Paul YuknewiczBoris Scholl
The Azure SDK for .NET has come a long way since its first release. Functionality has been regularly added as the Azure platform and associated tools evolved, such as adding Azure Cloud Service projects to any Web app project, ASP.NET MVC versions 3 and 4 project support, and a dramatically streamlined publishing experience. This aligns with Microsoft’s overall vision to improve the tools used for cloud development and to better integrate with all aspects of the application development lifecycle.
As of June 2012, Azure offers three compute container options to develop and run applications. These options include Azure Web Sites (Preview) for quick and easy Web site and Web application deployment; Azure Virtual Machines (Preview) for durable Windows Server and Linux Infrastructure as a Service (IaaS) virtual machines (VMs) and applications running on them; and Azure Cloud Services, which provide reserved, infinitely scalable, n-tier options running on a Platform as a Service (PaaS). This article focuses on developing cloud services; you can learn more about all of the options at windowsazure.com.
We’ll walk through parts of the cloud service application development lifecycle using Visual Studio and highlight the new features as we progress through that lifecycle. After reading through the article, readers new to Azure should have a basic understanding of cloud development with Visual Studio, and readers who already have experience with Azure development in Visual Studio should have a good understanding of the new features available.
The Azure SDK for .NET June 2012 release includes Visual Studio tools, which work on top of Visual Studio 2010 SP1 and Visual Studio 2012 release candidate (RC) and above. For this article, we’ll use Visual Studio 2012 RC. The best way to install the tools is to start Visual Studio, open the New Project dialog box and select the Cloud node. This will show the Get Azure SDK for .NET project template.
That link will take you to the .NET Developer Center (bit.ly/v5MF7m) for Azure and highlight the all-in-one installer for the SDK.
Once everything is installed, you can go ahead and create an Azure Cloud Service project, the focal project for working with Azure Cloud Services. The cloud service is a compute container for infinitely scalable, highly available and multitier cloud applications. A helpful new feature in this SDK release is that it works side by side with the Azure SDK for .NET November 2011 (1.6) release. This means you can still work on your 1.6 projects without having to upgrade them. In general, you have two options for creating an Azure application. The first approach is to create an Azure project from scratch. To do that, start Visual Studio with elevated privileges, click on the File menu and choose New | Project to bring up the New Project dialog. Under Installed Templates | Visual C# (or Visual Basic or F#), select the Cloud node and select the Azure Cloud Service Project template to bring up the New Cloud Service Project dialog. This dialog lets you add roles to a cloud service.
If the June 2012 SDK is installed side by side with the November 2012 SDK, a dropdown for the SDK version is shown at the top the dialog, which enables you to choose the SDK version with which to create the role.
Before we proceed, let’s quickly review what roles and instances mean. A role is basically a definition for both the app and the PaaS-managed VM it will run on, defining, for example, which OS and modules should be installed on the VM, which diagnostics settings should be used and what endpoints are supposed to be exposed. Think of the role as your template that lets you stamp out as many or as few instances (that is, VMs) as you need to scale to the current demands on your cloud service. There are currently two types of roles you can create from Visual Studio:
Instances directly correspond to VMs in the cloud running roles defined by the role template.
Figure 1 shows the components of a cloud application and how they work together.
Figure 1 Common Components of a Cloud Service
As noted, another compute container type introduced with the June 2012 release is the Azure Virtual Machine (Preview). VMs allow you to create completely custom workloads with any OS or any software (for example, databases, app servers or legacy components). This is great to add a custom tier, which can be used in conjunction with cloud services. You can use this container to create VMs from scratch or use VMs provided by the VM gallery in Azure. At the time this article was written, you need to first create VMs in the management portal (rather than from within Visual Studio), but as you’ll see later, you can explore VMs and their properties in Server Explorer, so it’s easy to reference VM workloads in your cloud service code. Note that unlike Web and Worker Role VMs that are managed and updated by Microsoft, you’ll be on your own to update and manage your Azure Virtual Machines, which can be used in conjunction with cloud services.
If your application is under heavy load, you might need to scale out. You can do that by adding more instances to your cloud service.
After adding roles to the cloud service and clicking OK, Visual Studio will create a solution that includes the cloud service project and a project corresponding to each role you added.
The Web Roles are ASP.NET Web application projects with only a few differences. The Web Role project, MVCWebRole1, contains references to the following libraries that are not referenced with a standard ASP.NET Web application:
private string conn = CloudConfigurationManager.GetSetting("MyConnectionString");
var roleInstance = RoleEnvironment.CurrentRoleInstance;
CloudStorageAccount storageAccount =
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
Figure 2 Initializing the Diagnostics Monitor and Adding Some Windows Event Log Data Sources
public class WebRole : RoleEntryPoint
public override bool OnStart()
DiagnosticMonitorConfiguration diagConfig =
diagConfig.WindowsEventLog.ScheduledTransferLogLevelFilter = LogLevel.Error;
With the exception of Microsoft.WindowsAzure.Diagnostics and Microsoft.WindowsAzure.ServiceRuntime, all the preceding assemblies mentioned are referenced as NuGet packages, which makes it easier to service your application when newer versions are available.
The cloud service project contains the roles that are included in the cloud service, along with the definition and configuration files. It provides Azure-specific run, debug and publish functionality.
You can use the New Project | Cloud | Azure Cloud Service dialog to create a cloud service with any number of Web and Worker Roles and use a different template for each role. Once the project is created, you need to configure each role by double-clicking on the role node—in our example, MvcWebRole1 under the Roles folder—in Solution Explorer. The role designer allows you to configure important role-related settings. For example, you can set the number of instances of each role independently to improve performance and provide redundancy. In fact, Azure requires at least two instances in order to ensure the high availability defined in the compute service-level agreement.
All the settings are stored in the Azure Service configuration file. Starting with the November 2011 1.6 release, you can create multiple configuration files to support different scenarios. For example, you might want to have a service configuration that contains only two instances for your staging environment and four instances in your production environment. By default, there are configuration files for local and cloud environments, but you can also create custom configs using the <Manage…> entry in the Service Configuration dropdown, as shown in Figure 3. The Azure management portal provides support for changing some of the configuration settings using a Web UI post-deployment.
Figure 3 Service Configuration Files
To best explain how that works, you can look at how we deal with storage connection strings. Let’s assume you want to build an application that stores customer data such as names and phone numbers. When running locally—for example, in debugging mode—the data will be stored in the local storage used by the storage emulator or a local database engine such as SQL Server 2012 LocalDB. Local storage configuration won’t work in the cloud because there’s no persistent storage engine in your role like LocalDB, so instead you want to take advantage of the Azure storage options such as Azure Storage or Azure SQL Database. The connection string also points to local storage or the localhost DNS name, so that won’t work in the cloud.
Instead, you need to specify an Azure storage account in the connection string to match the target environment in Azure. This is where the multiple configurations feature comes in handy. You can define settings such as connection strings in the Settings section of the role designer. Here you can add a new setting of type Connection string and call it, for example, “MyConnectionString.”
You can now enter the connection string by clicking the ellipsis button to bring up the Storage Account Connection String builder. First, you add the value for your local storage by selecting the option “Use the Azure storage emulator.”
Next, you enter the connection string you want to use for your published app. To do that, you need to switch the service configuration setting (shown in Figure3) to Cloud and bring up the Storage Account Connection String builder again, as shown in Figure 4.
Figure 4 Storage Account Connection String Builder for Cloud Configuration
You can now enter the storage account credentials you want to use to store your customer data. The account key can be obtained by using the “manage keys” option in the storage section of the Azure Management Portal.
As mentioned, you can now access the connection string value in your code by using the CloudConfigurationManager object like this:
var conn = CloudConfigurationManager.GetSetting("MyConnectionString");
Visual Studio defaults to using the local configuration for F5, so your app will automatically use local storage for debugging. When you publish with the Cloud configuration selected (default setting), your code will use the connection string pointing to Azure instead. You’ll see later how to choose the service configuration when you’re about to publish the application to Azure.
As you can see, one big benefit of supporting multiple configuration files is that you don’t need to update the Azure service configuration each time you publish to a different target environment, nor is there a need to maintain multiple Azure cloud service projects.
Another interesting new role aspect of the June 2012 release is the Azure Caching (Preview). Like any caching solution, the aim is to improve performance and latency. Unlike other solutions, this one runs in your role instances and can serve as a distributed and highly available service to your role instances. It can be used as a provider for existing caching APIs such as ASP.NET caching, output cache, session-state cache or memcache. Caching is enabled and configured in a role. You can simply check the Enable Caching checkbox to get smart defaults and get caching up and running immediately.
Existing roles with excess memory can be co-located with the cache service, or you have the option to create a new dedicated cache role. The co-located option has the benefit that you can take advantage of the excess resources in role instances for which you’re already paying. Each in-role cache service can be configured to host multiple named caches that have individual policies that control each named cache’s availability, eviction policy and more. Each named cache can be configured for memcache compatibility and accessed programmatically, and serve as the backing store for existing caching solutions such as output cache or session-state cache. The new in-role cache service is compatible with the on-premises ApplicationServer APIs for caching to allow easy migration from on-premises to the cloud. As mentioned, you can even create a dedicated Caching Role by choosing the Cache Worker Role template when adding a new role to your project. More information on how to set up caching can be found in the MSDN Library article, “How To: Use Azure Caching (Preview),” at bit.ly/LRFStZ.
In addition to the traditional way of creating an Azure project, you can also “Azurify” an existing Web Application project. Let’s assume you already have a Web Application project such as an ASP.NET MVC 4 project that you want to deploy to Azure. Right-click on your project and choose the “Add Azure Cloud Service Project” command.
This adds a cloud service project to the solution. Visual Studio also adds the NuGet references needed for an Azure project to the MVC project. In addition, the tool sets the Copy Local property for the System.Web.MVC assembly to true because the assembly isn’t available on Azure.
Once you’ve created your project, you’ll likely need to debug it. You should always remember that Visual Studio needs to be started with admin-elevated privileges in order to debug an Azure application. As with any other Visual Studio project, you can start debugging by setting a breakpoint and hitting F5. It’s worth mentioning that in the 1.7 SDK release, Visual Studio and the Compute Emulator use IIS Express to host instances and LocalDB for development storage by default, as opposed to IIS and SQL Express in the 1.6 SDK release (don’t worry, these options still exist in the project properties).
When debugging, Visual Studio will automatically use the local service configuration. As with previous versions, you can use the Azure Compute Emulator to perform various operations on your deployments, such as viewing logs and restarting and deleting deployments (see Figure 5). To bring up the Compute Emulator, right-click on the Azure notification icon in the taskbar and click on Show Compute Emulator UI. Notice that the Azure Compute Emulator contains a new deployment that hosts two Web Role instances and two Worker Role instances.
Figure 5 The Azure Compute Emulator
Now that you’ve created, edited and debugged your application locally, you’re ready to deploy it to Azure. In general, it’s good practice to follow the application development lifecycle before doing a final publish to the Azure production environment. First, you should publish the application to a test environment. The test environment is basically a cloud service that you need to create and use for testing purposes only. This environment allows you to test if the application behaves as expected when hosted in Azure. Once the tests are successful, you can publish to the staging environment. In this environment, you can do user-acceptance tests to validate if the application provides the functionality for which it was designed. Finally, if all the tests pass, you can publish to the production environment.
The publishing process was improved considerably in the November 2011 release with the introduction of a new publishing wizard. To open the wizard, right-click on the Azure project and select Publish. Note that you need to right-click on the Azure project, not on a Web Application project such as an ASP.NET MVC project. Otherwise, the Web publishing wizard would be launched to perform Web deploys, and not the Azure publishing wizard.
When you publish to Azure for the very first time, you need to click on the “Sign in to download credentials” link on the publish page to download the .publishsettings file. This file contains the metadata and credentials needed for Visual Studio to work with your Azure subscription. The .publishsettings download page will create an Azure management certificate for Visual Studio and embed it in the .publishsettings file along with your subscription details. All of these details will be installed and stored on your local development machine when imported.Important note: This file contains very sensitive information such as subscription IDs and your management certificate, so it’s best to store the file in a secure place or delete it immediately after importing.
Once the file is downloaded to the development machine, you can import it by using the Import button on the first page of the wizard. All of your subscriptions will now show up in the dropdown box, as shown in Figure 6. The nice thing about the dropdown box is that you can also manage your authentication settings such as creating new certificates, renaming credentials and so on.
Figure 6 The Publishing Wizard
Clicking the Next button takes you to the Common Settings tab (see Figure 7). This tab allows you to select an existing cloud service or even create a new one. Also, you can choose which environment you want to deploy to—production or staging—and which build service configuration you want to use. Remember we talked about multiple service configurations earlier in this article? This is the place where you can choose which ones to use for your deployment.
Figure 7 Common Settings Tab
Remote Desktop and Web Deploy can be enabled on that tab as well. Enabling Remote Desktop is really helpful if you want to connect to specific role instances on Azure later for diagnostic purposes. Once you’ve decided to enable Remote Desktop and the deployment has been successfully published, you can connect directly to the role instance using the “Connect using Remote Desktop” context menu on an instance in Server Explorer, as shown in Figure 8. The context menu of Server Explorer automatically displays menu items based on the functionality enabled during publish. In our example, we enabled Remote Desktop.
Figure 8 Connect Using Remote Desktop
There are some useful settings in the Advanced Settings tab of the publishing wizard. Here you can specify which storage account to use for publishing or create a new one. We recommend always creating or using storage accounts in the same location (that is, the datacenter) as the cloud service using them in order to avoid performance impacts caused by latency. In addition, you can control the update behavior of your deployment as well as the troubleshooting tools you want to use for your deployment.
Finally, after applying the settings you want to use for publishing, you’ll be taken to the summary page of the wizard.
The summary page lists all the settings you made and allows you to save a target profile for publishing. Unlike the cloud configuration, the profile stores the Visual Studio publish configuration. The target profile is basically an MSBuild definition file with the file extension .azurePubxml. All the settings applied in the publishing wizard are saved to that file. This comes in handy when you have different publishing settings or multiple target environments. Just think about testing and production environments. In the test environment, you’ll probably want to have IntelliTrace turned on, though in the production environment, you won’t. Now, instead of going through the wizard again before you can publish to production, you just need to select the publishing profile for the production environment. This is quite effective with regard to standardizing publishing.
One last thing worth mentioning for publishing: Visual Studio will perform a one-time update of the connection strings for diagnostics and caching in the service configuration file by default with the value for the publish storage account if the value is still UseDevelopmentStorage=true, as shown in Figure 9. This behavior can be disabled in the Role Designer by unchecking the “Update development storage connection strings …” checkbox.
Figure 9 Update Development Storage Connection Strings for Diagnostics and Caching
Clicking the Publish button will kick off the publishing process after the build was successful. As publishing can take some time, it’s useful that Visual Studio provides a detailed status of which publishing step is currently being executed in the Azure Activity Log window. While there are several operations being handled by the background publish process, you still get a pretty good idea which step is currently being performed by looking at Activity Log status for publish. Figure 10 shows the Activity Log window after successfully publishing.
Figure 10 The Activity Log Window
You can now browse to your cloud service in a browser by clicking the “Web site URL” in the window.
We already looked at how to connect to one of your instances using Server Explorer. While connecting using Remote Desktop to Azure Compute instances and Azure Virtual Machines is certainly a big asset, it’s worth pointing out some of the other useful features in Server Explorer. In the June 2012 SDK, you can connect directly to the Azure Service Bus by providing the namespace and the key for that namespace. Once connected, you have visibility into Service Bus queues and topics. You can even create new Service Bus queues and topics, as shown in Figure 11.
Figure 11 Creating a New Service Bus Queue
Also, you can add storage accounts to the Azure storage node and see what’s stored in the BLOBs and tables. Currently, you can only read the data in Azure storage using Server Explorer. Last, Server Explorer now also lists the Azure Virtual Machines and all the endpoints exposed by the VMs.
Earlier you saw a code sample that initializes the diagnostics monitor to collect Windows event log data sources. Explaining how Azure diagnostics works goes beyond the scope of this article;however, let’s still have a quick look at it.
In addition to Windows event log data sources, you have the option to add code that will start collecting Azure trace logs, infrastructure logs, crash dumps and so on. More information on enabling and configuring Azure diagnostics can be found at bit.ly/MMkwiK.
The major difference in collecting diagnostics data in Azure versus an on-premises application is where the diagnostics data is stored. Azure stores the diagnostics data in Azure storage (we discussed the storage account for diagnostics earlier in the “Publishing to Azure” section). Some diagnostics data, for example IIS logs and IIS failed requests, is stored in BLOB storage; other data, such as trace logs, performance counters and Windows event logs, is stored in table storage. Figure 12shows a storage account named intellitracetest with BLOB and table storage containing Azure diagnostics data. The BLOBs and tables are easily identifiable, as they start with “wad-” or “WAD.”
Figure 12 Storage Account Containing Diagnostics Data
In order to move the data to the storage, it’s important to always use the ScheduledTransferPeriod property. The following code shows an example for Windows event logs:
diagConfig.WindowsEventLog.ScheduledTransferPeriod = TimeSpan.FromMinutes(1.0);
Enabling diagnostics definitely helps with troubleshooting and monitoring. But, as developers, we love the debugger. Currently, Visual Studio doesn’t provide out-of-the-box support for debugging, but you can use familiar tools such as IntelliTrace and Profiling. Before we wrap up, let’s have a look at IntelliTrace. IntelliTrace is available only in the Visual Studio Ultimate editions, however, it’s such an invaluable feature for cloud developers that it’s definitely worth mentioning. Let’s assume you need to check whether the Worker Role in your project started as expected. Because there’s some tracing code in the WorkerRole:RoleEntryPoint class, you can use IntelliTrace to debug the RoleenvironmentOnStart event.
Remember that we previously discussed enabling IntelliTrace for your deployment? After the successful publish, when you right-click on a compute instance in Server Explorer and have IntelliTrace enabled for the deployment, you get another context menu item saying “View IntelliTrace logs” (it works the same way if you had enabled Profiling). Remember that enabling IntelliTrace and Profiling are mutually exclusive (a current limitation that we hope to remove in the future).
Once you click “View IntelliTrace logs,” Visual Studio downloads the logs and displays them, as shown in Figure 13.
Figure 13 Debugging Using IntelliTrace
You’re now able to see all of the events, and you can start debugging from the event in which you’re interested. The IntelliTrace window gives good information on the sequence of the events, particular events and exception information. Visual Studio takes you straight to the relevant code in the code editor when you click on an event.
Even though we just scratched the surface of all the Azure tools features, you should have an idea of how easy it is to develop and debug cloud service applications using Visual Studio. For the latest and greatest news and information about .NET development on Azure, please see bit.ly/v5MF7m.
Boris Scholl is a senior program manager with the cloud tools team for Visual Studio, focused on building end-to-end developer experiences for Azure. Before joining the team, he spent time working on the Visual Studio SharePoint tools team and as an architect in the Microsoft field designing SharePoint and cloud solutions.
Paul Yuknewicz is a principal program manager lead for cloud tools, Windows Forms and Visual Basic 6 for Visual Studio.
Thanks to the following technical experts for reviewing this article: Gordon Hodgenson, Jim Nakashima and Mohit Srivastava
The campaign to update the VB6 programming language (by adding the same changes to VB6 they have already added to VBA) continues at: http://visualstudio.uservoice.com/forums/121579-visual-studio/suggestions/3440221-bring-back-classic-visual-basic-an-improved-versi http://visualstudio.uservoice.com/forums/121579-visual-studio/suggestions/6008370-bring-back-classic-visual-basic-an-improved-versi https://www.facebook.com/MicrosoftVB
"Paul Yuknewicz is a principal program manager lead for cloud tools, Windows Forms and Visual Basic 6 for Visual Studio." And yet Yuknewicz has the distinction of twice cancelling VB6, Microsoft's most popular programming language. http://visualstudio.uservoice.com/forums/121579-visual-studio/suggestions/3440221-bring-back-classic-visual-basic-an-improved-versi Yuknewicz claimed adding the same changes to VB6 that they have already added to VBA are "not possible" while "maintaining the essence" of the VB6 programming language. Yuknewicz does admit VB6 will be supported by Microsoft 'at least' through 2024.
More MSDN Magazine Blog entries >
Browse All MSDN Magazines
Subscribe to MSDN Flash newsletter
Receive the MSDN Flash e-mail newsletter every other week, with news and information personalized to your interests and areas of focus.