I’m pleased to announce the release of the June 2010 Windows Azure Tools + SDK.
The Windows Azure Tools for Microsoft Visual Studio extend Visual Studio 2010 and Visual Studio 2008 to enable the creation, configuration, building, debugging, running, packaging and deployment of scalable web applications and services on Windows Azure.
Lots of new features for this release:
QFE to enable IntelliTrace on 32-Bit OS.
QFE to enable IntelliTrace on 32-Bit OS.
We’re really excited about this release as it gave us the opportunity to be a more tool focused than we’ve been able to be in the past.
Hope you like it and as always – please send me feedback.
One of the cool new features of the June 2010 Windows Azure Tools + SDK is the integration of IntelliTrace to allow you to debug issues that occur in the cloud.
IntelliTrace support requires .NET 4, Visual Studio 2010 Ultimate and the cloud service has to be deployed with IntelliTrace enabled. If you are using a 32-Bit OS, you need this patch/QFE.
To enable IntelliTrace, right click on the cloud service project and select “Publish”.
At the bottom of our publish dialog, click to select “Enable IntelliTrace for .NET 4 roles”.
You can also configure IntelliTrace for the cloud settings (these are separate from the settings in Tools | Options which are used for the debug (F5) scenario which we currently do not support with Cloud Services/Development Fabric.
A couple of notes about IntelliTrace settings.
We default to high mode which is different from the F5 IntelliTrace settings in Visual Studio. The reason is that F5 IntelliTrace includes both debugger and IntelliTrace data while in the cloud, you are only able to get back IntelliTrace data.
Additionally, we exclude Microsoft.WindowsAzure.StorageClient.dll as we found that the slow down caused by IntelliTrace instrumenting resulted in time outs to storage. You may find you will want to remove the storage client assembly from the exclusion list.
To reset the IntelliTrace settings back to the default, you can delete “collectionplan.xml” from %AppData%\Roaming\Microsoft\VisualStudio\10.0\Cloud Tools
Click “OK” to package up everything you need to IntelliTrace the web and worker host processes in the cloud and start the deployment process.
Note: There is a current limitation that child processes cannot be IntelliTrace debugged.
The deployment process is completely asynchronous so you can continue to work while you wait for deployment the to complete and you can track the progress through the Windows Azure Activity Log tool window.
After the deployment has completed, open up the Windows Azure Compute node in Server Explorer to browse hosted services deployed to Windows Azure.
You can add a hosted service by right clicking on the Windows Azure Compute node and selecting “Add Slot…”
This will bring up a dialog you can use to choose a slot or add/manage your credentials.
The Server Explorer will show you which Hosted Services have IntelliTrace enabled. They are the ones that have “(IntelliTrace)” beside the slot name.
Expand open the nodes and navigate to an instance, you can get the IntelliTrace logs for that instance by right clicking on the instance node and selected “View IntelliTrace Logs”.
Note: When a role process exits, it automatically gets restarted by the fabric, which causes the cycling role state behavior that some of you are familiar with. When IntelliTrace is enabled, when a role process exits, it is not restarted and it is put into the “Unresponsive” state instead. This allows you to get the IntelliTrace logs for the failure.
Similar to how you can track the progress of deployment from the Windows Azure Activity Log, you can also track the the download of the IntelliTrace logs asynchronously.
You’ll then see the IntelliTrace files open in Visual Studio.
You can now browse the Exception Data on the summary page or put Visual Studio into debug mode by clicking and exception and clicking the “Start Debugging” button or by double clicking on one of the threads in the thread list.
Being in debug mode will bring up the IntelliTrace tools window which will show you all of the IntelliTrace events. You can filter between different categories and “Switch to Calls View” which will show you the call stack that you can drill in and out of various methods.
You can also open up your source code and right click on a line and select “Search for this line in IntelliTrace”.
When the search is complete, you can click on the navigation buttons at the top of the file to select one of the instances in which this code was called and use the historical debugging buttons on the left to debug forward and backward through the code looking at the call stack and locals as step through the control flow.
Debugging Common Issues Using IntelliTrace
Missing an Assembly in the Service Package:
This is by far one of the most common "works on the devfabric, fails in the cloud" issues. View the IntelliTrace log and look for FileNotFoundExceptions in the exception list or IntelliTrace events.
In the IntelliTrace events window:
Using an incorrect Windows Azure storage connection string:
This one is a little tougher as there isn’t a top level exception you can look at. Search for the methods in IntelliTrace where you use connection strings and see the input and return values. For example the CloudStorageAccount and DiagnosticMonitor calls.
Missing a CRT library in the Cloud:
For the scenario where you are calling into a native dll but did not xcopy deploy the CRT along with his Service Package, an exception will surface in the IntelliTrace summary naming the native dll that could not be loaded.
Using a 32 Bit Native Library in the Cloud:
This issue is very similar to the missing CRT example above, in this scenario you’ve been successfully developing with a 32 bit machine and but get a failure in the cloud when the 32 bit dll is loaded in a 64 bit process.
With IntelliTrace, an exception showing which native library failed to load is surfaced in the IntelliTrace summary screen.
In the case where the loading of the assembly is triggered by a method call that is outside of the startup code, you can double click the exception to get to the line of their code that made the call to native that loaded up the dll.
Using code that requires admin access:
If you are running into this issue, you should be testing against the Development Fabric before deploying to the cloud. That said, our support data shows that this is one of the issues you run into.
I tried to do a registry write to HKLM, which fails in the following way:
An exception is shown in the IntelliTrace summary and when double clicked, will navigate to the line of code that is causing the exception.
Using an ASP.NET provider with the default SQL Server connection string in the cloud:
In this scenario, you are using the ASP.NET providers, the default MVC and ASP.NET Web Application templates both use these. In the devfabric, they work fine as they use SQL Express under the hood by default. When you deploy to the cloud, they no longer work. (An exception web page is shown after a wait)
In opening the IntelliTrace summary, you will see the exception "Unable to connect to SQL Server database" with a stack trace that points to one of the providers, in my example, it was the SqlMembershipProvider.
Using a Diagnostics connection string with a connection string that uses HTTP endpoints:
In this scenario, you’ve deployed to the cloud but forgot to change your Windows Azure storage connection strings. If you incorrectly select to use HTTP endpoints for the storage account and didn't try running your app with the new connection strings on the devfabric before deploying, you can run into this problem.
When opening the IntelliTrace log, you will see an exception in the summary indicating that the endpoint is not a secure connection.
To sum up, I’m really excited about this feature, I hope it will really help a lot of people see into the cloud and diagnose issues saving both time and frustration.
As part of the June 2010 release of the Windows Azure Tools, we now have a Windows Azure Storage browser in the Visual Studio Server Explorer:
It is our first cut at this feature and we've been iterating fairly quickly on the Windows Azure Tools so I'm excited about having this feature not only for what it delivers today but also because it lays the foundation for the future. In this post, I'll go over what you can and can’t do with the Windows Azure Storage browser and how added some features to hopefully make it easier for you to handle navigating through large data sets.
Connecting to a Storage Account
The development storage node is always available under the Windows Azure Storage node and when opened will also start up the development storage if it isn’t already running.
To add a storage account in the cloud, right click on the Windows Azure Storage node and select “Add New Account…”
This will pop up a dialog that allows you to enter your storage account credentials:
This corresponds to the account name and key that you set up on the Windows Azure Developer Portal, for example:
In this case “serverexplorer” is the account name and you can use either the primary or secondary access keys as the account key.
That said, one of our design principles is not to ask the user to enter the same information more than once. So if you’ve entered storage connection strings in your cloud service, specifically if you've added connection strings as configuration settings in the roles of your Cloud Service, we find those and show them in the combo box. If you select one, we'll fill out the name and key so that you don't have to re-enter that same information.
Once you hit ok that new storage account will be shown in the Server Explorer:
Browsing Blob Storage
To browse blobs, you can open up the storage account and then open up the “Blobs” node to see a list of the containers in that storage account.
By double clicking on a container, you can see all of the blobs that are in that container.
One of the things we did to help you to handle large data sets, we get the blob list 200 at a time. As the blob list is downloading, you can click to pause or resume the download.
If you click to pause, it gives you the ability to download a blob, see blob properties (right click on a row and select “Properties”) or enter a filter by blob prefix.
Filtering by blob prefix occurs on the server side so only the filtered list is returned and shown.
Our thought is that by supporting both filtering and pause/resume, you will be able to use the Windows Azure Storage browser with containers that contain a large number of blobs.
We also support downloading blobs by double clicking on them. This will add a line item into the Windows Azure Activity Log window in Visual Studio which we use to track long running processes that relate to our Windows Azure Tools.
After downloading is complete, the blob will be opened in Visual Studio if the file type is supported.
One of the hard cuts for this release was the edit/write support. We really hoped to add the ability to delete blobs and containers because viewing and deleting really covers the core developer scenario. Unfortunately, we’ll have to wait for a future release to add that in... but again, I'm excited about the foundation that this feature provides and it's integration into Visual Studio makes it really convenient.
Browsing Table Storage
Browsing Table Storage works in a very similar way.
When you open up a table, we download 200 rows at a time and allow you to pause/resume. If you pause you can filter using a WCF Data Services $filter query option.
What you can put in the text box to filter is anything you would put after ‘$filter=’ in a WCF Data Services. For example, “Address gt ‘989 Redmond Way, Redmond WA’”
Having a table viewer right in Visual Studio now allows you to view Windows Azure Diagnostic trace messages without having to leave Visual Studio.
Similar to the blob storage viewer, we also had to cut the edit/write capabilities for table storage.
We are really dying to get the edit/write capability and Queue capability into the product. Hopefully we’ll be able to schedule it soon!
We the Windows Azure Storage browser for you so let me know what you like, don’t like and what features you want to see next!
Thank you to all of you who attended my session at TechEd 2010 - COS307 | Using Microsoft Visual Studio 2010 to Build Applications That Run on Windows Azure
Here are some of the key takeaways and links from the session:
Lots of New Tools
The June 2010 release of the Windows Azure Tools has now includes:
The Web Platform Installer automates a number of the steps to install Windows Azure or to install IIS prior to installing the Windows Azure Tools for VS 2010.
Get the patches - http://msdn.microsoft.com/en-us/azure/cc974146.aspx
ASP.NET Web Roles vs ASP.NET Web Applications
The 3 differences are:
The NerdDinner sample code can be found at: http://nerddinner.codeplex.com/
ASP.NET Provider scripts for SQL Azure
To use the ASP.NET providers with SQL Azure, you can use these scripts: http://support.microsoft.com/default.aspx/kb/2006191 to setup the database.
Using IntelliTrace to debug services / applications that are running in the Cloud
If you are attending TechEd 2010 next week, I really hope to see you there. I’ll be at the Windows Azure booth and speaking on Wednesday June 9th at 5:00 in Room 356.
I’ll be speaking about the end to end development experience for Windows Azure and have about 3-4 cool new things to show that I’m really excited about.
COS307 | Using Microsoft Visual Studio 2010 to Build Applications That Run on Windows Azure
A platform is only as powerful as the tools that let you build applications for it. This session focuses on using demos, not slides, to show the best way to use Visual Studio 2010 to develop Windows Azure applications. Learn tips, tricks and solutions to common problems when creating or moving an existing application to run on Windows Azure. Come see how Visual Studio 2010 supports all parts of the development cycle as we show how to take an ASP.NET application running on IIS and make it a scalable cloud application running on Windows Azure.
If you can’t make it, stay tuned, I’ll be posting links to all of the Windows Azure videos and I have some blog posts coming on those cool new things :)