• Cloudy in Seattle

    Migrating an Existing ASP.NET App to run on Windows Azure

    • 20 Comments

    This post has 2 main parts.  The first part is an update to a post I wrote back in February 2009 about using an existing ASP.NET Web Application as Web Role and rolls in information from this post and this post.  The second part is about migrating an existing database running on SQL Express and ASP.NET providers to SQL Azure.

    I’ll start with the NerdDinner sample, so make sure you have ASP.NET MVC installed.  Although I used VS 2008 for the screen shots, this walkthrough is compatible with VS 2010.

    I’ve opened the solution in Visual Studio and removed the test project to keep things simple.

    image

    The first thing I need to do is make this Web Application project a Web Role. 

    I can do that one of two ways:

    1) Since I have the NerdDinner project open, I can add a Windows Azure Cloud Service to the solution.

    image

    Select “Windows Azure Cloud Service” and hit “OK”

    image

    Hit “OK” again, because we don’t need to add any Roles to this Cloud Service.

    image

    Right click on the “Roles” node in the Cloud Service project and select “Add | Web Role Project in solution…” 

    image

    Select the NerdDinner project.  Note that all of the Web Application projects in the solution will show up in this list. 

    image

    2) The other option would have been to create a new Cloud Service project and add the NerdDinner project to it using Solution | Add | Existing Project… then following the step of Add | Web Role Project in solution…

    image

    We now have the following:

    image 

    Before I get to what it will take to hit F5 and make the NerdDinner application run as a Web Role, let’s discuss the differences between a Web Role and an ASP.NET Web Application.

    There are 4 differences and they are:

    • References to the Windows Azure specific assemblies: Microsoft.WindowsAzure.Diagnostics, Microsoft.WindowsAzure.ServiceRuntime, and Microsoft.WindowsAzure.StorageClient
    • Bootstrap code in the WebRole.cs/vb file that starts the DiagnosticMonitor as well as defines a default behavior of recycling the role when a configuration setting change occurs.
    • The addition of a trace listener in the web.config file: Microsoft.WindowsAzure.Diagnostics.DiagnosticMonitorTraceListener.
    • In the case of an MVC web application, the assembly reference to System.Web.Mvc may not have the Copy Local property set to “True” – you need to to have this to ensure that the System.Web.Mvc assembly is available in the cloud.  The cloud VMs only contain the assemblies that come with the .NET Framework 3.5 SP1 redistributable. (System.Web.Mvc is not one of them and this is actually a hard thing to diagnose today as your role will go into an intializing, starting, stopping loop).  Setting “Copy Local” to True will ensure the assembly is added to the Service Package – the package that gets uploaded to the cloud and used to run a Cloud Service on the local development fabric.

    image

    Additionally, today we only support targeting .NET Framework 3.5 SP1.  Stay tuned for .NET 4 support.

    Except for #4, the other 3 differences aren’t strictly required.

    Chances are, at a minimum you are going to want to reference Microsoft.WindowsAzure.ServiceRuntime and Microsoft.WindowsAzure.Diagnostics, start the Diagnostic Monitor and add the trace listener so that you can write logs and gather other diagnostic information to diagnose issues or monitor the health of your application.

    If you use Windows Azure Storage, you are going to want to use the Microsoft.WindowsAzure.StorageClient library which provides a .NET interface to Windows Azure Storage.

    For the sake of this article, I’m just going to make sure that System.Web.Mvc has Copy Local set to true.

    Let’s try hitting F5 and seeing what we get.

    The application runs… almost…  if I click on “View Upcoming Dinners” I get an error. 

    image

    The connectionstrings.config file that is referenced in the web.config is not being found.

    <connectionStrings configSource="ConnectionStrings.config" />

    I need to make sure that this file is added to the Service Package.  I can do so by adding it to the NerdDinner project (right click on the NerdDinner project in Solution Explorer and select Add | Existing Item…)

    image

    Now set the Build Action to Content which should be done by default but I want to call it out as a way to ensure that given files in the project get added to the Service Package – you may a need to do this with some of your other files. 

    image

    Now, I hit F5 again, and everything works. But will it work in the cloud?

    The answer is no – NerdDinner has a NerdDinner.mdf file it uses for data and it makes use of ASP.NET providers – both of these rely on SQL Express, which I have on my local machine but are not available in the cloud VMs (even if it was, you would need to have a story that works across multiple instances).

    I have a decision to make.  I can use SQL Azure or I can rewrite the application to use Windows Azure Storage.  Since it is easy to do a search for “NerdDinner Azure” and find examples of the latter I will do the former – the point of this article is also to primarily focus on using the an existing project in a Windows Azure Cloud Service.

    In the real world, you’ll want to consider your data and compare the long term costs / effort / requirements to make the right decision.

    The key app I need to migrate the data to SQL Azure is the SQL Server Management Studio R2 CTP that supports SQL Azure.  Links are here. This post may also be helpful.

    I also need to have a database setup on SQL Azure.  Go to sql.azure.com and sign in. Create databases called NerdDinnerDB and aspprovidersdb.

    image

    Make sure to set your firewall settings such that you can the machine where your app is running in the development fabric has access. For example: this is not recommended, but makes development on multiple machines easy.

    image

    The steps to migrate the data are now:

    1. Migrate the NerdDinner.MDF
    2. Migrate the ASP.NET providers
    3. Change the connection strings.

    Let’s start with NerdDinner.MDF. 

    Open SQL Server Management Studio 2008 R2 and connect to .\SQLExpress. 

    Right click on Databases, and select “Attach…”

     

     

    image

    Click the Add… button and browse to the location of the NerdDinner.MDF file then click OK.

    image

    Now right click on the NerdDinner database that was just added, select Tasks | Generate Scripts…

    image

    This will bring you to the Generate and Publish Scripts wizard.  Click Next twice (selecting the default of the whole database) then click Advanced on the “Set Scripting Options” page.

    Scroll down to “Script for the database engine type” and select “SQL Azure Database”.

    image

    You also have an option to choose whether to script the schema only, the schema and data or the data only.  For the purposes of this walkthrough, you can choose the schema only or the schema and data.

    clip_image002

    Finish the wizard saving the script to file then open the file in SSMS.

    Now we want to connect to SQL Azure.  Do so by right clicking the Connect button in Object Explorer and selecting “Database Engine”. 

    For server name put the name you got from the SQL Azure portal, including the full domain name.  For example in the SQL Azure portal, the server name is listed as: zky996mdy7.database.windows.net

    image

    Correspondingly, enter this Server name in the Connect to Server dialog in SSMS:

    image

    The login is the Administrator username and the password setup in the SQL Azure portal.  Note that @zky996mdy7 is appended to the username.

    Click on Options >> select the “Connection Properties” tab and enter NerdDinnerDB for “Connect to database”.

    image

    This puts the SQL Azure database in the Object Explorer in SSMS.

    Right click on the SQL Azure database and select “New Query”.  This will open a SQL Query window.

    image

    Copy and paste in the database script the SQL Query window and hit Execute.  In the bottom status area of SSMS you should see that the query executed successfull and that it was against your SQL Azure NerdDinnerDB database.

    image

    Now we need to setup the ASP Providers.  This requires using provider scripts that we created for SQL Azure.  See this post for the scripts and more info.

    Download the scripts and extract.

    Open the InstallCommon.SQL script in SSMS.  Since we were last connected to the NerdDinnerDB and we now want to connect to the aspprovidersDB you created above in the SQL Azure portal, right click in the query window and select Connection | Disconnect. 

    image

    Follow that by right clicking in the SQL Query window and selecting Connection | Connect, entering the aspprovidersDB as the database to connect to in the options.

    image

    Run the script, open InstallMembership.SQL and installprofile.SQL and run those scripts as well.  Just be sure to always be running these scripts against aspprovidersdb.

    Now we need to change the connection strings in the connectionstrings.config file we added to the NerdDinner project.  There is a connection string for the NerdDinner database and a connection string for the ASP Providers.

    Here’s an example to follow, replacing the server, user ID and Password appropriately. 

    <connectionStrings>
      <add name="ApplicationServices" connectionString="Server=tcp:zky996mdy7.database.windows.net;Database=aspprovidersdb;User ID=jnak;Password=<enter>;Trusted_Connection=False;" providerName="System.Data.SqlClient"/>
      <add name="NerdDinnerConnectionString" connectionString="Server=tcp:zky996mdy7.database.windows.net;Database=NerdDinnerDB;User ID=jnak;Password=<enter>;Trusted_Connection=False;" providerName="System.Data.SqlClient"/>
    </connectionStrings>

    Delete the database files in App_Data and hit F5 to run this in the Development Fabric. 

    A couple things to test to ensure that both the providers and the nerddinner database are working correctly: Register for an account, host a dinner and view all upcoming dinners.

    Now deploy this to the cloud -- Please see the Deploying a Cloud Service walkthrough if you need help.

    What’s cool to consider here is we started with a single instance ASP.NET web application and turned it into a scalable cloud application that has the capability to be load balanced across any number of instances.

    As I look back at this walkthrough, it’s actually quite long but over half is migrating the data to SQL Azure and there was a lot of screen shots.  The key things I want you get out of this post are:

    1) Add existing ASP.NET Web Applications to the same solution as a Cloud Service project and use Roles | Add | Web Role in project…

    2) Understand the 4 differences between a Windows Azure Web Role project and a standard Web Application project

    3) Figure out how you are going to handle your data, you have a few options notably Windows Azure Storage and SQL Azure. Don’t forget about your providers.

    4) Walkthrough of migrating an existing database to SQL Azure.

    Finally, whenever you are deploying the cloud, it is useful to look into the service package and make sure that everything you need is in the service package.  See this post for more info – remember Windows Azure VMs only include the .NET Framework 3.5 SP1 redistributable.

  • Cloudy in Seattle

    ASP.Net MVC on Windows Azure with Providers

    • 19 Comments

    [For more recent information on using ASP.NET MVC with Windows Azure please see this post.]

    Before you get started with ASP.Net MVC and Windows Azure – please install this hotfix.

    Were you wondering why the sample project I got from Phil attached to my post, ASP.Net MVC on Windows Azure, didn't include the sample Windows Azure ASP.Net providers?

    The answer is that we wanted to get something out early that would unblock folks from trying out the MVC on Windows Azure scenario.  That sample solution accomplished that.

    We were also working out a problem we were seeing when using those providers with MVC.  The problem was that we would get a timeout after logging in or creating a new account when running on Windows Azure -- a problem you won't see when running locally against the Development Fabric.

    Luckily, Phil pointed me to a workaround which I've now verified solves this problem. (the fix is now in MVC).

    I tend to get pinged from time to time about issues with RequestURL so I’ll give a pointer to a post to David’s blog that talks about that.

    This post will cover augmenting the existing "MVC on Windows Azure sample" with membership, role and session state providers as well as the workaround that ensures the sample works when you run on Windows Azure.

    It is again important to note that using ASP.Net MVC on Windows Azure is still a preview.

    The sample code that goes along with this code is on the MSDN Code Gallery MVCCloudService and does not include the sample projects from the Windows Azure SDK, please follow the instruction below for adding and referencing them. (when you first load the project it will complain that the sample projects are missing)

    Starting with the sample project attached to my last post, the first thing to do is to add the AspProviders and the StorageClient projects found in the Windows Azure SDK samples to the solution.

    These sample projects are found in the SDK install folder (C:\Program Files\Windows Azure SDK\v1.0 by default), where you'll see a zip file (samples.zip).  Copy this file to a writeable (i.e. not in Program Files) location and unzip it.

    Right click on the Solution node in Solution Explorer -> Add -> Existing Project:

    image

    Navigate to the directory where you unzipped the Windows Azure SDK samples, AspProviders\Lib and select AspProviders.csproj:

    image

    Do the same to add the StorageClient.csproj project from StorageClient\Lib:

    image

    Then add project references to both of those projects by right clicking on the reference node in the CloudService1_WebRole project and selecting "Add Reference...", choosing the Projects Tab, selected both AspProviders and StorageClient projects and hitting "OK".

    image

    Let's now add the Service Definition and Service Configuration settings and values needed to access the Windows Azure Storage Services by by adding the following code to the Service Definition and Service Configuration files found in the Cloud Service project.

    The settings below are setup for the local Development Storage case.

    Note: When switching over to the *.core.windows.net endpoints, you'll have to use the https addresses (i.e. https://blob.core.windows.net) otherwise the providers will throw an exception.  Alternatively you could set allowInsecureRemoteEndpoints to true -- however that is not recommended.

    The definitions in ServiceDefinition.csdef:

    <ConfigurationSettings>
        <Setting name="AccountName"/>
        <Setting name="AccountSharedKey"/>
        <Setting name="BlobStorageEndpoint"/>
        <Setting name="QueueStorageEndpoint"/>
        <Setting name="TableStorageEndpoint"/>
        <Setting name="allowInsecureRemoteEndpoints"/>
    </ConfigurationSettings>

    The Values in ServiceConfiguration.cscfg:

    <Setting name="AccountName" value="devstoreaccount1"/>
    <Setting name="AccountSharedKey" value="Eby8vdM02xNOcqFlqUwJPLlmEtlCDXJ1OUzFT50uSRZ6IFsuFq2UVErCz4I6tq/K1SZFPTOtr/KBHBeksoGMGw=="/>
    <Setting name="BlobStorageEndpoint" value="http://127.0.0.1:10000"/>
    <Setting name="QueueStorageEndpoint" value = "http://127.0.0.1:10001"/>
    <Setting name="TableStorageEndpoint" value="http://127.0.0.1:10002"/>
    <Setting name="allowInsecureRemoteEndpoints" value=""/>

    The providers also have settings to specify the name of the table for storing membership, role and session related data.  Note that as the comment indicates, the values below are the only values that will work in the Development Storage case. 

    In the MVCWebrole web.config, change <appSettings/> to the following:

    <appSettings>
        <!-- provider configuration -->
        <!-- When using the local development table storage service only the default values given
         below will work for the tables (Membership, Roles and Sessions) since these are the names
         of the properties on the DataServiceContext class -->
        <add key = "DefaultMembershipTableName" value="Membership"/>
        <add key = "DefaultRoleTableName" value="Roles"/>
        <add key = "DefaultSessionTableName" value="Sessions"/>
        <add key = "DefaultProviderApplicationName" value="MvcCloudService"/>
        <add key = "DefaultSessionContainerName"/>
    </appSettings>

    The following connection string for SQL can be removed:

    <add name="ApplicationServices" connectionString="data source=.\SQLEXPRESS;Integrated Security=SSPI;AttachDBFilename=|DataDirectory|aspnetdb.mdf;User Instance=true" providerName="System.Data.SqlClient"/>
    

    We'll now add the providers, via the MVCWebRole web.config file (remove the existing SQL Server ones).  First the membership provider:

        <membership defaultProvider="TableStorageMembershipProvider" userIsOnlineTimeWindow = "20">
          <providers>
            <clear/>
    
            <add name="TableStorageMembershipProvider"
                 type="Microsoft.Samples.ServiceHosting.AspProviders.TableStorageMembershipProvider"
                 description="Membership provider using table storage"
                 applicationName="MvcCloudService"
                 enablePasswordRetrieval="false"
                 enablePasswordReset="true"
                 requiresQuestionAndAnswer="false"
                 minRequiredPasswordLength="1"
                 minRequiredNonalphanumericCharacters="0"
                 requiresUniqueEmail="true"
                 passwordFormat="Hashed"
                    />
    
          </providers>
        </membership>
    

    Role Manager provider:

        <roleManager enabled="true" defaultProvider="TableStorageRoleProvider" cacheRolesInCookie="true" cookieName=".ASPXROLES" cookieTimeout="30"
                     cookiePath="/" cookieRequireSSL="false" cookieSlidingExpiration = "true"
                     cookieProtection="All" >
          <providers>
            <clear/>
            <add name="TableStorageRoleProvider"
                 type="Microsoft.Samples.ServiceHosting.AspProviders.TableStorageRoleProvider"
                 description="Role provider using table storage"
                 applicationName="MvcCloudService"
                    />
          </providers>
        </roleManager>
    

    and the session state provider:

        <sessionState mode="Custom" customProvider="TableStorageSessionStateProvider">
          <providers>
            <clear />
            <add name="TableStorageSessionStateProvider"
                 type="Microsoft.Samples.ServiceHosting.AspProviders.TableStorageSessionStateProvider"
                 applicationName="MvcCloudService"
                 />
          </providers>
        </sessionState>
    

    You can consult the AspProvidersDemo project in the Windows Azure SDK samples as well. 

    Note: there is also a profile provider which would be added in the same manner.

    Before running, right click on the Cloud Service project node in Solution Explorer and select "Create Test Storage Tables".  (For more information on creating test storage tables, see this article)

    image

    And there you have it – ASP.Net MVC RC2 running on Windows Azure.

  • Cloudy in Seattle

    ASP.Net MVC Projects running on Windows Azure

    • 28 Comments

    [For more recent information on using ASP.NET MVC with Windows Azure please see this post.]

    Before you get started with ASP.Net MVC and Windows Azure – please install this hotfix.

    Strictly speaking, ASP.Net MVC projects are not supported on Windows Azure.  That is to say that we haven't spent the time to fully test all of the MVC scenarios when running on Windows Azure. 

    That said, for the most part, they do work, just as long as you know what tweaks you need to make in order to get up and running.

    I've attached a sample application that Phil and Eilon on the MVC team put together to help make it easier for you to get started.

    I’ll walk through the changes:

    1. Start by creating a Blank Cloud Service.  File –> New Project –> Visual C# –> Cloud Service –> Blank Cloud Service.  I call it MVCCloudService

    image

    2. Right click on the solution node in Solution Explorer and select “Add New Project”

    image

    3. In the Add New Project dialog, navigate to the Web node under Visual C# or Visual Basic and select the ASP.Net MVC Application (I call it MVCWebRole). 

    image

    4. In Solution Explorer, right click on MVCWebRole and select “Unload Project”

    image

    5. Right click again on MVCWebRole and select “Edit MVCWebRole.csproj”

    image

    6. Add <RoleType>Web</RoleType> to the top PropertyGroup in the project file.

    <Project ToolsVersion="3.5" DefaultTargets="Build" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
      <PropertyGroup>
    {. . .}
        <RoleType>Web</RoleType>

    7. Reload the project, be sure to save the project file.

    image 

    8. If you want to use the Windows Azure runtime library, add a reference to Microsoft.ServiceHosting.ServiceRuntime.dll by right clicking on the References node and selecting “Add reference…”.  Scroll down in the .Net Tab and you’ll find it.

    image

    9. Right click on the Roles node in the MVCCloudService project and select Add –> Web Role Project in solution…

    image

    Select the MVCWebRole project.

    image

    10. Set Copy Local = true on the MVC DLLs:

    • System.Web.Abstractions
    • System.Web.Mvc
    • System.Web.Routing

    These assemblies are not available when running on Windows Azure and need to be deployed with the Service Package.

    Expand the References node in the MVCWebRole and right click on System.Web.Abstractions and select Properties.  Change “Copy Local” to true.  Repeat for the other DLLs

    image

    With these changes, the edit, build, debug, publish and deploy functionality will all work in Visual Studio with the Windows Azure Tools for Microsoft Visual Studio installed.

    That said, it is still "use at your own risk".

    Note: The sample has not been modified to use the Windows Azure ASP.Net provider (for example, the membership provider), stayed tuned.

    The sample project is attached to this blog post.

  • Cloudy in Seattle

    Using IntelliTrace to debug Windows Azure Cloud Services

    • 9 Comments

    One of the cool new features of the June 2010 Windows Azure Tools + SDK is the integration of IntelliTrace to allow you to debug issues that occur in the cloud.

    IntelliTrace support requires .NET 4, Visual Studio 2010 Ultimate and the cloud service has to be deployed with IntelliTrace enabled. If you are using a 32-Bit OS, you need this patch/QFE.

    To enable IntelliTrace, right click on the cloud service project and select “Publish”.

    image

    At the bottom of our publish dialog, click to select “Enable IntelliTrace for .NET 4 roles”.

    image

    You can also configure IntelliTrace for the cloud settings (these are separate from the settings in Tools | Options which are used for the debug (F5) scenario which we currently do not support with Cloud Services/Development Fabric.  

    A couple of notes about IntelliTrace settings. 

    We default to high mode which is different from the F5 IntelliTrace settings in Visual Studio.  The reason is that F5 IntelliTrace includes both debugger and IntelliTrace data while in the cloud, you are only able to get back IntelliTrace data.

    Additionally, we exclude Microsoft.WindowsAzure.StorageClient.dll as we found that the slow down caused by IntelliTrace instrumenting resulted in time outs to storage.  You may find you will want to remove the storage client assembly from the exclusion list.

    To reset the IntelliTrace settings back to the default, you can delete “collectionplan.xml” from %AppData%\Roaming\Microsoft\VisualStudio\10.0\Cloud Tools

    Click “OK” to package up everything you need to IntelliTrace the web and worker host processes in the cloud and start the deployment process.

    Note: There is a current limitation that child processes cannot be IntelliTrace debugged.

    The deployment process is completely asynchronous so you can continue to work while you wait for deployment the to complete and you can track the progress through the Windows Azure Activity Log tool window.

    image

     

    After the deployment has completed, open up the Windows Azure Compute node in Server Explorer to browse hosted services deployed to Windows Azure.

    You can add a hosted service by right clicking on the Windows Azure Compute node and selecting “Add Slot…”

    image

    This will bring up a dialog you can use to choose a slot or add/manage your credentials.

    image

    The Server Explorer will show you which Hosted Services have IntelliTrace enabled.  They are the ones that have “(IntelliTrace)” beside the slot name.

    image

    Expand open the nodes and navigate to an instance, you can get the IntelliTrace logs for that instance by right clicking on the instance node and selected “View IntelliTrace Logs”.

    image

     

    Note: When a role process exits, it automatically gets restarted by the fabric, which causes the cycling role state behavior that some of you are familiar with.  When IntelliTrace is enabled, when a role process exits, it is not restarted and it is put into the “Unresponsive” state instead.  This allows you to get the IntelliTrace logs for the failure.

    Similar to how you can track the progress of deployment from the Windows Azure Activity Log, you can also track the the download of the IntelliTrace logs asynchronously.

    image

     

    You’ll then see the IntelliTrace files open in Visual Studio. 

    image

    You can now browse the Exception Data on the summary page or put Visual Studio into debug mode by clicking and exception and clicking the “Start Debugging” button or by double clicking on one of the threads in the thread list.

    Being in debug mode will bring up the IntelliTrace tools window which will show you all of the IntelliTrace events.  You can filter between different categories and “Switch to Calls View” which will show you the call stack that you can drill in and out of various methods.

    You can also open up your source code and right click on a line and select “Search for this line in IntelliTrace”.

    image

    When the search is complete, you can click on the navigation buttons at the top of the file to select one of the instances in which this code was called and use the historical debugging buttons on the left to debug forward and backward through the code looking at the call stack and locals as step through the control flow.

    image

    Debugging Common Issues Using IntelliTrace

     

    Missing an Assembly in the Service Package:

    This is by far one of the most common "works on the devfabric, fails in the cloud" issues. View the IntelliTrace log and look for FileNotFoundExceptions in the exception list or IntelliTrace events.

    clip_image001

    In the IntelliTrace events window:

    clip_image002

    Using an incorrect Windows Azure storage connection string:

    This one is a little tougher as there isn’t a top level exception you can look at.  Search for the methods in IntelliTrace where you use connection strings and see the input and return values.  For example the CloudStorageAccount and DiagnosticMonitor calls.

    clip_image004

    Missing a CRT library in the Cloud:

    For the scenario where you are calling into a native dll but did not xcopy deploy the CRT along with his Service Package, an exception will surface in the IntelliTrace summary naming the native dll that could not be loaded.

    clip_image005

    clip_image006

    Using a 32 Bit Native Library in the Cloud:

    This issue is very similar to the missing CRT example above, in this scenario you’ve been successfully developing with a 32 bit machine and but get a failure in the cloud when the 32 bit dll is loaded in a 64 bit process.

    With IntelliTrace, an exception showing which native library failed to load is surfaced in the IntelliTrace summary screen.

    clip_image007

    In the case where the loading of the assembly is triggered by a method call that is outside of the startup code, you can double click the exception to get to the line of their code that made the call to native that loaded up the dll.

    clip_image008

    Using code that requires admin access:

    If you are running into this issue, you should be testing against the Development Fabric before deploying to the cloud.  That said, our support data shows that this is one of the issues you run into.

    I tried to do a registry write to HKLM, which fails in the following way:

    An exception is shown in the IntelliTrace summary and when double clicked, will navigate to the line of code that is causing the exception.

    clip_image009

    clip_image010

    Using an ASP.NET provider with the default SQL Server connection string in the cloud:

    In this scenario, you are using the ASP.NET providers, the default MVC and ASP.NET Web Application templates both use these. In the devfabric, they work fine as they use SQL Express under the hood by default. When you deploy to the cloud, they no longer work. (An exception web page is shown after a wait)

    In opening the IntelliTrace summary, you will see the exception "Unable to connect to SQL Server database" with a stack trace that points to one of the providers, in my example, it was the SqlMembershipProvider.

    clip_image011

    clip_image012

    Using a Diagnostics connection string with a connection string that uses HTTP endpoints:

    In this scenario, you’ve deployed to the cloud but forgot to change your Windows Azure storage connection strings. If you incorrectly select to use HTTP endpoints for the storage account and didn't try running your app with the new connection strings on the devfabric  before deploying, you can run into this problem.

    When opening the IntelliTrace log, you will see an exception in the summary indicating that the endpoint is not a secure connection.

    clip_image013

    To sum up, I’m really excited about this feature, I hope it will really help a lot of people see into the cloud and diagnose issues saving both time and frustration.

  • Cloudy in Seattle

    Windows Azure Walkthrough: Table Storage

    • 19 Comments

    Please see the updated post for November 2009 and later

    This walkthrough covers what I found to be the simplest way to get a sample up and running on Windows Azure that uses the Table Storage Service. It is not trying to be comprehensive or trying to dive deep in the technology, it just serves as an introduction to how the Table Storage Service works.

    Please take the Quick Lap Around the Tools before doing this walkthrough.

    Note: The code for this walkthrough is attached, you will still have to add and reference the Common and StorageClient projects from the Windows Azure SDK.

    After you have completed this walkthrough, you will have a Web Role that is a simple ASP.Net Web Application that shows a list of Contacts and allows you to add to and delete from that list. Each contact will have simplified information: just a name and an address (both strings).

    image

    Table Storage Concepts

    The Windows Azure Table Storage Services provides queryable structured storage. Each account can have 0..n tables.

    image

    Design of the Sample

    When a request comes in to the UI, it makes its way to the Table Storage Service as follows (click for larger size):

    clip_image002

    The UI class (the aspx page and it’s code behind) is data bound through an ObjectDataSource to the SimpleTableSample_WebRole.ContactDataSource which creates the connection to the Table Storage service gets the list of Contacts and Inserts to, and Deletes from, the Table Storage.

    The SimpleTableSample_WebRole.ContactDataModel class acts as the data model object and the SimpleTableSample_WebRole.ContactDataServiceContext derives from TableStorageDataServiceContext which handles the authentication process and allows you to write LINQ queries, insert, delete and save changes to the Table Storage service.

    Creating the Cloud Service Project

    1. Start Visual Studio as an administrator

    2. Create a new project: File à New à Project

    3. Select “Web Cloud Service”. This will create the Cloud Service Project and an ASP.Net Web Role. Call it “SimpleTableSample”

    image

    4. Find the installation location of the Windows Azure SDK. By default this will be: C:\Program Files\Windows Azure SDK\v1.0

    a. Find the file named “samples.zip”

    b. Unzip this to a writeable location

    5. From the samples you just unzipped, add the StorageClient\Lib\StorageClient.csproj and HelloFabric\Common\Common.csproj to your solution by right-clicking on the solution in Solution Explorer and selecting Add à Existing Project.

    image

    a. Common and StorageClient and libraries that are currently distributed as samples that provide functionality to help you build Cloud Applications. Common adds Access to settings and logging while StorageClient provides helpers for using the storage services.

    6. From your Web Role, add references to the Common and StorageClient projects you just added along with a reference to System.Data.Services.Client

    image

    image

    7. Add a ContactDataModel class to your Web Role that derives from TableStorageEntity. For simplicity, we’ll just assign a new Guid as the PartitionKey to ensure uniqueness. This default of assigning the PartitionKey and setting the RowKey to a hard coded value (String.Empty) gives the storage system the freedom to distribute the data.

    using Microsoft.Samples.ServiceHosting.StorageClient;
    public class ContactDataModel : TableStorageEntity
    {
        public ContactDataModel(string partitionKey, string rowKey)
            : base(partitionKey, rowKey)
        {
        }
    
        public ContactDataModel()
            : base()
        {
            PartitionKey = Guid.NewGuid().ToString();
            RowKey = String.Empty;
        }
    
        public string Name
        {
            get;
            set;
        }
    
        public string Address
        {
            get;
            set;
        }
    }

    8. Now add the ContactDataServiceContext to the Web Role that derives from TableStorageDataServiceContext.

    a. We’ll use this later to write queries, insert, remove and save changes to the table storage.

    using Microsoft.Samples.ServiceHosting.StorageClient;

    internal class ContactDataServiceContext : TableStorageDataServiceContext
    {
        internal ContactDataServiceContext(StorageAccountInfo accountInfo)
            : base(accountInfo)
        {
        }
    
        internal const string ContactTableName = "ContactTable";
    
        public IQueryable<ContactDataModel> ContactTable
        {
            get
            {
                return this.CreateQuery<ContactDataModel>(ContactTableName);
            }
        }
    }

    9. Next add a ContactDataSource class. We'll fill this class out over the course of the next few steps.  This is the class the does all the hookup between the UI and the table storage service. Starting with the first part of the constructor, a StorageAccountInfo class is instantiated in order to get the settings required to make a connection to the Table Storage Service. (note that this is just the first part of the constructor code, the rest is in step 12)

    using Microsoft.Samples.ServiceHosting.StorageClient;
    using System.Data.Services.Client;
    public class ContactDataSource
    {
        private ContactDataServiceContext _ServiceContext = null;
    
        public ContactDataSource()
        {
            // Get the settings from the Service Configuration file
            StorageAccountInfo account = 
    StorageAccountInfo.GetDefaultTableStorageAccountFromConfiguration();

    10. In order for the StorageAccountInfo class to find the configuration settings, open up ServiceDefinition.csdef and add the following to <WebRole/>. These define the settings.

        <ConfigurationSettings>
          <Setting name="AccountName"/>
          <Setting name="AccountSharedKey"/>
          <Setting name="TableStorageEndpoint"/>
        </ConfigurationSettings>  
    

    11. Likewise, add the actual local development values to the ServiceConfiguration.cscfg file. Note that the settings between both files have to match exactly otherwise your Cloud Service will not run.

    • When you run in the Cloud, the AccountName and AccountSharedKey will be set to the values you will get back from the Portal for your account. The TableStorageEndpoint will be set the URL for the Table Storage Service: http://table.core.windows.net
    • Because these are set in the ServiceConfiguration.cscfg file, these values can be updated even after deploying to the cloud by uploading a new Service Configuration.
    • For the local development case, the local host and port 10002 (by default) will be used as the Table Storage Endpoint. The AccountName and AccountSharedKey are hard coded to a value that the Development Storage service is looking for (it’s the same for all 3, Table, Blob and Queue services).
        <ConfigurationSettings>
          <Setting name="AccountName" value="devstoreaccount1"/>
          <Setting name="AccountSharedKey" 
    value="Eby8vdM02xNOcqFlqUwJPLlmEtlCDXJ1OUzFT50uSRZ6IFsuFq2UVErCz4I6tq/K1SZFPTOtr/KBHBeksoGMGw=="/> <Setting name="TableStorageEndpoint"
    value="http://127.0.0.1:10002/"/>
    </ConfigurationSettings>

    12. Next, continue to fill out the constructor (just after the call to GetDefaultTableStorageAccountFromConfiguration ()) by instantiating the ContactDataServiceContext. Set the RetryPolicy that applies only to the methods on the DataServiceContext (i.e. SaveChanges() and not the query. )

        // Create the service context we'll query against
        _ServiceContext = new ContactDataServiceContext(account);
        _ServiceContext.RetryPolicy = RetryPolicies.RetryN(3, TimeSpan.FromSeconds(1));
    }

    13. We need some code to ensure that the tables we rely on get created.  We'll do this on first request to the web site -- which can be done by adding code to one of the handlers in the global application class.  Add a global application class by right clicking on the web role and selecting Add -> New Item -> Global Application Class. (see this post for more information)

    image

    14. Add the following code to global.asax.cs to create the tables on first request:

    using Microsoft.Samples.ServiceHosting.StorageClient;
    protected void Application_BeginRequest(object sender, EventArgs e)
    {
        HttpApplication app = (HttpApplication)sender;
        HttpContext context = app.Context;
    
        // Attempt to peform first request initialization
        FirstRequestInitialization.Initialize(context);
    
    }

    And the implementation of the FirstRequestInitialization class:

    internal class FirstRequestInitialization
    {
        private static bool s_InitializedAlready = false;
        private static Object s_lock = new Object();
    
    
        // Initialize only on the first request
        public static void Initialize(HttpContext context)
        {
            if (s_InitializedAlready)
            {
                return;
            }
    
            lock (s_lock)
            {
                if (s_InitializedAlready)
                {
                    return;
                }
    
                ApplicationStartUponFirstRequest(context);
                s_InitializedAlready = true;
            }
        }
    
        private static void ApplicationStartUponFirstRequest(HttpContext context)
        {
            // This is where you put initialization logic for the site.
            // RoleManager is properly initialized at this point.
    
            // Create the tables on first request initialization as there is a performance impact
            // if you call CreateTablesFromModel() when the tables already exist. This limits the exposure of
            // creating tables multiple times.
    
            // Get the settings from the Service Configuration file
            StorageAccountInfo account = StorageAccountInfo.GetDefaultTableStorageAccountFromConfiguration();
    
            // Create the tables
            // In this case, just a single table.  
            // This will create tables for all public properties that are IQueryable (collections)
            TableStorage.CreateTablesFromModel(typeof(ContactDataServiceContext), account);
        }
    }

    15. When running in the real cloud, this code is all that is needed to create the tables for your Cloud Service. The TableStorage class reflects over the ContactDataServiceContext classs and creates a table for each IQueryable<T> property where the columns of that table are based on the properties of the type T of the IQueryable<T>.

    a. There is a bit more to do in order to get this to work in the local Development Storage case, more on that later.

    16. At this point, it’s just a matter of filling out the ContactDataSource class with methods to query for the data, insert and delete rows. This is done through LINQ and using the ContactDataServiceContext.

    a. Note: in the Select() method, the TableStorageDataServiceQuery<T> class enables you to have finer grained control over how you get the data.

    i. Execute() or ExecuteWithRetries() will access the data store and return up to the first 1000 elements.

    ii. ExecuteAll() or ExecuteAllWithRetries() will return all of the elements with continuation as you enumerate over the data.

    iii. ExecuteWithRetries() and ExecuteAllWithRetries() uses the retry policy set on the ContactDataServiceContext for the queries.

    b. Note: the use of AttachTo() in the Delete() method to connect to and remove the row.

    public IEnumerable<ContactDataModel> Select()
    {
        var results = from c in _ServiceContext.ContactTable
                      select c;
    
        TableStorageDataServiceQuery<ContactDataModel> query = 
    new TableStorageDataServiceQuery<ContactDataModel>(results as DataServiceQuery<ContactDataModel>); IEnumerable<ContactDataModel> queryResults = query.ExecuteAllWithRetries(); return queryResults; } public void Delete(ContactDataModel itemToDelete) { _ServiceContext.AttachTo(ContactDataServiceContext.ContactTableName, itemToDelete, "*"); _ServiceContext.DeleteObject(itemToDelete); _ServiceContext.SaveChanges(); } public void Insert(ContactDataModel newItem) { _ServiceContext.AddObject(ContactDataServiceContext.ContactTableName, newItem); _ServiceContext.SaveChanges(); }

    17. The UI is defined in the aspx page and consists of 3 parts. The GridView which will display all of the rows of data, the FormView which allows the user to add rows and the ObjectDataSource which databinds the UI to the ContactDataSource.

    18. The GridView is placed after the first <div>. Note that in this sample, we’ll just auto-generate the columns and show the delete button. The DataSourceId is set the ObjectDataSource which will be covered below.

        <asp:GridView
            id="contactsView"
            DataSourceId="contactData"
            DataKeyNames="PartitionKey"
            AllowPaging="False"
            AutoGenerateColumns="True"
            GridLines="Vertical"
            Runat="server" 
            BackColor="White" ForeColor="Black"
            BorderColor="#DEDFDE" BorderStyle="None" BorderWidth="1px" CellPadding="4">
            <Columns>
                <asp:CommandField ShowDeleteButton="true"  />
            </Columns>
            <RowStyle BackColor="#F7F7DE" />
            <FooterStyle BackColor="#CCCC99" />
            <PagerStyle BackColor="#F7F7DE" ForeColor="Black" HorizontalAlign="Right" />
            <SelectedRowStyle BackColor="#CE5D5A" Font-Bold="True" ForeColor="White" />
            <HeaderStyle BackColor="#6B696B" Font-Bold="True" ForeColor="White" />
            <AlternatingRowStyle BackColor="White" />
        </asp:GridView>    

    19. The Form view to add rows is really simple, just labels and text boxes with a button at the end to raise the “Insert” command. Note that the DataSourceID is again set to the ObjectDataProvider and there are bindings to the Name and Address.

        <br />        
        <asp:FormView
            id="frmAdd"
            DataSourceId="contactData"
            DefaultMode="Insert"
            Runat="server">
            <InsertItemTemplate>
                <asp:Label
                        id="nameLabel"
                        Text="Name:"
                        AssociatedControlID="nameBox"
                        Runat="server" />
                <asp:TextBox
                        id="nameBox"
                        Text='<%# Bind("Name") %>'
                        Runat="server" />
                <br />
                <asp:Label
                        id="addressLabel"
                        Text="Address:"
                        AssociatedControlID="addressBox"
                        Runat="server" />
                <asp:TextBox
                        id="addressBox"
                        Text='<%# Bind("Address") %>'
                        Runat="server" />
                <br />
                <asp:Button
                        id="insertButton"
                        Text="Add"
                        CommandName="Insert"
                        Runat="server"/>
            </InsertItemTemplate>
        </asp:FormView>
    

    20. The final part of the aspx is the definition of the ObjectDataSource. See how it ties the ContactDataSource and the ContactDataModel together with the GridView and FormView.

        <%-- Data Sources --%>
        <asp:ObjectDataSource runat="server" ID="contactData" 
    TypeName="SimpleTableSample_WebRole.ContactDataSource" DataObjectTypeName="SimpleTableSample_WebRole.ContactDataModel" SelectMethod="Select" DeleteMethod="Delete" InsertMethod="Insert"> </asp:ObjectDataSource>

    21. Build. You should not have any compilation errors, all 4 projects in the solution should build successfully.

    22. Create Test Storage Tables. As mentioned in step 15, creating tables in the Cloud is all done programmatically, however there is an additional step that is needed in the local Development Storage case.

    a. In the local development case, tables need to be created in the SQL Express database that the local Development Storage uses for its storage. These need to correspond exactly to the runtime code. This is due to a current limitation in the local Development Storage.

    b. Right click on the Cloud Service node in Solution Explorer named “Create Test Storage Tables” that runs a tool that uses reflection to create the tables you need in a SQL Express database whose name corresponds to your Solution name.

    image

    i. ContactDataServiceContext is the type that the tool will look for and use to create those tables on your behalf.

    ii. Each IQueryable<T> property on ContactDataServiceContext will have a table created for it where the columns in that table will correspond to the public properties of the type T of the IQueryable<T>.

    23. F5 to debug. You will see the app running in the Development Fabric using the Table Development Storage

    image

    Please see the Deploying a Cloud Service to learn how to modify the configuration of this Cloud Service to make it run on Windows Azure.

  • Cloudy in Seattle

    The Windows Azure CGI Web Role Template Explained

    • 0 Comments

    Many of you may not even know this, but as part of the Windows Azure Tools for Microsoft Visual Studio, we ship a Role template called “CGI Web Role”.

    Today, it’s a little hard to find (yes, I’m foreshadowing that this is about to change!) as you have to create a Cloud Service project first, then add/replace the Web Role with a CGI Web Role.

    For example you could create a Blank Cloud Service project:

    image

    In Solution Explorer, right click on the Roles node in the Cloud Service project Add | New Web Role Project…

    image 

    Select CGI Web Role:

    image

    Which adds an ASP.NET Web Application project tailored to be the configuration and files conduit for a FastCGI Application to your Cloud Service:

    image

    What is the purpose of the CGI Web Role? 

    Even though Visual Studio doesn’t support languages like PHP, there are reasons for you to be interested in using Visual Studio when building a FastCGI application that runs on Windows Azure:

    • Configuration:
      • Includes the Web.roleconfig file which is used to specify the FastCGI application
      • Has a commented section in the web.config that describes how to add the FastCGI handler.
    • Running on the DevFabric
      • Once you have the project setup, you can hit F5 and have your application run on the DevFabric. Stop, edit and run it again with ease.
    • Packaging for deployment
      • Right click on the Cloud Service project in the Solution Explorer and select “Publish” – this will package your application for deployment.

    Not to mention that Visual Studio really has a first class source editor and a lot of other features you’ll be able to make use of.

    (Please see the Windows Azure SDK documentation and the FastCGI sample for more information on hosting a FastCGI application on Windows Azure.)

  • Cloudy in Seattle

    Walkthrough: Windows Azure Table Storage (Nov 2009 and later)

    • 11 Comments

    This walkthrough covers what I found to be the simplest way to get a sample up and running on Windows Azure that uses the Table Storage Service. It serves as an introduction to both Windows Azure cloud services as well as using table storage.  Although there is a wealth of information out there on Windows Azure - I try to tie together a lot of that information for folks of all levels to consume.

    I originally wrote this walkthrough well over a year ago for our very first public release of Windows Azure at PDC ‘08.

    Much has changed in the last year, and this post is an update to that original post that will work with our PDC ‘09 and v1.0 release of the Windows Azure Tools

    So what's changed from a dev perspective?  Overall not a lot, mainly because table storage leverages ADO.NET Data Services and that is the core of how you work with Table Storage.  The way you connect to Windows Azure Storage has changed, namespaces, class names have changed and there have been a few other tweaks.

    To be clear, this post is not trying to be comprehensive or trying to dive deep in the technology, it just serves as an introduction to how the Table Storage Service works.  Also, please take a look at the Quick Lap Around the Tools before doing this walkthrough.

    Note: The code for this walkthrough is attached to this blog post.

    After you have completed this walkthrough, you will have a Web Role that is a simple ASP.NET Web Application that shows a list of Contacts and allows you to add to and delete from that list. Each contact will have simplified information: just a name and an address (both strings).

    image

    Table Storage Concepts

    The Windows Azure Table Storage Services provides queryable structured storage. Each account can have 0..n tables, i.e. an unlimited number of tables and entities with no limit on the table size. (the combined size of an entity cannot exceed 1MB however)

    image

    Each entity and a table always has three properties, the PartitionKey, the RowKey and Timestamp that are not shown in above for space/legibility reasons.  Together these form a unique key for an entity. 

    Additionally, currently the only index and all results are returned sorted by PartitionKey and then by RowKey.

    Design of the Sample

    When a request comes in to the UI, it makes its way to the Table Storage Service as follows (click for larger size):

    clip_image002

    The UI class (the aspx page and its code behind) is data bound through an ObjectDataSource to the WebRole1.ContactDataSource which creates the connection to the Table Storage service gets the list of Contacts and Inserts to, and Deletes from, the Table Storage.

    The WebRole1.ContactDataModel class acts as the data model object and the WebRole1.ContactDataServiceContext derives from TableServiceContext which handles the authentication process and allows you to write LINQ queries, insert, delete and save changes to the Table Storage service.

    Creating the Cloud Service Project

    1. Start Visual Studio as an administrator (Screen shots below are from VS 2008, VS 2010 is also supported)

    2. Create a new project: File | New Project

    3. Under the Visual C# node (VB is also supported), select the “Cloud Service” project type then select the “Windows Azure Cloud Service” project template. Set the name to be “SimpleTableSample”. Hit OK to continue.

    image

    This will bring up a dialog to add Roles to the Cloud Service.

    4. Add an ASP.NET Web Role to the Cloud Service, we’ll use the default name of “WebRole1”.  Hit OK.

    image

    Solution explorer should look as follows:

    image

    5. We now want to setup the the data model for the entity.  Right-click on WebRole1 in the Solution Explorer and select “Add Class”.  Call this class “ContactDataModel.cs” and hit OK.

    6.  Add a using directive to the storage client library – a .NET library for using Windows Azure Storage.  The assembly reference was already added by Visual Studio.

    using Microsoft.WindowsAzure.StorageClient;

    7. Make the ContactDataModel class derive from the TableServiceEntity class. This brings in the PartitionKey, RowKey and Timestamp properties. (not necessary to derive from TableServiceEntity, but a convenience)

    8. For simplicity, we’ll just assign a new Guid as the PartitionKey to ensure uniqueness even though generally a GUID is not a good partition key.  If we were really building an address book, using a partition key that maps to a popular search field would be a good approach (contact name for example). 

    In this case, since the PartitionKey is set and the RowKey to is set to a constant hard coded value (String.Empty) the storage system distributes the data over many storage nodes prioritizing scalability (spreading load over multiple servers) over the faster performance of operations on multiple entities in a single partition. (entity locality)

    The key message here is that you’ll want to think about and make the right decision for your application/scenarios.  To learn more read the Programming Table Storage White Paper on windowsazure.com.

    public class ContactDataModel :TableServiceEntity
    {
        public ContactDataModel(string partitionKey, string rowKey)
            : base(partitionKey, rowKey)
        {
        }
    
        public ContactDataModel(): this(Guid.NewGuid().ToString(), String.Empty)
        {
        }
    
        public string Name { get; set; }
        public string Address { get; set; }
    }

    9. Now add the ContactDataServiceContext to the Web Role that derives from TableServiceContext.  Right click on WebRole1 and select Add | Class…  Name the class ContactDataServiceContext.cs.

    10.  Add the using directives.

    using Microsoft.WindowsAzure;
    using Microsoft.WindowsAzure.StorageClient;

    11. Set the base class to be TableServiceContext.

    12. We’ll use the ContactDataServiceContext later to write queries, insert, remove and save changes to the table storage. One of the key things it does is provide the IQueryable<T> property that corresponds to a table.

    public class ContactDataServiceContext : TableServiceContext
    {
        public ContactDataServiceContext(string baseAddress, StorageCredentials credentials)
            : base(baseAddress, credentials)
        {
        }
    
        public const string ContactTableName = "ContactTable";
    
        public IQueryable<ContactDataModel> ContactTable
        {
            get
            {
                return this.CreateQuery<ContactDataModel>(ContactTableName);
            }
        }
    }

    Every IQueryable<T> property corresponds to a table in table storage.

    13. Let’s now add the ContactDataSource class. We'll fill this class out over the course of the next few steps.  This is the class the does all the hookup between the UI and the table storage service.  Right click on WebRole1 | Add | Class… and enter the file name to be “ContactDataSource.cs”.

    14. Add a reference to the System.Data.Services.Client assembly.  Right click on the References node under WebRole1 and select Add Reference…

    image

    Then scroll down in the list and select System.Data.Services.Client and click OK.

    image

    15. Now add the using directives to the top of the ContactDataSource.cs file.

    using Microsoft.WindowsAzure;
    using Microsoft.WindowsAzure.StorageClient;
    using System.Data.Services.Client;

    16. For simplicity, use the instantiation of the ContactDataSource class as the location to setup the connection to Windows Azure Storage.  This involves reading a connection string from the Windows Azure settings and creating the ContactDataServiceContext with that connection information.

    public class ContactDataSource
    {
        private ContactDataServiceContext _ServiceContext = null;
    
        public ContactDataSource()
        {
            var storageAccount = CloudStorageAccount.FromConfigurationSetting("DataConnectionString");
            _ServiceContext = new ContactDataServiceContext(storageAccount.TableEndpoint.ToString(), storageAccount.Credentials);   }

    17.  Setup the “DataConnectionString” setting by opening up the configuration UI for WebRole1.  Right click on the WebRole1 node under the Roles folder in the SimpleTableStorage cloud service project and select “Properties”.

    image

    18. Switch to the Settings tab and click “Add Setting”.  Name it DataConnectionString, set the type to ConnectionString and click on the “…” button on the far right.

    image

    Hit “OK” to set the credentials to use Development Storage.  We’ll first get this sample working on development storage then convert it to use cloud storage later.

    19. If you actually instantiated the ContactDataSource and ran this app, you would find that the CloudStorageAccount.FromConfigurationSetting() call would fail with the following message:

    ConfigurationSettingSubscriber needs to be set before FromConfigurationSetting can be used

    This message is in fact incorrect – I have a bug filed to get this fixed and should be fixed in the next release (after our November 2009 release), to say “Configuration Setting Publisher” and not “ConfigurationSettingSubscriber”.

    To resolve this, we need to add a bit of template code to the WebRole.cs file in WebRole1.  Add the following to the WebRole.OnStart() method.

    using Microsoft.WindowsAzure;

    #region Setup CloudStorageAccount Configuration Setting Publisher
    
    // This code sets up a handler to update CloudStorageAccount instances when their corresponding
    // configuration settings change in the service configuration file.
    CloudStorageAccount.SetConfigurationSettingPublisher((configName, configSetter) =>
    {
        // Provide the configSetter with the initial value
        configSetter(RoleEnvironment.GetConfigurationSettingValue(configName));
    
        RoleEnvironment.Changed += (sender, arg) =>
        {
            if (arg.Changes.OfType<RoleEnvironmentConfigurationSettingChange>()
                .Any((change) => (change.ConfigurationSettingName == configName)))
            {
                // The corresponding configuration setting has changed, propagate the value
                if (!configSetter(RoleEnvironment.GetConfigurationSettingValue(configName)))
                {
                    // In this case, the change to the storage account credentials in the
                    // service configuration is significant enough that the role needs to be
                    // recycled in order to use the latest settings. (for example, the 
                    // endpoint has changed)
                    RoleEnvironment.RequestRecycle();
                }
            }
        };
    });
    #endregion

    The comments (which I wrote and is included in the samples) should explain what is going on if you care to know.  In a nutshell, this code bridges the gap between the Microsoft.WindowsAzure.StorageClient assembly and the Microsoft.WindowsAzure.ServiceRuntime library – the Storage Client library is agnostic to the Windows Azure runtime as it can be used in non Windows Azure applications.

    This code essentially says how to get a setting value given a setting name and sets up an event handler to handle setting changes while running in the cloud (because the ServiceConfiguration.cscfg file was updated in the cloud).

    20. We need some code to ensure that the tables we rely on get created. Add the code to create the tables if they don't exist to the ContactDataSource constructor:

    public ContactDataSource()
    {
        var storageAccount = CloudStorageAccount.FromConfigurationSetting("DataConnectionString");
        _ServiceContext = new ContactDataServiceContext(storageAccount.TableEndpoint.ToString(), storageAccount.Credentials);
    
        // Create the tables
        // In this case, just a single table.  
        storageAccount.CreateCloudTableClient().CreateTableIfNotExist(ContactDataServiceContext.ContactTableName);
    }

    Note: For production code you'll want to optimize the reading of the configuration settings and making a call to create the tables to improve perf -- the focus of this post is to keep things simple.

    Add the following code to the ContactDataSource.cs file:

    public IEnumerable<ContactDataModel> Select()
    {
        var results = from c in _ServiceContext.ContactTable
                      select c;
    
        var query = results.AsTableServiceQuery<ContactDataModel>();
        var queryResults = query.Execute();
    
        return queryResults;
    }
    
    public void Delete(ContactDataModel itemToDelete)
    {
        _ServiceContext.AttachTo(ContactDataServiceContext.ContactTableName, itemToDelete, "*");
        _ServiceContext.DeleteObject(itemToDelete);
        _ServiceContext.SaveChanges();
    }
    
    public void Insert(ContactDataModel newItem)
    {
        _ServiceContext.AddObject(ContactDataServiceContext.ContactTableName, newItem);
        _ServiceContext.SaveChanges();
    }

    Note: in the Select() method, the TableServiceQuery<T> class enables you to have finer grained control over how you get the data.

    Note: the use of AttachTo() in the Delete() method to connect to and remove the row.

    22. The UI is defined in the aspx page and consists of 3 parts. The GridView which will display all of the rows of data, the FormView which allows the user to add rows and the ObjectDataSource which databinds the UI to the ContactDataSource.

    23. The GridView is placed after the first <div>. Note that in this sample, we’ll just auto-generate the columns and show the delete button. The DataSourceId is set the ObjectDataSource which will be covered below.

        <asp:GridView
            id="contactsView"
            DataSourceId="contactData"
            DataKeyNames="PartitionKey"
            AllowPaging="False"
            AutoGenerateColumns="True"
            GridLines="Vertical"
            Runat="server" 
            BackColor="White" ForeColor="Black"
            BorderColor="#DEDFDE" BorderStyle="None" BorderWidth="1px" CellPadding="4">
            <Columns>
                <asp:CommandField ShowDeleteButton="true"  />
            </Columns>
            <RowStyle BackColor="#F7F7DE" />
            <FooterStyle BackColor="#CCCC99" />
            <PagerStyle BackColor="#F7F7DE" ForeColor="Black" HorizontalAlign="Right" />
            <SelectedRowStyle BackColor="#CE5D5A" Font-Bold="True" ForeColor="White" />
            <HeaderStyle BackColor="#6B696B" Font-Bold="True" ForeColor="White" />
            <AlternatingRowStyle BackColor="White" />
        </asp:GridView>    

    24. The Form view to add rows is really simple, just labels and text boxes with a button at the end to raise the “Insert” command. Note that the DataSourceID is again set to the ObjectDataProvider and there are bindings to the Name and Address.

        <br />        
        <asp:FormView
            id="frmAdd"
            DataSourceId="contactData"
            DefaultMode="Insert"
            Runat="server">
            <InsertItemTemplate>
                <asp:Label
                        id="nameLabel"
                        Text="Name:"
                        AssociatedControlID="nameBox"
                        Runat="server" />
                <asp:TextBox
                        id="nameBox"
                        Text='<%# Bind("Name") %>'
                        Runat="server" />
                <br />
                <asp:Label
                        id="addressLabel"
                        Text="Address:"
                        AssociatedControlID="addressBox"
                        Runat="server" />
                <asp:TextBox
                        id="addressBox"
                        Text='<%# Bind("Address") %>'
                        Runat="server" />
                <br />
                <asp:Button
                        id="insertButton"
                        Text="Add"
                        CommandName="Insert"
                        Runat="server"/>
            </InsertItemTemplate>
        </asp:FormView>
    

    25. The final part of the aspx is the definition of the ObjectDataSource. See how it ties the ContactDataSource and the ContactDataModel together with the GridView and FormView.

        <%-- Data Sources --%>
        <asp:ObjectDataSource runat="server" ID="contactData"     TypeName="WebRole1.ContactDataSource"
            DataObjectTypeName="WebRole1.ContactDataModel" 
            SelectMethod="Select" DeleteMethod="Delete" InsertMethod="Insert">    
        </asp:ObjectDataSource>
    

    26. Build. You should not have any compilation errors.

    27. F5 to debug. You will see the app running in the Development Fabric using the Table Development Storage

    image

    28. Now I want to switch this to use Windows Azure Storage, not the development storage.  The first step is to go to the Windows Azure Developer Portal and create a storage account.

    Login and click on “New Service”, and select “Storage Account”:

    image

    Fill out the service name, the public name and optionally choose a region.  You will be brought to a page that contains the following (note I rubbed out the access keys):

    image

    29. You will use the first part of the endpoint, (jnakstorageaccount) and the one of the access keys to fill out your connection string.

    30. Open the WebRole1 config again, bring up Settings | DataConnectionString and fill out the account name and the account key and hit OK.

    image

    31. Hit F5 to run the application again. 

    This time you will be running against cloud storage – note that the data you entered when running against the development storage is no longer there.

    Important Note: Before deploying to the cloud, the DiagnosticsConnectionString also needs to use storage account credentials and not the development storage account.

    32. Please see the Deploying a Cloud Service walkthrough to learn how to deploy this to the Windows Azure cloud.

    And there you have it, a walkthrough of using Windows Azure Table Storage – for more information and to dig deeper, definitely check out the white paper here: Windows Azure Table – Programming Table Storage and the docs in MSDN that cover what subsets of ADO.NET Data Services work with table storage here: http://msdn.microsoft.com/en-us/library/dd135720.aspx and here: http://msdn.microsoft.com/en-us/library/dd894032.aspx

  • Cloudy in Seattle

    How to: Add an HTTPS Endpoint to a Windows Azure Cloud Service

    • 18 Comments

    Back in May I posted about Adding an HTTPS Endpoint to a Windows Azure Cloud Service and with the November 2009 release of the Windows Azure Tools, that article is now obsolete.

    In the last week I received a number of requests to post a new article about how to add HTTPS endpoints with the November 2009 release and later and I’m always happy to oblige!

    To illustrate how to add an HTTPS endpoint to a Cloud Service, I’ll start with the thumbnail sample from the Windows Azure SDK – the web role in that sample only has an http endpoint and I’ll walkthrough the steps to add an HTTPS endpoint to that web role.

    Open the Windows Azure SDK folder, for me, that's C:\Program Files\Windows Azure SDK\v1.0 and unzip the samples-cs.zip file to a writeable location.

    In the samples folder, there is a sample called “Thumbnails”, open the solution Thumbnails.sln in Visual Studio. (2008 or 2010 will work)

    Hit F5 to run the application and make sure everything works as expected. 

    If you aren’t familiar with this sample, it allows you to select an image locally via the web role which will upload that image to blob storage and will communicate the location to the worker via a queue.  The worker will generate a thumbnail for that image and the web role displays the thumbnail.

    The process to add an HTTPS endpoint is a 3 step process.

    1. Configure the endpoint
    2. Upload the certificate to the Cloud
    3. Configure the SSL certificate (and then point the endpoint to that certificate)

    Adding the HTTPS Endpoint

    To configure the endpoint, open up the configuration UI on the WebRole by right clicking on the Thumbnails_WebRole node under the Roles node in the Solution Explorer and selecting “Properties”.

    image

    Switch to the Endpoints tab and click the checkbox to select “HTTPS”.

    image

    This will add the HTTPS endpoint but not specify the certificate.

    Switch to the Configuration page and uncheck the Launch browser for: HTTP endpoint option, by unselecting this option, on run or debug of the cloud service, the default browser will only be launched for the HTTPS endpoint.

    image

    HTTPS Endpoints on the Local Development Fabric

    Click on the Debug menu, Debug | Start Debugging to package and run the cloud service on the local development fabric.

    The development simulation always uses a self-signed certificate issued to and issued by 127.0.0.1 which corresponds to the local host. This certificate is installed as part of the Windows Azure Tools + SDK.

    This is an important thing to note as the certificate configuration I'm about to describe below only applies when the application is running on the cloud.

    By default, this certificate is not root trusted and running an application with an https endpoint on the development fabric will result in a certificate error on the local development fabric.  Click on “Continue to this website (not recommended)” to browse to the web site:

    image

    For more information about the certificate Windows Azure installs including how to get rid of the certificate error when running on the development fabric -- see the end of this post.

    The Certificate

    To configure a certificate for use in the cloud, you will need a certificate which will be uploaded to the cloud and to configure that certificate for the role.

    For the purpose of this article, we’ll create and use a self-signed certificate. Create a self-signed certificate by opening the IIS Manager.

    Open the IIS Manager and select “Server Certificates”.

    Select “Create Self-Signed Certificate…” under the “Actions” heading on the far right of the dialog.

    image

    After creating the cert, click on “Export…” to export the certificate to a pfx file. Provide a password you’ll remember. 

    image

    The benefit of using IIS to create the certificate is that it is the easiest way I know to create a certificate that has the appropriate settings and an exportable private key.

    Uploading the Certificate

    We’ll now proceed with the upload step. Navigate the to Windows Azure Developer Portal and select a Hosted Service – this will be the same Hosted Service that you will deploy your application to later.

    On the Certificates heading at the bottom of that page, select “Manage”.

    image

    This will bring you to the following page:

    image

    Upload the certificate by entering the name of the pfx file and the corresponding password you entered during the export step above and click “upload”

    Copy the certificate Thumbprint to your clipboard after it is installed to your Hosted Service Component.

    image

    Configuring the Certificate

    Go back to the Visual Studio Thumbnails_WebRole configuration UI, click on the Certificates tab and click “Add Certificate”.

    Give the certificate a name (i.e. sslCert), paste in the Thumbprint in the Thumbprint field (you copied it to your clipboard after uploading the certificate to the portal) and leave the store location and name at the default of LocalMachine and My.

    image

    The certificates configuration page allows you to specify for a given role what certificates should be installed to the VM instances for that role and in which stores to install those certificates.

    In this case, you are telling Windows Azure to install the certificate you uploaded via the Portal to all VM instances that are created for the Thumbnails_WebRole web role.

    Switch to the Endpoints tab and select “sslCert” for the HTTPS certificate.

    Now deploy your application to the same Hosted Service in the Cloud where you uploaded the certificate.

    Once the application is deployed, you will be able to access it via http and https. You can see from the screen shot below, the certificate I uploaded is being used for the https endpoint for the cloud service I deployed:

    image

    Note: Since we uploaded a self-signed certificate, your web browser will display a certificate error when browsing to your https endpoint, using a real signed certificate will solve this problem.

    Also note that you may have to add intermediate certificates to complete the certificate chain.  You can do this by uploading additional certificates via the Portal and configuring those certificates in the Certificates tab of the role configuration UI. (more on this in a subsequent post)

    Known Issues:

    If you forget to set the SSL certificate name for the HTTPS endpoint on the Endpoints page of the role configuration UI, you'll still be able to run on the local development fabric (as per above, it always uses the installed certificate) but when you publish, it will fail with an error in the error list:

    No SSL certificate specified for https input endpoint 'HttpsIn' of role 'Thumbnails_WebRole' 

    Windows Explorer (and your default browser) will still come up and the Thumbnails.cspkg file will appear as a 0 byte file which is misleading.  We will be fixing this in a future release such that it is more obvious that there was an error.

    Specifying the SSL Certificate Used by the Development Fabric to be Trusted

    To find the certificate that is installed by Windows Azure, run the Microsoft Management Console by typing “mmc" from in the start menu.

    Select “Add/Remove Snap-in…”:

    image

    And select “Certificates”, “Computer Account”:

    image

    Click “Next” for “Local Computer. then hit “OK”.

    Under Personal\Certificates, you will see the 127.0.01 certificate that was installed.

    image

    Because the certificate is not root trusted, it’s installed to the Personal store, when you run applications that have an SSL endpoint on the local development fabric, the web browser will come up with a certificate error indicating that “There is a problem with this website’s security certificate”.

    This is expected. Click on “Continue to this website (not recommended)” to browse to the web site:

    image

    If you would like to make the certificate root trusted and therefore not see the Certificate errors, you can install the certificate to the “Trusted Root Certification Authorities” Certificate store. Simply drag it to the “Trusted Root Certification Authorities” folder in the mmc window. (you can also move it to the Current User TRCA store if you prefer)

    To be on the safe side, please don’t trust any HTTPS web sites with any valued information on a machine where you have made this change.

    Now if you made the 127.0.0.1 certificate root trusted, when you run the application, the web site will come up without the error:

    image

  • Cloudy in Seattle

    Windows Azure Walkthrough: Blob Storage Sample

    • 5 Comments

    Please see the updated post for November 2009 and later.

    This walkthrough covers what I found to be the simplest way to get a sample up and running on Windows Azure that uses the Blob Storage service. It is not trying to be comprehensive or trying to dive deep in the technology, it just serves as an introduction to how the Windows Azure Blob Storage Service works.

    Please take the Quick Lap Around the Tools before doing this walkthrough.

    Note: The code for this walkthrough is attached, you will still have to add and reference the Common and StorageClient projects from the Windows Azure SDK.

    After you have completed this walkthrough, you will have a Web Role that is a simple ASP.Net Web Application that shows a list of files that are stored and can be downloaded from Blob Storage. You can use the Web Role to add files to Blob storage and make them available in the list.

    image

    Blob Concepts

    Each storage account has access to blob storage. For each account there can be 0..n containers. Each container contains the actual blobs, which is a raw byte array. Containers can be public or private. In public containers, the URLs to the blobs can be accessed over the internet while in a private container, only the account holder can access those blob URLs.

    Each Blob can have a set of metadata set as a NameValueCollection of strings.

    image

    Creating the Cloud Service Project

    1. Start Visual Studio as an administrator

    2. Create a new project: File à New à Project

    3. Select “Web Cloud Service”. This will create the Cloud Service Project and an ASP.Net Web Role

    image

    4. Find the installation location of the Windows Azure SDK. By default this will be: C:\Program Files\Windows Azure SDK\v1.0

    a. Find the file named “samples.zip”

    b. Unzip this to a writeable location

    5. From the samples you just unzipped, add the StorageClient\Lib\StorageClient.csproj and HelloFabric\Common\Common.csproj to your solution by right-clicking on the solution in Solution Explorer and selecting Add –> Existing Project.

    image

    a. Common and StorageClient and libraries that are currently distributed as samples that provide functionality to help you build Cloud Applications. Common adds Access to settings and logging while StorageClient provides helpers for using the storage services.

    6. From your Web Role, add references to the Common and StorageClient projects you just added.

    image

    7. Add the code to connect to the Blob Storage Service in Default.aspx.cs.

    a. This code gets account information from the Service Configuration file and uses that to create a BlobStorage instance.

    b. From the BlobStorage instance, a container is created with a name taken from the Service Configuration. The name used for the container is restricted to valid DNS names.

    c. The container is made public so that the URLs for the blobs are accessible from the internet.

    using Microsoft.Samples.ServiceHosting.StorageClient;
    namespace DownloadSite_WebRole
    {
        public partial class _Default : System.Web.UI.Page
        {
            private BlobContainer _Container = null;
    
            protected void Page_Load(object sender, EventArgs e)
            {
                try
                {
                    // Get the configuration from the cscfg file
                    StorageAccountInfo accountInfo = StorageAccountInfo.GetDefaultBlobStorageAccountFromConfiguration();
    
                    // Container names have the same restrictions as DNS names
                    BlobStorage blobStorage = BlobStorage.Create(accountInfo);
                    _Container = blobStorage.GetBlobContainer(RoleManager.GetConfigurationSetting("ContainerName"));
    
                    // returns false if the container already exists, ignore for now
                    // Make the container public so that we can hit the URLs from the web
                    _Container.CreateContainer(new NameValueCollection(), ContainerAccessControl.Public);
                    UpdateFileList();
                }
                catch (WebException webExcept)
                {
                }
                catch (Exception ex)
                {
                }
            }

    8. In order for the StorageAccountInfo class to find the configuration settings, open up ServiceDefinition.csdef and add the following to <WebRole/>. These define the settings.

        <ConfigurationSettings>
          <Setting name="AccountName"/>
          <Setting name="AccountSharedKey"/>
          <Setting name="BlobStorageEndpoint"/>
          <Setting name="ContainerName"/>
        </ConfigurationSettings>
    

    9. Likewise, add the actual local development values to the ServiceConfiguration.cscfg file. Note that the settings between both files have to match exactly otherwise your Cloud Service will not run.

    <ConfigurationSettings>
      <Setting name="AccountName" value="devstoreaccount1"/>
      <Setting name="AccountSharedKey" value="Eby8vdM02xNOcqFlqUwJPLlmEtlCDXJ1OUzFT50uSRZ6IFsuFq2UVErCz4I6tq/K1SZFPTOtr/KBHBeksoGMGw=="/>
      <Setting name="BlobStorageEndpoint" value="http://127.0.0.1:10000/"/>
    
      <!-- Container, lower case letters only-->
      <Setting name="ContainerName" value="downloadsite"/>
    </ConfigurationSettings>
    

    10. When you run in the Cloud, the AccountName and AccountSharedKey will be set the values you will get back from the Portal for your account. The BlobStorageEndpoint will be set the URL for the Blob Storage Service: http://blob.core.windows.net

    a. Because these are set in the ServiceConfiguration.cscfg file, these values can be updated even after deploying to the cloud by uploading a new Service Configuration.

    11. For the local development case, the local host and port 10000 (by default) will be used as the Blob Storage Endpoint. The AccountName and AccountSharedKey are hard coded to a value that the Development Storage service is looking for (it’s the same for all 3, Table, Blob and Queue services).

    12. Next open up Default.aspx and add the code for the UI. The UI consists of:

    a. a GridView at the top

    b. Label and FileUpload control

    c. 2 Label and TextBox pairs (File Name and Submitter)

    d. Field validators to ensure that all of the fields are filled out before the file is uploaded.

    Add the following between the template generated <div></div> elements:

            <asp:GridView ID="fileView"
            AutoGenerateColumns="false" DataKeyNames="BlobName"
            Runat="server" onrowcommand="RowCommandHandler">
            <Columns>
                <asp:ButtonField Text="Delete" CommandName="DeleteItem"/>
                <asp:HyperLinkField
                    HeaderText="Link"
                    DataTextField="FileName"
                    DataNavigateUrlFields="FileUri" />
                <asp:BoundField DataField="Submitter" HeaderText="Submitted by" />
            </Columns>
        </asp:GridView>
    
        <br />
        <asp:Label id="filePathLabel" 
            Text="File Path:" 
            AssociatedControlID="fileUploadControl"
            Runat="server" />
        <asp:FileUpload ID="fileUploadControl" runat="server"/>
        <asp:requiredfieldvalidator id="filUploadValidator"
          controltovalidate="fileUploadControl"
          validationgroup="fileInfoGroup"
          errormessage="Select a File"
          runat="Server">
        </asp:requiredfieldvalidator>
        <br />
        <asp:Label
            id="fileNameLabel"
            Text="File Name:"
            AssociatedControlID="fileNameBox"
            Runat="server" />
        <asp:TextBox
            id="fileNameBox"
            Runat="server" />
        <asp:requiredfieldvalidator id="fileNameValidator"
          controltovalidate="fileNameBox"
          validationgroup="fileInfoGroup"
          errormessage="Enter the File Name"
          runat="Server">
        </asp:requiredfieldvalidator>
        <br />
        <asp:Label
            id="submitterLabel"
            Text="Submitter:"
            AssociatedControlID="submitterBox"
            Runat="server" />
        <asp:TextBox
            id="submitterBox"
            Runat="server" />
        <asp:requiredfieldvalidator id="submitterValidator"
          controltovalidate="submitterBox"
          validationgroup="fileInfoGroup"
          errormessage="Enter the Submitter Name"
          runat="Server">
        </asp:requiredfieldvalidator>
        <br />
        <asp:Button
            id="insertButton"
            Text="Submit"
            causesvalidation="true"
            validationgroup="fileInfoGroup"
            Runat="server" onclick="insertButton_Click"/>
        <br />
        <br />
        <asp:Label id="statusMessage" runat="server"/>
    

    13. If you now switch to design view, you will see:

    image

    14. In order to simplify the databinding to the UI, lets add a FileEntry class that contains the metadata for each blob.

    public class FileEntry
    {
        public FileEntry(string blobName, Uri fileAddress, string name, string user)
        {
            BlobName = blobName;
            FileUri = fileAddress;
            FileName = name;
            Submitter = user;
        }
    
        public Uri FileUri
        {
            get;
            set;
        }
    
        public string BlobName
        {
            get;
            set;
        }
    
        public string FileName
        {
            get;
            set;
        }
    
        public string Submitter
        {
            get;
            set;
        }
    }

    15. Back to Default.aspx.cs, let’s add code to populate the GridView. One row for each blob found in storage. A FileEntry for each blob found in the container is create and put in a List.

    a. Note that in order to get the metadata from a blob, you need to call BlobContainer.GetBlobProperties(), the list of blobs returned from BlobContainer.ListBlobs() does not contain the metadata.

    using System.Collections.Generic;
    using System.Collections.Specialized;
    private void UpdateFileList()
    {
        IEnumerable<object> blobs = _Container.ListBlobs(string.Empty, false);
        List<FileEntry> filesList = new List<FileEntry>();
    
        foreach (object o in blobs)
        {
            BlobProperties bp = o as BlobProperties;
            if (bp != null)
            {
                BlobProperties p = _Container.GetBlobProperties(bp.Name);
                NameValueCollection fileEntryProperties = p.Metadata;
                filesList.Add(new FileEntry(p.Name, bp.Uri, fileEntryProperties["FileName"], fileEntryProperties["Submitter"]));
            }
        }
    
        fileView.DataSource = filesList;
        fileView.DataBind();
    }

    16. Add the code to upload a file to blob storage.

    a. Create a unique blob name by using a Guid. Appended the existing file extension.

    b. Add metadata

    i. For the file name (friendly name to show instead of the blob name or URL)

    ii. For the Submitter

    c. Add the bytes and create the Blob

    d. The UI is refreshed after this operation

    protected void insertButton_Click(object sender, EventArgs e)
    {
        // Make a unique blob name
        string extension = System.IO.Path.GetExtension(fileUploadControl.FileName);
        BlobProperties properties = new BlobProperties(Guid.NewGuid().ToString() + extension);
    
        // Create metadata to be associated with the blob
        NameValueCollection metadata = new NameValueCollection();
        metadata["FileName"] = fileNameBox.Text;
        metadata["Submitter"] = submitterBox.Text;
    
        properties.Metadata = metadata;
        properties.ContentType = fileUploadControl.PostedFile.ContentType;
    
        // Create the blob
        BlobContents fileBlob = new BlobContents(fileUploadControl.FileBytes);
        _Container.CreateBlob(properties, fileBlob, true);
    
        // Update the UI
        UpdateFileList();
        fileNameBox.Text = "";
        statusMessage.Text = "";
    }

    17. Add code to delete the blob. This is as simple as calling BlobContainer.DeleteBlob() passing in the blob name. In this case, it is the Guid + file extension generated during the upload.

    protected void RowCommandHandler(object sender, GridViewCommandEventArgs e)
    {
        if (e.CommandName == "DeleteItem")
        {
            int index = Convert.ToInt32(e.CommandArgument);
            string blobName = (string)fileView.DataKeys[index].Value;
    
            if (_Container.DoesBlobExist(blobName))
            {
                _Container.DeleteBlob(blobName);
            }
        }
        UpdateFileList();
    }

    18. Finally, lets just round out some of the error handling on Page_Load(). Right after _Container.CreateContainer() lets update the UI and properly catch any exceptions that could get thrown.

    using System.Net;
        _Container.CreateContainer(new NameValueCollection(), ContainerAccessControl.Public);
        UpdateFileList();
    }
    catch (WebException webExcept)
    {
        if (webExcept.Status == WebExceptionStatus.ConnectFailure)
        {
            statusMessage.Text = "Failed to connect to the Blob Storage Service, make sure it is running: " + webExcept.Message;
        }
        else
        {
            statusMessage.Text = "Error creating container: " + webExcept.Message;
        }
    }
    catch (Exception ex)
    {
        statusMessage.Text = "Error creating container: " + ex.Message;
    }

    19. Build and hit F5 to run the application.

    a. Notice that the Development Fabric and the Development Storage startup on your behalf and will continue to run until you close them.

    b. Note that the FileUpload control has a file size limit. You can modify it by changing the httpRuntime maxRequestLength attribute in web.config.

    image

    Please see the Deploying a Cloud Service to learn how to modify the configuration of this Cloud Service to make it run on Windows Azure.

  • Cloudy in Seattle

    Walkthrough: Windows Azure Blob Storage (Nov 2009 and later)

    • 27 Comments

    Similar to the table storage walkthrough I posted last week, I updated this blog post for the Nov 2009/v1.0 and later release of the Windows Azure Tools.

    This walkthrough covers what I found to be the simplest way to get a sample up and running on Windows Azure that uses the Blob Storage service. It is not trying to be comprehensive or trying to dive deep in the technology, it just serves as an introduction to how the Windows Azure Blob Storage Service works.

    Please take the Quick Lap Around the Tools before doing this walkthrough.

    Note: The code for this walkthrough is attached to this blog post.

    After you have completed this walkthrough, you will have a Web Role that is a simple ASP.NET Web Application that shows a list of files that are stored and can be downloaded from Blob Storage. You can use the Web Role to add files to Blob storage and make them available in the list.

    image

    Blob Concepts

    Each storage account has access to blob storage. For each account there can be 0..n containers. Each container contains the actual blobs, which is a raw byte array. Containers can be public or private. In public containers, the URLs to the blobs can be accessed over the internet while in a private container, only the account holder can access those blob URLs.

    Each Blob can have a set of metadata set as a NameValueCollection of strings.

    image

    Creating the Cloud Service Project

    1. Start Visual Studio as an administrator (Screen shots below are from VS 2008, VS 2010 is also supported)

    2. Create a new project: File | New Project

    3. Under the Visual C# node (VB is also supported), select the “Cloud Service” project type then select the “Windows Azure Cloud Service” project template. Set the name to be “SimpleBlobSample”. Hit OK to continue.

    image

    This will bring up a dialog to add Roles to the Cloud Service.

    4. Add an ASP.NET Web Role to the Cloud Service, we’ll use the default name of “WebRole1”.  Hit OK.

    image

    Solution explorer should look as follows:

    image

    We’ll now cover the implementation, which can be broken up into 5 different parts:

    1. Implementing the UI
    2. Connecting to Windows Azure storage
    3. Adding blobs
    4. Enumerating existing blobs
    5. Deleting blobs

    Implementing the UI

    5. Next open up Default.aspx and add the code for the UI. The UI consists of:

    • GridView at the top
    • Label and FileUpload control
    • 2 Label and TextBox pairs (File Name and Submitter)
    • Field validators to ensure that all of the fields are filled out before the file is uploaded.

    Add the following between the template generated <div></div> elements:

    <asp:GridView ID="fileView" AutoGenerateColumns="false" DataKeyNames="FileUri" runat="server"
        OnRowCommand="RowCommandHandler">
        <Columns>
            <asp:ButtonField Text="Delete" CommandName="DeleteItem" />
            <asp:HyperLinkField HeaderText="Link" DataTextField="FileName" DataNavigateUrlFields="FileUri" />
            <asp:BoundField DataField="Submitter" HeaderText="Submitted by" />
        </Columns>
    </asp:GridView>
    <br />
    <asp:Label ID="filePathLabel" Text="File Path:" AssociatedControlID="fileUploadControl"
        runat="server" />
    <asp:FileUpload ID="fileUploadControl" runat="server" />
    <asp:RequiredFieldValidator ID="filUploadValidator" ControlToValidate="fileUploadControl"
        ValidationGroup="fileInfoGroup" ErrorMessage="Select a File" runat="Server">
    </asp:RequiredFieldValidator>
    <br />
    <asp:Label ID="fileNameLabel" Text="File Name:" AssociatedControlID="fileNameBox"
        runat="server" />
    <asp:TextBox ID="fileNameBox" runat="server" />
    <asp:RequiredFieldValidator ID="fileNameValidator" ControlToValidate="fileNameBox"
        ValidationGroup="fileInfoGroup" ErrorMessage="Enter the File Name" runat="Server">
    </asp:RequiredFieldValidator>
    <br />
    <asp:Label ID="submitterLabel" Text="Submitter:" AssociatedControlID="submitterBox"
        runat="server" />
    <asp:TextBox ID="submitterBox" runat="server" />
    <asp:RequiredFieldValidator ID="submitterValidator" ControlToValidate="submitterBox"
        ValidationGroup="fileInfoGroup" ErrorMessage="Enter the Submitter Name" runat="Server">
    </asp:RequiredFieldValidator>
    <br />
    <asp:Button ID="insertButton" Text="Submit" CausesValidation="true" ValidationGroup="fileInfoGroup"
        runat="server" OnClick="insertButton_Click" />
    <br />
    <br />
    <asp:Label ID="statusMessage" runat="server" />

    6. If you now switch to design view, you will see:

    image

    You’ll also notice from that aspx that there is an event handler for OnRowCommand on the GridView to handle the DeleteItem command, IDs for the TextBoxes and an event handler for the OnClick event on the Submit button.

    The code for these will be filled out further down in the walkthrough.

    Connecting to Windows Azure storage

    7. Open Default.aspx.cs and add the code to connect to the Blob Storage Service to the Page_Load() method.

    using Microsoft.WindowsAzure;
    using Microsoft.WindowsAzure.StorageClient;
    
    namespace WebRole1
    {
        public partial class _Default : System.Web.UI.Page
        {
            private CloudBlobClient _BlobClient = null;
            private CloudBlobContainer _BlobContainer = null;
    
            protected void Page_Load(object sender, EventArgs e)
            {
                // Setup the connection to Windows Azure Storage
                var storageAccount = CloudStorageAccount.FromConfigurationSetting("DataConnectionString");
                _BlobClient = storageAccount.CreateCloudBlobClient();

    8.  Setup the “DataConnectionString” setting by opening up the configuration UI for WebRole1.  Right click on the WebRole1 node under the Roles folder in the SimpleBlobStorage cloud service project and select “Properties”.

    image

    9. Switch to the Settings tab and click “Add Setting”.  Name it DataConnectionString, set the type to ConnectionString and click on the “…” button on the far right.

    image

    Hit “OK” to set the credentials to use Development Storage.  We’ll first get this sample working on development storage then convert it to use cloud storage later.

    10. If you actually tried to connect to Blob Storage at this point, you would find that the CloudStorageAccount.FromConfigurationSetting() call would fail with the following message:

    ConfigurationSettingSubscriber needs to be set before FromConfigurationSetting can be used

    This message is in fact incorrect – I have a bug filed to get this fixed, to say “Configuration Setting Publisher” and not “ConfigurationSettingSubscriber”.

    To resolve this, we need to add a bit of template code to the WebRole.cs file in WebRole1.  Add the following to the WebRole.OnStart() method in WebRole.cs in the WebRole1 project.

    using Microsoft.WindowsAzure;

    public override bool OnStart()
    {
        DiagnosticMonitor.Start("DiagnosticsConnectionString");
    
        #region Setup CloudStorageAccount Configuration Setting Publisher
    
        // This code sets up a handler to update CloudStorageAccount instances when their corresponding
        // configuration settings change in the service configuration file.
        CloudStorageAccount.SetConfigurationSettingPublisher((configName, configSetter) =>
        {
            // Provide the configSetter with the initial value
            configSetter(RoleEnvironment.GetConfigurationSettingValue(configName));
    
            RoleEnvironment.Changed += (sender, arg) =>
            {
                if (arg.Changes.OfType<RoleEnvironmentConfigurationSettingChange>()
                    .Any((change) => (change.ConfigurationSettingName == configName)))
                {
                    // The corresponding configuration setting has changed, propagate the value
                    if (!configSetter(RoleEnvironment.GetConfigurationSettingValue(configName)))
                    {
                        // In this case, the change to the storage account credentials in the
                        // service configuration is significant enough that the role needs to be
                        // recycled in order to use the latest settings. (for example, the 
                        // endpoint has changed)
                        RoleEnvironment.RequestRecycle();
                    }
                }
            };
        });
        #endregion
    
        // For information on handling configuration changes
        // see the MSDN topic at http://go.microsoft.com/fwlink/?LinkId=166357.
        RoleEnvironment.Changing += RoleEnvironmentChanging;
    
        return base.OnStart();
    }

    The comments (which I wrote and is included in the samples) should explain what is going on if you care to know.  In a nutshell, this code bridges the gap between the Microsoft.WindowsAzure.StorageClient assembly and the Microsoft.WindowsAzure.ServiceRuntime library – the Storage Client library is agnostic to the Windows Azure runtime as it can be used in non Windows Azure applications.

    This code essentially says how to get a setting value given a setting name and sets up an event handler to handle setting changes while running in the cloud (because the ServiceConfiguration.cscfg file was updated in the cloud).

    The key point is that with this snippet of code in place, you now have everything in place to connect to Windows Azure storage and create a CloudBlobClient instance.

    Adding Blobs

    11. In order to add blobs to a container, you first need to setup a container.  Let’s add this code to the Page_Load() method in Default.aspx.cs.  For a production application, you will want to optimize this code to avoid doing all this work on every page load.

    Page_Load() should now look as follows.

    protected void Page_Load(object sender, EventArgs e)
    {
        // Setup the connection to Windows Azure Storage
        var storageAccount = CloudStorageAccount.FromConfigurationSetting("DataConnectionString");
        _BlobClient = storageAccount.CreateCloudBlobClient();
    
        // Get and create the container
        _BlobContainer = _BlobClient.GetContainerReference("publicfiles");
        _BlobContainer.CreateIfNotExist();
    
        // Setup the permissions on the container to be public
        var permissions = new BlobContainerPermissions();
        permissions.PublicAccess = BlobContainerPublicAccessType.Container;
        _BlobContainer.SetPermissions(permissions);
    
        // Show the current list.
        UpdateFileList();
    }

    Note: The container is named with DNS naming restrictions (i.e. all lower case) and is created if it does not exist.  Additionally, the container is set to be a public container – i.e. the URIs to the blobs are accessible by anyone over the internet. 

    Had this been a private container, the blobs in that container could only be read by code that has the access key and account name.

    12.  Let’s now add the code to Default.aspx.cs to add a blob when the “Submit” button on the UI is clicked (remember the event handler was defined in the aspx).

    A GUID is created for the file name to ensure a unique blob name is used.  The file name and submitter are gotten from the TextBoxes in the UI.

    Blob Metadata, or user defined key/value pairs, is used to store the file name and submitter along with the blob. 

    protected void insertButton_Click(object sender, EventArgs e)
    {
        // Make a unique blob name
        string extension = System.IO.Path.GetExtension(fileUploadControl.FileName);
    
        // Create the Blob and upload the file
        var blob = _BlobContainer.GetBlobReference(Guid.NewGuid().ToString() + extension);
        blob.UploadFromStream(fileUploadControl.FileContent);
    
        // Set the metadata into the blob
        blob.Metadata["FileName"] = fileNameBox.Text;
        blob.Metadata["Submitter"] = submitterBox.Text;
        blob.SetMetadata();
    
        // Set the properties
        blob.Properties.ContentType = fileUploadControl.PostedFile.ContentType;
        blob.SetProperties();
    
        // Update the UI
        UpdateFileList();
        fileNameBox.Text = "";
        statusMessage.Text = "";
    }

    Enumerating Existing Blobs

    13. In order to simplify the databinding to the UI, lets add a FileEntry class that contains the data we want to show in the UI for each blob.  One instance of a FileEntry corresponds to a blob. 

    Right click on WebRole1 and select Add | Class…

    image

    Name the class FileEntry.cs and hit OK.

    14. Fill out FileEntry.cs with the following code:

    public class FileEntry
    {
        public Uri FileUri { get; set; }
        public string FileName { get; set; }
        public string Submitter { get; set; }
    }

    15. Back to Default.aspx.cs, let’s add code to populate the GridView by getting the collection of blobs from ListBlobs() and creating a FileEntry for each item.

    FetchAttributes() is used to retrieve the blob metadata.

    using System.Collections.Generic;
    using System.Collections.Specialized;

    private void UpdateFileList()
    {
        // Get a list of the blobs
        var blobs = _BlobContainer.ListBlobs();
        var filesList = new List<FileEntry>();
    
        // For each item, create a FileEntry which will populate the grid
        foreach (var blobItem in blobs)
        {
            var cloudBlob = _BlobContainer.GetBlobReference(blobItem.Uri.ToString());
            cloudBlob.FetchAttributes();
    
            filesList.Add(new FileEntry() { 
                FileUri = blobItem.Uri,
                FileName = cloudBlob.Metadata["FileName"],
                Submitter = cloudBlob.Metadata["Submitter"]
            });    
        }
        
        // Bind the grid
        fileView.DataSource = filesList;
        fileView.DataBind();
    }

    Deleting Blobs

    16. Add code to delete the blob in the row command handler that was setup in the aspx. This is as simple as calling CloudBlobContainer.DeleteIfExists() for the blob where the CloudBlob instance is gotten from the Guid + file extension generated during the upload.

    protected void RowCommandHandler(object sender, GridViewCommandEventArgs e)
    {
        if (e.CommandName == "DeleteItem")
        {
            var index = Convert.ToInt32(e.CommandArgument);
            var blobName = (string)fileView.DataKeys[index].Value;
            var blobContainer = _BlobClient.GetContainerReference("publicFiles");
            var blob = blobContainer.GetBlobReference(blobName);
            blob.DeleteIfExists();
        }
    
        // Update the UI
        UpdateFileList();
    }

    Testing the Application

    17. Build and hit F5 to run the application.

    Note: The FileUpload control has a file size limit. You can modify it by changing the httpRuntime maxRequestLength attribute in web.config.

    image

    Moving from Development Storage to Cloud Storage

    18. Now I want to switch this to use Windows Azure Storage, not the development storage.  The first step is to go to the Windows Azure Developer Portal and create a storage account.

    Login and click on “New Service”, and select “Storage Account”:

    image

    Fill out the service name, the public name and optionally choose a region.  You will be brought to a page that contains the following (note I rubbed out the access keys):

    image

    19. You will use the first part of the endpoint, (jnakstorageaccount) and the one of the access keys to fill out your connection string.

    20. Open the WebRole1 config again, bring up Settings | DataConnectionString and fill out the account name and the account key and hit OK.

    image

    21. Hit F5 to run the application again. 

    This time you will be running against cloud storage – note that the data you entered when running against the development storage is no longer there.

    Important Note: Before deploying to the cloud, the DiagnosticsConnectionString also needs to use storage account credentials and not the development storage account.

    Please see the Deploying a Cloud Service walkthrough to learn how to deploy this to the Windows Azure cloud.

    For more information, please see the Blob Service API documentation and the Programming Blob Storage white paper.

    I know that there are a number of different concepts that have to be pieced together, hopefully this walkthrough has been helpful in understand how everything fits together.

  • Cloudy in Seattle

    Using an Existing ASP.NET Web Application as a Windows Azure Web Role

    • 12 Comments

    [For an expanded walkthrough about using an existing ASP.NET Web Application in a Cloud Service and migrating data to SQL Azure, please see this post: http://blogs.msdn.com/jnak/archive/2010/02/08/migrating-an-existing-asp-net-app-to-run-on-windows-azure.aspx.  Updated info on using MVC with WIndows Azure in this post.] 

    One of the things which I’ve kind of covered with my MVC posts here and here is the steps for taking an existing ASP.Net Web Application project and getting it to run on Windows Azure as the Web Role.

    Based on some of the forum questions I’ve seen, I figured this could use a post on its own.

    I’ll start this off by making a new ASP.Net Web Application – note that there is currently a limitation in that only Web Applications (and not Web Sites) can be associated as Web Roles in the Windows Azure Tools for Visual Studio – if you really need to use a Web Site, you can do so with the Windows Azure SDK.  If you have a Web Site and are willing to convert it to a Web Application, Visual Studio provides a conversion tool.

    File –> New project brings up the new project dialog:

    image 

    Where I’ll select an ASP.Net Web Application.  (this works the same for all ASP.Net Web Application types)

    This creates the Web Application project.

    Right-click on the Web Application project and select “Unload project”.

    image

    Then edit the csproj file:

    image

    This will bring up the csproj (same process for VB) file in the XML editor in Visual Studio.  To the top PropertyGroup, you need to add the RoleType element:

    <Project ToolsVersion="3.5" DefaultTargets="Build" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
      <PropertyGroup>
    (. . .)
    <RoleType>Web</RoleType> </PropertyGroup>

    Save the project file (in this case the csproj).

    A note on RoleType – our tools use the RoleType element in the project to filter out projects in the solution that can be associated as either a Web or Worker role.  That is, when you right click on the roles node in the Cloud Service project and select to associate either the Web or Worker Role to a project in the solution – we use this element to give you a list of Web Role or Worker Role projects to choose from.

    Now, create a new blank Cloud Service

     

     

    image

    After the Cloud Service project creation completes, in the Solution Explorer, add the Web Application project created above to the solution by right clicking on the solution and selection Add –> Existing Project…

    image

    and selecting the Web Application project.

    Now add a Web role to the Cloud Service by selecting the Web Application.  This is possible because the project property was added to the csproj file.

    image

    Select the Web Application:

    image

    You now have a Cloud Service that has a Web Role that points to the ASP.Net Web Application that was created above.

    image

    One final thing to do is, in the Web Application project, add a reference to the Microsoft.ServiceHosting.ServiceRuntime assembly.

    image

    This assembly contains the Fabric runtime APIs that you can call for logging, configuration and local storage.

    And that’s it – hit F5 and you get debugging with your service running on the Development Fabric.  Publish the Cloud Service and you get a Windows Azure Service Package. 

    If you are using the ASP.Net providers such as membership, role, profile or session state you can use the sample implementations that are in included in the SDK samples that use Cloud Storage.  See this post for more information on how to set those up.

    One of the bigger challenge you will have when trying to run an existing ASP.Net Web Application on Windows Azure is data.

    Windows Azure provides Blob, Queue and Table storage but doesn’t have a SQL Server story yet (nor does the richer SQL Data Services).  This means that you would have to rewrite your data access layer to use one of the Cloud Storage services.

    You will also need to make sure that you don’t have assumptions about the state of the machine you are running on – in the Windows Azure world, your service could easily be moved to a new VM.  Additionally – Windows Azure Cloud Services only run in a modified version of partial trust.  If you make any calls that require Full Trust, those calls won’t work on Windows Azure.

    Finally, the scale model on Windows Azure is to increase instances of both the Web role and Worker role.  In order to make effective use of that model, it is quite likely that you’ll have to rework some of your code.

    That’s the story for the January 2009 CTP.  Keep in mind that we’re looking at how we can improve on this story and make it easier to move existing and new assets to and from Windows Azure. 

    Stay tuned.

  • Cloudy in Seattle

    Installing Certificates in Windows Azure VMs

    • 7 Comments

    A little while ago I posted How To: Add an HTTPS Endpoint to a Windows Azure Cloud Service which talked about the whole process around adding an HTTPS endpoint and configuring & uploading the SSL certificate for that endpoint.

    This post is a follow up to that post to talk about installing any certificate to a Windows Azure VM.  You may want to do this to install various client certificates or even to install the intermediate certificates to complete the certificate chain for your SSL certificate.

    In order to peak into the cloud, I’ve written a very simple app that will enumerate the certificates of the Current User\Personal (or My) store.

    Create a new Windows Azure Cloud Service and add an ASP.NET Web Role to it. 

    Open up default.aspx and add the following between the empty <div></div> to setup a table that will be used to list the certificates.

    <asp:Table ID="certificateTable" runat="server">
        <asp:TableRow runat="server">
            <asp:TableCell runat="server">Friendly Name:</asp:TableCell>
            <asp:TableCell runat="server">Issued By:</asp:TableCell>
            <asp:TableCell runat="server">Issued To:</asp:TableCell>
            <asp:TableCell runat="server">Expiration Date:</asp:TableCell>
        </asp:TableRow>
    </asp:Table>
    

    Now, I have some simple code that opens up a certificate store and adds a row to the table for each certificate found with the Friendly Name, Issuer, Subject and Expiration date.

    protected void Page_Load(object sender, EventArgs e)
    {
        X509Certificate2Collection selectedCerts = new X509Certificate2Collection();
    
        X509Store store = new X509Store(StoreName.My, StoreLocation.CurrentUser);
        try
        {
            store.Open(OpenFlags.OpenExistingOnly | OpenFlags.ReadOnly);
            foreach (X509Certificate2 cert in store.Certificates)
            {
                TableRow certificateRow = new TableRow();
    
                // Friendly Name
                TableCell friendlyNameCell = new TableCell();
                TextBox friendlyNameText = new TextBox();
                friendlyNameText.Text = cert.FriendlyName;
                friendlyNameCell.Controls.Add(friendlyNameText);
                certificateRow.Cells.Add(friendlyNameCell);
    
                // Issuer
                TableCell issuerCell = new TableCell();
                TextBox issuerText = new TextBox();
                issuerText.Text = cert.Issuer;
                issuerCell.Controls.Add(issuerText);
                certificateRow.Cells.Add(issuerCell);
                
                // Subject
                TableCell subjectCell = new TableCell();
                TextBox subjectText = new TextBox();
                subjectText.Text = cert.Subject;
                subjectCell.Controls.Add(subjectText);
                certificateRow.Cells.Add(subjectCell);
    
                // Expiration
                TableCell expirationCell = new TableCell();
                TextBox expirationText = new TextBox();
                expirationText.Text = cert.NotAfter.ToString("d");
                expirationCell.Controls.Add(expirationText);
                certificateRow.Cells.Add(expirationCell);
                
                // Add the TableRow to the Table
                certificateTable.Rows.Add(certificateRow);
            }
        }
        finally
        {
            store.Close();
        }
    }

    Build, publish and deploy this to the cloud and you’ll see that the CurrentUser\MY store is empty. 

    image 

    Now that I have a way to show you the certificates that I install to the cloud, let’s follow the process to install 2 certificates.  One is a self signed certificate with a private key I created myself and the other is an intermediate certificate I got from the verisign web site. 

    Installing a certificate to Windows Azure is a three step process. 

    1. Configure the certificates to install per role
    2. Upload the certificates via the Windows Azure Developer Portal per hosted service
    3. Deploy your app and when VMs are setup to host each of your role instances, those VMs will contain your certificates.

    Note: The configuration of the certificates are per role but the upload of the certificates are for a hosted service and will be used for all of the roles in that hosted service.

    Configure the certificates.  Bring up the configuration UI for the Web Role in your solution, click on “Certificates” and click on “Add Certificate”.

    image

    Now click on the “. . .” to bring up the certificate selection dialog.  This currently enumerates the certificates in the CurrentUser\My store, however in future versions this will enumerate the certificates in LocalMachine\My.  The dialog specifies which store it is listing the certificates from.

    image

    In my example, I’m selecting a self-signed certs with an exportable private key that I’ve setup ahead of time.

    This sets the thumbprint field for that certificate.

    image

    Now I have a CER file from verisign which is an intermediate cert.  The steps to create it was to copy a big long chunk of text from their web site and save it as a .CER file which I did.

    I have a couple of choices on how to handle this, I could install this certificate to the CurrentUser\My store and select it from the VS UI to get the thumbprint.

    Instead, I’m going to open the certificate, switch to Details and copy the Thumbprint from the certificate itself (the point of doing the 2 certificates is to show alternate ways of handling the certificates):

    image

    Now I click “Add Certificate” on the Certificates tab of the VS UI and paste in the thumbprint for my second certificate.  I also changed the names of the certificate entries to “SelfSigned” and “Intermediate” for clarity but certainly is not necessary.

    image

    Note:I changed the store location and store name to CurrentUser and My – typically you will install intermediate certs to the CA store but to keep this sample simple I only want to enumerate one store, CurrentUser\My which as we saw above is empty by default in Windows Azure so I’m installing the two example certificates to the CurrentUser\My store.

    The dropdown for store name currently contains My, Root, CA and Trust.  We do support you opening the service definition file and setting any string for the Store Name that you would like in the event that the 4 in the drop down are not sufficient.

    <Certificate name="SelfSigned" storeLocation="CurrentUser" storeName="<enter a value>" />
    

    I now need to upload these certificates through the Developer Portal which means I need them as files.  I have the .CER file already but need to export my self-signed cert from the certificate store.

    To do so I run certmgr.msc, browse to the certificate, right click | All Tasks | Export…

    image

    This takes me through Windows Certificate Export Wizard.  One of the key things for SSL certificates and any other certificates where you need the private key in the Cloud, is to ensure that you select “Yes, export the private key” and provide a password during the export

    image

    After finishing the wizard, I now have 2 certificate files, the .CER file and the .PFX file that was just created.

    image

    Now I go to the Developer portal, and select my Hosted Service.  Down at the bottom I select “Manage” under the “Certificates” heading.

    image 

    This allows me to browse to PFX files, type the same password you entered when exporting the certificate and uploading that certificate to the cloud.

    image 

    You’ll notice here that the Portal only allows uploading of PFX files but we need to upload a .CER file.  This will change in the future but for now, there is some simple code you can run to convert your .CER file to PFX.  I put the following code in a console application and ran it.

    static void Main(string[] args)
    {
        // The path to the certificate.
        string certificate = @"C:\Users\jnak.REDMOND\Desktop\verisignintermediate.cer";
    
        // Load the certificate into an X509Certificate object.
        X509Certificate cert = new X509Certificate(certificate);
        byte[] certData = cert.Export(X509ContentType.Pkcs12);
    
        System.IO.File.WriteAllBytes(@"C:\Users\jnak.REDMOND\Desktop\verisignintermediate.pfx", certData);
    }

    Now that I have a PFX file for my CER file, I upload that leaving the password field blank because the password is only used to protect private keys which CER files do not have and therefore this PFX file also does not have.

    You can see here that both certificates are now installed to my Hosted Service.

    image

    Now I deploy my Cloud Service which will enumerate the two certificates I installed to the CurrentUser\My store.  If you deployed the app already as I did above, just suspend and delete that deployment and deploy the app again with the new configuration.  (both the service definition and service configuration files have changed so the package (cspkg) and service configuration (cscfg) need to be deployed again to Windows Azure.

    After you complete the deploy process and your app is running on Windows Azure, you’ll now see the following:

    image

    This shows that the VMs in the cloud for your role have the certificates you configured and uploaded installed.

    Finally I’ll reiterate that certificate configuration is on a per role basis.  As an example, if I added a worker role to this same Cloud Service, the VMs for the worker role instances would not contain the certificates we configured above unless you opened up the configuration UI for the Worker role and added the same entries to the Certificates tab.

    That said, because the certificates are upload per hosted service – you do not need to upload the certificates to the cloud more than once for a given Cloud Service.

  • Cloudy in Seattle

    Installing the Windows Azure Tools using the Web Platform Installer

    • 0 Comments

    Today, the IIS team released a the Web Platform Installer 2.0 RTW.  Among the many cool new things (more tools, new applications, and localization to 9 languages) is the inclusion of the Windows Azure Tools for Microsoft Visual Studio 2008.

    Install the Windows Azure Tools for Microsoft Visual Studio 2008 using the Web Platform Installer.

    image

    Why should you care?  As many of you know, before using the Windows Azure Tools, you need install and configure IIS which requires figuring out how to do that and following multiple steps.  The Web Platform Installer (we call it the WebPI) makes installing the Tools, SDK and IIS as simple as clicking a few buttons.

    For example, on a fresh machine, when I use the WebPI to install the Tools – look at all of the dependencies that get brought in and installed for me.  I also don’t need to know to install IIS as a separate step, *it just works*.

    clip_image002

    One thing I do want to point out is that if you to browse around the WebPI to find the Windows Azure Tools, you first have to go to the Options dialog:

    image

    And select the “Developer Tools” checkbox.

    image

    A couple of other notes:

    • The Microsoft Web Platform home page is on microsoft.com/web
    • If you have trouble with the link to install the Windows Azure Tools, try installing the WebPI manually first, then clicking on the link.
    • The applications installed by the Web Platform Installer are intended to be run on IIS, not Windows Azure.  We’re working on resolving this moving forward.
  • Cloudy in Seattle

    Videos of the Windows Azure Sessions at PDC09

    • 5 Comments

    Here are the videos of the Windows Azure sessions at PDC09.  Lots of useful content, the sessions were well attended and well received.

    At the time of this writing, some of the videos are not yet posted but they will be by the end of the week.

    Enjoy.

    Windows Azure Sessions

    My session -- Tips and Tricks for Using Visual Studio 2010 to Build Applications that Run on Windows Azure

    Introductory

    Learn to Develop for Windows Azure

    Windows Azure Storage

    Windows Azure as an Open Platform

    SQL Azure Sessions

    Showcases

  • Cloudy in Seattle

    Add and Vote for Windows Azure Features

    • 0 Comments

    [Changing title to be more clear]

    Mike Wickstrand, the director of Windows Azure product planning has put together a site where you can post and vote for Windows Azure ideas: http://www.mygreatwindowsazureidea.com

    The idea came from the success we had with the Silverlight feature suggestions page and we hope to duplicate that success.

    Please take the time to go to http://www.mygreatwindowsazureidea.com/ today submit your ideas and your votes - and don't forget about the areas that are my passion - the developer experiences and tools.

  • Cloudy in Seattle

    Videos of the Windows Azure Sessions at MIX10

    • 1 Comments

    Had a great time at MIX10 this year, talked to a lot of customers and got a lot of great feedback – thank you.

    If you weren’t able to attend and are interested in Windows Azure, here are links to the Windows Azure sessions.  Videos of all of the sessions are available at: http://live.visitmix.com/Videos (some of the sessions will come online toward the end of the day)

    My session: Building and Deploying Windows Azure based Applications with Microsoft Visual Studio 2010.

    Day 1 Keynote - Scott Guthrie and Joe Belfiore

    Day 2 Keynote – Scott Guthrie, Dean Hachamovitch, Bill Buxton and Doug Purdy

    All Windows Azure sessions:

  • Cloudy in Seattle

    Windows Azure Instance & Storage Limits

    • 0 Comments

    Recently, a colleague of mine wrote about the Windows Azure instance limits: http://blog.toddysm.com/2010/01/windows-azure-role-instance-limits-explained.html

    His post is very complete, I recommend you have a look but here is my take:

    These are default limits that are in place to ensure that Windows Azure will always have VMs available to all of our customers.  If you have a need for more capacity, we want to help!  Please contact us: http://go.microsoft.com/fwlink/?LinkID=123579

    The limits are:

    • 20 Hosted Service Projects
    • 5 Storage Accounts
    • 5 roles per Hosted Service (i.e. 3 different web roles + 2 different worker roles or any such combination)
    • 20 CPU cores across all of your Hosted Service Projects

    The first two are really easy to track, on the Development portal when you go to create a new service, it’ll tell you how many you have left of each:

    image

    5 roles per Hosted Service is also easy to understand, this corresponds to the number of projects you can add as roles to your Cloud Service – here I am hitting my role limit:

    image

    So let’s talk real quick about the 20 CPU core limit – note that the limit is on CPU cores, not on instances. 

    When you configure your role, you can set the number of instances as well as the VM size:

    image

    The VM sizes of Small, Medium, Large and ExtraLarge are defined here: http://msdn.microsoft.com/en-us/library/ee814754.aspx

    Today the CPU cores for each VM size are: (subject to change so always consult the MSDN link above for the latest information)

    VM Size CPU Cores
    Small 1
    Medium 2
    Large 4
    ExtraLarge 8

    So the number of CPU cores for a role is the (instance count) X (Number of CPU Cores for the selected VM Size).

    If you add those up across all of your roles across all of your Hosted Service projects (staging and production slots) – this has to be lower than 20.

    Quick example: if you have 5 Hosted Service projects with 1 role, 2 instances per role and Medium VM size, you’ve hit the limit.

    The other key is that you not only need to stop your deployment to free up CPU cores, you also need to delete the deployment to reduce your CPU core count.

    What about Windows Azure Storage quotas? 

    It just so happens that another colleague of mine has written about this: http://blogs.msdn.com/frogs69/archive/2009/12/17/storage-quotas-and-core-allocation-on-windows-azure.aspx

    Each storage account allows you to have 100TB of data across all of your blob, tables and queues.  As mentioned above, you can have up to 5 storage accounts.

    If you are dealing with really large data sets, follow the link above to see the limits on the blobs, # properties in a table, entity and queue messages.

  • Cloudy in Seattle

    The Easy Way to Install the Windows Azure Tools and SDK Pre-Requisites

    • 4 Comments

     (Update 9/24/2009 -- We're now included in the Web Platform Installer!  See: http://blogs.msdn.com/jnak/archive/2009/09/24/installing-the-windows-azure-tools-using-the-web-platform-installer.aspx)

    One of the first things I do when I get a new box is install the Windows Azure Tools and SDK. (don’t you?)

    Quite often, I forget to install and configure IIS7 – and receive the following message:

    image

    The Windows Azure SDK requires Internet Information Service 7.0 with ASP.Net Application Development components installed.

    So how do I install IIS and the required components? 

    You could follow the instructions buried on the download page… or you can use the Microsoft Platform installer which is a heck of a lot easier.  Let’s see how that would work.

    1. Navigate to http://microsoft.com/web and click on “Get the Microsoft Web Platform”

    image

    At the time of this writing, there is a v1.0 and v2.0 beta you can try.  We’ll show the 2.0 beta although you could use either.

    2. Click on the download button.

    3. Click on the Web Platform tab and customize the Web Server option with: ASP.Net, Default Document and CGI (if you want to run fastCGI apps) and any other features you want to add. 

    image

    4. You can also click to add a database (SQL Express) if you need it and tools – including the free Visual Web Developer Express which our Windows Azure Tools support.

    5. Click “Install” when you are ready. You’ll get an opportunity to review your selection, then the download and install will commence.

    6. Install a few hot fixes manually:

  • Install the Hotfix: Native Debugging Improvements
  • Install the Hotfix: Support for FastCGI on the Development Fabric
  • Install the Hotfix: Improve Visual Studio Stability

    7. Finally, if you are using WCF, you will want to install WCF HTTP Activation. (this is a .Net feature)

  • On Vista: From the Start menu, choose Settings | Control Panel | Programs | Programs and Features, Click Turn “windows Features On or Off”, under Microsoft .Net Framework 3.0, select WCF HTTP Activation

    On Windows Server 2008 – In Server Manager under Features Summary, choose Add Features – under .Net Framework 3.0 Features, select WCF Activation.

    8. Install the Windows Azure Tools

    This is the way I setup my new machines nowadays, just so much easier and I get all of the other web frameworks (like MVC and Silverlight) at the same time. 

  • Cloudy in Seattle

    Adding an HTTPS Endpoint to a Windows Azure Cloud Service

    • 7 Comments

    [Update: With the November 2009 release of the Windows Azure Tools - this post is now obsolete - an updated post is available here]

    Lately there has been a couple of threads on the forum and some internal email around setting up an https endpoint on a Windows Azure Cloud Service.

    A good starting point is this article, but there are some common issues that people run into that I wanted to talk about.

    First are the cert requirements. 

    • The certificate must contain a private key that is marked exportable
    • The certificate must have the Server Authentication Intended Purpose

    When running on the Development Fabric, the certificate also needs to be self-signed – this is to prevent any security issues around leaking the private key of a real certificate.

    Let’s walkthrough the steps to trying an https endpoint on the Development Fabric:

    1) open the ServiceDefinition.csdef file in the CloudService project in Visual Studio and add a second InputEndpoint to the WebRole:

      <WebRole name="WebRole">
        <InputEndpoints>
          <InputEndpoint name="HttpIn" protocol="http" port="80" />
          <InputEndpoint name="HttpsIn" protocol="https" port="443" />
        </InputEndpoints>
      </WebRole>
    

    2) If you have a self-signed certificate that meets the requirements above, you can skip ahead to step 9.  Otherwise, let’s use the IIS manager to create a self-signed certificate

    3) Open the IIS Manager and select “Server Certifiates”

    image

    4) On the right side under “Actions”, select “Create Self-Signed Certificate…”

    image

    5) Follow the steps in the IIS Manager and you’ll have a new self-signed cert that supports Server Authentication and has an exportable private key.

    6) The newly created cert will be put in the Personal store in the Local Computer location. Windows Azure Tools (including cspack) look for the certs in the Personal store in the Current User location (we needed to settle on a location and didn’t want it to be one that requires admin elevation).

    7) To move the certs to the Current User location, you can run mmc, add the Certificates snap-in for both “My User Account” and “Computer Account” and drag and drop the certificates to the Personal store in the Current User location.  Alternatively, you can export and import.

    8) If you ever export/import the cert, make sure you export the private key and on import mark the key as exportable:

    image

    9) Right click on the Cloud Service project in the VS Solution Explorer and click “Properties”.  Click on the SSL tab and check to Enable SSL Connections under Development and click “Select from Store…”. 

    image

    10) Select your certificate.  Hit F5 to run.

    11) Navigate to the https endpoint -- the browser will complain as expected because you are using a self-signed certificate:

     image

    12) To see the actual ports that were used for your service, you can bring up the Development Fabric UI (right click on the Development Fabric tray icon) and click on the Service Details for your Deployment:

    image

    13) When you are ready to publish to the real cloud, use the SSL Cloud Service settings to select a certificate for Publish – this is the certificate that is used when publishing for deployment.

    Troubleshooting

    • If you see the error “Role start failed for one or more roles” when specifying an https endpoint, most likely this is because you are trying to use a certificate that does not have an exportable private key.
    • If you see the error “can't locate service descriptions”, most likely this is because you attempted to use a non self-signed certificate when running on the Development Fabric.
  • Cloudy in Seattle

    Detecting Design Mode

    • 11 Comments

    A while back, Brian posted an article about how we proposed to implement the equivalent to the Windows Forms Control.DesignMode property in Cider.

    In his follow up article, he talks about how we dropped the original proposal and worked with the WPF team to get a the new DesignerProperties class into WPF's PresentationFramework assembly.  This change was driven by the feedback he got on his blog -- your feedback makes a difference.

    So instead of looking up a DependencyProperty through the AppDomain data context by name, you can check the DependencyProperty defined in the DesignerProperties class.  A far better solution that both Cider and Sparkle will support.

    When the designer starts up, it will change the default value of the IsInDesignMode attached property from false to true by overriding its metadata.  For parts of the designer like the Property Browser or adorner layer that will host controls in a runtime context (as part of the design time UI), the IsInDesignMode property will be set to false on those visual trees.

    What that means is that whenever a control is created, it will be able to check whether or not it is in design mode and get back a value of true if that control was created by the designer.  This will, for the most part, be the correct and final value, unless that control is hosted in the Property Browser or Adorner context where the IsInDesignMode property will switch from true to false for that control when it is parented to its final visual tree.

    Getting the IsInDesignMode Property

    Here is a simple example of making a runtime versus design time decision using the DesignerProperties class:

        public class CustomButton : Button
        {
            public CustomButton()
            {
                if (System.ComponentModel.DesignerProperties.GetIsInDesignMode(this))
                {
                    Content = "In Design Mode";
                }
                else
                {
                    Content = "Runtime";
                }
            }
        }

    Note how this code can be run from the constructor.

    Setting the IsInDesignMode Property

    Isn't setting the IsInDesignMode property the responsiblity of the designer?  In most cases, yes... but not all cases. 

    Consider the situation where a control has design time adorners that popup a dialog when clicked.  Since the dialog is not in the visual tree of the adorner layer, it will get the default value of the IsInDesignMode property which is set to true when the designer starts up. 

    In this case, the control developer will want to set the IsInDesignMode property to false for that dialog so that the controls on that dialog operate in a runtime context.

    The DesignerProperties class has a SetIsInDesignMode property to accommodate this kind of scenario:

    System.ComponentModel.DesignerProperties.SetIsInDesignMode(element, newValue);

    Tracking Changes to the IsInDesignMode Property

    It is important to track changes to the IsInDesignMode property as its initial value may not be its final value -- the Property Browser Editor or Adorner Layer example mentioned above.

    For example, the value of the IsInDesignMode property when your custom control is instantiated by Cider will almost always be true.

    If that control is being used within the Cider Property Browser in an Extended Editor you've written, it will change to false when that control is added into the Property Browsers visual tree where the IsInDesignMode property is false.

    To track state changes of the IsInDesignMode property, override OnPropertyChanged.

  • Cloudy in Seattle

    Adding Files to your Windows Azure Service Package

    • 11 Comments

    When using the Windows Azure Tools for Visual Studio, there are two times that you end up creating a Windows Azure Service Package:

    1. When you build and run on the Development Fabric -- this is a folder based package, extension is csx.  This is used by the Development Fabric.
    2. When you use the "Publish" feature -- this creates a .cspkg file which is a zipped and encrypted version of the csx built in (1).  This is what you upload to the Cloud.

    This post explains how this process works so that you can have better control over what files end up in the package. 

    The first thing to know is that in both cases above, the way the contents of the package is created are the same.  The difference is that in case 2, the package is zipped and encrypted.

    Web Role

    The way the Web Role copies files to the Service Package is by using the internal ASP.Net _CopyWebApplication build target.  This build target copies the build outputs and content files.

    In other words, it doesn't copy all of the files in your project to the Service Package.  It has to either be a build output or a content file. 

    If you want an your file to be copied, you can add it to the project:

    image

    and set the Build Action from "None":

    image

    To "Content". 

    image

    This marks the file as "Content" and will copy the file to your Service package (both when debugging in the DevFabric and "Publish" to the cspkg).

    The other option you have is to keep the Build Action as "None" but set "Copy to Output Directory" to "Copy if Newer" (or "Copy Always").  The difference is that when you set the Build Action to "Content", the file will show up in the root of the Web Role folder, whereas when you set the file to Copy to the Output Directory, it will show up in the bin directory along side your build output.

    image

    One of the side effects of using the Web Application build action is that linked files are not supported.  We're working with the ASP.Net team to get this fixed in the future.

    Worker Role

    A Worker Role does not use the Web Application target that I referred to above and the only option to have extra files copied to the Service Package is to keep the Build Action set to None and set Copy to Output Directory to "Copy if newer".

    Assemblies

    For assemblies that you need to have copied to the output directory, you can always add a reference to that assembly, even if you don't need it directly from your Web or Worker roles.  The reference has a property "Copy Local" which defaults to true.

    One exception is if the referenced file is in the GAC and you'll have to set "Copy Local" to true manually.

  • Cloudy in Seattle

    Windows Azure Storage Browser in the Visual Studio Server Explorer

    • 6 Comments

    As part of the June 2010 release of the Windows Azure Tools, we now have a Windows Azure Storage browser in the Visual Studio Server Explorer:

    image

    It is our first cut at this feature and we've been iterating fairly quickly on the Windows Azure Tools so I'm excited about having this feature not only for what it delivers today but also because it lays the foundation for the future.  In this post, I'll go over what you can and can’t do with the Windows Azure Storage browser and how added some features to hopefully make it easier for you to handle navigating through large data sets.

    Connecting to a Storage Account

    The development storage node is always available under the Windows Azure Storage node and when opened will also start up the development storage if it isn’t already running.

    To add a storage account in the cloud, right click on the Windows Azure Storage node and select “Add New Account…”

    image

    This will pop up a dialog that allows you to enter your storage account credentials:

    image

    This corresponds to the account name and key that you set up on the Windows Azure Developer Portal, for example:

    image

    In this case “serverexplorer” is the account name and you can use either the primary or secondary access keys as the account key.

    That said, one of our design principles is not to ask the user to enter the same information more than once.  So if you’ve entered storage connection strings in your cloud service, specifically if you've added connection strings as configuration settings in the roles of your Cloud Service, we find those and show them in the combo box.  If you select one, we'll fill out the name and key so that you don't have to re-enter that same information.

    image

    Once you hit ok that new storage account will be shown in the Server Explorer:

    image

    Browsing Blob Storage

    To browse blobs, you can open up the storage account and then open up the “Blobs” node to see a list of the containers in that storage account.

    image 

    By double clicking on a container, you can see all of the blobs that are in that container. 

    One of the things we did to help you to handle large data sets, we get the blob list 200 at a time.  As the blob list is downloading, you can click to pause or resume the download.

    image

    If you click to pause, it gives you the ability to download a blob, see blob properties (right click on a row and select “Properties”) or enter a filter by blob prefix.

    image

    Filtering by blob prefix occurs on the server side so only the filtered list is returned and shown.

    image

    Our thought is that by supporting both filtering and pause/resume, you will be able to use the Windows Azure Storage browser with containers that contain a large number of blobs.

    We also support downloading blobs by double clicking on them.  This will add a line item into the Windows Azure Activity Log window in Visual Studio which we use to track long running processes that relate to our Windows Azure Tools.

    After downloading is complete, the blob will be opened in Visual Studio if the file type is supported.

    image 

    One of the hard cuts for this release was the edit/write support.  We really hoped to add the ability to delete blobs and containers because viewing and deleting really covers the core developer scenario.  Unfortunately, we’ll have to wait for a future release to add that in... but again, I'm excited about the foundation that this feature provides and it's integration into Visual Studio makes it really convenient.

    Browsing Table Storage

    Browsing Table Storage works in a very similar way. 

    image

    When you open up a table, we download 200 rows at a time and allow you to pause/resume.  If you pause you can filter using a WCF Data Services $filter query option.

    What you can put in the text box to filter is anything you would put after ‘$filter=’ in a WCF Data Services.  For example, “Address gt ‘989 Redmond Way, Redmond WA’”

    image

    Having a table viewer right in Visual Studio now allows you to view Windows Azure Diagnostic trace messages without having to leave Visual Studio.

    image

    Similar to the blob storage viewer, we also had to cut the edit/write capabilities for table storage. 

    Finally

    We are really dying to get the edit/write capability and Queue capability into the product.  Hopefully we’ll be able to schedule it soon!

    We the Windows Azure Storage browser for you so let me know what you like, don’t like and what features you want to see next!

  • Cloudy in Seattle

    Silverlight MediaElement Playing a Video stored in Windows Azure Blob Storage

    • 15 Comments

    There are two things that I want to show in this post:

    1. That you can use Silverlight in a Windows Azure Cloud Service
    2. That you can stream a movie progressively via http (more about that here) from Windows Azure Blob storage

    The code is attached to this blog post.  Note that you will have to add and reference the Common and StorageClient projects that come as samples in the Windows Azure SDK.

    What I did, is start with my Simple Blob Storage Walkthrough and adapted it to be a Video Store instead.

    image

    The top part of the app is really the same as in the Simple Blob Storage Walkthrough where another column was added to the GridView to provide play buttons.

    When you hit play, the Silverlight control sitting underneath the form part of the page uses a MediaElement to play the video.  The video is played directly from its location in blob storage.

    Adding Silverlight

    Starting where the Simple Blob Storage Walkthrough finished off, lets add Silverlight to that project.  (this assumes that you have Silverlight Tools installed.  See here for more information)

    Note that you may have to configure the mime type in IIS for Silverlight -- you'll know if you get a "Could not download Silverlight application"error:

    Registering MIME type in development fabric

    To ensure solutions containing Silverlight clients work correctly in the development fabric, please ensure that the MIME type for the Silverlight .xap extension is configured correctly in IIS.

    1.     Open Internet Information Services (IIS) Configuration Manager and select the server to manage (usually the top-level item on the left side)

    2.     Double-click MIME Types on the right side

    3.     Add an entry to map the .xap extension to the application/x-silverlight-app MIME type

    Right click on the solution node in the Solution Explorer and select Add -> New Project.  Select "Silverlight Application":

    image

    Name the project VideoPlayer.

    Select to "Link this Silverlight control into an existing Web site" and make sure that your Web Role is selected.  For this walkthrough I chose not to add a test page as I want to add the Silverlight control onto the existing page.

    image

    Solution Explorer will now look contain a Silverlight project and will look like this:

    image

    Open up Default.aspx and at the bottom of the page, just after the statusMessage Label you had from the previous walkthrough, add the following snippet to add the Silverlight control onto the page:

            <asp:ScriptManager ID="ScriptManager1" runat="server">
            </asp:ScriptManager>
            <div style="height: 100%;">
                <asp:Silverlight ID="Xaml1" runat="server" Source="~/ClientBin/VideoPlayer.xap" MinimumVersion="2.0.30523"
                    Width="100%" Height="100%" />
            </div>

    You also need to Register the Silverlight assembly otherwise the asp:Silverlight tag will come up as unknown.  You can do this at the top of Default.aspx, right under the "<%@ Page ..." tag:

    <%@ Page Language="C#" AutoEventWireup="true" CodeBehind="Default.aspx.cs" Inherits="DownloadSite_WebRole._Default" %>
    <%@ Register Assembly="System.Web.Silverlight" Namespace="System.Web.UI.SilverlightControls"
        TagPrefix="asp" %>

    You may also have to add a reference to System.Web.Silverlight from your web Role (it can be found on the .Net tab of the add references dialog).

    Open up Page.xaml and change the Background property of the Grid to "Green" so that we can see the control on the aspx page.  Hit "F5" to debug.

    You should get the following:

    image

    Cool!  We have Silverlight!

    Coding up the Silverlight Page

    First we need a MediaElement to play the video.  Open up Page.xaml and add the MediaElement tag to the Grid as follows.  While we're at it, let's set the Width and Height of the Page to be 640 x 480 to make the video a little bigger and remove the Background attribute. 

    Name the MediaElement "mediaPlayer".

    <UserControl x:Class="VideoPlayer.Page"
        xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation" 
        xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml" 
        Width="640" Height="480">
        <Grid x:Name="LayoutRoot">
            <MediaElement x:Name="mediaPlayer"/>        
        </Grid>
    </UserControl>

    Note: A couple of issues I've been seeing (with the PDC October 2008 CTP) while developing Silverlight applications running on a Windows Azure Web Role: debugging doesn't seem to be working, the xap in the Web Role isn't always updating as expected resulting in a stale Silverlight UI on F5.  For the second problem, I found right clicking on the Silverlight project and hitting "Rebuild" then right clicking on the Web Role and hitting "Rebuild" before hitting F5 resolved the problem.  We're actively investigating both issues.

    We'll just keep things really simple.  When the play button is clicked, we'll use Javascript to call a method on the Silverlight control passing in the URI that contains the video to play.

    Let's add a scriptable method to the Page class in Page.xaml.cs:

    using System.Windows.Browser;
    public partial class Page : UserControl
    {
        {. . .}
        [ScriptableMemberAttribute]
        public void Play(string fileUriString)
        {
            if (!string.IsNullOrEmpty(fileUriString))
            {
                Uri fileUri = new Uri(fileUriString);
                mediaPlayer.Source = fileUri;
            }
        }
    }

    If you want to know more about the ScriptableMemberAttribute and how all this works, please see the MSDN article here.

    The second part of making the control scriptable is to register the scriptable object.  That is done in App.xaml.cs in Application_Startup()

    using System.Windows.Browser;
    private void Application_Startup(object sender, StartupEventArgs e)
    {
        Page p = new Page();
        HtmlPage.RegisterScriptableObject("VideoPlayer", p);
    
        this.RootVisual = p;
    }

    Playing the Video

    In order to play the video, we need to add a "play" button to each row of the GridView.

    Open up Default.aspx and in the Columns for the GridView, add a column at the end for the Play button.  We'll use the <input/> tag for this as we don't want a postback to occur when we click the button (that will re-initialize the Silverlight control)

    <asp:TemplateField>
        <ItemTemplate>
            <input type="button" value="Play"/>
        </ItemTemplate>
    </asp:TemplateField>

    To that button, let's hook up an event handler when it gets clicked.  I did this programatically on the RowDataBound method on the fileView GridView control. 

    The reason is that I wanted an easy way to pass the URI for the video to the Silverlight control.  In the RowDataBound event handler, I can get at the URI and pass it along as a parameter. 

    The event handler will be run on the client side in Javascript.

    In Default.aspx.cs in Page_Load(), register to handle the event:

    fileView.RowDataBound += new GridViewRowEventHandler(fileView_RowDataBound);

    Then handle the event by adding the event handler for each of the <input/> buttons that will pass the URI as a parameter.

    void fileView_RowDataBound(object sender, GridViewRowEventArgs e)
    {
        if (e.Row.RowType == DataControlRowType.DataRow)
        {
            FileEntry fe = (FileEntry)e.Row.DataItem;
            e.Row.Cells[3].Attributes.Add("onclick", "onGridViewRowSelected('" + fe.FileUri + "')");
        }
    }

    Finally, back in Default.aspx, we add the Javascript event handler that calls the scriptable method on our Silverlight control:

    <script language="javascript" type="text/javascript">
        function onGridViewRowSelected(fileUri) {
            var control = document.getElementById("Xaml1");
            control.content.VideoPlayer.Play(fileUri);
        }
    </script>

    Hit F5 to give it a try on the Development Fabric/Storage.  Upload a video sample (say from C:\Users\Public\Videos\Sample Videos) and then hit play.

    One final thing to remember is the upload file size limit on the FileUpload control which is discussed in the Simple Blob Storage Walkthrough.  More likely you'll run into here with video files.

    image

  • Cloudy in Seattle

    Windows Azure - Resolving "The Path is too long after being fully qualified" Error Message

    • 19 Comments

    When you run a cloud service on the development fabric, the development fabric uses a temporary folder to store a number of files including local storage locations, cached binaries, configuration, diagnostics information and cached compiled web site content.

    By default this location is: C:\Users\<username>\AppData\Local\dftmp

    For the most part you won’t really care about this temporary folder, the Windows Azure Tools will periodically clean up the folder so it doesn’t get out of hand.

    Note: To manually clean up the devfabric temporary folder, you can open an elevated Windows Azure SDK Command Prompt and run: “csrun /devfabric:shutdown” followed by “csrun /devfabric:clean”.  You really don’t need to do this but it can come in handy from time to time.

    There are some cases where the length of the path can cause problems.

    If the combination of your username, cloud service project name, role name and assembly name get so long that you run into assembly or file loading issues at runtime. This will give you the following error message when you hit F5:

    “The path is too long after being fully qualified.  Make sure the full path is less than 260 characters and the directory name is less than 248 characters.”

    For example, in my test, the path to one of the assemblies in my cloud service was:

    C:\Users\jnak\AppData\Local\dftmp\s0\deployment(4)\res\deployment(4).CloudServiceabcdefghijklmnopqrstuvwxyzabcdefghijklmnopqr.WebRole1.0\AspNetTemp\aspNetTemp\root\aff90b31\aa373305\assembly\dl3\971d7b9b\0064bc6f_307dca01\Microsoft.WindowsAzure.Diagnostics.DLL

    which exceeds the 260 character path limit.

    If you aren’t married to your project and assembly names, you could name those differently so that they are shorter

    The other workaround is to change the location of the development fabric temporary folder to be a shorter path.

    You can do this by setting the _CSRUN_STATE_DIRECTORY to a shorter path, say “C:\A” for example.

    image

    Make sure that you close Visual Studio and shutdown the development fabric by using the “csrun /devfabric:shutdown” command I mentioned above or by clicking “exit” on the Windows Azure the tray icon.

    After making this change, my sample above was able to run without problem.

    Of course, this workaround really only buys you more characters and ultimately you may have to simply reduce your path lengths through renaming.

  • Cloudy in Seattle

    Fix available: ASP.Net MVC RC Crash in a Windows Azure Cloud Service Project

    • 27 Comments

    For those of you that have been unfortunate enough to experience the VS crash when using the ASP.Net MVC RC in a Windows Azure Cloud Service project, I have good news!

    The CLR team has distributed a hotfix.  The KB Article number is 963676 and is posted at http://support.microsoft.com/?kbid=963676

    The hotfix isn't specific to the MVC crash, it solves some crashes in the WPF Designer as well.

    You can download the hotfix here: https://connect.microsoft.com/VisualStudio/Downloads/DownloadDetails.aspx?DownloadID=16827&wa=wsignin1.0

    • For Vista/Windows Server 2008 32 bit: Use Windows6.0-KB963676-x86.msu
    • For Vista/Windows Server 2008 64 bit: Use Windows6.0-KB963676-x64.msu

    [update 8/27/2010] - This hotfix has been superceded: http://support.microsoft.com/kb/981574/ and unfortunately it appears we no longer host it on Connect so you'll have to contact Microsoft support, see the KB article for more information.

  • Page 1 of 9 (202 items) 12345»