January, 2010

  • Cloudy in Seattle

    Installing Certificates in Windows Azure VMs

    • 7 Comments

    A little while ago I posted How To: Add an HTTPS Endpoint to a Windows Azure Cloud Service which talked about the whole process around adding an HTTPS endpoint and configuring & uploading the SSL certificate for that endpoint.

    This post is a follow up to that post to talk about installing any certificate to a Windows Azure VM.  You may want to do this to install various client certificates or even to install the intermediate certificates to complete the certificate chain for your SSL certificate.

    In order to peak into the cloud, I’ve written a very simple app that will enumerate the certificates of the Current User\Personal (or My) store.

    Create a new Windows Azure Cloud Service and add an ASP.NET Web Role to it. 

    Open up default.aspx and add the following between the empty <div></div> to setup a table that will be used to list the certificates.

    <asp:Table ID="certificateTable" runat="server">
        <asp:TableRow runat="server">
            <asp:TableCell runat="server">Friendly Name:</asp:TableCell>
            <asp:TableCell runat="server">Issued By:</asp:TableCell>
            <asp:TableCell runat="server">Issued To:</asp:TableCell>
            <asp:TableCell runat="server">Expiration Date:</asp:TableCell>
        </asp:TableRow>
    </asp:Table>
    

    Now, I have some simple code that opens up a certificate store and adds a row to the table for each certificate found with the Friendly Name, Issuer, Subject and Expiration date.

    protected void Page_Load(object sender, EventArgs e)
    {
        X509Certificate2Collection selectedCerts = new X509Certificate2Collection();
    
        X509Store store = new X509Store(StoreName.My, StoreLocation.CurrentUser);
        try
        {
            store.Open(OpenFlags.OpenExistingOnly | OpenFlags.ReadOnly);
            foreach (X509Certificate2 cert in store.Certificates)
            {
                TableRow certificateRow = new TableRow();
    
                // Friendly Name
                TableCell friendlyNameCell = new TableCell();
                TextBox friendlyNameText = new TextBox();
                friendlyNameText.Text = cert.FriendlyName;
                friendlyNameCell.Controls.Add(friendlyNameText);
                certificateRow.Cells.Add(friendlyNameCell);
    
                // Issuer
                TableCell issuerCell = new TableCell();
                TextBox issuerText = new TextBox();
                issuerText.Text = cert.Issuer;
                issuerCell.Controls.Add(issuerText);
                certificateRow.Cells.Add(issuerCell);
                
                // Subject
                TableCell subjectCell = new TableCell();
                TextBox subjectText = new TextBox();
                subjectText.Text = cert.Subject;
                subjectCell.Controls.Add(subjectText);
                certificateRow.Cells.Add(subjectCell);
    
                // Expiration
                TableCell expirationCell = new TableCell();
                TextBox expirationText = new TextBox();
                expirationText.Text = cert.NotAfter.ToString("d");
                expirationCell.Controls.Add(expirationText);
                certificateRow.Cells.Add(expirationCell);
                
                // Add the TableRow to the Table
                certificateTable.Rows.Add(certificateRow);
            }
        }
        finally
        {
            store.Close();
        }
    }

    Build, publish and deploy this to the cloud and you’ll see that the CurrentUser\MY store is empty. 

    image 

    Now that I have a way to show you the certificates that I install to the cloud, let’s follow the process to install 2 certificates.  One is a self signed certificate with a private key I created myself and the other is an intermediate certificate I got from the verisign web site. 

    Installing a certificate to Windows Azure is a three step process. 

    1. Configure the certificates to install per role
    2. Upload the certificates via the Windows Azure Developer Portal per hosted service
    3. Deploy your app and when VMs are setup to host each of your role instances, those VMs will contain your certificates.

    Note: The configuration of the certificates are per role but the upload of the certificates are for a hosted service and will be used for all of the roles in that hosted service.

    Configure the certificates.  Bring up the configuration UI for the Web Role in your solution, click on “Certificates” and click on “Add Certificate”.

    image

    Now click on the “. . .” to bring up the certificate selection dialog.  This currently enumerates the certificates in the CurrentUser\My store, however in future versions this will enumerate the certificates in LocalMachine\My.  The dialog specifies which store it is listing the certificates from.

    image

    In my example, I’m selecting a self-signed certs with an exportable private key that I’ve setup ahead of time.

    This sets the thumbprint field for that certificate.

    image

    Now I have a CER file from verisign which is an intermediate cert.  The steps to create it was to copy a big long chunk of text from their web site and save it as a .CER file which I did.

    I have a couple of choices on how to handle this, I could install this certificate to the CurrentUser\My store and select it from the VS UI to get the thumbprint.

    Instead, I’m going to open the certificate, switch to Details and copy the Thumbprint from the certificate itself (the point of doing the 2 certificates is to show alternate ways of handling the certificates):

    image

    Now I click “Add Certificate” on the Certificates tab of the VS UI and paste in the thumbprint for my second certificate.  I also changed the names of the certificate entries to “SelfSigned” and “Intermediate” for clarity but certainly is not necessary.

    image

    Note:I changed the store location and store name to CurrentUser and My – typically you will install intermediate certs to the CA store but to keep this sample simple I only want to enumerate one store, CurrentUser\My which as we saw above is empty by default in Windows Azure so I’m installing the two example certificates to the CurrentUser\My store.

    The dropdown for store name currently contains My, Root, CA and Trust.  We do support you opening the service definition file and setting any string for the Store Name that you would like in the event that the 4 in the drop down are not sufficient.

    <Certificate name="SelfSigned" storeLocation="CurrentUser" storeName="<enter a value>" />
    

    I now need to upload these certificates through the Developer Portal which means I need them as files.  I have the .CER file already but need to export my self-signed cert from the certificate store.

    To do so I run certmgr.msc, browse to the certificate, right click | All Tasks | Export…

    image

    This takes me through Windows Certificate Export Wizard.  One of the key things for SSL certificates and any other certificates where you need the private key in the Cloud, is to ensure that you select “Yes, export the private key” and provide a password during the export

    image

    After finishing the wizard, I now have 2 certificate files, the .CER file and the .PFX file that was just created.

    image

    Now I go to the Developer portal, and select my Hosted Service.  Down at the bottom I select “Manage” under the “Certificates” heading.

    image 

    This allows me to browse to PFX files, type the same password you entered when exporting the certificate and uploading that certificate to the cloud.

    image 

    You’ll notice here that the Portal only allows uploading of PFX files but we need to upload a .CER file.  This will change in the future but for now, there is some simple code you can run to convert your .CER file to PFX.  I put the following code in a console application and ran it.

    static void Main(string[] args)
    {
        // The path to the certificate.
        string certificate = @"C:\Users\jnak.REDMOND\Desktop\verisignintermediate.cer";
    
        // Load the certificate into an X509Certificate object.
        X509Certificate cert = new X509Certificate(certificate);
        byte[] certData = cert.Export(X509ContentType.Pkcs12);
    
        System.IO.File.WriteAllBytes(@"C:\Users\jnak.REDMOND\Desktop\verisignintermediate.pfx", certData);
    }

    Now that I have a PFX file for my CER file, I upload that leaving the password field blank because the password is only used to protect private keys which CER files do not have and therefore this PFX file also does not have.

    You can see here that both certificates are now installed to my Hosted Service.

    image

    Now I deploy my Cloud Service which will enumerate the two certificates I installed to the CurrentUser\My store.  If you deployed the app already as I did above, just suspend and delete that deployment and deploy the app again with the new configuration.  (both the service definition and service configuration files have changed so the package (cspkg) and service configuration (cscfg) need to be deployed again to Windows Azure.

    After you complete the deploy process and your app is running on Windows Azure, you’ll now see the following:

    image

    This shows that the VMs in the cloud for your role have the certificates you configured and uploaded installed.

    Finally I’ll reiterate that certificate configuration is on a per role basis.  As an example, if I added a worker role to this same Cloud Service, the VMs for the worker role instances would not contain the certificates we configured above unless you opened up the configuration UI for the Worker role and added the same entries to the Certificates tab.

    That said, because the certificates are upload per hosted service – you do not need to upload the certificates to the cloud more than once for a given Cloud Service.

  • Cloudy in Seattle

    Windows Azure Debugging: Matching an Instance in the DevFabric to its Process

    • 1 Comments

    One of the things I run into from time to time is the need/desire to match an instance I’m watching in the Development Fabric UI back to its process.

    This may be because I am sitting on a breakpoint in Visual Studio and want to look at the corresponding logs for that process in the DevFabric UI or because I want to use some non Windows Azure aware tool that requires a process ID.

    For example, given a Web and a Worker role with 3 instances each, I can watch the instances in the development fabric UI by right clicking on the Windows Azure task tray icon and select “Show Development Fabric UI”:

    image

    image

    This is useful for a couple of reasons, sometimes I want to double check the status of my roles and instances but more commonly I want to watch the logs on a per instance basis.

    Although the logs are shown in the Visual Studio output window, all of the messages for all of the instances show up there:

    image

    Which is fine for the common case where I’m debugging a single instance at a time but not so good as soon as I have more than one instance.

    Let’s setup an example, create a new Windows Azure Cloud Service in Visual Studio, add an ASP.NET Web Role and a Worker Role and you’ll have the following in Solution Explorer:

    image 

    Set the instance count to 3 for each role by bringing up the role configuration UI for each role:

    image

    and setting “Instance count” to 3.

    image

    Now I’m going to set a breakpoint in the WorkerRole.cs file in the Worker Role project.  You can see that the default worker role code just loops and writes a trace message.

    image

    Hit F5 to start debugging the Cloud Service on the Development Fabric.

    When I hit a breakpoint – I can see that Visual Studio is debugging multiple WaWorkerHost.exe and WaWebHost.exe processes – these correspond to Worker and Web role instances respectively.

    image

    But which process corresponds to which logs in the Development Fabric UI? 

    That is, if I’m stopped on a breakpoint for a WaWorkerHost.exe process with ID 4460 – how do I know which log output in the Development Fabric UI corresponds to the process I’m currently stopped on?

    As it turns out, there are two ways I can get the information I want -- which is the process ID for the instances I’m looking at in the Development Fabric.

    Option 1: Check the log file.  If you look carefully, the log file will have the text “-parent xyz” where xyz is the process ID of the WaWorkerHost.exe or WaWebHost.exe that is hosting your worker or web role instance.

    image

    Option 2: Use the “Attach debugger…” functionality from the context menu in the Development Fabric.  Right click on one of the instance nodes in the Development Fabric and select “Attach debugger…”

    image 

    This will bring up a VS dialog that has both the process name and the process ID on it. 

    image

    I can then use that process ID to correspond an instance in the DevFabric UI to the process I’m stopped on in Visual Studio.

    Note: If you aren’t debugging your cloud service from Visual Studio and you choose to use this DevFabric UI option to “Attach Debugger” -- On x64 systems, the “Attach debugger…” functionality only attaches the native debugger so it will appear as though even though you are attached that you can’t hit breakpoints in managed code.

    To workaround this issue, make note of process ID and then click “No, cancel debugging”.  Open up Visual Studio and select Debug | Attach to Process…

    image

    Select the process with the matching process ID and click “Attach”.  To explicitly select a debugger, before clicking “Attach”, click the “Select…” button which will allow you to specify which debugger you want to attach.

    image 

    So far we’ve seen that multiple instances on the development fabric isn’t used nearly as much as doing single instance debugging, however I have a feeling that as more and more apps adopt the Windows Azure scale out model, there are going to be more cases where developers are going to want to test/debug multi-instance scenarios locally.

    To that end, I hope this helps and let me know if you think we need to do more for this scenario.

  • Cloudy in Seattle

    Windows Azure Instance & Storage Limits

    • 0 Comments

    Recently, a colleague of mine wrote about the Windows Azure instance limits: http://blog.toddysm.com/2010/01/windows-azure-role-instance-limits-explained.html

    His post is very complete, I recommend you have a look but here is my take:

    These are default limits that are in place to ensure that Windows Azure will always have VMs available to all of our customers.  If you have a need for more capacity, we want to help!  Please contact us: http://go.microsoft.com/fwlink/?LinkID=123579

    The limits are:

    • 20 Hosted Service Projects
    • 5 Storage Accounts
    • 5 roles per Hosted Service (i.e. 3 different web roles + 2 different worker roles or any such combination)
    • 20 CPU cores across all of your Hosted Service Projects

    The first two are really easy to track, on the Development portal when you go to create a new service, it’ll tell you how many you have left of each:

    image

    5 roles per Hosted Service is also easy to understand, this corresponds to the number of projects you can add as roles to your Cloud Service – here I am hitting my role limit:

    image

    So let’s talk real quick about the 20 CPU core limit – note that the limit is on CPU cores, not on instances. 

    When you configure your role, you can set the number of instances as well as the VM size:

    image

    The VM sizes of Small, Medium, Large and ExtraLarge are defined here: http://msdn.microsoft.com/en-us/library/ee814754.aspx

    Today the CPU cores for each VM size are: (subject to change so always consult the MSDN link above for the latest information)

    VM Size CPU Cores
    Small 1
    Medium 2
    Large 4
    ExtraLarge 8

    So the number of CPU cores for a role is the (instance count) X (Number of CPU Cores for the selected VM Size).

    If you add those up across all of your roles across all of your Hosted Service projects (staging and production slots) – this has to be lower than 20.

    Quick example: if you have 5 Hosted Service projects with 1 role, 2 instances per role and Medium VM size, you’ve hit the limit.

    The other key is that you not only need to stop your deployment to free up CPU cores, you also need to delete the deployment to reduce your CPU core count.

    What about Windows Azure Storage quotas? 

    It just so happens that another colleague of mine has written about this: http://blogs.msdn.com/frogs69/archive/2009/12/17/storage-quotas-and-core-allocation-on-windows-azure.aspx

    Each storage account allows you to have 100TB of data across all of your blob, tables and queues.  As mentioned above, you can have up to 5 storage accounts.

    If you are dealing with really large data sets, follow the link above to see the limits on the blobs, # properties in a table, entity and queue messages.

  • Cloudy in Seattle

    Windows Azure WCF Add Service Reference Patch and Windows 7

    • 0 Comments

    For those of you that watched my Tips & Tricks session at PDC or followed along on my blog post, you'll recall that I mentioned the following:

    In general, WCF works correctly on Windows Azure.  There is a problem using "add service reference" or svcutil but we have a patch to workaround the problem.  The patch is installed in the cloud, and more information about this is here: http://code.msdn.microsoft.com/wcfazure/Wiki/View.aspx?title=KnownIssues (note that a web.config change is also required)

    One of the things this prompted a lot of folks to ask is "Where is the Windows 7 version of this patch?"

    Well, I'm happy to announce that we have released this QFE for Windows 7 and Windows Server 2008 R2: http://code.msdn.microsoft.com/KB977420

    I also recommend that if you are using WCF on Windows Azure, you spend time browsing the content on http://code.msdn.microsoft.com/wcfazure.

    You may also be interested in the REST service templates the WCF team has made available on the Visual Studio Gallery: http://visualstudiogallery.msdn.microsoft.com/en-us/842a05c3-4cc8-49d3-837f-5ec7e6b17e80 (this is the .NET 3.5 C# template, there are also .NET 3.5 VB and .NET 4 C# and VB templates)

    Note that the REST templates aren't currently directly supported as a Windows Azure Role in the New Project Dialog, but you can easily use that template in a Windows Azure Cloud Service by following the "Using an Existing Project" section of this post.

  • Cloudy in Seattle

    Windows Azure - Resolving "The Path is too long after being fully qualified" Error Message

    • 19 Comments

    When you run a cloud service on the development fabric, the development fabric uses a temporary folder to store a number of files including local storage locations, cached binaries, configuration, diagnostics information and cached compiled web site content.

    By default this location is: C:\Users\<username>\AppData\Local\dftmp

    For the most part you won’t really care about this temporary folder, the Windows Azure Tools will periodically clean up the folder so it doesn’t get out of hand.

    Note: To manually clean up the devfabric temporary folder, you can open an elevated Windows Azure SDK Command Prompt and run: “csrun /devfabric:shutdown” followed by “csrun /devfabric:clean”.  You really don’t need to do this but it can come in handy from time to time.

    There are some cases where the length of the path can cause problems.

    If the combination of your username, cloud service project name, role name and assembly name get so long that you run into assembly or file loading issues at runtime. This will give you the following error message when you hit F5:

    “The path is too long after being fully qualified.  Make sure the full path is less than 260 characters and the directory name is less than 248 characters.”

    For example, in my test, the path to one of the assemblies in my cloud service was:

    C:\Users\jnak\AppData\Local\dftmp\s0\deployment(4)\res\deployment(4).CloudServiceabcdefghijklmnopqrstuvwxyzabcdefghijklmnopqr.WebRole1.0\AspNetTemp\aspNetTemp\root\aff90b31\aa373305\assembly\dl3\971d7b9b\0064bc6f_307dca01\Microsoft.WindowsAzure.Diagnostics.DLL

    which exceeds the 260 character path limit.

    If you aren’t married to your project and assembly names, you could name those differently so that they are shorter

    The other workaround is to change the location of the development fabric temporary folder to be a shorter path.

    You can do this by setting the _CSRUN_STATE_DIRECTORY to a shorter path, say “C:\A” for example.

    image

    Make sure that you close Visual Studio and shutdown the development fabric by using the “csrun /devfabric:shutdown” command I mentioned above or by clicking “exit” on the Windows Azure the tray icon.

    After making this change, my sample above was able to run without problem.

    Of course, this workaround really only buys you more characters and ultimately you may have to simply reduce your path lengths through renaming.

  • Cloudy in Seattle

    Walkthrough: Windows Azure Blob Storage (Nov 2009 and later)

    • 27 Comments

    Similar to the table storage walkthrough I posted last week, I updated this blog post for the Nov 2009/v1.0 and later release of the Windows Azure Tools.

    This walkthrough covers what I found to be the simplest way to get a sample up and running on Windows Azure that uses the Blob Storage service. It is not trying to be comprehensive or trying to dive deep in the technology, it just serves as an introduction to how the Windows Azure Blob Storage Service works.

    Please take the Quick Lap Around the Tools before doing this walkthrough.

    Note: The code for this walkthrough is attached to this blog post.

    After you have completed this walkthrough, you will have a Web Role that is a simple ASP.NET Web Application that shows a list of files that are stored and can be downloaded from Blob Storage. You can use the Web Role to add files to Blob storage and make them available in the list.

    image

    Blob Concepts

    Each storage account has access to blob storage. For each account there can be 0..n containers. Each container contains the actual blobs, which is a raw byte array. Containers can be public or private. In public containers, the URLs to the blobs can be accessed over the internet while in a private container, only the account holder can access those blob URLs.

    Each Blob can have a set of metadata set as a NameValueCollection of strings.

    image

    Creating the Cloud Service Project

    1. Start Visual Studio as an administrator (Screen shots below are from VS 2008, VS 2010 is also supported)

    2. Create a new project: File | New Project

    3. Under the Visual C# node (VB is also supported), select the “Cloud Service” project type then select the “Windows Azure Cloud Service” project template. Set the name to be “SimpleBlobSample”. Hit OK to continue.

    image

    This will bring up a dialog to add Roles to the Cloud Service.

    4. Add an ASP.NET Web Role to the Cloud Service, we’ll use the default name of “WebRole1”.  Hit OK.

    image

    Solution explorer should look as follows:

    image

    We’ll now cover the implementation, which can be broken up into 5 different parts:

    1. Implementing the UI
    2. Connecting to Windows Azure storage
    3. Adding blobs
    4. Enumerating existing blobs
    5. Deleting blobs

    Implementing the UI

    5. Next open up Default.aspx and add the code for the UI. The UI consists of:

    • GridView at the top
    • Label and FileUpload control
    • 2 Label and TextBox pairs (File Name and Submitter)
    • Field validators to ensure that all of the fields are filled out before the file is uploaded.

    Add the following between the template generated <div></div> elements:

    <asp:GridView ID="fileView" AutoGenerateColumns="false" DataKeyNames="FileUri" runat="server"
        OnRowCommand="RowCommandHandler">
        <Columns>
            <asp:ButtonField Text="Delete" CommandName="DeleteItem" />
            <asp:HyperLinkField HeaderText="Link" DataTextField="FileName" DataNavigateUrlFields="FileUri" />
            <asp:BoundField DataField="Submitter" HeaderText="Submitted by" />
        </Columns>
    </asp:GridView>
    <br />
    <asp:Label ID="filePathLabel" Text="File Path:" AssociatedControlID="fileUploadControl"
        runat="server" />
    <asp:FileUpload ID="fileUploadControl" runat="server" />
    <asp:RequiredFieldValidator ID="filUploadValidator" ControlToValidate="fileUploadControl"
        ValidationGroup="fileInfoGroup" ErrorMessage="Select a File" runat="Server">
    </asp:RequiredFieldValidator>
    <br />
    <asp:Label ID="fileNameLabel" Text="File Name:" AssociatedControlID="fileNameBox"
        runat="server" />
    <asp:TextBox ID="fileNameBox" runat="server" />
    <asp:RequiredFieldValidator ID="fileNameValidator" ControlToValidate="fileNameBox"
        ValidationGroup="fileInfoGroup" ErrorMessage="Enter the File Name" runat="Server">
    </asp:RequiredFieldValidator>
    <br />
    <asp:Label ID="submitterLabel" Text="Submitter:" AssociatedControlID="submitterBox"
        runat="server" />
    <asp:TextBox ID="submitterBox" runat="server" />
    <asp:RequiredFieldValidator ID="submitterValidator" ControlToValidate="submitterBox"
        ValidationGroup="fileInfoGroup" ErrorMessage="Enter the Submitter Name" runat="Server">
    </asp:RequiredFieldValidator>
    <br />
    <asp:Button ID="insertButton" Text="Submit" CausesValidation="true" ValidationGroup="fileInfoGroup"
        runat="server" OnClick="insertButton_Click" />
    <br />
    <br />
    <asp:Label ID="statusMessage" runat="server" />

    6. If you now switch to design view, you will see:

    image

    You’ll also notice from that aspx that there is an event handler for OnRowCommand on the GridView to handle the DeleteItem command, IDs for the TextBoxes and an event handler for the OnClick event on the Submit button.

    The code for these will be filled out further down in the walkthrough.

    Connecting to Windows Azure storage

    7. Open Default.aspx.cs and add the code to connect to the Blob Storage Service to the Page_Load() method.

    using Microsoft.WindowsAzure;
    using Microsoft.WindowsAzure.StorageClient;
    
    namespace WebRole1
    {
        public partial class _Default : System.Web.UI.Page
        {
            private CloudBlobClient _BlobClient = null;
            private CloudBlobContainer _BlobContainer = null;
    
            protected void Page_Load(object sender, EventArgs e)
            {
                // Setup the connection to Windows Azure Storage
                var storageAccount = CloudStorageAccount.FromConfigurationSetting("DataConnectionString");
                _BlobClient = storageAccount.CreateCloudBlobClient();

    8.  Setup the “DataConnectionString” setting by opening up the configuration UI for WebRole1.  Right click on the WebRole1 node under the Roles folder in the SimpleBlobStorage cloud service project and select “Properties”.

    image

    9. Switch to the Settings tab and click “Add Setting”.  Name it DataConnectionString, set the type to ConnectionString and click on the “…” button on the far right.

    image

    Hit “OK” to set the credentials to use Development Storage.  We’ll first get this sample working on development storage then convert it to use cloud storage later.

    10. If you actually tried to connect to Blob Storage at this point, you would find that the CloudStorageAccount.FromConfigurationSetting() call would fail with the following message:

    ConfigurationSettingSubscriber needs to be set before FromConfigurationSetting can be used

    This message is in fact incorrect – I have a bug filed to get this fixed, to say “Configuration Setting Publisher” and not “ConfigurationSettingSubscriber”.

    To resolve this, we need to add a bit of template code to the WebRole.cs file in WebRole1.  Add the following to the WebRole.OnStart() method in WebRole.cs in the WebRole1 project.

    using Microsoft.WindowsAzure;

    public override bool OnStart()
    {
        DiagnosticMonitor.Start("DiagnosticsConnectionString");
    
        #region Setup CloudStorageAccount Configuration Setting Publisher
    
        // This code sets up a handler to update CloudStorageAccount instances when their corresponding
        // configuration settings change in the service configuration file.
        CloudStorageAccount.SetConfigurationSettingPublisher((configName, configSetter) =>
        {
            // Provide the configSetter with the initial value
            configSetter(RoleEnvironment.GetConfigurationSettingValue(configName));
    
            RoleEnvironment.Changed += (sender, arg) =>
            {
                if (arg.Changes.OfType<RoleEnvironmentConfigurationSettingChange>()
                    .Any((change) => (change.ConfigurationSettingName == configName)))
                {
                    // The corresponding configuration setting has changed, propagate the value
                    if (!configSetter(RoleEnvironment.GetConfigurationSettingValue(configName)))
                    {
                        // In this case, the change to the storage account credentials in the
                        // service configuration is significant enough that the role needs to be
                        // recycled in order to use the latest settings. (for example, the 
                        // endpoint has changed)
                        RoleEnvironment.RequestRecycle();
                    }
                }
            };
        });
        #endregion
    
        // For information on handling configuration changes
        // see the MSDN topic at http://go.microsoft.com/fwlink/?LinkId=166357.
        RoleEnvironment.Changing += RoleEnvironmentChanging;
    
        return base.OnStart();
    }

    The comments (which I wrote and is included in the samples) should explain what is going on if you care to know.  In a nutshell, this code bridges the gap between the Microsoft.WindowsAzure.StorageClient assembly and the Microsoft.WindowsAzure.ServiceRuntime library – the Storage Client library is agnostic to the Windows Azure runtime as it can be used in non Windows Azure applications.

    This code essentially says how to get a setting value given a setting name and sets up an event handler to handle setting changes while running in the cloud (because the ServiceConfiguration.cscfg file was updated in the cloud).

    The key point is that with this snippet of code in place, you now have everything in place to connect to Windows Azure storage and create a CloudBlobClient instance.

    Adding Blobs

    11. In order to add blobs to a container, you first need to setup a container.  Let’s add this code to the Page_Load() method in Default.aspx.cs.  For a production application, you will want to optimize this code to avoid doing all this work on every page load.

    Page_Load() should now look as follows.

    protected void Page_Load(object sender, EventArgs e)
    {
        // Setup the connection to Windows Azure Storage
        var storageAccount = CloudStorageAccount.FromConfigurationSetting("DataConnectionString");
        _BlobClient = storageAccount.CreateCloudBlobClient();
    
        // Get and create the container
        _BlobContainer = _BlobClient.GetContainerReference("publicfiles");
        _BlobContainer.CreateIfNotExist();
    
        // Setup the permissions on the container to be public
        var permissions = new BlobContainerPermissions();
        permissions.PublicAccess = BlobContainerPublicAccessType.Container;
        _BlobContainer.SetPermissions(permissions);
    
        // Show the current list.
        UpdateFileList();
    }

    Note: The container is named with DNS naming restrictions (i.e. all lower case) and is created if it does not exist.  Additionally, the container is set to be a public container – i.e. the URIs to the blobs are accessible by anyone over the internet. 

    Had this been a private container, the blobs in that container could only be read by code that has the access key and account name.

    12.  Let’s now add the code to Default.aspx.cs to add a blob when the “Submit” button on the UI is clicked (remember the event handler was defined in the aspx).

    A GUID is created for the file name to ensure a unique blob name is used.  The file name and submitter are gotten from the TextBoxes in the UI.

    Blob Metadata, or user defined key/value pairs, is used to store the file name and submitter along with the blob. 

    protected void insertButton_Click(object sender, EventArgs e)
    {
        // Make a unique blob name
        string extension = System.IO.Path.GetExtension(fileUploadControl.FileName);
    
        // Create the Blob and upload the file
        var blob = _BlobContainer.GetBlobReference(Guid.NewGuid().ToString() + extension);
        blob.UploadFromStream(fileUploadControl.FileContent);
    
        // Set the metadata into the blob
        blob.Metadata["FileName"] = fileNameBox.Text;
        blob.Metadata["Submitter"] = submitterBox.Text;
        blob.SetMetadata();
    
        // Set the properties
        blob.Properties.ContentType = fileUploadControl.PostedFile.ContentType;
        blob.SetProperties();
    
        // Update the UI
        UpdateFileList();
        fileNameBox.Text = "";
        statusMessage.Text = "";
    }

    Enumerating Existing Blobs

    13. In order to simplify the databinding to the UI, lets add a FileEntry class that contains the data we want to show in the UI for each blob.  One instance of a FileEntry corresponds to a blob. 

    Right click on WebRole1 and select Add | Class…

    image

    Name the class FileEntry.cs and hit OK.

    14. Fill out FileEntry.cs with the following code:

    public class FileEntry
    {
        public Uri FileUri { get; set; }
        public string FileName { get; set; }
        public string Submitter { get; set; }
    }

    15. Back to Default.aspx.cs, let’s add code to populate the GridView by getting the collection of blobs from ListBlobs() and creating a FileEntry for each item.

    FetchAttributes() is used to retrieve the blob metadata.

    using System.Collections.Generic;
    using System.Collections.Specialized;

    private void UpdateFileList()
    {
        // Get a list of the blobs
        var blobs = _BlobContainer.ListBlobs();
        var filesList = new List<FileEntry>();
    
        // For each item, create a FileEntry which will populate the grid
        foreach (var blobItem in blobs)
        {
            var cloudBlob = _BlobContainer.GetBlobReference(blobItem.Uri.ToString());
            cloudBlob.FetchAttributes();
    
            filesList.Add(new FileEntry() { 
                FileUri = blobItem.Uri,
                FileName = cloudBlob.Metadata["FileName"],
                Submitter = cloudBlob.Metadata["Submitter"]
            });    
        }
        
        // Bind the grid
        fileView.DataSource = filesList;
        fileView.DataBind();
    }

    Deleting Blobs

    16. Add code to delete the blob in the row command handler that was setup in the aspx. This is as simple as calling CloudBlobContainer.DeleteIfExists() for the blob where the CloudBlob instance is gotten from the Guid + file extension generated during the upload.

    protected void RowCommandHandler(object sender, GridViewCommandEventArgs e)
    {
        if (e.CommandName == "DeleteItem")
        {
            var index = Convert.ToInt32(e.CommandArgument);
            var blobName = (string)fileView.DataKeys[index].Value;
            var blobContainer = _BlobClient.GetContainerReference("publicFiles");
            var blob = blobContainer.GetBlobReference(blobName);
            blob.DeleteIfExists();
        }
    
        // Update the UI
        UpdateFileList();
    }

    Testing the Application

    17. Build and hit F5 to run the application.

    Note: The FileUpload control has a file size limit. You can modify it by changing the httpRuntime maxRequestLength attribute in web.config.

    image

    Moving from Development Storage to Cloud Storage

    18. Now I want to switch this to use Windows Azure Storage, not the development storage.  The first step is to go to the Windows Azure Developer Portal and create a storage account.

    Login and click on “New Service”, and select “Storage Account”:

    image

    Fill out the service name, the public name and optionally choose a region.  You will be brought to a page that contains the following (note I rubbed out the access keys):

    image

    19. You will use the first part of the endpoint, (jnakstorageaccount) and the one of the access keys to fill out your connection string.

    20. Open the WebRole1 config again, bring up Settings | DataConnectionString and fill out the account name and the account key and hit OK.

    image

    21. Hit F5 to run the application again. 

    This time you will be running against cloud storage – note that the data you entered when running against the development storage is no longer there.

    Important Note: Before deploying to the cloud, the DiagnosticsConnectionString also needs to use storage account credentials and not the development storage account.

    Please see the Deploying a Cloud Service walkthrough to learn how to deploy this to the Windows Azure cloud.

    For more information, please see the Blob Service API documentation and the Programming Blob Storage white paper.

    I know that there are a number of different concepts that have to be pieced together, hopefully this walkthrough has been helpful in understand how everything fits together.

  • Cloudy in Seattle

    Windows Azure Article in Visual Studio Magazine

    • 2 Comments

    Good article in Visual Studio Magazine this month on Windows Azure:  http://visualstudiomagazine.com/Articles/2010/01/01/App-Dev-from-the-Ground-Up.aspx

    Ok, I might be a bit biased -- I get a few mentions in the article that I'm pretty jazzed about :)

    The quotes are from my session at PDC '09.  In case you missed it, you can always catch it here: Tips and Tricks for Using Visual Studio 2010 to Build Applications that Run on Windows Azure

    Kathleen, if you read this, thanks for the quotes!

  • Cloudy in Seattle

    Walkthrough: Windows Azure Table Storage (Nov 2009 and later)

    • 11 Comments

    This walkthrough covers what I found to be the simplest way to get a sample up and running on Windows Azure that uses the Table Storage Service. It serves as an introduction to both Windows Azure cloud services as well as using table storage.  Although there is a wealth of information out there on Windows Azure - I try to tie together a lot of that information for folks of all levels to consume.

    I originally wrote this walkthrough well over a year ago for our very first public release of Windows Azure at PDC ‘08.

    Much has changed in the last year, and this post is an update to that original post that will work with our PDC ‘09 and v1.0 release of the Windows Azure Tools

    So what's changed from a dev perspective?  Overall not a lot, mainly because table storage leverages ADO.NET Data Services and that is the core of how you work with Table Storage.  The way you connect to Windows Azure Storage has changed, namespaces, class names have changed and there have been a few other tweaks.

    To be clear, this post is not trying to be comprehensive or trying to dive deep in the technology, it just serves as an introduction to how the Table Storage Service works.  Also, please take a look at the Quick Lap Around the Tools before doing this walkthrough.

    Note: The code for this walkthrough is attached to this blog post.

    After you have completed this walkthrough, you will have a Web Role that is a simple ASP.NET Web Application that shows a list of Contacts and allows you to add to and delete from that list. Each contact will have simplified information: just a name and an address (both strings).

    image

    Table Storage Concepts

    The Windows Azure Table Storage Services provides queryable structured storage. Each account can have 0..n tables, i.e. an unlimited number of tables and entities with no limit on the table size. (the combined size of an entity cannot exceed 1MB however)

    image

    Each entity and a table always has three properties, the PartitionKey, the RowKey and Timestamp that are not shown in above for space/legibility reasons.  Together these form a unique key for an entity. 

    Additionally, currently the only index and all results are returned sorted by PartitionKey and then by RowKey.

    Design of the Sample

    When a request comes in to the UI, it makes its way to the Table Storage Service as follows (click for larger size):

    clip_image002

    The UI class (the aspx page and its code behind) is data bound through an ObjectDataSource to the WebRole1.ContactDataSource which creates the connection to the Table Storage service gets the list of Contacts and Inserts to, and Deletes from, the Table Storage.

    The WebRole1.ContactDataModel class acts as the data model object and the WebRole1.ContactDataServiceContext derives from TableServiceContext which handles the authentication process and allows you to write LINQ queries, insert, delete and save changes to the Table Storage service.

    Creating the Cloud Service Project

    1. Start Visual Studio as an administrator (Screen shots below are from VS 2008, VS 2010 is also supported)

    2. Create a new project: File | New Project

    3. Under the Visual C# node (VB is also supported), select the “Cloud Service” project type then select the “Windows Azure Cloud Service” project template. Set the name to be “SimpleTableSample”. Hit OK to continue.

    image

    This will bring up a dialog to add Roles to the Cloud Service.

    4. Add an ASP.NET Web Role to the Cloud Service, we’ll use the default name of “WebRole1”.  Hit OK.

    image

    Solution explorer should look as follows:

    image

    5. We now want to setup the the data model for the entity.  Right-click on WebRole1 in the Solution Explorer and select “Add Class”.  Call this class “ContactDataModel.cs” and hit OK.

    6.  Add a using directive to the storage client library – a .NET library for using Windows Azure Storage.  The assembly reference was already added by Visual Studio.

    using Microsoft.WindowsAzure.StorageClient;

    7. Make the ContactDataModel class derive from the TableServiceEntity class. This brings in the PartitionKey, RowKey and Timestamp properties. (not necessary to derive from TableServiceEntity, but a convenience)

    8. For simplicity, we’ll just assign a new Guid as the PartitionKey to ensure uniqueness even though generally a GUID is not a good partition key.  If we were really building an address book, using a partition key that maps to a popular search field would be a good approach (contact name for example). 

    In this case, since the PartitionKey is set and the RowKey to is set to a constant hard coded value (String.Empty) the storage system distributes the data over many storage nodes prioritizing scalability (spreading load over multiple servers) over the faster performance of operations on multiple entities in a single partition. (entity locality)

    The key message here is that you’ll want to think about and make the right decision for your application/scenarios.  To learn more read the Programming Table Storage White Paper on windowsazure.com.

    public class ContactDataModel :TableServiceEntity
    {
        public ContactDataModel(string partitionKey, string rowKey)
            : base(partitionKey, rowKey)
        {
        }
    
        public ContactDataModel(): this(Guid.NewGuid().ToString(), String.Empty)
        {
        }
    
        public string Name { get; set; }
        public string Address { get; set; }
    }

    9. Now add the ContactDataServiceContext to the Web Role that derives from TableServiceContext.  Right click on WebRole1 and select Add | Class…  Name the class ContactDataServiceContext.cs.

    10.  Add the using directives.

    using Microsoft.WindowsAzure;
    using Microsoft.WindowsAzure.StorageClient;

    11. Set the base class to be TableServiceContext.

    12. We’ll use the ContactDataServiceContext later to write queries, insert, remove and save changes to the table storage. One of the key things it does is provide the IQueryable<T> property that corresponds to a table.

    public class ContactDataServiceContext : TableServiceContext
    {
        public ContactDataServiceContext(string baseAddress, StorageCredentials credentials)
            : base(baseAddress, credentials)
        {
        }
    
        public const string ContactTableName = "ContactTable";
    
        public IQueryable<ContactDataModel> ContactTable
        {
            get
            {
                return this.CreateQuery<ContactDataModel>(ContactTableName);
            }
        }
    }

    Every IQueryable<T> property corresponds to a table in table storage.

    13. Let’s now add the ContactDataSource class. We'll fill this class out over the course of the next few steps.  This is the class the does all the hookup between the UI and the table storage service.  Right click on WebRole1 | Add | Class… and enter the file name to be “ContactDataSource.cs”.

    14. Add a reference to the System.Data.Services.Client assembly.  Right click on the References node under WebRole1 and select Add Reference…

    image

    Then scroll down in the list and select System.Data.Services.Client and click OK.

    image

    15. Now add the using directives to the top of the ContactDataSource.cs file.

    using Microsoft.WindowsAzure;
    using Microsoft.WindowsAzure.StorageClient;
    using System.Data.Services.Client;

    16. For simplicity, use the instantiation of the ContactDataSource class as the location to setup the connection to Windows Azure Storage.  This involves reading a connection string from the Windows Azure settings and creating the ContactDataServiceContext with that connection information.

    public class ContactDataSource
    {
        private ContactDataServiceContext _ServiceContext = null;
    
        public ContactDataSource()
        {
            var storageAccount = CloudStorageAccount.FromConfigurationSetting("DataConnectionString");
            _ServiceContext = new ContactDataServiceContext(storageAccount.TableEndpoint.ToString(), storageAccount.Credentials);   }

    17.  Setup the “DataConnectionString” setting by opening up the configuration UI for WebRole1.  Right click on the WebRole1 node under the Roles folder in the SimpleTableStorage cloud service project and select “Properties”.

    image

    18. Switch to the Settings tab and click “Add Setting”.  Name it DataConnectionString, set the type to ConnectionString and click on the “…” button on the far right.

    image

    Hit “OK” to set the credentials to use Development Storage.  We’ll first get this sample working on development storage then convert it to use cloud storage later.

    19. If you actually instantiated the ContactDataSource and ran this app, you would find that the CloudStorageAccount.FromConfigurationSetting() call would fail with the following message:

    ConfigurationSettingSubscriber needs to be set before FromConfigurationSetting can be used

    This message is in fact incorrect – I have a bug filed to get this fixed and should be fixed in the next release (after our November 2009 release), to say “Configuration Setting Publisher” and not “ConfigurationSettingSubscriber”.

    To resolve this, we need to add a bit of template code to the WebRole.cs file in WebRole1.  Add the following to the WebRole.OnStart() method.

    using Microsoft.WindowsAzure;

    #region Setup CloudStorageAccount Configuration Setting Publisher
    
    // This code sets up a handler to update CloudStorageAccount instances when their corresponding
    // configuration settings change in the service configuration file.
    CloudStorageAccount.SetConfigurationSettingPublisher((configName, configSetter) =>
    {
        // Provide the configSetter with the initial value
        configSetter(RoleEnvironment.GetConfigurationSettingValue(configName));
    
        RoleEnvironment.Changed += (sender, arg) =>
        {
            if (arg.Changes.OfType<RoleEnvironmentConfigurationSettingChange>()
                .Any((change) => (change.ConfigurationSettingName == configName)))
            {
                // The corresponding configuration setting has changed, propagate the value
                if (!configSetter(RoleEnvironment.GetConfigurationSettingValue(configName)))
                {
                    // In this case, the change to the storage account credentials in the
                    // service configuration is significant enough that the role needs to be
                    // recycled in order to use the latest settings. (for example, the 
                    // endpoint has changed)
                    RoleEnvironment.RequestRecycle();
                }
            }
        };
    });
    #endregion

    The comments (which I wrote and is included in the samples) should explain what is going on if you care to know.  In a nutshell, this code bridges the gap between the Microsoft.WindowsAzure.StorageClient assembly and the Microsoft.WindowsAzure.ServiceRuntime library – the Storage Client library is agnostic to the Windows Azure runtime as it can be used in non Windows Azure applications.

    This code essentially says how to get a setting value given a setting name and sets up an event handler to handle setting changes while running in the cloud (because the ServiceConfiguration.cscfg file was updated in the cloud).

    20. We need some code to ensure that the tables we rely on get created. Add the code to create the tables if they don't exist to the ContactDataSource constructor:

    public ContactDataSource()
    {
        var storageAccount = CloudStorageAccount.FromConfigurationSetting("DataConnectionString");
        _ServiceContext = new ContactDataServiceContext(storageAccount.TableEndpoint.ToString(), storageAccount.Credentials);
    
        // Create the tables
        // In this case, just a single table.  
        storageAccount.CreateCloudTableClient().CreateTableIfNotExist(ContactDataServiceContext.ContactTableName);
    }

    Note: For production code you'll want to optimize the reading of the configuration settings and making a call to create the tables to improve perf -- the focus of this post is to keep things simple.

    Add the following code to the ContactDataSource.cs file:

    public IEnumerable<ContactDataModel> Select()
    {
        var results = from c in _ServiceContext.ContactTable
                      select c;
    
        var query = results.AsTableServiceQuery<ContactDataModel>();
        var queryResults = query.Execute();
    
        return queryResults;
    }
    
    public void Delete(ContactDataModel itemToDelete)
    {
        _ServiceContext.AttachTo(ContactDataServiceContext.ContactTableName, itemToDelete, "*");
        _ServiceContext.DeleteObject(itemToDelete);
        _ServiceContext.SaveChanges();
    }
    
    public void Insert(ContactDataModel newItem)
    {
        _ServiceContext.AddObject(ContactDataServiceContext.ContactTableName, newItem);
        _ServiceContext.SaveChanges();
    }

    Note: in the Select() method, the TableServiceQuery<T> class enables you to have finer grained control over how you get the data.

    Note: the use of AttachTo() in the Delete() method to connect to and remove the row.

    22. The UI is defined in the aspx page and consists of 3 parts. The GridView which will display all of the rows of data, the FormView which allows the user to add rows and the ObjectDataSource which databinds the UI to the ContactDataSource.

    23. The GridView is placed after the first <div>. Note that in this sample, we’ll just auto-generate the columns and show the delete button. The DataSourceId is set the ObjectDataSource which will be covered below.

        <asp:GridView
            id="contactsView"
            DataSourceId="contactData"
            DataKeyNames="PartitionKey"
            AllowPaging="False"
            AutoGenerateColumns="True"
            GridLines="Vertical"
            Runat="server" 
            BackColor="White" ForeColor="Black"
            BorderColor="#DEDFDE" BorderStyle="None" BorderWidth="1px" CellPadding="4">
            <Columns>
                <asp:CommandField ShowDeleteButton="true"  />
            </Columns>
            <RowStyle BackColor="#F7F7DE" />
            <FooterStyle BackColor="#CCCC99" />
            <PagerStyle BackColor="#F7F7DE" ForeColor="Black" HorizontalAlign="Right" />
            <SelectedRowStyle BackColor="#CE5D5A" Font-Bold="True" ForeColor="White" />
            <HeaderStyle BackColor="#6B696B" Font-Bold="True" ForeColor="White" />
            <AlternatingRowStyle BackColor="White" />
        </asp:GridView>    

    24. The Form view to add rows is really simple, just labels and text boxes with a button at the end to raise the “Insert” command. Note that the DataSourceID is again set to the ObjectDataProvider and there are bindings to the Name and Address.

        <br />        
        <asp:FormView
            id="frmAdd"
            DataSourceId="contactData"
            DefaultMode="Insert"
            Runat="server">
            <InsertItemTemplate>
                <asp:Label
                        id="nameLabel"
                        Text="Name:"
                        AssociatedControlID="nameBox"
                        Runat="server" />
                <asp:TextBox
                        id="nameBox"
                        Text='<%# Bind("Name") %>'
                        Runat="server" />
                <br />
                <asp:Label
                        id="addressLabel"
                        Text="Address:"
                        AssociatedControlID="addressBox"
                        Runat="server" />
                <asp:TextBox
                        id="addressBox"
                        Text='<%# Bind("Address") %>'
                        Runat="server" />
                <br />
                <asp:Button
                        id="insertButton"
                        Text="Add"
                        CommandName="Insert"
                        Runat="server"/>
            </InsertItemTemplate>
        </asp:FormView>
    

    25. The final part of the aspx is the definition of the ObjectDataSource. See how it ties the ContactDataSource and the ContactDataModel together with the GridView and FormView.

        <%-- Data Sources --%>
        <asp:ObjectDataSource runat="server" ID="contactData"     TypeName="WebRole1.ContactDataSource"
            DataObjectTypeName="WebRole1.ContactDataModel" 
            SelectMethod="Select" DeleteMethod="Delete" InsertMethod="Insert">    
        </asp:ObjectDataSource>
    

    26. Build. You should not have any compilation errors.

    27. F5 to debug. You will see the app running in the Development Fabric using the Table Development Storage

    image

    28. Now I want to switch this to use Windows Azure Storage, not the development storage.  The first step is to go to the Windows Azure Developer Portal and create a storage account.

    Login and click on “New Service”, and select “Storage Account”:

    image

    Fill out the service name, the public name and optionally choose a region.  You will be brought to a page that contains the following (note I rubbed out the access keys):

    image

    29. You will use the first part of the endpoint, (jnakstorageaccount) and the one of the access keys to fill out your connection string.

    30. Open the WebRole1 config again, bring up Settings | DataConnectionString and fill out the account name and the account key and hit OK.

    image

    31. Hit F5 to run the application again. 

    This time you will be running against cloud storage – note that the data you entered when running against the development storage is no longer there.

    Important Note: Before deploying to the cloud, the DiagnosticsConnectionString also needs to use storage account credentials and not the development storage account.

    32. Please see the Deploying a Cloud Service walkthrough to learn how to deploy this to the Windows Azure cloud.

    And there you have it, a walkthrough of using Windows Azure Table Storage – for more information and to dig deeper, definitely check out the white paper here: Windows Azure Table – Programming Table Storage and the docs in MSDN that cover what subsets of ADO.NET Data Services work with table storage here: http://msdn.microsoft.com/en-us/library/dd135720.aspx and here: http://msdn.microsoft.com/en-us/library/dd894032.aspx

Page 1 of 1 (8 items)