• Cloudy in Seattle

    Where to Find your Windows Azure Billable Usage Info

    • 9 Comments

    This weekend I had someone email me asking me to blog about where to find the usage info used for billing because they were having a hard time finding it and figured others were having difficulty as well.

    Coincidentally, I saw an internal thread this morning where someone was asking the very same question so... Erik, I believe you are correct, a post will probably be helpful for folks.

    To find your usage info, go to:

    https://mocp.microsoftonline.com (From the Developer Portal, you can click on the "Billing" link in the upper right hand corner.

    I'm going to walk through getting to the actual bill because I know some people get this far but get lost in some of these pages.

    Choose your country / region and hit continue.

    image

    Then click “Sign in now” or “Sign in” and sign in.

    image

    Followed by “View my bills”, then click on “View Online Bill/Invoice”

    image 

    From the “Online Bill” page you can select the Billing Period and click on one of the Usage Charge links in the middle of the page. This will bring you to a page similar to the following:

    image

    Essentially, everything you want to know about your bill will be there.  Note that this doesn’t update in real time and that there is a lag.

    Also note that you need to delete your deployment to stop the clock on compute hours.

    [Update 2/11/2010 - The intent of this post was to help you get to the right place where you can explore around.  If you are looking for more information on Data Transfer usage, SQL Azure and other usage charge pages, please see this post by Roger Jennings for more details.]

    Hope this helps.

  • Cloudy in Seattle

    Migrating an Existing ASP.NET App to run on Windows Azure

    • 19 Comments

    This post has 2 main parts.  The first part is an update to a post I wrote back in February 2009 about using an existing ASP.NET Web Application as Web Role and rolls in information from this post and this post.  The second part is about migrating an existing database running on SQL Express and ASP.NET providers to SQL Azure.

    I’ll start with the NerdDinner sample, so make sure you have ASP.NET MVC installed.  Although I used VS 2008 for the screen shots, this walkthrough is compatible with VS 2010.

    I’ve opened the solution in Visual Studio and removed the test project to keep things simple.

    image

    The first thing I need to do is make this Web Application project a Web Role. 

    I can do that one of two ways:

    1) Since I have the NerdDinner project open, I can add a Windows Azure Cloud Service to the solution.

    image

    Select “Windows Azure Cloud Service” and hit “OK”

    image

    Hit “OK” again, because we don’t need to add any Roles to this Cloud Service.

    image

    Right click on the “Roles” node in the Cloud Service project and select “Add | Web Role Project in solution…” 

    image

    Select the NerdDinner project.  Note that all of the Web Application projects in the solution will show up in this list. 

    image

    2) The other option would have been to create a new Cloud Service project and add the NerdDinner project to it using Solution | Add | Existing Project… then following the step of Add | Web Role Project in solution…

    image

    We now have the following:

    image 

    Before I get to what it will take to hit F5 and make the NerdDinner application run as a Web Role, let’s discuss the differences between a Web Role and an ASP.NET Web Application.

    There are 4 differences and they are:

    • References to the Windows Azure specific assemblies: Microsoft.WindowsAzure.Diagnostics, Microsoft.WindowsAzure.ServiceRuntime, and Microsoft.WindowsAzure.StorageClient
    • Bootstrap code in the WebRole.cs/vb file that starts the DiagnosticMonitor as well as defines a default behavior of recycling the role when a configuration setting change occurs.
    • The addition of a trace listener in the web.config file: Microsoft.WindowsAzure.Diagnostics.DiagnosticMonitorTraceListener.
    • In the case of an MVC web application, the assembly reference to System.Web.Mvc may not have the Copy Local property set to “True” – you need to to have this to ensure that the System.Web.Mvc assembly is available in the cloud.  The cloud VMs only contain the assemblies that come with the .NET Framework 3.5 SP1 redistributable. (System.Web.Mvc is not one of them and this is actually a hard thing to diagnose today as your role will go into an intializing, starting, stopping loop).  Setting “Copy Local” to True will ensure the assembly is added to the Service Package – the package that gets uploaded to the cloud and used to run a Cloud Service on the local development fabric.

    image

    Additionally, today we only support targeting .NET Framework 3.5 SP1.  Stay tuned for .NET 4 support.

    Except for #4, the other 3 differences aren’t strictly required.

    Chances are, at a minimum you are going to want to reference Microsoft.WindowsAzure.ServiceRuntime and Microsoft.WindowsAzure.Diagnostics, start the Diagnostic Monitor and add the trace listener so that you can write logs and gather other diagnostic information to diagnose issues or monitor the health of your application.

    If you use Windows Azure Storage, you are going to want to use the Microsoft.WindowsAzure.StorageClient library which provides a .NET interface to Windows Azure Storage.

    For the sake of this article, I’m just going to make sure that System.Web.Mvc has Copy Local set to true.

    Let’s try hitting F5 and seeing what we get.

    The application runs… almost…  if I click on “View Upcoming Dinners” I get an error. 

    image

    The connectionstrings.config file that is referenced in the web.config is not being found.

    <connectionStrings configSource="ConnectionStrings.config" />

    I need to make sure that this file is added to the Service Package.  I can do so by adding it to the NerdDinner project (right click on the NerdDinner project in Solution Explorer and select Add | Existing Item…)

    image

    Now set the Build Action to Content which should be done by default but I want to call it out as a way to ensure that given files in the project get added to the Service Package – you may a need to do this with some of your other files. 

    image

    Now, I hit F5 again, and everything works. But will it work in the cloud?

    The answer is no – NerdDinner has a NerdDinner.mdf file it uses for data and it makes use of ASP.NET providers – both of these rely on SQL Express, which I have on my local machine but are not available in the cloud VMs (even if it was, you would need to have a story that works across multiple instances).

    I have a decision to make.  I can use SQL Azure or I can rewrite the application to use Windows Azure Storage.  Since it is easy to do a search for “NerdDinner Azure” and find examples of the latter I will do the former – the point of this article is also to primarily focus on using the an existing project in a Windows Azure Cloud Service.

    In the real world, you’ll want to consider your data and compare the long term costs / effort / requirements to make the right decision.

    The key app I need to migrate the data to SQL Azure is the SQL Server Management Studio R2 CTP that supports SQL Azure.  Links are here. This post may also be helpful.

    I also need to have a database setup on SQL Azure.  Go to sql.azure.com and sign in. Create databases called NerdDinnerDB and aspprovidersdb.

    image

    Make sure to set your firewall settings such that you can the machine where your app is running in the development fabric has access. For example: this is not recommended, but makes development on multiple machines easy.

    image

    The steps to migrate the data are now:

    1. Migrate the NerdDinner.MDF
    2. Migrate the ASP.NET providers
    3. Change the connection strings.

    Let’s start with NerdDinner.MDF. 

    Open SQL Server Management Studio 2008 R2 and connect to .\SQLExpress. 

    Right click on Databases, and select “Attach…”

     

     

    image

    Click the Add… button and browse to the location of the NerdDinner.MDF file then click OK.

    image

    Now right click on the NerdDinner database that was just added, select Tasks | Generate Scripts…

    image

    This will bring you to the Generate and Publish Scripts wizard.  Click Next twice (selecting the default of the whole database) then click Advanced on the “Set Scripting Options” page.

    Scroll down to “Script for the database engine type” and select “SQL Azure Database”.

    image

    You also have an option to choose whether to script the schema only, the schema and data or the data only.  For the purposes of this walkthrough, you can choose the schema only or the schema and data.

    clip_image002

    Finish the wizard saving the script to file then open the file in SSMS.

    Now we want to connect to SQL Azure.  Do so by right clicking the Connect button in Object Explorer and selecting “Database Engine”. 

    For server name put the name you got from the SQL Azure portal, including the full domain name.  For example in the SQL Azure portal, the server name is listed as: zky996mdy7.database.windows.net

    image

    Correspondingly, enter this Server name in the Connect to Server dialog in SSMS:

    image

    The login is the Administrator username and the password setup in the SQL Azure portal.  Note that @zky996mdy7 is appended to the username.

    Click on Options >> select the “Connection Properties” tab and enter NerdDinnerDB for “Connect to database”.

    image

    This puts the SQL Azure database in the Object Explorer in SSMS.

    Right click on the SQL Azure database and select “New Query”.  This will open a SQL Query window.

    image

    Copy and paste in the database script the SQL Query window and hit Execute.  In the bottom status area of SSMS you should see that the query executed successfull and that it was against your SQL Azure NerdDinnerDB database.

    image

    Now we need to setup the ASP Providers.  This requires using provider scripts that we created for SQL Azure.  See this post for the scripts and more info.

    Download the scripts and extract.

    Open the InstallCommon.SQL script in SSMS.  Since we were last connected to the NerdDinnerDB and we now want to connect to the aspprovidersDB you created above in the SQL Azure portal, right click in the query window and select Connection | Disconnect. 

    image

    Follow that by right clicking in the SQL Query window and selecting Connection | Connect, entering the aspprovidersDB as the database to connect to in the options.

    image

    Run the script, open InstallMembership.SQL and installprofile.SQL and run those scripts as well.  Just be sure to always be running these scripts against aspprovidersdb.

    Now we need to change the connection strings in the connectionstrings.config file we added to the NerdDinner project.  There is a connection string for the NerdDinner database and a connection string for the ASP Providers.

    Here’s an example to follow, replacing the server, user ID and Password appropriately. 

    <connectionStrings>
      <add name="ApplicationServices" connectionString="Server=tcp:zky996mdy7.database.windows.net;Database=aspprovidersdb;User ID=jnak;Password=<enter>;Trusted_Connection=False;" providerName="System.Data.SqlClient"/>
      <add name="NerdDinnerConnectionString" connectionString="Server=tcp:zky996mdy7.database.windows.net;Database=NerdDinnerDB;User ID=jnak;Password=<enter>;Trusted_Connection=False;" providerName="System.Data.SqlClient"/>
    </connectionStrings>

    Delete the database files in App_Data and hit F5 to run this in the Development Fabric. 

    A couple things to test to ensure that both the providers and the nerddinner database are working correctly: Register for an account, host a dinner and view all upcoming dinners.

    Now deploy this to the cloud -- Please see the Deploying a Cloud Service walkthrough if you need help.

    What’s cool to consider here is we started with a single instance ASP.NET web application and turned it into a scalable cloud application that has the capability to be load balanced across any number of instances.

    As I look back at this walkthrough, it’s actually quite long but over half is migrating the data to SQL Azure and there was a lot of screen shots.  The key things I want you get out of this post are:

    1) Add existing ASP.NET Web Applications to the same solution as a Cloud Service project and use Roles | Add | Web Role in project…

    2) Understand the 4 differences between a Windows Azure Web Role project and a standard Web Application project

    3) Figure out how you are going to handle your data, you have a few options notably Windows Azure Storage and SQL Azure. Don’t forget about your providers.

    4) Walkthrough of migrating an existing database to SQL Azure.

    Finally, whenever you are deploying the cloud, it is useful to look into the service package and make sure that everything you need is in the service package.  See this post for more info – remember Windows Azure VMs only include the .NET Framework 3.5 SP1 redistributable.

  • Cloudy in Seattle

    Windows Azure Compute Hours include the time your Deployment is Stopped

    • 0 Comments

    One of my tenets on this blog is to not do posts that simply point to someone else’s post.  I’m breaking that tenant today because of something I just found out that I think is super important for all Windows Azure customers to be clear on and saw that Ryan Dunn had already posted about this.

    Windows Azure compute time is calculated based on the time that you’ve deployed the app, not the time your app is in the running state.

    For example, if you put your deployment in the stopped state, you will continue to be charged for Compute hours – the rate at which will correspond to the VM Size you have selected.

    clip_image002

    In order to not be charged Compute hours, you need to delete the deployment.  This will look as follows:

    clip_image004

    Ryan goes on to show you how to use the powershell cmdlets to automate deleting your deployments, please check out his post.

    We are working on making this more obvious on the developer portal.

  • Cloudy in Seattle

    February 2010 Release of the Windows Azure Tools for Microsoft Visual Studio v1.1

    • 1 Comments

    I’m pleased to announce that the Windows Azure Tools for Microsoft Visual Studio 1.1 (direct link while the release propagates) has been released to coincide with the general availability of Windows Azure.

    This release supports VS 2008 and the upcoming VS 2010 RC.  For VS 2010 Beta 2 please use the November 2009 release.

    New for version 1.1:

    • Windows Azure Drive: Enable a Windows Azure application to use existing NTFS APIs to access a durable drive. This allows the Windows Azure application to mount a page blob as a drive letter, such as X:, and enables easy migration of existing NTFS applications to the cloud.
    • OS Version Support: Allows a Windows Azure application to choose the appropriate Guest OS to run on in the cloud.
    • Bug Fixes
      • StorageClient: Expose account key from storage credentials, expose continuation tokens for pagination, and reduce maximum buffer size for parallel uploads.
      • Windows Azure Diagnostics: Fix path corruption for crash dumps, OnDemandTransfer now respects LogLevelFilter.
      • VS 2010: Improved packaging performance.
      • VS 2010: Support for linked files in Web projects.
      • VS 2010: Support for ASP.NET web project web.config transformations.
      • Certificate selection lists certificates from LocalMachine\My instead of CurrentUser\My.
      • Right click on Role under Roles folder to select whether to launch the browser against HTTP, HTTPS or not at all.

    Updated and additional samples are available at: http://code.msdn.microsoft.com/windowsazuresamples

  • Cloudy in Seattle

    Windows Azure Tools for Microsoft Visual Studio 2010 RC

    • 0 Comments

    Windows Azure Tools for Microsoft Visual Studio extend Visual Studio to enable the creation, building, configuring, debugging, running and packaging of scalable web applications and services on Windows Azure.

    The February 2010 release of the Windows Azure Tools for Microsoft Visual Studio support the Visual Studio 2008 SP1 and the upcoming Visual Studio 2010 RC.

    We’re very excited about Visual Studio 2010! The RC, available soon, will support a broad ‘go-live’ license, increased performance and stability, and is an excellent vehicle for providing any remaining feedback prior to the final release build.

    Additionally, because of the improvements in the Visual Studio 2010 RC, we’ve made a number of RC specific improvements to our tools:

    • Improved packaging performance.
    • Support for linked files in Web projects.
    • Support for ASP.NET web project web.config transformations.

    If you are looking to download the Windows Azure Tools for Visual Studio 2010 Beta 2 -- please use the November 2009 release.

    Learn more at http://windowsazure.com

  • Cloudy in Seattle

    Installing Certificates in Windows Azure VMs

    • 7 Comments

    A little while ago I posted How To: Add an HTTPS Endpoint to a Windows Azure Cloud Service which talked about the whole process around adding an HTTPS endpoint and configuring & uploading the SSL certificate for that endpoint.

    This post is a follow up to that post to talk about installing any certificate to a Windows Azure VM.  You may want to do this to install various client certificates or even to install the intermediate certificates to complete the certificate chain for your SSL certificate.

    In order to peak into the cloud, I’ve written a very simple app that will enumerate the certificates of the Current User\Personal (or My) store.

    Create a new Windows Azure Cloud Service and add an ASP.NET Web Role to it. 

    Open up default.aspx and add the following between the empty <div></div> to setup a table that will be used to list the certificates.

    <asp:Table ID="certificateTable" runat="server">
        <asp:TableRow runat="server">
            <asp:TableCell runat="server">Friendly Name:</asp:TableCell>
            <asp:TableCell runat="server">Issued By:</asp:TableCell>
            <asp:TableCell runat="server">Issued To:</asp:TableCell>
            <asp:TableCell runat="server">Expiration Date:</asp:TableCell>
        </asp:TableRow>
    </asp:Table>
    

    Now, I have some simple code that opens up a certificate store and adds a row to the table for each certificate found with the Friendly Name, Issuer, Subject and Expiration date.

    protected void Page_Load(object sender, EventArgs e)
    {
        X509Certificate2Collection selectedCerts = new X509Certificate2Collection();
    
        X509Store store = new X509Store(StoreName.My, StoreLocation.CurrentUser);
        try
        {
            store.Open(OpenFlags.OpenExistingOnly | OpenFlags.ReadOnly);
            foreach (X509Certificate2 cert in store.Certificates)
            {
                TableRow certificateRow = new TableRow();
    
                // Friendly Name
                TableCell friendlyNameCell = new TableCell();
                TextBox friendlyNameText = new TextBox();
                friendlyNameText.Text = cert.FriendlyName;
                friendlyNameCell.Controls.Add(friendlyNameText);
                certificateRow.Cells.Add(friendlyNameCell);
    
                // Issuer
                TableCell issuerCell = new TableCell();
                TextBox issuerText = new TextBox();
                issuerText.Text = cert.Issuer;
                issuerCell.Controls.Add(issuerText);
                certificateRow.Cells.Add(issuerCell);
                
                // Subject
                TableCell subjectCell = new TableCell();
                TextBox subjectText = new TextBox();
                subjectText.Text = cert.Subject;
                subjectCell.Controls.Add(subjectText);
                certificateRow.Cells.Add(subjectCell);
    
                // Expiration
                TableCell expirationCell = new TableCell();
                TextBox expirationText = new TextBox();
                expirationText.Text = cert.NotAfter.ToString("d");
                expirationCell.Controls.Add(expirationText);
                certificateRow.Cells.Add(expirationCell);
                
                // Add the TableRow to the Table
                certificateTable.Rows.Add(certificateRow);
            }
        }
        finally
        {
            store.Close();
        }
    }

    Build, publish and deploy this to the cloud and you’ll see that the CurrentUser\MY store is empty. 

    image 

    Now that I have a way to show you the certificates that I install to the cloud, let’s follow the process to install 2 certificates.  One is a self signed certificate with a private key I created myself and the other is an intermediate certificate I got from the verisign web site. 

    Installing a certificate to Windows Azure is a three step process. 

    1. Configure the certificates to install per role
    2. Upload the certificates via the Windows Azure Developer Portal per hosted service
    3. Deploy your app and when VMs are setup to host each of your role instances, those VMs will contain your certificates.

    Note: The configuration of the certificates are per role but the upload of the certificates are for a hosted service and will be used for all of the roles in that hosted service.

    Configure the certificates.  Bring up the configuration UI for the Web Role in your solution, click on “Certificates” and click on “Add Certificate”.

    image

    Now click on the “. . .” to bring up the certificate selection dialog.  This currently enumerates the certificates in the CurrentUser\My store, however in future versions this will enumerate the certificates in LocalMachine\My.  The dialog specifies which store it is listing the certificates from.

    image

    In my example, I’m selecting a self-signed certs with an exportable private key that I’ve setup ahead of time.

    This sets the thumbprint field for that certificate.

    image

    Now I have a CER file from verisign which is an intermediate cert.  The steps to create it was to copy a big long chunk of text from their web site and save it as a .CER file which I did.

    I have a couple of choices on how to handle this, I could install this certificate to the CurrentUser\My store and select it from the VS UI to get the thumbprint.

    Instead, I’m going to open the certificate, switch to Details and copy the Thumbprint from the certificate itself (the point of doing the 2 certificates is to show alternate ways of handling the certificates):

    image

    Now I click “Add Certificate” on the Certificates tab of the VS UI and paste in the thumbprint for my second certificate.  I also changed the names of the certificate entries to “SelfSigned” and “Intermediate” for clarity but certainly is not necessary.

    image

    Note:I changed the store location and store name to CurrentUser and My – typically you will install intermediate certs to the CA store but to keep this sample simple I only want to enumerate one store, CurrentUser\My which as we saw above is empty by default in Windows Azure so I’m installing the two example certificates to the CurrentUser\My store.

    The dropdown for store name currently contains My, Root, CA and Trust.  We do support you opening the service definition file and setting any string for the Store Name that you would like in the event that the 4 in the drop down are not sufficient.

    <Certificate name="SelfSigned" storeLocation="CurrentUser" storeName="<enter a value>" />
    

    I now need to upload these certificates through the Developer Portal which means I need them as files.  I have the .CER file already but need to export my self-signed cert from the certificate store.

    To do so I run certmgr.msc, browse to the certificate, right click | All Tasks | Export…

    image

    This takes me through Windows Certificate Export Wizard.  One of the key things for SSL certificates and any other certificates where you need the private key in the Cloud, is to ensure that you select “Yes, export the private key” and provide a password during the export

    image

    After finishing the wizard, I now have 2 certificate files, the .CER file and the .PFX file that was just created.

    image

    Now I go to the Developer portal, and select my Hosted Service.  Down at the bottom I select “Manage” under the “Certificates” heading.

    image 

    This allows me to browse to PFX files, type the same password you entered when exporting the certificate and uploading that certificate to the cloud.

    image 

    You’ll notice here that the Portal only allows uploading of PFX files but we need to upload a .CER file.  This will change in the future but for now, there is some simple code you can run to convert your .CER file to PFX.  I put the following code in a console application and ran it.

    static void Main(string[] args)
    {
        // The path to the certificate.
        string certificate = @"C:\Users\jnak.REDMOND\Desktop\verisignintermediate.cer";
    
        // Load the certificate into an X509Certificate object.
        X509Certificate cert = new X509Certificate(certificate);
        byte[] certData = cert.Export(X509ContentType.Pkcs12);
    
        System.IO.File.WriteAllBytes(@"C:\Users\jnak.REDMOND\Desktop\verisignintermediate.pfx", certData);
    }

    Now that I have a PFX file for my CER file, I upload that leaving the password field blank because the password is only used to protect private keys which CER files do not have and therefore this PFX file also does not have.

    You can see here that both certificates are now installed to my Hosted Service.

    image

    Now I deploy my Cloud Service which will enumerate the two certificates I installed to the CurrentUser\My store.  If you deployed the app already as I did above, just suspend and delete that deployment and deploy the app again with the new configuration.  (both the service definition and service configuration files have changed so the package (cspkg) and service configuration (cscfg) need to be deployed again to Windows Azure.

    After you complete the deploy process and your app is running on Windows Azure, you’ll now see the following:

    image

    This shows that the VMs in the cloud for your role have the certificates you configured and uploaded installed.

    Finally I’ll reiterate that certificate configuration is on a per role basis.  As an example, if I added a worker role to this same Cloud Service, the VMs for the worker role instances would not contain the certificates we configured above unless you opened up the configuration UI for the Worker role and added the same entries to the Certificates tab.

    That said, because the certificates are upload per hosted service – you do not need to upload the certificates to the cloud more than once for a given Cloud Service.

  • Cloudy in Seattle

    Windows Azure Debugging: Matching an Instance in the DevFabric to its Process

    • 1 Comments

    One of the things I run into from time to time is the need/desire to match an instance I’m watching in the Development Fabric UI back to its process.

    This may be because I am sitting on a breakpoint in Visual Studio and want to look at the corresponding logs for that process in the DevFabric UI or because I want to use some non Windows Azure aware tool that requires a process ID.

    For example, given a Web and a Worker role with 3 instances each, I can watch the instances in the development fabric UI by right clicking on the Windows Azure task tray icon and select “Show Development Fabric UI”:

    image

    image

    This is useful for a couple of reasons, sometimes I want to double check the status of my roles and instances but more commonly I want to watch the logs on a per instance basis.

    Although the logs are shown in the Visual Studio output window, all of the messages for all of the instances show up there:

    image

    Which is fine for the common case where I’m debugging a single instance at a time but not so good as soon as I have more than one instance.

    Let’s setup an example, create a new Windows Azure Cloud Service in Visual Studio, add an ASP.NET Web Role and a Worker Role and you’ll have the following in Solution Explorer:

    image 

    Set the instance count to 3 for each role by bringing up the role configuration UI for each role:

    image

    and setting “Instance count” to 3.

    image

    Now I’m going to set a breakpoint in the WorkerRole.cs file in the Worker Role project.  You can see that the default worker role code just loops and writes a trace message.

    image

    Hit F5 to start debugging the Cloud Service on the Development Fabric.

    When I hit a breakpoint – I can see that Visual Studio is debugging multiple WaWorkerHost.exe and WaWebHost.exe processes – these correspond to Worker and Web role instances respectively.

    image

    But which process corresponds to which logs in the Development Fabric UI? 

    That is, if I’m stopped on a breakpoint for a WaWorkerHost.exe process with ID 4460 – how do I know which log output in the Development Fabric UI corresponds to the process I’m currently stopped on?

    As it turns out, there are two ways I can get the information I want -- which is the process ID for the instances I’m looking at in the Development Fabric.

    Option 1: Check the log file.  If you look carefully, the log file will have the text “-parent xyz” where xyz is the process ID of the WaWorkerHost.exe or WaWebHost.exe that is hosting your worker or web role instance.

    image

    Option 2: Use the “Attach debugger…” functionality from the context menu in the Development Fabric.  Right click on one of the instance nodes in the Development Fabric and select “Attach debugger…”

    image 

    This will bring up a VS dialog that has both the process name and the process ID on it. 

    image

    I can then use that process ID to correspond an instance in the DevFabric UI to the process I’m stopped on in Visual Studio.

    Note: If you aren’t debugging your cloud service from Visual Studio and you choose to use this DevFabric UI option to “Attach Debugger” -- On x64 systems, the “Attach debugger…” functionality only attaches the native debugger so it will appear as though even though you are attached that you can’t hit breakpoints in managed code.

    To workaround this issue, make note of process ID and then click “No, cancel debugging”.  Open up Visual Studio and select Debug | Attach to Process…

    image

    Select the process with the matching process ID and click “Attach”.  To explicitly select a debugger, before clicking “Attach”, click the “Select…” button which will allow you to specify which debugger you want to attach.

    image 

    So far we’ve seen that multiple instances on the development fabric isn’t used nearly as much as doing single instance debugging, however I have a feeling that as more and more apps adopt the Windows Azure scale out model, there are going to be more cases where developers are going to want to test/debug multi-instance scenarios locally.

    To that end, I hope this helps and let me know if you think we need to do more for this scenario.

  • Cloudy in Seattle

    Windows Azure Instance & Storage Limits

    • 0 Comments

    Recently, a colleague of mine wrote about the Windows Azure instance limits: http://blog.toddysm.com/2010/01/windows-azure-role-instance-limits-explained.html

    His post is very complete, I recommend you have a look but here is my take:

    These are default limits that are in place to ensure that Windows Azure will always have VMs available to all of our customers.  If you have a need for more capacity, we want to help!  Please contact us: http://go.microsoft.com/fwlink/?LinkID=123579

    The limits are:

    • 20 Hosted Service Projects
    • 5 Storage Accounts
    • 5 roles per Hosted Service (i.e. 3 different web roles + 2 different worker roles or any such combination)
    • 20 CPU cores across all of your Hosted Service Projects

    The first two are really easy to track, on the Development portal when you go to create a new service, it’ll tell you how many you have left of each:

    image

    5 roles per Hosted Service is also easy to understand, this corresponds to the number of projects you can add as roles to your Cloud Service – here I am hitting my role limit:

    image

    So let’s talk real quick about the 20 CPU core limit – note that the limit is on CPU cores, not on instances. 

    When you configure your role, you can set the number of instances as well as the VM size:

    image

    The VM sizes of Small, Medium, Large and ExtraLarge are defined here: http://msdn.microsoft.com/en-us/library/ee814754.aspx

    Today the CPU cores for each VM size are: (subject to change so always consult the MSDN link above for the latest information)

    VM Size CPU Cores
    Small 1
    Medium 2
    Large 4
    ExtraLarge 8

    So the number of CPU cores for a role is the (instance count) X (Number of CPU Cores for the selected VM Size).

    If you add those up across all of your roles across all of your Hosted Service projects (staging and production slots) – this has to be lower than 20.

    Quick example: if you have 5 Hosted Service projects with 1 role, 2 instances per role and Medium VM size, you’ve hit the limit.

    The other key is that you not only need to stop your deployment to free up CPU cores, you also need to delete the deployment to reduce your CPU core count.

    What about Windows Azure Storage quotas? 

    It just so happens that another colleague of mine has written about this: http://blogs.msdn.com/frogs69/archive/2009/12/17/storage-quotas-and-core-allocation-on-windows-azure.aspx

    Each storage account allows you to have 100TB of data across all of your blob, tables and queues.  As mentioned above, you can have up to 5 storage accounts.

    If you are dealing with really large data sets, follow the link above to see the limits on the blobs, # properties in a table, entity and queue messages.

  • Cloudy in Seattle

    Windows Azure WCF Add Service Reference Patch and Windows 7

    • 0 Comments

    For those of you that watched my Tips & Tricks session at PDC or followed along on my blog post, you'll recall that I mentioned the following:

    In general, WCF works correctly on Windows Azure.  There is a problem using "add service reference" or svcutil but we have a patch to workaround the problem.  The patch is installed in the cloud, and more information about this is here: http://code.msdn.microsoft.com/wcfazure/Wiki/View.aspx?title=KnownIssues (note that a web.config change is also required)

    One of the things this prompted a lot of folks to ask is "Where is the Windows 7 version of this patch?"

    Well, I'm happy to announce that we have released this QFE for Windows 7 and Windows Server 2008 R2: http://code.msdn.microsoft.com/KB977420

    I also recommend that if you are using WCF on Windows Azure, you spend time browsing the content on http://code.msdn.microsoft.com/wcfazure.

    You may also be interested in the REST service templates the WCF team has made available on the Visual Studio Gallery: http://visualstudiogallery.msdn.microsoft.com/en-us/842a05c3-4cc8-49d3-837f-5ec7e6b17e80 (this is the .NET 3.5 C# template, there are also .NET 3.5 VB and .NET 4 C# and VB templates)

    Note that the REST templates aren't currently directly supported as a Windows Azure Role in the New Project Dialog, but you can easily use that template in a Windows Azure Cloud Service by following the "Using an Existing Project" section of this post.

  • Cloudy in Seattle

    Windows Azure - Resolving "The Path is too long after being fully qualified" Error Message

    • 19 Comments

    When you run a cloud service on the development fabric, the development fabric uses a temporary folder to store a number of files including local storage locations, cached binaries, configuration, diagnostics information and cached compiled web site content.

    By default this location is: C:\Users\<username>\AppData\Local\dftmp

    For the most part you won’t really care about this temporary folder, the Windows Azure Tools will periodically clean up the folder so it doesn’t get out of hand.

    Note: To manually clean up the devfabric temporary folder, you can open an elevated Windows Azure SDK Command Prompt and run: “csrun /devfabric:shutdown” followed by “csrun /devfabric:clean”.  You really don’t need to do this but it can come in handy from time to time.

    There are some cases where the length of the path can cause problems.

    If the combination of your username, cloud service project name, role name and assembly name get so long that you run into assembly or file loading issues at runtime. This will give you the following error message when you hit F5:

    “The path is too long after being fully qualified.  Make sure the full path is less than 260 characters and the directory name is less than 248 characters.”

    For example, in my test, the path to one of the assemblies in my cloud service was:

    C:\Users\jnak\AppData\Local\dftmp\s0\deployment(4)\res\deployment(4).CloudServiceabcdefghijklmnopqrstuvwxyzabcdefghijklmnopqr.WebRole1.0\AspNetTemp\aspNetTemp\root\aff90b31\aa373305\assembly\dl3\971d7b9b\0064bc6f_307dca01\Microsoft.WindowsAzure.Diagnostics.DLL

    which exceeds the 260 character path limit.

    If you aren’t married to your project and assembly names, you could name those differently so that they are shorter

    The other workaround is to change the location of the development fabric temporary folder to be a shorter path.

    You can do this by setting the _CSRUN_STATE_DIRECTORY to a shorter path, say “C:\A” for example.

    image

    Make sure that you close Visual Studio and shutdown the development fabric by using the “csrun /devfabric:shutdown” command I mentioned above or by clicking “exit” on the Windows Azure the tray icon.

    After making this change, my sample above was able to run without problem.

    Of course, this workaround really only buys you more characters and ultimately you may have to simply reduce your path lengths through renaming.

  • Cloudy in Seattle

    Walkthrough: Windows Azure Blob Storage (Nov 2009 and later)

    • 27 Comments

    Similar to the table storage walkthrough I posted last week, I updated this blog post for the Nov 2009/v1.0 and later release of the Windows Azure Tools.

    This walkthrough covers what I found to be the simplest way to get a sample up and running on Windows Azure that uses the Blob Storage service. It is not trying to be comprehensive or trying to dive deep in the technology, it just serves as an introduction to how the Windows Azure Blob Storage Service works.

    Please take the Quick Lap Around the Tools before doing this walkthrough.

    Note: The code for this walkthrough is attached to this blog post.

    After you have completed this walkthrough, you will have a Web Role that is a simple ASP.NET Web Application that shows a list of files that are stored and can be downloaded from Blob Storage. You can use the Web Role to add files to Blob storage and make them available in the list.

    image

    Blob Concepts

    Each storage account has access to blob storage. For each account there can be 0..n containers. Each container contains the actual blobs, which is a raw byte array. Containers can be public or private. In public containers, the URLs to the blobs can be accessed over the internet while in a private container, only the account holder can access those blob URLs.

    Each Blob can have a set of metadata set as a NameValueCollection of strings.

    image

    Creating the Cloud Service Project

    1. Start Visual Studio as an administrator (Screen shots below are from VS 2008, VS 2010 is also supported)

    2. Create a new project: File | New Project

    3. Under the Visual C# node (VB is also supported), select the “Cloud Service” project type then select the “Windows Azure Cloud Service” project template. Set the name to be “SimpleBlobSample”. Hit OK to continue.

    image

    This will bring up a dialog to add Roles to the Cloud Service.

    4. Add an ASP.NET Web Role to the Cloud Service, we’ll use the default name of “WebRole1”.  Hit OK.

    image

    Solution explorer should look as follows:

    image

    We’ll now cover the implementation, which can be broken up into 5 different parts:

    1. Implementing the UI
    2. Connecting to Windows Azure storage
    3. Adding blobs
    4. Enumerating existing blobs
    5. Deleting blobs

    Implementing the UI

    5. Next open up Default.aspx and add the code for the UI. The UI consists of:

    • GridView at the top
    • Label and FileUpload control
    • 2 Label and TextBox pairs (File Name and Submitter)
    • Field validators to ensure that all of the fields are filled out before the file is uploaded.

    Add the following between the template generated <div></div> elements:

    <asp:GridView ID="fileView" AutoGenerateColumns="false" DataKeyNames="FileUri" runat="server"
        OnRowCommand="RowCommandHandler">
        <Columns>
            <asp:ButtonField Text="Delete" CommandName="DeleteItem" />
            <asp:HyperLinkField HeaderText="Link" DataTextField="FileName" DataNavigateUrlFields="FileUri" />
            <asp:BoundField DataField="Submitter" HeaderText="Submitted by" />
        </Columns>
    </asp:GridView>
    <br />
    <asp:Label ID="filePathLabel" Text="File Path:" AssociatedControlID="fileUploadControl"
        runat="server" />
    <asp:FileUpload ID="fileUploadControl" runat="server" />
    <asp:RequiredFieldValidator ID="filUploadValidator" ControlToValidate="fileUploadControl"
        ValidationGroup="fileInfoGroup" ErrorMessage="Select a File" runat="Server">
    </asp:RequiredFieldValidator>
    <br />
    <asp:Label ID="fileNameLabel" Text="File Name:" AssociatedControlID="fileNameBox"
        runat="server" />
    <asp:TextBox ID="fileNameBox" runat="server" />
    <asp:RequiredFieldValidator ID="fileNameValidator" ControlToValidate="fileNameBox"
        ValidationGroup="fileInfoGroup" ErrorMessage="Enter the File Name" runat="Server">
    </asp:RequiredFieldValidator>
    <br />
    <asp:Label ID="submitterLabel" Text="Submitter:" AssociatedControlID="submitterBox"
        runat="server" />
    <asp:TextBox ID="submitterBox" runat="server" />
    <asp:RequiredFieldValidator ID="submitterValidator" ControlToValidate="submitterBox"
        ValidationGroup="fileInfoGroup" ErrorMessage="Enter the Submitter Name" runat="Server">
    </asp:RequiredFieldValidator>
    <br />
    <asp:Button ID="insertButton" Text="Submit" CausesValidation="true" ValidationGroup="fileInfoGroup"
        runat="server" OnClick="insertButton_Click" />
    <br />
    <br />
    <asp:Label ID="statusMessage" runat="server" />

    6. If you now switch to design view, you will see:

    image

    You’ll also notice from that aspx that there is an event handler for OnRowCommand on the GridView to handle the DeleteItem command, IDs for the TextBoxes and an event handler for the OnClick event on the Submit button.

    The code for these will be filled out further down in the walkthrough.

    Connecting to Windows Azure storage

    7. Open Default.aspx.cs and add the code to connect to the Blob Storage Service to the Page_Load() method.

    using Microsoft.WindowsAzure;
    using Microsoft.WindowsAzure.StorageClient;
    
    namespace WebRole1
    {
        public partial class _Default : System.Web.UI.Page
        {
            private CloudBlobClient _BlobClient = null;
            private CloudBlobContainer _BlobContainer = null;
    
            protected void Page_Load(object sender, EventArgs e)
            {
                // Setup the connection to Windows Azure Storage
                var storageAccount = CloudStorageAccount.FromConfigurationSetting("DataConnectionString");
                _BlobClient = storageAccount.CreateCloudBlobClient();

    8.  Setup the “DataConnectionString” setting by opening up the configuration UI for WebRole1.  Right click on the WebRole1 node under the Roles folder in the SimpleBlobStorage cloud service project and select “Properties”.

    image

    9. Switch to the Settings tab and click “Add Setting”.  Name it DataConnectionString, set the type to ConnectionString and click on the “…” button on the far right.

    image

    Hit “OK” to set the credentials to use Development Storage.  We’ll first get this sample working on development storage then convert it to use cloud storage later.

    10. If you actually tried to connect to Blob Storage at this point, you would find that the CloudStorageAccount.FromConfigurationSetting() call would fail with the following message:

    ConfigurationSettingSubscriber needs to be set before FromConfigurationSetting can be used

    This message is in fact incorrect – I have a bug filed to get this fixed, to say “Configuration Setting Publisher” and not “ConfigurationSettingSubscriber”.

    To resolve this, we need to add a bit of template code to the WebRole.cs file in WebRole1.  Add the following to the WebRole.OnStart() method in WebRole.cs in the WebRole1 project.

    using Microsoft.WindowsAzure;

    public override bool OnStart()
    {
        DiagnosticMonitor.Start("DiagnosticsConnectionString");
    
        #region Setup CloudStorageAccount Configuration Setting Publisher
    
        // This code sets up a handler to update CloudStorageAccount instances when their corresponding
        // configuration settings change in the service configuration file.
        CloudStorageAccount.SetConfigurationSettingPublisher((configName, configSetter) =>
        {
            // Provide the configSetter with the initial value
            configSetter(RoleEnvironment.GetConfigurationSettingValue(configName));
    
            RoleEnvironment.Changed += (sender, arg) =>
            {
                if (arg.Changes.OfType<RoleEnvironmentConfigurationSettingChange>()
                    .Any((change) => (change.ConfigurationSettingName == configName)))
                {
                    // The corresponding configuration setting has changed, propagate the value
                    if (!configSetter(RoleEnvironment.GetConfigurationSettingValue(configName)))
                    {
                        // In this case, the change to the storage account credentials in the
                        // service configuration is significant enough that the role needs to be
                        // recycled in order to use the latest settings. (for example, the 
                        // endpoint has changed)
                        RoleEnvironment.RequestRecycle();
                    }
                }
            };
        });
        #endregion
    
        // For information on handling configuration changes
        // see the MSDN topic at http://go.microsoft.com/fwlink/?LinkId=166357.
        RoleEnvironment.Changing += RoleEnvironmentChanging;
    
        return base.OnStart();
    }

    The comments (which I wrote and is included in the samples) should explain what is going on if you care to know.  In a nutshell, this code bridges the gap between the Microsoft.WindowsAzure.StorageClient assembly and the Microsoft.WindowsAzure.ServiceRuntime library – the Storage Client library is agnostic to the Windows Azure runtime as it can be used in non Windows Azure applications.

    This code essentially says how to get a setting value given a setting name and sets up an event handler to handle setting changes while running in the cloud (because the ServiceConfiguration.cscfg file was updated in the cloud).

    The key point is that with this snippet of code in place, you now have everything in place to connect to Windows Azure storage and create a CloudBlobClient instance.

    Adding Blobs

    11. In order to add blobs to a container, you first need to setup a container.  Let’s add this code to the Page_Load() method in Default.aspx.cs.  For a production application, you will want to optimize this code to avoid doing all this work on every page load.

    Page_Load() should now look as follows.

    protected void Page_Load(object sender, EventArgs e)
    {
        // Setup the connection to Windows Azure Storage
        var storageAccount = CloudStorageAccount.FromConfigurationSetting("DataConnectionString");
        _BlobClient = storageAccount.CreateCloudBlobClient();
    
        // Get and create the container
        _BlobContainer = _BlobClient.GetContainerReference("publicfiles");
        _BlobContainer.CreateIfNotExist();
    
        // Setup the permissions on the container to be public
        var permissions = new BlobContainerPermissions();
        permissions.PublicAccess = BlobContainerPublicAccessType.Container;
        _BlobContainer.SetPermissions(permissions);
    
        // Show the current list.
        UpdateFileList();
    }

    Note: The container is named with DNS naming restrictions (i.e. all lower case) and is created if it does not exist.  Additionally, the container is set to be a public container – i.e. the URIs to the blobs are accessible by anyone over the internet. 

    Had this been a private container, the blobs in that container could only be read by code that has the access key and account name.

    12.  Let’s now add the code to Default.aspx.cs to add a blob when the “Submit” button on the UI is clicked (remember the event handler was defined in the aspx).

    A GUID is created for the file name to ensure a unique blob name is used.  The file name and submitter are gotten from the TextBoxes in the UI.

    Blob Metadata, or user defined key/value pairs, is used to store the file name and submitter along with the blob. 

    protected void insertButton_Click(object sender, EventArgs e)
    {
        // Make a unique blob name
        string extension = System.IO.Path.GetExtension(fileUploadControl.FileName);
    
        // Create the Blob and upload the file
        var blob = _BlobContainer.GetBlobReference(Guid.NewGuid().ToString() + extension);
        blob.UploadFromStream(fileUploadControl.FileContent);
    
        // Set the metadata into the blob
        blob.Metadata["FileName"] = fileNameBox.Text;
        blob.Metadata["Submitter"] = submitterBox.Text;
        blob.SetMetadata();
    
        // Set the properties
        blob.Properties.ContentType = fileUploadControl.PostedFile.ContentType;
        blob.SetProperties();
    
        // Update the UI
        UpdateFileList();
        fileNameBox.Text = "";
        statusMessage.Text = "";
    }

    Enumerating Existing Blobs

    13. In order to simplify the databinding to the UI, lets add a FileEntry class that contains the data we want to show in the UI for each blob.  One instance of a FileEntry corresponds to a blob. 

    Right click on WebRole1 and select Add | Class…

    image

    Name the class FileEntry.cs and hit OK.

    14. Fill out FileEntry.cs with the following code:

    public class FileEntry
    {
        public Uri FileUri { get; set; }
        public string FileName { get; set; }
        public string Submitter { get; set; }
    }

    15. Back to Default.aspx.cs, let’s add code to populate the GridView by getting the collection of blobs from ListBlobs() and creating a FileEntry for each item.

    FetchAttributes() is used to retrieve the blob metadata.

    using System.Collections.Generic;
    using System.Collections.Specialized;

    private void UpdateFileList()
    {
        // Get a list of the blobs
        var blobs = _BlobContainer.ListBlobs();
        var filesList = new List<FileEntry>();
    
        // For each item, create a FileEntry which will populate the grid
        foreach (var blobItem in blobs)
        {
            var cloudBlob = _BlobContainer.GetBlobReference(blobItem.Uri.ToString());
            cloudBlob.FetchAttributes();
    
            filesList.Add(new FileEntry() { 
                FileUri = blobItem.Uri,
                FileName = cloudBlob.Metadata["FileName"],
                Submitter = cloudBlob.Metadata["Submitter"]
            });    
        }
        
        // Bind the grid
        fileView.DataSource = filesList;
        fileView.DataBind();
    }

    Deleting Blobs

    16. Add code to delete the blob in the row command handler that was setup in the aspx. This is as simple as calling CloudBlobContainer.DeleteIfExists() for the blob where the CloudBlob instance is gotten from the Guid + file extension generated during the upload.

    protected void RowCommandHandler(object sender, GridViewCommandEventArgs e)
    {
        if (e.CommandName == "DeleteItem")
        {
            var index = Convert.ToInt32(e.CommandArgument);
            var blobName = (string)fileView.DataKeys[index].Value;
            var blobContainer = _BlobClient.GetContainerReference("publicFiles");
            var blob = blobContainer.GetBlobReference(blobName);
            blob.DeleteIfExists();
        }
    
        // Update the UI
        UpdateFileList();
    }

    Testing the Application

    17. Build and hit F5 to run the application.

    Note: The FileUpload control has a file size limit. You can modify it by changing the httpRuntime maxRequestLength attribute in web.config.

    image

    Moving from Development Storage to Cloud Storage

    18. Now I want to switch this to use Windows Azure Storage, not the development storage.  The first step is to go to the Windows Azure Developer Portal and create a storage account.

    Login and click on “New Service”, and select “Storage Account”:

    image

    Fill out the service name, the public name and optionally choose a region.  You will be brought to a page that contains the following (note I rubbed out the access keys):

    image

    19. You will use the first part of the endpoint, (jnakstorageaccount) and the one of the access keys to fill out your connection string.

    20. Open the WebRole1 config again, bring up Settings | DataConnectionString and fill out the account name and the account key and hit OK.

    image

    21. Hit F5 to run the application again. 

    This time you will be running against cloud storage – note that the data you entered when running against the development storage is no longer there.

    Important Note: Before deploying to the cloud, the DiagnosticsConnectionString also needs to use storage account credentials and not the development storage account.

    Please see the Deploying a Cloud Service walkthrough to learn how to deploy this to the Windows Azure cloud.

    For more information, please see the Blob Service API documentation and the Programming Blob Storage white paper.

    I know that there are a number of different concepts that have to be pieced together, hopefully this walkthrough has been helpful in understand how everything fits together.

  • Cloudy in Seattle

    Windows Azure Article in Visual Studio Magazine

    • 2 Comments

    Good article in Visual Studio Magazine this month on Windows Azure:  http://visualstudiomagazine.com/Articles/2010/01/01/App-Dev-from-the-Ground-Up.aspx

    Ok, I might be a bit biased -- I get a few mentions in the article that I'm pretty jazzed about :)

    The quotes are from my session at PDC '09.  In case you missed it, you can always catch it here: Tips and Tricks for Using Visual Studio 2010 to Build Applications that Run on Windows Azure

    Kathleen, if you read this, thanks for the quotes!

  • Cloudy in Seattle

    Walkthrough: Windows Azure Table Storage (Nov 2009 and later)

    • 11 Comments

    This walkthrough covers what I found to be the simplest way to get a sample up and running on Windows Azure that uses the Table Storage Service. It serves as an introduction to both Windows Azure cloud services as well as using table storage.  Although there is a wealth of information out there on Windows Azure - I try to tie together a lot of that information for folks of all levels to consume.

    I originally wrote this walkthrough well over a year ago for our very first public release of Windows Azure at PDC ‘08.

    Much has changed in the last year, and this post is an update to that original post that will work with our PDC ‘09 and v1.0 release of the Windows Azure Tools

    So what's changed from a dev perspective?  Overall not a lot, mainly because table storage leverages ADO.NET Data Services and that is the core of how you work with Table Storage.  The way you connect to Windows Azure Storage has changed, namespaces, class names have changed and there have been a few other tweaks.

    To be clear, this post is not trying to be comprehensive or trying to dive deep in the technology, it just serves as an introduction to how the Table Storage Service works.  Also, please take a look at the Quick Lap Around the Tools before doing this walkthrough.

    Note: The code for this walkthrough is attached to this blog post.

    After you have completed this walkthrough, you will have a Web Role that is a simple ASP.NET Web Application that shows a list of Contacts and allows you to add to and delete from that list. Each contact will have simplified information: just a name and an address (both strings).

    image

    Table Storage Concepts

    The Windows Azure Table Storage Services provides queryable structured storage. Each account can have 0..n tables, i.e. an unlimited number of tables and entities with no limit on the table size. (the combined size of an entity cannot exceed 1MB however)

    image

    Each entity and a table always has three properties, the PartitionKey, the RowKey and Timestamp that are not shown in above for space/legibility reasons.  Together these form a unique key for an entity. 

    Additionally, currently the only index and all results are returned sorted by PartitionKey and then by RowKey.

    Design of the Sample

    When a request comes in to the UI, it makes its way to the Table Storage Service as follows (click for larger size):

    clip_image002

    The UI class (the aspx page and its code behind) is data bound through an ObjectDataSource to the WebRole1.ContactDataSource which creates the connection to the Table Storage service gets the list of Contacts and Inserts to, and Deletes from, the Table Storage.

    The WebRole1.ContactDataModel class acts as the data model object and the WebRole1.ContactDataServiceContext derives from TableServiceContext which handles the authentication process and allows you to write LINQ queries, insert, delete and save changes to the Table Storage service.

    Creating the Cloud Service Project

    1. Start Visual Studio as an administrator (Screen shots below are from VS 2008, VS 2010 is also supported)

    2. Create a new project: File | New Project

    3. Under the Visual C# node (VB is also supported), select the “Cloud Service” project type then select the “Windows Azure Cloud Service” project template. Set the name to be “SimpleTableSample”. Hit OK to continue.

    image

    This will bring up a dialog to add Roles to the Cloud Service.

    4. Add an ASP.NET Web Role to the Cloud Service, we’ll use the default name of “WebRole1”.  Hit OK.

    image

    Solution explorer should look as follows:

    image

    5. We now want to setup the the data model for the entity.  Right-click on WebRole1 in the Solution Explorer and select “Add Class”.  Call this class “ContactDataModel.cs” and hit OK.

    6.  Add a using directive to the storage client library – a .NET library for using Windows Azure Storage.  The assembly reference was already added by Visual Studio.

    using Microsoft.WindowsAzure.StorageClient;

    7. Make the ContactDataModel class derive from the TableServiceEntity class. This brings in the PartitionKey, RowKey and Timestamp properties. (not necessary to derive from TableServiceEntity, but a convenience)

    8. For simplicity, we’ll just assign a new Guid as the PartitionKey to ensure uniqueness even though generally a GUID is not a good partition key.  If we were really building an address book, using a partition key that maps to a popular search field would be a good approach (contact name for example). 

    In this case, since the PartitionKey is set and the RowKey to is set to a constant hard coded value (String.Empty) the storage system distributes the data over many storage nodes prioritizing scalability (spreading load over multiple servers) over the faster performance of operations on multiple entities in a single partition. (entity locality)

    The key message here is that you’ll want to think about and make the right decision for your application/scenarios.  To learn more read the Programming Table Storage White Paper on windowsazure.com.

    public class ContactDataModel :TableServiceEntity
    {
        public ContactDataModel(string partitionKey, string rowKey)
            : base(partitionKey, rowKey)
        {
        }
    
        public ContactDataModel(): this(Guid.NewGuid().ToString(), String.Empty)
        {
        }
    
        public string Name { get; set; }
        public string Address { get; set; }
    }

    9. Now add the ContactDataServiceContext to the Web Role that derives from TableServiceContext.  Right click on WebRole1 and select Add | Class…  Name the class ContactDataServiceContext.cs.

    10.  Add the using directives.

    using Microsoft.WindowsAzure;
    using Microsoft.WindowsAzure.StorageClient;

    11. Set the base class to be TableServiceContext.

    12. We’ll use the ContactDataServiceContext later to write queries, insert, remove and save changes to the table storage. One of the key things it does is provide the IQueryable<T> property that corresponds to a table.

    public class ContactDataServiceContext : TableServiceContext
    {
        public ContactDataServiceContext(string baseAddress, StorageCredentials credentials)
            : base(baseAddress, credentials)
        {
        }
    
        public const string ContactTableName = "ContactTable";
    
        public IQueryable<ContactDataModel> ContactTable
        {
            get
            {
                return this.CreateQuery<ContactDataModel>(ContactTableName);
            }
        }
    }

    Every IQueryable<T> property corresponds to a table in table storage.

    13. Let’s now add the ContactDataSource class. We'll fill this class out over the course of the next few steps.  This is the class the does all the hookup between the UI and the table storage service.  Right click on WebRole1 | Add | Class… and enter the file name to be “ContactDataSource.cs”.

    14. Add a reference to the System.Data.Services.Client assembly.  Right click on the References node under WebRole1 and select Add Reference…

    image

    Then scroll down in the list and select System.Data.Services.Client and click OK.

    image

    15. Now add the using directives to the top of the ContactDataSource.cs file.

    using Microsoft.WindowsAzure;
    using Microsoft.WindowsAzure.StorageClient;
    using System.Data.Services.Client;

    16. For simplicity, use the instantiation of the ContactDataSource class as the location to setup the connection to Windows Azure Storage.  This involves reading a connection string from the Windows Azure settings and creating the ContactDataServiceContext with that connection information.

    public class ContactDataSource
    {
        private ContactDataServiceContext _ServiceContext = null;
    
        public ContactDataSource()
        {
            var storageAccount = CloudStorageAccount.FromConfigurationSetting("DataConnectionString");
            _ServiceContext = new ContactDataServiceContext(storageAccount.TableEndpoint.ToString(), storageAccount.Credentials);   }

    17.  Setup the “DataConnectionString” setting by opening up the configuration UI for WebRole1.  Right click on the WebRole1 node under the Roles folder in the SimpleTableStorage cloud service project and select “Properties”.

    image

    18. Switch to the Settings tab and click “Add Setting”.  Name it DataConnectionString, set the type to ConnectionString and click on the “…” button on the far right.

    image

    Hit “OK” to set the credentials to use Development Storage.  We’ll first get this sample working on development storage then convert it to use cloud storage later.

    19. If you actually instantiated the ContactDataSource and ran this app, you would find that the CloudStorageAccount.FromConfigurationSetting() call would fail with the following message:

    ConfigurationSettingSubscriber needs to be set before FromConfigurationSetting can be used

    This message is in fact incorrect – I have a bug filed to get this fixed and should be fixed in the next release (after our November 2009 release), to say “Configuration Setting Publisher” and not “ConfigurationSettingSubscriber”.

    To resolve this, we need to add a bit of template code to the WebRole.cs file in WebRole1.  Add the following to the WebRole.OnStart() method.

    using Microsoft.WindowsAzure;

    #region Setup CloudStorageAccount Configuration Setting Publisher
    
    // This code sets up a handler to update CloudStorageAccount instances when their corresponding
    // configuration settings change in the service configuration file.
    CloudStorageAccount.SetConfigurationSettingPublisher((configName, configSetter) =>
    {
        // Provide the configSetter with the initial value
        configSetter(RoleEnvironment.GetConfigurationSettingValue(configName));
    
        RoleEnvironment.Changed += (sender, arg) =>
        {
            if (arg.Changes.OfType<RoleEnvironmentConfigurationSettingChange>()
                .Any((change) => (change.ConfigurationSettingName == configName)))
            {
                // The corresponding configuration setting has changed, propagate the value
                if (!configSetter(RoleEnvironment.GetConfigurationSettingValue(configName)))
                {
                    // In this case, the change to the storage account credentials in the
                    // service configuration is significant enough that the role needs to be
                    // recycled in order to use the latest settings. (for example, the 
                    // endpoint has changed)
                    RoleEnvironment.RequestRecycle();
                }
            }
        };
    });
    #endregion

    The comments (which I wrote and is included in the samples) should explain what is going on if you care to know.  In a nutshell, this code bridges the gap between the Microsoft.WindowsAzure.StorageClient assembly and the Microsoft.WindowsAzure.ServiceRuntime library – the Storage Client library is agnostic to the Windows Azure runtime as it can be used in non Windows Azure applications.

    This code essentially says how to get a setting value given a setting name and sets up an event handler to handle setting changes while running in the cloud (because the ServiceConfiguration.cscfg file was updated in the cloud).

    20. We need some code to ensure that the tables we rely on get created. Add the code to create the tables if they don't exist to the ContactDataSource constructor:

    public ContactDataSource()
    {
        var storageAccount = CloudStorageAccount.FromConfigurationSetting("DataConnectionString");
        _ServiceContext = new ContactDataServiceContext(storageAccount.TableEndpoint.ToString(), storageAccount.Credentials);
    
        // Create the tables
        // In this case, just a single table.  
        storageAccount.CreateCloudTableClient().CreateTableIfNotExist(ContactDataServiceContext.ContactTableName);
    }

    Note: For production code you'll want to optimize the reading of the configuration settings and making a call to create the tables to improve perf -- the focus of this post is to keep things simple.

    Add the following code to the ContactDataSource.cs file:

    public IEnumerable<ContactDataModel> Select()
    {
        var results = from c in _ServiceContext.ContactTable
                      select c;
    
        var query = results.AsTableServiceQuery<ContactDataModel>();
        var queryResults = query.Execute();
    
        return queryResults;
    }
    
    public void Delete(ContactDataModel itemToDelete)
    {
        _ServiceContext.AttachTo(ContactDataServiceContext.ContactTableName, itemToDelete, "*");
        _ServiceContext.DeleteObject(itemToDelete);
        _ServiceContext.SaveChanges();
    }
    
    public void Insert(ContactDataModel newItem)
    {
        _ServiceContext.AddObject(ContactDataServiceContext.ContactTableName, newItem);
        _ServiceContext.SaveChanges();
    }

    Note: in the Select() method, the TableServiceQuery<T> class enables you to have finer grained control over how you get the data.

    Note: the use of AttachTo() in the Delete() method to connect to and remove the row.

    22. The UI is defined in the aspx page and consists of 3 parts. The GridView which will display all of the rows of data, the FormView which allows the user to add rows and the ObjectDataSource which databinds the UI to the ContactDataSource.

    23. The GridView is placed after the first <div>. Note that in this sample, we’ll just auto-generate the columns and show the delete button. The DataSourceId is set the ObjectDataSource which will be covered below.

        <asp:GridView
            id="contactsView"
            DataSourceId="contactData"
            DataKeyNames="PartitionKey"
            AllowPaging="False"
            AutoGenerateColumns="True"
            GridLines="Vertical"
            Runat="server" 
            BackColor="White" ForeColor="Black"
            BorderColor="#DEDFDE" BorderStyle="None" BorderWidth="1px" CellPadding="4">
            <Columns>
                <asp:CommandField ShowDeleteButton="true"  />
            </Columns>
            <RowStyle BackColor="#F7F7DE" />
            <FooterStyle BackColor="#CCCC99" />
            <PagerStyle BackColor="#F7F7DE" ForeColor="Black" HorizontalAlign="Right" />
            <SelectedRowStyle BackColor="#CE5D5A" Font-Bold="True" ForeColor="White" />
            <HeaderStyle BackColor="#6B696B" Font-Bold="True" ForeColor="White" />
            <AlternatingRowStyle BackColor="White" />
        </asp:GridView>    

    24. The Form view to add rows is really simple, just labels and text boxes with a button at the end to raise the “Insert” command. Note that the DataSourceID is again set to the ObjectDataProvider and there are bindings to the Name and Address.

        <br />        
        <asp:FormView
            id="frmAdd"
            DataSourceId="contactData"
            DefaultMode="Insert"
            Runat="server">
            <InsertItemTemplate>
                <asp:Label
                        id="nameLabel"
                        Text="Name:"
                        AssociatedControlID="nameBox"
                        Runat="server" />
                <asp:TextBox
                        id="nameBox"
                        Text='<%# Bind("Name") %>'
                        Runat="server" />
                <br />
                <asp:Label
                        id="addressLabel"
                        Text="Address:"
                        AssociatedControlID="addressBox"
                        Runat="server" />
                <asp:TextBox
                        id="addressBox"
                        Text='<%# Bind("Address") %>'
                        Runat="server" />
                <br />
                <asp:Button
                        id="insertButton"
                        Text="Add"
                        CommandName="Insert"
                        Runat="server"/>
            </InsertItemTemplate>
        </asp:FormView>
    

    25. The final part of the aspx is the definition of the ObjectDataSource. See how it ties the ContactDataSource and the ContactDataModel together with the GridView and FormView.

        <%-- Data Sources --%>
        <asp:ObjectDataSource runat="server" ID="contactData"     TypeName="WebRole1.ContactDataSource"
            DataObjectTypeName="WebRole1.ContactDataModel" 
            SelectMethod="Select" DeleteMethod="Delete" InsertMethod="Insert">    
        </asp:ObjectDataSource>
    

    26. Build. You should not have any compilation errors.

    27. F5 to debug. You will see the app running in the Development Fabric using the Table Development Storage

    image

    28. Now I want to switch this to use Windows Azure Storage, not the development storage.  The first step is to go to the Windows Azure Developer Portal and create a storage account.

    Login and click on “New Service”, and select “Storage Account”:

    image

    Fill out the service name, the public name and optionally choose a region.  You will be brought to a page that contains the following (note I rubbed out the access keys):

    image

    29. You will use the first part of the endpoint, (jnakstorageaccount) and the one of the access keys to fill out your connection string.

    30. Open the WebRole1 config again, bring up Settings | DataConnectionString and fill out the account name and the account key and hit OK.

    image

    31. Hit F5 to run the application again. 

    This time you will be running against cloud storage – note that the data you entered when running against the development storage is no longer there.

    Important Note: Before deploying to the cloud, the DiagnosticsConnectionString also needs to use storage account credentials and not the development storage account.

    32. Please see the Deploying a Cloud Service walkthrough to learn how to deploy this to the Windows Azure cloud.

    And there you have it, a walkthrough of using Windows Azure Table Storage – for more information and to dig deeper, definitely check out the white paper here: Windows Azure Table – Programming Table Storage and the docs in MSDN that cover what subsets of ADO.NET Data Services work with table storage here: http://msdn.microsoft.com/en-us/library/dd135720.aspx and here: http://msdn.microsoft.com/en-us/library/dd894032.aspx

  • Cloudy in Seattle

    How to: Add an HTTPS Endpoint to a Windows Azure Cloud Service

    • 18 Comments

    Back in May I posted about Adding an HTTPS Endpoint to a Windows Azure Cloud Service and with the November 2009 release of the Windows Azure Tools, that article is now obsolete.

    In the last week I received a number of requests to post a new article about how to add HTTPS endpoints with the November 2009 release and later and I’m always happy to oblige!

    To illustrate how to add an HTTPS endpoint to a Cloud Service, I’ll start with the thumbnail sample from the Windows Azure SDK – the web role in that sample only has an http endpoint and I’ll walkthrough the steps to add an HTTPS endpoint to that web role.

    Open the Windows Azure SDK folder, for me, that's C:\Program Files\Windows Azure SDK\v1.0 and unzip the samples-cs.zip file to a writeable location.

    In the samples folder, there is a sample called “Thumbnails”, open the solution Thumbnails.sln in Visual Studio. (2008 or 2010 will work)

    Hit F5 to run the application and make sure everything works as expected. 

    If you aren’t familiar with this sample, it allows you to select an image locally via the web role which will upload that image to blob storage and will communicate the location to the worker via a queue.  The worker will generate a thumbnail for that image and the web role displays the thumbnail.

    The process to add an HTTPS endpoint is a 3 step process.

    1. Configure the endpoint
    2. Upload the certificate to the Cloud
    3. Configure the SSL certificate (and then point the endpoint to that certificate)

    Adding the HTTPS Endpoint

    To configure the endpoint, open up the configuration UI on the WebRole by right clicking on the Thumbnails_WebRole node under the Roles node in the Solution Explorer and selecting “Properties”.

    image

    Switch to the Endpoints tab and click the checkbox to select “HTTPS”.

    image

    This will add the HTTPS endpoint but not specify the certificate.

    Switch to the Configuration page and uncheck the Launch browser for: HTTP endpoint option, by unselecting this option, on run or debug of the cloud service, the default browser will only be launched for the HTTPS endpoint.

    image

    HTTPS Endpoints on the Local Development Fabric

    Click on the Debug menu, Debug | Start Debugging to package and run the cloud service on the local development fabric.

    The development simulation always uses a self-signed certificate issued to and issued by 127.0.0.1 which corresponds to the local host. This certificate is installed as part of the Windows Azure Tools + SDK.

    This is an important thing to note as the certificate configuration I'm about to describe below only applies when the application is running on the cloud.

    By default, this certificate is not root trusted and running an application with an https endpoint on the development fabric will result in a certificate error on the local development fabric.  Click on “Continue to this website (not recommended)” to browse to the web site:

    image

    For more information about the certificate Windows Azure installs including how to get rid of the certificate error when running on the development fabric -- see the end of this post.

    The Certificate

    To configure a certificate for use in the cloud, you will need a certificate which will be uploaded to the cloud and to configure that certificate for the role.

    For the purpose of this article, we’ll create and use a self-signed certificate. Create a self-signed certificate by opening the IIS Manager.

    Open the IIS Manager and select “Server Certificates”.

    Select “Create Self-Signed Certificate…” under the “Actions” heading on the far right of the dialog.

    image

    After creating the cert, click on “Export…” to export the certificate to a pfx file. Provide a password you’ll remember. 

    image

    The benefit of using IIS to create the certificate is that it is the easiest way I know to create a certificate that has the appropriate settings and an exportable private key.

    Uploading the Certificate

    We’ll now proceed with the upload step. Navigate the to Windows Azure Developer Portal and select a Hosted Service – this will be the same Hosted Service that you will deploy your application to later.

    On the Certificates heading at the bottom of that page, select “Manage”.

    image

    This will bring you to the following page:

    image

    Upload the certificate by entering the name of the pfx file and the corresponding password you entered during the export step above and click “upload”

    Copy the certificate Thumbprint to your clipboard after it is installed to your Hosted Service Component.

    image

    Configuring the Certificate

    Go back to the Visual Studio Thumbnails_WebRole configuration UI, click on the Certificates tab and click “Add Certificate”.

    Give the certificate a name (i.e. sslCert), paste in the Thumbprint in the Thumbprint field (you copied it to your clipboard after uploading the certificate to the portal) and leave the store location and name at the default of LocalMachine and My.

    image

    The certificates configuration page allows you to specify for a given role what certificates should be installed to the VM instances for that role and in which stores to install those certificates.

    In this case, you are telling Windows Azure to install the certificate you uploaded via the Portal to all VM instances that are created for the Thumbnails_WebRole web role.

    Switch to the Endpoints tab and select “sslCert” for the HTTPS certificate.

    Now deploy your application to the same Hosted Service in the Cloud where you uploaded the certificate.

    Once the application is deployed, you will be able to access it via http and https. You can see from the screen shot below, the certificate I uploaded is being used for the https endpoint for the cloud service I deployed:

    image

    Note: Since we uploaded a self-signed certificate, your web browser will display a certificate error when browsing to your https endpoint, using a real signed certificate will solve this problem.

    Also note that you may have to add intermediate certificates to complete the certificate chain.  You can do this by uploading additional certificates via the Portal and configuring those certificates in the Certificates tab of the role configuration UI. (more on this in a subsequent post)

    Known Issues:

    If you forget to set the SSL certificate name for the HTTPS endpoint on the Endpoints page of the role configuration UI, you'll still be able to run on the local development fabric (as per above, it always uses the installed certificate) but when you publish, it will fail with an error in the error list:

    No SSL certificate specified for https input endpoint 'HttpsIn' of role 'Thumbnails_WebRole' 

    Windows Explorer (and your default browser) will still come up and the Thumbnails.cspkg file will appear as a 0 byte file which is misleading.  We will be fixing this in a future release such that it is more obvious that there was an error.

    Specifying the SSL Certificate Used by the Development Fabric to be Trusted

    To find the certificate that is installed by Windows Azure, run the Microsoft Management Console by typing “mmc" from in the start menu.

    Select “Add/Remove Snap-in…”:

    image

    And select “Certificates”, “Computer Account”:

    image

    Click “Next” for “Local Computer. then hit “OK”.

    Under Personal\Certificates, you will see the 127.0.01 certificate that was installed.

    image

    Because the certificate is not root trusted, it’s installed to the Personal store, when you run applications that have an SSL endpoint on the local development fabric, the web browser will come up with a certificate error indicating that “There is a problem with this website’s security certificate”.

    This is expected. Click on “Continue to this website (not recommended)” to browse to the web site:

    image

    If you would like to make the certificate root trusted and therefore not see the Certificate errors, you can install the certificate to the “Trusted Root Certification Authorities” Certificate store. Simply drag it to the “Trusted Root Certification Authorities” folder in the mmc window. (you can also move it to the Current User TRCA store if you prefer)

    To be on the safe side, please don’t trust any HTTPS web sites with any valued information on a machine where you have made this change.

    Now if you made the 127.0.0.1 certificate root trusted, when you run the application, the web site will come up without the error:

    image

  • Cloudy in Seattle

    Web Site Projects and Windows Azure

    • 0 Comments

    Currently (November 2009), the Windows Azure Tools for Visual Studio only support Web Application projects – the type of Web projects that have a project file and are compiled.

    For most folks, they reason to choose a Web Site project was for the ability to update it easily on the server and generally, the target customer for Windows Azure is typically using a Web Application project. (for a good article about the differences and when to use each one, please see this post)

    Because Windows Azure has a deployment model where you can’t update the files on the server itself and considering the target customer, this seemed like a reasonable approach.

    That said, there are a lot of apps out there that are in the Web Site format that folks want to deploy to Windows Azure and we’re figuring out the best way to support this moving forward.

    For the time being, most folks are doing the conversion from Web Site to Web Application project and I wanted to point to a good post off the Visual Web Developer blog that will help make this easier: http://blogs.msdn.com/webdevtools/archive/2009/10/29/converting-a-web-site-project-to-a-web-application-project.aspx

  • Cloudy in Seattle

    Add and Vote for Windows Azure Features

    • 0 Comments

    [Changing title to be more clear]

    Mike Wickstrand, the director of Windows Azure product planning has put together a site where you can post and vote for Windows Azure ideas: http://www.mygreatwindowsazureidea.com

    The idea came from the success we had with the Silverlight feature suggestions page and we hope to duplicate that success.

    Please take the time to go to http://www.mygreatwindowsazureidea.com/ today submit your ideas and your votes - and don't forget about the areas that are my passion - the developer experiences and tools.

  • Cloudy in Seattle

    ASP.NET Provider Scripts for SQL Azure

    • 4 Comments

    If you want to use the ASP.NET Providers (membership, role, personalization, profile, web event provider) with SQL Azure, you'll need to use the the following scripts or aspnet-regAzure.exe tool to setup the database: http://support.microsoft.com/default.aspx/kb/2006191

    Currently the only provider which is not supported on SQL Azure is the session state provider.

    Personally, I like using SSMS 2008 R2 to connect to SQL Azure and using the Query window to run the scripts. (if you already have SSMS 2008 installed, you can use that as well, just connect from the Query window itself, not the Object Explorer as that will fail)

    Note: I use SQL Server Authentication with the following credentials:

    • Server name: <servername>.database.windows.net
    • Login: <username>@<servername>)
  • Cloudy in Seattle

    Videos of the Windows Azure Sessions at PDC09

    • 5 Comments

    Here are the videos of the Windows Azure sessions at PDC09.  Lots of useful content, the sessions were well attended and well received.

    At the time of this writing, some of the videos are not yet posted but they will be by the end of the week.

    Enjoy.

    Windows Azure Sessions

    My session -- Tips and Tricks for Using Visual Studio 2010 to Build Applications that Run on Windows Azure

    Introductory

    Learn to Develop for Windows Azure

    Windows Azure Storage

    Windows Azure as an Open Platform

    SQL Azure Sessions

    Showcases

  • Cloudy in Seattle

    ASP.NET MVC and Windows Azure (November 2009 edition)

    • 7 Comments

    With the November release of the Windows Azure Tools for Microsoft Visual Studio, we’ve done some things in Visual Studio 2010 to make it easier to use ASP.NET MVC and Windows Azure together.

    Note:  In the November release of the Tools + SDK, Windows Azure only supports .NET 3.5 SP1, .NET 4.0 projects are not supported.

    Creating a New Project

    The first thing you’ll notice in Visual Studio 2010 is that we now have a project template option for an ASP.NET MVC 2 Web Role.  (Click on File | New | Project… | Windows Azure Cloud Service)

    Note: To use ASP.NET MVC and Windows Azure together on Visual Studio 2008 – please follow the steps under “Using an Existing Project”.

    clip_image002

    This makes it easy to create a new ASP.NET MVC project in the context of a cloud service.  This is available in Visual Studio 2010 and Visual Web Developer 2010 Express.

    The differences between an ASP.NET MVC Web Role and a regular ASP.NET MVC project are the following:

    1) Windows Azure specific references to the diagnostics library, runtime and storage client.

    clip_image004

    2) Setting the CopyLocal property of the System.Web.Mvc reference to true to ensure that it gets copied up to the cloud which will not have it by default.

    clip_image006

    3) Web.config is modified to add a Windows Azure specific TraceListener (Microsoft.WindowsAzure.Diagnostics.DiagnosticsMonitorTraceListener)

    4) WebRole.cs file that includes template code to bootstrap the logging and diagnostics infrastructure as well as a default behavior for handling configuration changes.

    Aside from that, the project is the same as any ASP.NET MVC project that you would create outside the context of a Windows Azure cloud service.

    Using an Existing Project

    To use an existing ASP.NET MVC project, simply add it to the solution as an existing project.  This is also how you would use ASP.NET MVC Projects (or any existing ASP.NET Web Application project) on VS 2008.

    Note:  In the November release of the Tools + SDK, Windows Azure only supports .NET 3.5 SP1, .NET 4.0 projects are not supported.  If you are adding a project that target .NET Framework 4.0, please change the target framework in the project properties to us the .NET Framework 3.5.

    clip_image008

    Right click on the Roles node in the Cloud Service project and select Add | Web Role Project in Solution…

    clip_image010

    The select the project you just added.

    clip_image012

    Your existing MVC is now associated as a web role.  The changes that were made above automatically to the project (adding references, startup code, CopyLocal=true and web.config trace listener change) you will have to do manually to get that functionality.

    Using ASP.NET Providers

    The default ASP.NET MVC project template makes use of ASP.NET providers: Membership, role and profile and rely on SQL Server.  You have two choices on reworking these providers, either use SQL Azure or use the Windows Azure sample providers that use Windows Azure storage.

    Because ASP.NET will setup SQL Express automatically for the providers, if you don’t migrate your providers to use one of the cloud data options, you will find that the providers just work when running on the local development fabric but will then fail when running on the cloud where a local SQL Server instance is not available.

    To convert your project to use the Windows Azure Storage based sample implementation of the ASP.NET providers see this post: http://bit.ly/1M1HSN

    Currently the existing aspnet_regsql.exe tool and the ASP.NET functionality to generate the database for the providers does not work with SQL Azure.  We have some scripts coming that will workaround the problem.  I’ll update this post when those become available.  [Update:  these scripts are now available, please see: http://blogs.msdn.com/jnak/archive/2009/11/24/asp-net-provider-scripts-for-sql-azure.aspx]

    Known Issues

    There are two known issues with using ASP.NET MVC and Windows Azure Tools:

    1) Creating a unit tests with the Visual Basic version of ASP.NET MVC results in a unit test project that fails to build.  This is because the Windows Azure Tools creates the projects as .NET 3.5 projects.  The workaround is to not create a unit test project.  This will be resolved in the RTM version of Visual Studio. Note: this is not an issue with Visual C#.

    2) Creating an ASP.NET MVC Web Role on Visual Web Developer 2010 Express results in a project that fails to build:

    Error 1 The type name 'MembershipCreateStatus' could not be found. This type has been forwarded to assembly 'System.Web.ApplicationServices, Version=4.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35'. Consider adding a reference to that assembly. c:\users\vslab2\documents\visual studio 2010\Projects\CloudService7

                    The workaround is to set the Target framework of the MVC project to .NET Framework 3.5.

  • Cloudy in Seattle

    Using the Sample Windows Azure ASP.NET Providers

    • 14 Comments

    Previously, the sample Windows Azure ASP.NET providers were included in the samples folder that was installed with the SDK.

    As of the November 2009 release of the Windows Azure Tools & SDK, this is no longer the case.  The samples are available online at http://code.msdn.microsoft.com/windowsazuresamples.

    To use these samples:

    1. Download the samples and unzip (These are no longer included as part of the samples installed to the SDK folder)

    2. Add the AspProviders/Lib/AspProviders.csproj project to the solution by right clicking on the solution and selecting Add | Existing Project… and navigating to the AspProviders.csproj file.

    3. Add a reference from your ASP.NET MVC 2 Web role to the AspProviders sample library by right clicking on the “references” folder in the ASP.NET MVC project and selecting “Add Reference…”

    image

    and selecting the AspProviders assembly:

    image

    4. Open the web.config file and add/change the providers. 

    You can set the applicationName appropriately for your application. 

    These sections are added under the system.web element. Note that I'm still tracking down some issues I'm seeing with the profile provider, will update this post when I know more.

    Membership Provider:

        <membership defaultProvider="TableStorageMembershipProvider" userIsOnlineTimeWindow = "20">
          <providers>
            <clear/>
    
            <add name="TableStorageMembershipProvider"
                 type="Microsoft.Samples.ServiceHosting.AspProviders.TableStorageMembershipProvider"
                 description="Membership provider using table storage"
                 applicationName="AspProvidersDemo"
                 enablePasswordRetrieval="false"
                 enablePasswordReset="true"
                 requiresQuestionAndAnswer="false"
                 minRequiredPasswordLength="1"
                 minRequiredNonalphanumericCharacters="0"
                 requiresUniqueEmail="true"
                 passwordFormat="Hashed"
                    />
    
          </providers>
        </membership>
    

    Role Manager Provider:

      <roleManager enabled="true" defaultProvider="TableStorageRoleProvider" cacheRolesInCookie="true" cookieName=".ASPXROLES" cookieTimeout="30"
                     cookiePath="/" cookieRequireSSL="false" cookieSlidingExpiration = "true"
                     cookieProtection="All" >
          <providers>
            <clear/>
            <add name="TableStorageRoleProvider"
                 type="Microsoft.Samples.ServiceHosting.AspProviders.TableStorageRoleProvider"
                 description="Role provider using table storage"
                 applicationName="AspProvidersDemo"
                    />
          </providers>
        </roleManager>
    

    Session State Provider:

            <sessionState mode="Custom" customProvider="TableStorageSessionStateProvider">
                <providers>
                    <clear />
                    <add name="TableStorageSessionStateProvider"
                         type="Microsoft.Samples.ServiceHosting.AspProviders.TableStorageSessionStateProvider"
                         applicationName="AspProvidersDemo"
                 />
                </providers>
            </sessionState>
    

    Change the existing appSettings element and the folllowing provider configuration:

      <appSettings>
        <add key = "TableStorageEndpoint" value="http://127.0.0.1:10002/devstoreaccount1"/>
        <add key = "BlobStorageEndpoint" value="http://127.0.0.1:10000/devstoreaccount1"/>
        <add key = "AccountName" value="devstoreaccount1"/>
        <add key = "AccountSharedKey" value="Eby8vdM02xNOcqFlqUwJPLlmEtlCDXJ1OUzFT50uSRZ6IFsuFq2UVErCz4I6tq/K1SZFPTOtr/KBHBeksoGMGw=="/>
    
        <!-- provider configuration -->
        <!-- When using the local development table storage service only the default values given
         below will work for the tables (Membership, Roles and Sessions) since these are the names
         of the properties on the DataServiceContext class -->
        <add key = "DefaultMembershipTableName" value="Membership"/>
        <add key = "DefaultRoleTableName" value="Roles"/>
        <add key = "DefaultSessionTableName" value="Sessions"/>
        <add key = "DefaultProviderApplicationName" value="ProviderTest"/>
        <add key = "DefaultProfileContainerName"/>
        <add key = "DefaultSessionContainerName"/>
      </appSettings>
    
    

    You can now remove the “ApplicationServices” connection string as all of the providers that referenced it are gone.

    6. Hit F5 to debug the application and your application is now using the sample ASP.NET providers that run against Windows Azure storage!

    What’s next?  You can bet that we’ll put some more work into these providers to make them better and potentially even provide them as a real library. 

    Stay tuned.

  • Cloudy in Seattle

    Tips and Tricks for Using Visual Studio 2010 to Build Applications that Run on Windows Azure

    • 0 Comments

    Thanks to everyone who attended my session at PDC today.  I really hope that you took something useful away from it.

    As mentioned in the session, I’m posting the set of tips that were covered. 

    You can now view the session online at: http://microsoftpdc.com/Sessions/SVC53

    If you missed my session, I went through a "green field" new Windows Azure Cloud Service project developer scenario as well as a "brown field" moving an existing ASP.NET web application to run on Windows Azure developer scenario. 

    Through the context of those two walkthroughs, I covered the following Tips and Tricks.

    Tips for Getting Started

    1. The Web Platform Installer automates a number of the steps to install the Windows Azure Tools for VS 2008 or to install IIS prior to installing the Windows Azure Tools for VS 2010.

    2. Get the patches - http://msdn.microsoft.com/en-us/azure/cc974146.aspx

    3. Name your projects before creating them.  When you add roles to your cloud service solution, it is much easier to rename them at that stage, then after the solution and projects have been created.

    clip_image002

    The reason is that even though you can use the Solution Explorer to rename the projects after creation, the folders where those projects reside will not be renamed.

    4. Include the Static Content module and register the Silverlight mime type in IIS – I actually didn’t mention this one (on purpose) in my session as it fell below the cut line but the development fabric uses IIS under the hood and every so often we run into folks that get unexpected results when they run their apps on IIS because they are used to running them on the ASP.NET Development Server.

    To enable the static content module: See tip #1 above (use WebPI) or see this article http://technet.microsoft.com/en-us/library/cc732612(WS.10).aspx

    To register the xap mime type (typically only needed for older OSs):

    1. Open Internet Information Services (IIS) Configuration Manager and select the server to manage (usually the top-level item on the left side)
    2. Double-click MIME Types on the right side
    3. Add an entry to map the .xap extension to the application/x-silverlight-app MIME type

    If you are using WCF, don’t forget to install the WCF Activation Windows feature – Control Panel | Programs | Turn Windows features on or off | Microsoft .NET Framework 3.5.1 | Windows Communication foundation HTTP Activation

    Tips During Development

    5. Always keep the Cloud Service project as the StartUp project to get the desired Run/Debug on the development fabric behavior.

    5. Know the 3 differences between web roles and web application projects.  The 3 differences are:

    • References to the Windows Azure specific assemblies: Microsoft.WindowsAzure.Diagnostics, Microsoft.WindowsAzure.ServiceRuntime, and Microsoft.WindowsAzure.StorageClient
    • Bootstrap code in the WebRole.cs/vb file that starts the DiagnosticMonitor as well as defines a default behavior of recycling the role when a configuration setting change occurs.
    • The addition of a trace listener in the web.config file: Microsoft.WindowsAzure.Diagnostics.DiagnosticMonitorTraceListener.

    6. In general, WCF works correctly on Windows Azure.  There is a problem using "add service reference" or svcutil but we have a patch to workaround the problem.  The patch is installed in the cloud, and more information about this is here: http://code.msdn.microsoft.com/wcfazure/Wiki/View.aspx?title=KnownIssues (not that a web.config change is also required)

    7. Use the tools to configure the UI.  Click on a role under the “Roles” node in Solution Explorer and select “Properties”.

    image

    This will bring up a UI over the Service Definition and Service Configuration file sections for that selected Role:

    image 

    8. Each role instance is run in a process, all role process are attached to the debugger

    • Web Roles run in WaWebHost.exe, one process per instance
    • Worker Roles run in WaWorkerHost.exe, one process per instance

    image

    When debugging, Visual Studio will attach to all of those processes as well as your web browser to enable debugging across roles and across instances. 

    You can correlate the role instance you are debugging to the instance in the development fabric by looking at the log messages and matching the PID.  You can also right click and attach to the debugger from the development fabric and the subsequent dialog also shows the PID.

    These processes in the Cloud will be 64-bit processes running on 64 bit Windows.  Keep that in mind especially if you are calling native methods through pinvoke.

    9. Project settings for debugging in role projects are read when running on the development fabric – Although we don’t read all of the values, we do pay attention to the project properties debuggers settings.  For example on a web application project | project properties | Web tab, you can configure whether you want to to Native Code debugging, or Silverlight debugging.

    image

    10. Add multiple cloud service projects to the solution - you can use different cloud service projects to manage different configuration/definition files and different role configurations.

    Tips for Migration

    The NerdDinner sample code can be found at: http://nerddinner.codeplex.com/

    10. Use an existing web application project as a web role

    clip_image002[5]

    Right click on the Roles node in the Cloud Service project and select Add | Web Role Project in Solution…

    clip_image004

    The select the project you just added.

    clip_image006

    Then refer to tip #5 above (know the differences between an ASP.NET Web App project and a Web Role project) and selectively add references, trace listener and startup code as desired.

    11. ASP.NET Provider scripts for SQL Azure

    To use the ASP.NET providers with SQL Azure, you can use these scripts: http://support.microsoft.com/default.aspx/kb/2006191 to setup the database.

    12. Know what Visual Studio tools work with SQL Azure. 

    Use SQL Server Management Studio 2008 R2 CTP3 (November) to connect to SQL Azure. (SSMS 2005 is not compatible and SSMS 2008 only support SQL Azure from the Query window)

    Tips for Deployment

    13. Test your app on the local development fabric with cloud storage.  There are two reasons to do this:

    • Deploying your application to the cloud in stages makes it much easier to diagnose any issues you encounter.  Being able to debug the app locally while you move the data to the cloud will help you flush out a number of issues
    • When your application is running in the cloud, run additional roles locally to debug issues that are related to production data that would otherwise be difficult to reproduce locally or hard to debug in the cloud.

    14. Crack the Service Package using the _CSPACK_FORCE_NOENCRYPT_ environment variable. See this post for more information.

    15. Use the Service Management APIs and Powershell cmdlets to automate deployment.  See this post for more information.

    The script I used to deploy the service was:

    $cert = Get-Item cert:\CurrentUser\My\<enter cert>
    $sub = "<enter subscription ID>"
    $servicename = <enter service name>'
    $package = "<enter url to service package in blob storage>"
    $config = "<enter path to service configuration file>"

    Add-PSSnapin AzureManagementToolsSnapIn

    Get-HostedService $servicename -Certificate $cert -SubscriptionId $sub |
        New-Deployment Staging $package $config -Label 'PDC09Staging' |
        Get-OperationStatus –WaitToComplete

    Get-HostedService $servicename -Certificate $cert -SubscriptionId $sub |
        Get-Deployment -Slot Staging |
        Set-DeploymentStatus 'Running' |
        Get-OperationStatus -WaitToComplete

    csmanage (tool that exercises the Service Management APIs) and other samples that aren't included in the Windows Azure SDK can be found here: http://code.msdn.microsoft.com/windowsazuresamples

  • Cloudy in Seattle

    Tools to Configure Windows Azure Service Definition and Configuration Files

    • 0 Comments

    One of the features we added to the November 2009 release of the Windows Azure Tools are tools to configure the Service Definition and Service Configuration files – yes, no more XML editing!

    To access the “Service Model Configuration Pages”, right click on the role under the Cloud Service “Roles” node in Solution Explorer and select “Properties”. (you can also double click the role)

    image

    This will bring up our UI over the definition and configuration files:

    image

    Here are a few more screen shots.

    Settings, including being able to create connection strings:

    image

    Endpoints for web roles:

    image

    Endpoints for Worker Roles – more flexible than web roles:

    image

    Local Storage:

    image

    Certificate management.  Declare the install of certificates in the VMs for the role that is being configured.  The certificates need to be uploaded separately through the Windows Azure Developer Portal.

    image

  • Cloudy in Seattle

    3 Short Videos - Introduction to Windows Azure Tools for Microsoft Visual Studio 2010

    • 0 Comments

     [Update 2/10/2010 - These videos moved so I needed to update the links.]

    Straight from the halls of Microsoft - a 3 video series on building Windows Azure applications using Visual Studio 2010 Beta 2.  The videos showcase the Windows Azure Tools.

    The goal was to focus on the developer experiences and to keep the videos relatively short (around 10 minutes each) so that they are easily consumed.

    Part 1: Windows Azure: Getting the Tools, Creating a Project, Creating Roles and Configuration (14 minutes, 26 seconds)

    Windows Azure: Getting the Tools, Creating a Project, Creating Roles and Configuration

    Part 2: Running and Debugging a Windows Azure Application Locally with Visual Studio (6 minutes, 11 seconds)

    Running and Debugging a Windows Azure Application Locally with Visual Studio

    Part 3: Deploying Windows Azure Applications from Visual Studio  (6 minutes, 56 seconds)

    Deploying Windows Azure Applications from Visual Studio

  • Cloudy in Seattle

    November 2009 Release of the Windows Azure Tools and SDK

    • 1 Comments

    Today we released several new features for Windows Azure through the Windows Azure Tools and SDK.  (Use the direct link while the release propagates)

    We look forward to discussing these new Windows Azure features, in detail, at PDC '09.

    This release add support for Visual Studio 2010 Beta 2 and VWD Express 2010 Beta 2.

    Lots of changes and new features in the November 2009 release:

    • Service Model UI: A redesigned and significantly more complete interface for manipulating Role configuration information. To access, double-click on a role node in the Solution Explorer.
    • Additional role templates: Support for ASP.NET MVC 2 (2010 only), F# worker roles (2010 only), and WCF Service Application web roles.
    • Support for dynamically creating tables: The Create Tables functionality is now performed automatically; there is no longer a need to right-click and select Create Tables… on the project after your table definitions have changed.
    • Full support for and installation of the November Windows Azure SDK release:
      • The sample storage client has been replaced by a new production quality library.
      • New Diagnostics library enables logging using .NET APIs and enables the collection of diagnostic information from the service.
      • Service Runtime library updated to support inter-role communication and notification of configuration changes .
      • Support for input endpoints on Worker Roles.
      • Higher fidelity simulation of Development Storage: supports all current cloud storage features, including dynamically creating tables.
      • Ability to choose the size of the VM for a role instance.
      • Ability to persist data in local storage even after the role is recycled.
      • Ability to manage certificates to install to the role VMs.

    Updated and additional samples are available at: http://code.msdn.microsoft.com/windowsazuresamples

    I’m pretty excited about this for a few reasons. 

    • We've been working really hard on it for quite a while now and it feels so good to see it go live. 
    • As you can see from the list above, we’ve packed it full of new things that you’ve been asking for!
    • You can now use Visual Studio 2010 Beta 2 with the Windows Azure Tools
    • I have a ton of things to blog about :) (including updating some of my walkthroughs to work against this new release)

    Let me know what you think!

  • Cloudy in Seattle

    Windows Azure Platform TCO/ROI Analysis Tool

    • 0 Comments

    We just released a tool to help you figure out how much money you can save by switching to Windows Azure.  It’s quite comprehensive and I’m sure will be quite useful for a lot folks.

    http://www.microsoft.com/windowsazure/tco/

    To give you a really rough idea of what this looks like:

    image

Page 2 of 9 (202 items) 12345»