October, 2008

  • Cloudy in Seattle

    Windows Live Web Role for Windows Azure

    • 2 Comments

    In my prior post that talked about the difference between the Cloud Service templates and the Role templates I mentioned that out of the box, we only supported one kind of Web Role -- an ASP.Net Web Application.  (that is, if you used the Roles node in the Cloud Service (ccproj) project to add or replace a Web Role, you only got the option to add an ASP.Net Web Application.

    The cool thing is that we made the role templates extensible so other teams can add in Windows Azure versions of their templates.

    The first team to do this? The Windows Live Tools team -- they have an add-in to make incorporating Windows Live services into your Web application easier.

    Check out Vikas' post on the Windows Live Tools release that has templates for Windows Azure.  This is really cool:

    (and before you ask, there isn't an add-in for ASP.Net MVC at this point.  We do have a sample project that will make it easy for you to get started with MVC on Windows Azure)

  • Cloudy in Seattle

    PDC Hands on Labs

    • 1 Comments

    The Azure Services Training Kit - PDC Preview which contains the Hands on Labs that folks are doing at the PDC is now available for download.

    From the download page:

    The Azure Services Training Kit will include a comprehensive set of technical content including samples, demos, hands-on labs, and presentations that are designed to help you learn how to use the Azure Services Platform. This initial PDC Preview release includes the hands-on labs that were provided at the PDC 2008 conference. These labs cover the broad set of Azure Services including Windows Azure, .NET Services, SQL Services, and Live Services. Additional content will be included in future updates of this kit.

  • Cloudy in Seattle

    Windows Azure Walkthrough: Blob Storage Sample

    • 5 Comments

    Please see the updated post for November 2009 and later.

    This walkthrough covers what I found to be the simplest way to get a sample up and running on Windows Azure that uses the Blob Storage service. It is not trying to be comprehensive or trying to dive deep in the technology, it just serves as an introduction to how the Windows Azure Blob Storage Service works.

    Please take the Quick Lap Around the Tools before doing this walkthrough.

    Note: The code for this walkthrough is attached, you will still have to add and reference the Common and StorageClient projects from the Windows Azure SDK.

    After you have completed this walkthrough, you will have a Web Role that is a simple ASP.Net Web Application that shows a list of files that are stored and can be downloaded from Blob Storage. You can use the Web Role to add files to Blob storage and make them available in the list.

    image

    Blob Concepts

    Each storage account has access to blob storage. For each account there can be 0..n containers. Each container contains the actual blobs, which is a raw byte array. Containers can be public or private. In public containers, the URLs to the blobs can be accessed over the internet while in a private container, only the account holder can access those blob URLs.

    Each Blob can have a set of metadata set as a NameValueCollection of strings.

    image

    Creating the Cloud Service Project

    1. Start Visual Studio as an administrator

    2. Create a new project: File à New à Project

    3. Select “Web Cloud Service”. This will create the Cloud Service Project and an ASP.Net Web Role

    image

    4. Find the installation location of the Windows Azure SDK. By default this will be: C:\Program Files\Windows Azure SDK\v1.0

    a. Find the file named “samples.zip”

    b. Unzip this to a writeable location

    5. From the samples you just unzipped, add the StorageClient\Lib\StorageClient.csproj and HelloFabric\Common\Common.csproj to your solution by right-clicking on the solution in Solution Explorer and selecting Add –> Existing Project.

    image

    a. Common and StorageClient and libraries that are currently distributed as samples that provide functionality to help you build Cloud Applications. Common adds Access to settings and logging while StorageClient provides helpers for using the storage services.

    6. From your Web Role, add references to the Common and StorageClient projects you just added.

    image

    7. Add the code to connect to the Blob Storage Service in Default.aspx.cs.

    a. This code gets account information from the Service Configuration file and uses that to create a BlobStorage instance.

    b. From the BlobStorage instance, a container is created with a name taken from the Service Configuration. The name used for the container is restricted to valid DNS names.

    c. The container is made public so that the URLs for the blobs are accessible from the internet.

    using Microsoft.Samples.ServiceHosting.StorageClient;
    namespace DownloadSite_WebRole
    {
        public partial class _Default : System.Web.UI.Page
        {
            private BlobContainer _Container = null;
    
            protected void Page_Load(object sender, EventArgs e)
            {
                try
                {
                    // Get the configuration from the cscfg file
                    StorageAccountInfo accountInfo = StorageAccountInfo.GetDefaultBlobStorageAccountFromConfiguration();
    
                    // Container names have the same restrictions as DNS names
                    BlobStorage blobStorage = BlobStorage.Create(accountInfo);
                    _Container = blobStorage.GetBlobContainer(RoleManager.GetConfigurationSetting("ContainerName"));
    
                    // returns false if the container already exists, ignore for now
                    // Make the container public so that we can hit the URLs from the web
                    _Container.CreateContainer(new NameValueCollection(), ContainerAccessControl.Public);
                    UpdateFileList();
                }
                catch (WebException webExcept)
                {
                }
                catch (Exception ex)
                {
                }
            }

    8. In order for the StorageAccountInfo class to find the configuration settings, open up ServiceDefinition.csdef and add the following to <WebRole/>. These define the settings.

        <ConfigurationSettings>
          <Setting name="AccountName"/>
          <Setting name="AccountSharedKey"/>
          <Setting name="BlobStorageEndpoint"/>
          <Setting name="ContainerName"/>
        </ConfigurationSettings>
    

    9. Likewise, add the actual local development values to the ServiceConfiguration.cscfg file. Note that the settings between both files have to match exactly otherwise your Cloud Service will not run.

    <ConfigurationSettings>
      <Setting name="AccountName" value="devstoreaccount1"/>
      <Setting name="AccountSharedKey" value="Eby8vdM02xNOcqFlqUwJPLlmEtlCDXJ1OUzFT50uSRZ6IFsuFq2UVErCz4I6tq/K1SZFPTOtr/KBHBeksoGMGw=="/>
      <Setting name="BlobStorageEndpoint" value="http://127.0.0.1:10000/"/>
    
      <!-- Container, lower case letters only-->
      <Setting name="ContainerName" value="downloadsite"/>
    </ConfigurationSettings>
    

    10. When you run in the Cloud, the AccountName and AccountSharedKey will be set the values you will get back from the Portal for your account. The BlobStorageEndpoint will be set the URL for the Blob Storage Service: http://blob.core.windows.net

    a. Because these are set in the ServiceConfiguration.cscfg file, these values can be updated even after deploying to the cloud by uploading a new Service Configuration.

    11. For the local development case, the local host and port 10000 (by default) will be used as the Blob Storage Endpoint. The AccountName and AccountSharedKey are hard coded to a value that the Development Storage service is looking for (it’s the same for all 3, Table, Blob and Queue services).

    12. Next open up Default.aspx and add the code for the UI. The UI consists of:

    a. a GridView at the top

    b. Label and FileUpload control

    c. 2 Label and TextBox pairs (File Name and Submitter)

    d. Field validators to ensure that all of the fields are filled out before the file is uploaded.

    Add the following between the template generated <div></div> elements:

            <asp:GridView ID="fileView"
            AutoGenerateColumns="false" DataKeyNames="BlobName"
            Runat="server" onrowcommand="RowCommandHandler">
            <Columns>
                <asp:ButtonField Text="Delete" CommandName="DeleteItem"/>
                <asp:HyperLinkField
                    HeaderText="Link"
                    DataTextField="FileName"
                    DataNavigateUrlFields="FileUri" />
                <asp:BoundField DataField="Submitter" HeaderText="Submitted by" />
            </Columns>
        </asp:GridView>
    
        <br />
        <asp:Label id="filePathLabel" 
            Text="File Path:" 
            AssociatedControlID="fileUploadControl"
            Runat="server" />
        <asp:FileUpload ID="fileUploadControl" runat="server"/>
        <asp:requiredfieldvalidator id="filUploadValidator"
          controltovalidate="fileUploadControl"
          validationgroup="fileInfoGroup"
          errormessage="Select a File"
          runat="Server">
        </asp:requiredfieldvalidator>
        <br />
        <asp:Label
            id="fileNameLabel"
            Text="File Name:"
            AssociatedControlID="fileNameBox"
            Runat="server" />
        <asp:TextBox
            id="fileNameBox"
            Runat="server" />
        <asp:requiredfieldvalidator id="fileNameValidator"
          controltovalidate="fileNameBox"
          validationgroup="fileInfoGroup"
          errormessage="Enter the File Name"
          runat="Server">
        </asp:requiredfieldvalidator>
        <br />
        <asp:Label
            id="submitterLabel"
            Text="Submitter:"
            AssociatedControlID="submitterBox"
            Runat="server" />
        <asp:TextBox
            id="submitterBox"
            Runat="server" />
        <asp:requiredfieldvalidator id="submitterValidator"
          controltovalidate="submitterBox"
          validationgroup="fileInfoGroup"
          errormessage="Enter the Submitter Name"
          runat="Server">
        </asp:requiredfieldvalidator>
        <br />
        <asp:Button
            id="insertButton"
            Text="Submit"
            causesvalidation="true"
            validationgroup="fileInfoGroup"
            Runat="server" onclick="insertButton_Click"/>
        <br />
        <br />
        <asp:Label id="statusMessage" runat="server"/>
    

    13. If you now switch to design view, you will see:

    image

    14. In order to simplify the databinding to the UI, lets add a FileEntry class that contains the metadata for each blob.

    public class FileEntry
    {
        public FileEntry(string blobName, Uri fileAddress, string name, string user)
        {
            BlobName = blobName;
            FileUri = fileAddress;
            FileName = name;
            Submitter = user;
        }
    
        public Uri FileUri
        {
            get;
            set;
        }
    
        public string BlobName
        {
            get;
            set;
        }
    
        public string FileName
        {
            get;
            set;
        }
    
        public string Submitter
        {
            get;
            set;
        }
    }

    15. Back to Default.aspx.cs, let’s add code to populate the GridView. One row for each blob found in storage. A FileEntry for each blob found in the container is create and put in a List.

    a. Note that in order to get the metadata from a blob, you need to call BlobContainer.GetBlobProperties(), the list of blobs returned from BlobContainer.ListBlobs() does not contain the metadata.

    using System.Collections.Generic;
    using System.Collections.Specialized;
    private void UpdateFileList()
    {
        IEnumerable<object> blobs = _Container.ListBlobs(string.Empty, false);
        List<FileEntry> filesList = new List<FileEntry>();
    
        foreach (object o in blobs)
        {
            BlobProperties bp = o as BlobProperties;
            if (bp != null)
            {
                BlobProperties p = _Container.GetBlobProperties(bp.Name);
                NameValueCollection fileEntryProperties = p.Metadata;
                filesList.Add(new FileEntry(p.Name, bp.Uri, fileEntryProperties["FileName"], fileEntryProperties["Submitter"]));
            }
        }
    
        fileView.DataSource = filesList;
        fileView.DataBind();
    }

    16. Add the code to upload a file to blob storage.

    a. Create a unique blob name by using a Guid. Appended the existing file extension.

    b. Add metadata

    i. For the file name (friendly name to show instead of the blob name or URL)

    ii. For the Submitter

    c. Add the bytes and create the Blob

    d. The UI is refreshed after this operation

    protected void insertButton_Click(object sender, EventArgs e)
    {
        // Make a unique blob name
        string extension = System.IO.Path.GetExtension(fileUploadControl.FileName);
        BlobProperties properties = new BlobProperties(Guid.NewGuid().ToString() + extension);
    
        // Create metadata to be associated with the blob
        NameValueCollection metadata = new NameValueCollection();
        metadata["FileName"] = fileNameBox.Text;
        metadata["Submitter"] = submitterBox.Text;
    
        properties.Metadata = metadata;
        properties.ContentType = fileUploadControl.PostedFile.ContentType;
    
        // Create the blob
        BlobContents fileBlob = new BlobContents(fileUploadControl.FileBytes);
        _Container.CreateBlob(properties, fileBlob, true);
    
        // Update the UI
        UpdateFileList();
        fileNameBox.Text = "";
        statusMessage.Text = "";
    }

    17. Add code to delete the blob. This is as simple as calling BlobContainer.DeleteBlob() passing in the blob name. In this case, it is the Guid + file extension generated during the upload.

    protected void RowCommandHandler(object sender, GridViewCommandEventArgs e)
    {
        if (e.CommandName == "DeleteItem")
        {
            int index = Convert.ToInt32(e.CommandArgument);
            string blobName = (string)fileView.DataKeys[index].Value;
    
            if (_Container.DoesBlobExist(blobName))
            {
                _Container.DeleteBlob(blobName);
            }
        }
        UpdateFileList();
    }

    18. Finally, lets just round out some of the error handling on Page_Load(). Right after _Container.CreateContainer() lets update the UI and properly catch any exceptions that could get thrown.

    using System.Net;
        _Container.CreateContainer(new NameValueCollection(), ContainerAccessControl.Public);
        UpdateFileList();
    }
    catch (WebException webExcept)
    {
        if (webExcept.Status == WebExceptionStatus.ConnectFailure)
        {
            statusMessage.Text = "Failed to connect to the Blob Storage Service, make sure it is running: " + webExcept.Message;
        }
        else
        {
            statusMessage.Text = "Error creating container: " + webExcept.Message;
        }
    }
    catch (Exception ex)
    {
        statusMessage.Text = "Error creating container: " + ex.Message;
    }

    19. Build and hit F5 to run the application.

    a. Notice that the Development Fabric and the Development Storage startup on your behalf and will continue to run until you close them.

    b. Note that the FileUpload control has a file size limit. You can modify it by changing the httpRuntime maxRequestLength attribute in web.config.

    image

    Please see the Deploying a Cloud Service to learn how to modify the configuration of this Cloud Service to make it run on Windows Azure.

  • Cloudy in Seattle

    ASP.Net MVC Projects running on Windows Azure

    • 28 Comments

    [For more recent information on using ASP.NET MVC with Windows Azure please see this post.]

    Before you get started with ASP.Net MVC and Windows Azure – please install this hotfix.

    Strictly speaking, ASP.Net MVC projects are not supported on Windows Azure.  That is to say that we haven't spent the time to fully test all of the MVC scenarios when running on Windows Azure. 

    That said, for the most part, they do work, just as long as you know what tweaks you need to make in order to get up and running.

    I've attached a sample application that Phil and Eilon on the MVC team put together to help make it easier for you to get started.

    I’ll walk through the changes:

    1. Start by creating a Blank Cloud Service.  File –> New Project –> Visual C# –> Cloud Service –> Blank Cloud Service.  I call it MVCCloudService

    image

    2. Right click on the solution node in Solution Explorer and select “Add New Project”

    image

    3. In the Add New Project dialog, navigate to the Web node under Visual C# or Visual Basic and select the ASP.Net MVC Application (I call it MVCWebRole). 

    image

    4. In Solution Explorer, right click on MVCWebRole and select “Unload Project”

    image

    5. Right click again on MVCWebRole and select “Edit MVCWebRole.csproj”

    image

    6. Add <RoleType>Web</RoleType> to the top PropertyGroup in the project file.

    <Project ToolsVersion="3.5" DefaultTargets="Build" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
      <PropertyGroup>
    {. . .}
        <RoleType>Web</RoleType>

    7. Reload the project, be sure to save the project file.

    image 

    8. If you want to use the Windows Azure runtime library, add a reference to Microsoft.ServiceHosting.ServiceRuntime.dll by right clicking on the References node and selecting “Add reference…”.  Scroll down in the .Net Tab and you’ll find it.

    image

    9. Right click on the Roles node in the MVCCloudService project and select Add –> Web Role Project in solution…

    image

    Select the MVCWebRole project.

    image

    10. Set Copy Local = true on the MVC DLLs:

    • System.Web.Abstractions
    • System.Web.Mvc
    • System.Web.Routing

    These assemblies are not available when running on Windows Azure and need to be deployed with the Service Package.

    Expand the References node in the MVCWebRole and right click on System.Web.Abstractions and select Properties.  Change “Copy Local” to true.  Repeat for the other DLLs

    image

    With these changes, the edit, build, debug, publish and deploy functionality will all work in Visual Studio with the Windows Azure Tools for Microsoft Visual Studio installed.

    That said, it is still "use at your own risk".

    Note: The sample has not been modified to use the Windows Azure ASP.Net provider (for example, the membership provider), stayed tuned.

    The sample project is attached to this blog post.

  • Cloudy in Seattle

    Try Windows Azure Now

    • 1 Comments

    Want to try Windows Azure right now?  Install the following:

    This will give you a local version of the Fabric and Storage services (called the Development Fabric and Development Storage) that will allow you to build, run and debug your Cloud Services just as they would run on Windows Azure.

  • Cloudy in Seattle

    Video Walkthrough: A Quick Lap around Windows Azure Tools for Microsoft Visual Studio

    • 6 Comments

    Watch my screencast that introduces you to the Windows Azure Tools for Microsoft Visual Studio.

    http://wm.microsoft.com/ms/msdn/azure/visualstudioazure.wmv

    image

  • Cloudy in Seattle

    Windows Azure Walkthrough: Table Storage

    • 19 Comments

    Please see the updated post for November 2009 and later

    This walkthrough covers what I found to be the simplest way to get a sample up and running on Windows Azure that uses the Table Storage Service. It is not trying to be comprehensive or trying to dive deep in the technology, it just serves as an introduction to how the Table Storage Service works.

    Please take the Quick Lap Around the Tools before doing this walkthrough.

    Note: The code for this walkthrough is attached, you will still have to add and reference the Common and StorageClient projects from the Windows Azure SDK.

    After you have completed this walkthrough, you will have a Web Role that is a simple ASP.Net Web Application that shows a list of Contacts and allows you to add to and delete from that list. Each contact will have simplified information: just a name and an address (both strings).

    image

    Table Storage Concepts

    The Windows Azure Table Storage Services provides queryable structured storage. Each account can have 0..n tables.

    image

    Design of the Sample

    When a request comes in to the UI, it makes its way to the Table Storage Service as follows (click for larger size):

    clip_image002

    The UI class (the aspx page and it’s code behind) is data bound through an ObjectDataSource to the SimpleTableSample_WebRole.ContactDataSource which creates the connection to the Table Storage service gets the list of Contacts and Inserts to, and Deletes from, the Table Storage.

    The SimpleTableSample_WebRole.ContactDataModel class acts as the data model object and the SimpleTableSample_WebRole.ContactDataServiceContext derives from TableStorageDataServiceContext which handles the authentication process and allows you to write LINQ queries, insert, delete and save changes to the Table Storage service.

    Creating the Cloud Service Project

    1. Start Visual Studio as an administrator

    2. Create a new project: File à New à Project

    3. Select “Web Cloud Service”. This will create the Cloud Service Project and an ASP.Net Web Role. Call it “SimpleTableSample”

    image

    4. Find the installation location of the Windows Azure SDK. By default this will be: C:\Program Files\Windows Azure SDK\v1.0

    a. Find the file named “samples.zip”

    b. Unzip this to a writeable location

    5. From the samples you just unzipped, add the StorageClient\Lib\StorageClient.csproj and HelloFabric\Common\Common.csproj to your solution by right-clicking on the solution in Solution Explorer and selecting Add à Existing Project.

    image

    a. Common and StorageClient and libraries that are currently distributed as samples that provide functionality to help you build Cloud Applications. Common adds Access to settings and logging while StorageClient provides helpers for using the storage services.

    6. From your Web Role, add references to the Common and StorageClient projects you just added along with a reference to System.Data.Services.Client

    image

    image

    7. Add a ContactDataModel class to your Web Role that derives from TableStorageEntity. For simplicity, we’ll just assign a new Guid as the PartitionKey to ensure uniqueness. This default of assigning the PartitionKey and setting the RowKey to a hard coded value (String.Empty) gives the storage system the freedom to distribute the data.

    using Microsoft.Samples.ServiceHosting.StorageClient;
    public class ContactDataModel : TableStorageEntity
    {
        public ContactDataModel(string partitionKey, string rowKey)
            : base(partitionKey, rowKey)
        {
        }
    
        public ContactDataModel()
            : base()
        {
            PartitionKey = Guid.NewGuid().ToString();
            RowKey = String.Empty;
        }
    
        public string Name
        {
            get;
            set;
        }
    
        public string Address
        {
            get;
            set;
        }
    }

    8. Now add the ContactDataServiceContext to the Web Role that derives from TableStorageDataServiceContext.

    a. We’ll use this later to write queries, insert, remove and save changes to the table storage.

    using Microsoft.Samples.ServiceHosting.StorageClient;

    internal class ContactDataServiceContext : TableStorageDataServiceContext
    {
        internal ContactDataServiceContext(StorageAccountInfo accountInfo)
            : base(accountInfo)
        {
        }
    
        internal const string ContactTableName = "ContactTable";
    
        public IQueryable<ContactDataModel> ContactTable
        {
            get
            {
                return this.CreateQuery<ContactDataModel>(ContactTableName);
            }
        }
    }

    9. Next add a ContactDataSource class. We'll fill this class out over the course of the next few steps.  This is the class the does all the hookup between the UI and the table storage service. Starting with the first part of the constructor, a StorageAccountInfo class is instantiated in order to get the settings required to make a connection to the Table Storage Service. (note that this is just the first part of the constructor code, the rest is in step 12)

    using Microsoft.Samples.ServiceHosting.StorageClient;
    using System.Data.Services.Client;
    public class ContactDataSource
    {
        private ContactDataServiceContext _ServiceContext = null;
    
        public ContactDataSource()
        {
            // Get the settings from the Service Configuration file
            StorageAccountInfo account = 
    StorageAccountInfo.GetDefaultTableStorageAccountFromConfiguration();

    10. In order for the StorageAccountInfo class to find the configuration settings, open up ServiceDefinition.csdef and add the following to <WebRole/>. These define the settings.

        <ConfigurationSettings>
          <Setting name="AccountName"/>
          <Setting name="AccountSharedKey"/>
          <Setting name="TableStorageEndpoint"/>
        </ConfigurationSettings>  
    

    11. Likewise, add the actual local development values to the ServiceConfiguration.cscfg file. Note that the settings between both files have to match exactly otherwise your Cloud Service will not run.

    • When you run in the Cloud, the AccountName and AccountSharedKey will be set to the values you will get back from the Portal for your account. The TableStorageEndpoint will be set the URL for the Table Storage Service: http://table.core.windows.net
    • Because these are set in the ServiceConfiguration.cscfg file, these values can be updated even after deploying to the cloud by uploading a new Service Configuration.
    • For the local development case, the local host and port 10002 (by default) will be used as the Table Storage Endpoint. The AccountName and AccountSharedKey are hard coded to a value that the Development Storage service is looking for (it’s the same for all 3, Table, Blob and Queue services).
        <ConfigurationSettings>
          <Setting name="AccountName" value="devstoreaccount1"/>
          <Setting name="AccountSharedKey" 
    value="Eby8vdM02xNOcqFlqUwJPLlmEtlCDXJ1OUzFT50uSRZ6IFsuFq2UVErCz4I6tq/K1SZFPTOtr/KBHBeksoGMGw=="/> <Setting name="TableStorageEndpoint"
    value="http://127.0.0.1:10002/"/>
    </ConfigurationSettings>

    12. Next, continue to fill out the constructor (just after the call to GetDefaultTableStorageAccountFromConfiguration ()) by instantiating the ContactDataServiceContext. Set the RetryPolicy that applies only to the methods on the DataServiceContext (i.e. SaveChanges() and not the query. )

        // Create the service context we'll query against
        _ServiceContext = new ContactDataServiceContext(account);
        _ServiceContext.RetryPolicy = RetryPolicies.RetryN(3, TimeSpan.FromSeconds(1));
    }

    13. We need some code to ensure that the tables we rely on get created.  We'll do this on first request to the web site -- which can be done by adding code to one of the handlers in the global application class.  Add a global application class by right clicking on the web role and selecting Add -> New Item -> Global Application Class. (see this post for more information)

    image

    14. Add the following code to global.asax.cs to create the tables on first request:

    using Microsoft.Samples.ServiceHosting.StorageClient;
    protected void Application_BeginRequest(object sender, EventArgs e)
    {
        HttpApplication app = (HttpApplication)sender;
        HttpContext context = app.Context;
    
        // Attempt to peform first request initialization
        FirstRequestInitialization.Initialize(context);
    
    }

    And the implementation of the FirstRequestInitialization class:

    internal class FirstRequestInitialization
    {
        private static bool s_InitializedAlready = false;
        private static Object s_lock = new Object();
    
    
        // Initialize only on the first request
        public static void Initialize(HttpContext context)
        {
            if (s_InitializedAlready)
            {
                return;
            }
    
            lock (s_lock)
            {
                if (s_InitializedAlready)
                {
                    return;
                }
    
                ApplicationStartUponFirstRequest(context);
                s_InitializedAlready = true;
            }
        }
    
        private static void ApplicationStartUponFirstRequest(HttpContext context)
        {
            // This is where you put initialization logic for the site.
            // RoleManager is properly initialized at this point.
    
            // Create the tables on first request initialization as there is a performance impact
            // if you call CreateTablesFromModel() when the tables already exist. This limits the exposure of
            // creating tables multiple times.
    
            // Get the settings from the Service Configuration file
            StorageAccountInfo account = StorageAccountInfo.GetDefaultTableStorageAccountFromConfiguration();
    
            // Create the tables
            // In this case, just a single table.  
            // This will create tables for all public properties that are IQueryable (collections)
            TableStorage.CreateTablesFromModel(typeof(ContactDataServiceContext), account);
        }
    }

    15. When running in the real cloud, this code is all that is needed to create the tables for your Cloud Service. The TableStorage class reflects over the ContactDataServiceContext classs and creates a table for each IQueryable<T> property where the columns of that table are based on the properties of the type T of the IQueryable<T>.

    a. There is a bit more to do in order to get this to work in the local Development Storage case, more on that later.

    16. At this point, it’s just a matter of filling out the ContactDataSource class with methods to query for the data, insert and delete rows. This is done through LINQ and using the ContactDataServiceContext.

    a. Note: in the Select() method, the TableStorageDataServiceQuery<T> class enables you to have finer grained control over how you get the data.

    i. Execute() or ExecuteWithRetries() will access the data store and return up to the first 1000 elements.

    ii. ExecuteAll() or ExecuteAllWithRetries() will return all of the elements with continuation as you enumerate over the data.

    iii. ExecuteWithRetries() and ExecuteAllWithRetries() uses the retry policy set on the ContactDataServiceContext for the queries.

    b. Note: the use of AttachTo() in the Delete() method to connect to and remove the row.

    public IEnumerable<ContactDataModel> Select()
    {
        var results = from c in _ServiceContext.ContactTable
                      select c;
    
        TableStorageDataServiceQuery<ContactDataModel> query = 
    new TableStorageDataServiceQuery<ContactDataModel>(results as DataServiceQuery<ContactDataModel>); IEnumerable<ContactDataModel> queryResults = query.ExecuteAllWithRetries(); return queryResults; } public void Delete(ContactDataModel itemToDelete) { _ServiceContext.AttachTo(ContactDataServiceContext.ContactTableName, itemToDelete, "*"); _ServiceContext.DeleteObject(itemToDelete); _ServiceContext.SaveChanges(); } public void Insert(ContactDataModel newItem) { _ServiceContext.AddObject(ContactDataServiceContext.ContactTableName, newItem); _ServiceContext.SaveChanges(); }

    17. The UI is defined in the aspx page and consists of 3 parts. The GridView which will display all of the rows of data, the FormView which allows the user to add rows and the ObjectDataSource which databinds the UI to the ContactDataSource.

    18. The GridView is placed after the first <div>. Note that in this sample, we’ll just auto-generate the columns and show the delete button. The DataSourceId is set the ObjectDataSource which will be covered below.

        <asp:GridView
            id="contactsView"
            DataSourceId="contactData"
            DataKeyNames="PartitionKey"
            AllowPaging="False"
            AutoGenerateColumns="True"
            GridLines="Vertical"
            Runat="server" 
            BackColor="White" ForeColor="Black"
            BorderColor="#DEDFDE" BorderStyle="None" BorderWidth="1px" CellPadding="4">
            <Columns>
                <asp:CommandField ShowDeleteButton="true"  />
            </Columns>
            <RowStyle BackColor="#F7F7DE" />
            <FooterStyle BackColor="#CCCC99" />
            <PagerStyle BackColor="#F7F7DE" ForeColor="Black" HorizontalAlign="Right" />
            <SelectedRowStyle BackColor="#CE5D5A" Font-Bold="True" ForeColor="White" />
            <HeaderStyle BackColor="#6B696B" Font-Bold="True" ForeColor="White" />
            <AlternatingRowStyle BackColor="White" />
        </asp:GridView>    

    19. The Form view to add rows is really simple, just labels and text boxes with a button at the end to raise the “Insert” command. Note that the DataSourceID is again set to the ObjectDataProvider and there are bindings to the Name and Address.

        <br />        
        <asp:FormView
            id="frmAdd"
            DataSourceId="contactData"
            DefaultMode="Insert"
            Runat="server">
            <InsertItemTemplate>
                <asp:Label
                        id="nameLabel"
                        Text="Name:"
                        AssociatedControlID="nameBox"
                        Runat="server" />
                <asp:TextBox
                        id="nameBox"
                        Text='<%# Bind("Name") %>'
                        Runat="server" />
                <br />
                <asp:Label
                        id="addressLabel"
                        Text="Address:"
                        AssociatedControlID="addressBox"
                        Runat="server" />
                <asp:TextBox
                        id="addressBox"
                        Text='<%# Bind("Address") %>'
                        Runat="server" />
                <br />
                <asp:Button
                        id="insertButton"
                        Text="Add"
                        CommandName="Insert"
                        Runat="server"/>
            </InsertItemTemplate>
        </asp:FormView>
    

    20. The final part of the aspx is the definition of the ObjectDataSource. See how it ties the ContactDataSource and the ContactDataModel together with the GridView and FormView.

        <%-- Data Sources --%>
        <asp:ObjectDataSource runat="server" ID="contactData" 
    TypeName="SimpleTableSample_WebRole.ContactDataSource" DataObjectTypeName="SimpleTableSample_WebRole.ContactDataModel" SelectMethod="Select" DeleteMethod="Delete" InsertMethod="Insert"> </asp:ObjectDataSource>

    21. Build. You should not have any compilation errors, all 4 projects in the solution should build successfully.

    22. Create Test Storage Tables. As mentioned in step 15, creating tables in the Cloud is all done programmatically, however there is an additional step that is needed in the local Development Storage case.

    a. In the local development case, tables need to be created in the SQL Express database that the local Development Storage uses for its storage. These need to correspond exactly to the runtime code. This is due to a current limitation in the local Development Storage.

    b. Right click on the Cloud Service node in Solution Explorer named “Create Test Storage Tables” that runs a tool that uses reflection to create the tables you need in a SQL Express database whose name corresponds to your Solution name.

    image

    i. ContactDataServiceContext is the type that the tool will look for and use to create those tables on your behalf.

    ii. Each IQueryable<T> property on ContactDataServiceContext will have a table created for it where the columns in that table will correspond to the public properties of the type T of the IQueryable<T>.

    23. F5 to debug. You will see the app running in the Development Fabric using the Table Development Storage

    image

    Please see the Deploying a Cloud Service to learn how to modify the configuration of this Cloud Service to make it run on Windows Azure.

  • Cloudy in Seattle

    Submitting a Bug against the Windows Azure Tools and SDK

    • 2 Comments

    If you have questions you want to have answered, or want to participate in community discussions, the Windows Azure Forum is a great place to go. 

    If you've found a bug or have a suggestion, submitting a bug/suggestion directly to our product team is a great way to help us identify the issues that exist and to help us build a better product. 

    Reporting a bug also allows you to track the progress of the issue, which is harder to do if we open a bug report for you in our internal systems. Also, customer reported issues get higher priority, so it's more likely to be fixed. 

    To submit a bug against the Windows Azure Tools and SDK, please go to the Microsoft Visual Studio Connect site:

    https://connect.microsoft.com/VisualStudio

    Sign in with your Live ID and click on "Submit a bug" or "Submit a Suggestion".

    You will then be taken to the following page:

    image

    Where in the Version combobox, you can select either "Windows Azure SDK" or "Windows Azure Tools for Visual Studio".

    image

    Often times bugs are hard to reproduce, which makes it hard for us to address them. If you can let us know what the exact steps for creating the problem from scratch are then that makes it easier for us to track down and fix the bug. 

    What behavior you expected as well as stack traces, error messages and any other diagnostic information really helps as well.

    You can also attach a solution or project that reproduces the problem. If at all possible, please narrow down the code and the solution to the simplest possible form that reproduces the problem. Please remove any intellectual property or personal identifiable information from your samples. 

  • Cloudy in Seattle

    MSDN Articles on Windows Azure Tools

    • 0 Comments

    I've posted a number of technical articles on MSDN that will help you to get acquainted with the Windows Azure Tools for Microsoft Visual Studio.  Have a look:

    Introduction

    Deploying a Service (Interesting for folks who don't have a token yet but want to see the experience)

    Creating an HTTPS endpoint

    Help

     

  • Cloudy in Seattle

    Cloud Service Project Templates vs Role Project Templates

    • 5 Comments

    If you create a new Cloud Service project in Visual Studio but it doesn't contain a Cloud Service project (ccproj), you won't get the Visual Studio integration for building Windows Azure services that you would expect.  For example, you won't get debug/run in the Development Fabric, you won't get integration with the Development Storage service, and you won't get publishing support.

    The reason is that you have created a Role project, instead of a Cloud Service Project. Today, you can have 0 or 1 Web Roles (which is an ASP.Net Web Application) and 0 or 1 Worker Roles (which is a UI-less .Net application) in each service.

    When you create a new Cloud Service project, make sure you are choosing a template from the "Cloud Service" node and not the Roles node in the New Project dialog:

    image

    The question would then be, what is the Roles node in the New Project dialog used for?

    It's used when you want to:

    1) Add a role to an existing Cloud Service project.  For example, the following Cloud Service project contains a web role.  You can use the roles node to add a Worker Role project to that Cloud Service.

    image

    This will bring up the new project dialog with the Roles node selected.  Out of the box, we only support C# and VB ASP.Net Web Application Web Roles and and C# and VB Worker roles.

    image

    Likewise, if the Cloud Service contains a Worker Role and not a Web Role, you can add a Web Role to that project in the same way.

    2) Replace an existing Role with a new Role Project.  For example, you have a Cloud Service that contains a Web Role, and you want to replace it with a new Web Role.  You can right click on the Web Role association and select to associate it with a new Web Role:

    image

    This will bring up the the new project dialog with the Roles node selected:

    image 

    The final thing you can do, is associate a role with an existing Role project by adding that Role project to the solution, right clicking on either the Web or Worker role association node in Solution Explorer and select to associate that role with a "Role Project in solution..."

  • Cloudy in Seattle

    Using the Windows Azure SDK Samples in Visual Studio

    • 2 Comments

    The Windows Azure SDK contains a set of samples that will help you get started with and to learn about Windows Azure.  In this post, I want to cover some potential issues that some of you may run into. 

    Potential issue #1: You try to load one of the Windows Azure SDK Samples, found as a zip file in the Windows Azure SDK install folder (Default is: C:\Program Files\Windows Azure SDK\v1.0), by opening the sln (for example HelloFabric.sln) and you receive the following error:

    image

    ccproj cannot be opened because its project type (.ccproj) is not supported by this version of the application.  To open it, Please use a version that supports this type of project.

    Solution: You've installed the Windows Azure SDK but have not installed the Windows Azure Tools for Microsoft Visual Studio.  Install the tools to get Visual Studio support for building Windows Azure services.

    Potential Issue #2:

    You click File->New->Project or "Create Project" from the start page and you don't see the option to create Cloud Service projects:

    i.e. you see this:

    image

    Instead of this:

    image

    Solution: As with potential issue #1, you've installed the Windows Azure SDK but have not installed the Windows Azure Tools for Microsoft Visual Studio.  Install the tools to get Visual Studio support for building Windows Azure services.

    Potential Issue #3:

    You try to open one of the samples in the Windows Azure SDK by opening up the sln in Visual Studio and receive one or both of the following security warning:

    image

    image 

    Solution: You are opening the files from a untrustworthy source location (such as the Program Files directory or a file share).  Copy the samples to a writeable, full trust location.

    Potential Issue #4:

    You hit "F5" to debug and you see the ASP.Net Development Server instead of the Development Fabric:

    image

    Or, you click "Publish" and you see the following dialog instead of packaging and navigating to the Azure Services Developer Portal:

    image

    Solution: You need to ensure that the Cloud Service project is the startup project.  In Solution Explorer, right Click the project that has the globe and 3 blocks and select "Set as Startup Project". 

    In Visual Web Developer Express Edition, you simply need to click on that project in order to make it bold.

    image

    The bolded project in Solution Explorer is the startup project.

    image

  • Cloudy in Seattle

    Now: Cloud Computing

    • 1 Comments

    As some of you may have noticed, this blog has gone dark for a while.  During that time it got an interesting face lift and a new title "Cloudy in Seattle".

    As it turns out, I switched teams and am now a proud member of the (Cloud) Computing Tools team... but that also meant I couldn't blog about the cool things I've been working on. 

    With the announcement in today's key note at the PDC, I'm happy to say I can now start blogging about it, about the Azure Services Platform and in particular, Windows Azure.

    When you distill it out, Windows Azure is all about hosting your web applications, storage, management and development.  That is, you can host, scale and manage web application in Microsoft data centers.

    At the same time, it incorporates some truly cool features like the Fabric controller that manages resources, load balances across your running instances (which incidentally, you can configure on the fly and as easily as editing or uploading a new configuration file) and manages upgrades and failures to maintain availability.

    My part in all this?  You guessed it, I'm working on developer tools -- that's my passion.  My product: Windows Azure Tools for Microsoft Visual Studio.  An add-in for Visual Studio 2008 that brings the developer experience you would expect for building Cloud Applications to Visual Studio.

    The tools make it easy to get started, configure, edit, build, package, debug and publish for deployment.

    I'm really jazzed about what's ahead, and I'm jazzed to be able to start sharing.  Stay tuned.

Page 1 of 1 (12 items)