November, 2008

  • Cloudy in Seattle

    ASP.Net MVC on Windows Azure with Providers

    • 19 Comments

    [For more recent information on using ASP.NET MVC with Windows Azure please see this post.]

    Before you get started with ASP.Net MVC and Windows Azure – please install this hotfix.

    Were you wondering why the sample project I got from Phil attached to my post, ASP.Net MVC on Windows Azure, didn't include the sample Windows Azure ASP.Net providers?

    The answer is that we wanted to get something out early that would unblock folks from trying out the MVC on Windows Azure scenario.  That sample solution accomplished that.

    We were also working out a problem we were seeing when using those providers with MVC.  The problem was that we would get a timeout after logging in or creating a new account when running on Windows Azure -- a problem you won't see when running locally against the Development Fabric.

    Luckily, Phil pointed me to a workaround which I've now verified solves this problem. (the fix is now in MVC).

    I tend to get pinged from time to time about issues with RequestURL so I’ll give a pointer to a post to David’s blog that talks about that.

    This post will cover augmenting the existing "MVC on Windows Azure sample" with membership, role and session state providers as well as the workaround that ensures the sample works when you run on Windows Azure.

    It is again important to note that using ASP.Net MVC on Windows Azure is still a preview.

    The sample code that goes along with this code is on the MSDN Code Gallery MVCCloudService and does not include the sample projects from the Windows Azure SDK, please follow the instruction below for adding and referencing them. (when you first load the project it will complain that the sample projects are missing)

    Starting with the sample project attached to my last post, the first thing to do is to add the AspProviders and the StorageClient projects found in the Windows Azure SDK samples to the solution.

    These sample projects are found in the SDK install folder (C:\Program Files\Windows Azure SDK\v1.0 by default), where you'll see a zip file (samples.zip).  Copy this file to a writeable (i.e. not in Program Files) location and unzip it.

    Right click on the Solution node in Solution Explorer -> Add -> Existing Project:

    image

    Navigate to the directory where you unzipped the Windows Azure SDK samples, AspProviders\Lib and select AspProviders.csproj:

    image

    Do the same to add the StorageClient.csproj project from StorageClient\Lib:

    image

    Then add project references to both of those projects by right clicking on the reference node in the CloudService1_WebRole project and selecting "Add Reference...", choosing the Projects Tab, selected both AspProviders and StorageClient projects and hitting "OK".

    image

    Let's now add the Service Definition and Service Configuration settings and values needed to access the Windows Azure Storage Services by by adding the following code to the Service Definition and Service Configuration files found in the Cloud Service project.

    The settings below are setup for the local Development Storage case.

    Note: When switching over to the *.core.windows.net endpoints, you'll have to use the https addresses (i.e. https://blob.core.windows.net) otherwise the providers will throw an exception.  Alternatively you could set allowInsecureRemoteEndpoints to true -- however that is not recommended.

    The definitions in ServiceDefinition.csdef:

    <ConfigurationSettings>
        <Setting name="AccountName"/>
        <Setting name="AccountSharedKey"/>
        <Setting name="BlobStorageEndpoint"/>
        <Setting name="QueueStorageEndpoint"/>
        <Setting name="TableStorageEndpoint"/>
        <Setting name="allowInsecureRemoteEndpoints"/>
    </ConfigurationSettings>

    The Values in ServiceConfiguration.cscfg:

    <Setting name="AccountName" value="devstoreaccount1"/>
    <Setting name="AccountSharedKey" value="Eby8vdM02xNOcqFlqUwJPLlmEtlCDXJ1OUzFT50uSRZ6IFsuFq2UVErCz4I6tq/K1SZFPTOtr/KBHBeksoGMGw=="/>
    <Setting name="BlobStorageEndpoint" value="http://127.0.0.1:10000"/>
    <Setting name="QueueStorageEndpoint" value = "http://127.0.0.1:10001"/>
    <Setting name="TableStorageEndpoint" value="http://127.0.0.1:10002"/>
    <Setting name="allowInsecureRemoteEndpoints" value=""/>

    The providers also have settings to specify the name of the table for storing membership, role and session related data.  Note that as the comment indicates, the values below are the only values that will work in the Development Storage case. 

    In the MVCWebrole web.config, change <appSettings/> to the following:

    <appSettings>
        <!-- provider configuration -->
        <!-- When using the local development table storage service only the default values given
         below will work for the tables (Membership, Roles and Sessions) since these are the names
         of the properties on the DataServiceContext class -->
        <add key = "DefaultMembershipTableName" value="Membership"/>
        <add key = "DefaultRoleTableName" value="Roles"/>
        <add key = "DefaultSessionTableName" value="Sessions"/>
        <add key = "DefaultProviderApplicationName" value="MvcCloudService"/>
        <add key = "DefaultSessionContainerName"/>
    </appSettings>

    The following connection string for SQL can be removed:

    <add name="ApplicationServices" connectionString="data source=.\SQLEXPRESS;Integrated Security=SSPI;AttachDBFilename=|DataDirectory|aspnetdb.mdf;User Instance=true" providerName="System.Data.SqlClient"/>
    

    We'll now add the providers, via the MVCWebRole web.config file (remove the existing SQL Server ones).  First the membership provider:

        <membership defaultProvider="TableStorageMembershipProvider" userIsOnlineTimeWindow = "20">
          <providers>
            <clear/>
    
            <add name="TableStorageMembershipProvider"
                 type="Microsoft.Samples.ServiceHosting.AspProviders.TableStorageMembershipProvider"
                 description="Membership provider using table storage"
                 applicationName="MvcCloudService"
                 enablePasswordRetrieval="false"
                 enablePasswordReset="true"
                 requiresQuestionAndAnswer="false"
                 minRequiredPasswordLength="1"
                 minRequiredNonalphanumericCharacters="0"
                 requiresUniqueEmail="true"
                 passwordFormat="Hashed"
                    />
    
          </providers>
        </membership>
    

    Role Manager provider:

        <roleManager enabled="true" defaultProvider="TableStorageRoleProvider" cacheRolesInCookie="true" cookieName=".ASPXROLES" cookieTimeout="30"
                     cookiePath="/" cookieRequireSSL="false" cookieSlidingExpiration = "true"
                     cookieProtection="All" >
          <providers>
            <clear/>
            <add name="TableStorageRoleProvider"
                 type="Microsoft.Samples.ServiceHosting.AspProviders.TableStorageRoleProvider"
                 description="Role provider using table storage"
                 applicationName="MvcCloudService"
                    />
          </providers>
        </roleManager>
    

    and the session state provider:

        <sessionState mode="Custom" customProvider="TableStorageSessionStateProvider">
          <providers>
            <clear />
            <add name="TableStorageSessionStateProvider"
                 type="Microsoft.Samples.ServiceHosting.AspProviders.TableStorageSessionStateProvider"
                 applicationName="MvcCloudService"
                 />
          </providers>
        </sessionState>
    

    You can consult the AspProvidersDemo project in the Windows Azure SDK samples as well. 

    Note: there is also a profile provider which would be added in the same manner.

    Before running, right click on the Cloud Service project node in Solution Explorer and select "Create Test Storage Tables".  (For more information on creating test storage tables, see this article)

    image

    And there you have it – ASP.Net MVC RC2 running on Windows Azure.

  • Cloudy in Seattle

    Silverlight MediaElement Playing a Video stored in Windows Azure Blob Storage

    • 15 Comments

    There are two things that I want to show in this post:

    1. That you can use Silverlight in a Windows Azure Cloud Service
    2. That you can stream a movie progressively via http (more about that here) from Windows Azure Blob storage

    The code is attached to this blog post.  Note that you will have to add and reference the Common and StorageClient projects that come as samples in the Windows Azure SDK.

    What I did, is start with my Simple Blob Storage Walkthrough and adapted it to be a Video Store instead.

    image

    The top part of the app is really the same as in the Simple Blob Storage Walkthrough where another column was added to the GridView to provide play buttons.

    When you hit play, the Silverlight control sitting underneath the form part of the page uses a MediaElement to play the video.  The video is played directly from its location in blob storage.

    Adding Silverlight

    Starting where the Simple Blob Storage Walkthrough finished off, lets add Silverlight to that project.  (this assumes that you have Silverlight Tools installed.  See here for more information)

    Note that you may have to configure the mime type in IIS for Silverlight -- you'll know if you get a "Could not download Silverlight application"error:

    Registering MIME type in development fabric

    To ensure solutions containing Silverlight clients work correctly in the development fabric, please ensure that the MIME type for the Silverlight .xap extension is configured correctly in IIS.

    1.     Open Internet Information Services (IIS) Configuration Manager and select the server to manage (usually the top-level item on the left side)

    2.     Double-click MIME Types on the right side

    3.     Add an entry to map the .xap extension to the application/x-silverlight-app MIME type

    Right click on the solution node in the Solution Explorer and select Add -> New Project.  Select "Silverlight Application":

    image

    Name the project VideoPlayer.

    Select to "Link this Silverlight control into an existing Web site" and make sure that your Web Role is selected.  For this walkthrough I chose not to add a test page as I want to add the Silverlight control onto the existing page.

    image

    Solution Explorer will now look contain a Silverlight project and will look like this:

    image

    Open up Default.aspx and at the bottom of the page, just after the statusMessage Label you had from the previous walkthrough, add the following snippet to add the Silverlight control onto the page:

            <asp:ScriptManager ID="ScriptManager1" runat="server">
            </asp:ScriptManager>
            <div style="height: 100%;">
                <asp:Silverlight ID="Xaml1" runat="server" Source="~/ClientBin/VideoPlayer.xap" MinimumVersion="2.0.30523"
                    Width="100%" Height="100%" />
            </div>

    You also need to Register the Silverlight assembly otherwise the asp:Silverlight tag will come up as unknown.  You can do this at the top of Default.aspx, right under the "<%@ Page ..." tag:

    <%@ Page Language="C#" AutoEventWireup="true" CodeBehind="Default.aspx.cs" Inherits="DownloadSite_WebRole._Default" %>
    <%@ Register Assembly="System.Web.Silverlight" Namespace="System.Web.UI.SilverlightControls"
        TagPrefix="asp" %>

    You may also have to add a reference to System.Web.Silverlight from your web Role (it can be found on the .Net tab of the add references dialog).

    Open up Page.xaml and change the Background property of the Grid to "Green" so that we can see the control on the aspx page.  Hit "F5" to debug.

    You should get the following:

    image

    Cool!  We have Silverlight!

    Coding up the Silverlight Page

    First we need a MediaElement to play the video.  Open up Page.xaml and add the MediaElement tag to the Grid as follows.  While we're at it, let's set the Width and Height of the Page to be 640 x 480 to make the video a little bigger and remove the Background attribute. 

    Name the MediaElement "mediaPlayer".

    <UserControl x:Class="VideoPlayer.Page"
        xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation" 
        xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml" 
        Width="640" Height="480">
        <Grid x:Name="LayoutRoot">
            <MediaElement x:Name="mediaPlayer"/>        
        </Grid>
    </UserControl>

    Note: A couple of issues I've been seeing (with the PDC October 2008 CTP) while developing Silverlight applications running on a Windows Azure Web Role: debugging doesn't seem to be working, the xap in the Web Role isn't always updating as expected resulting in a stale Silverlight UI on F5.  For the second problem, I found right clicking on the Silverlight project and hitting "Rebuild" then right clicking on the Web Role and hitting "Rebuild" before hitting F5 resolved the problem.  We're actively investigating both issues.

    We'll just keep things really simple.  When the play button is clicked, we'll use Javascript to call a method on the Silverlight control passing in the URI that contains the video to play.

    Let's add a scriptable method to the Page class in Page.xaml.cs:

    using System.Windows.Browser;
    public partial class Page : UserControl
    {
        {. . .}
        [ScriptableMemberAttribute]
        public void Play(string fileUriString)
        {
            if (!string.IsNullOrEmpty(fileUriString))
            {
                Uri fileUri = new Uri(fileUriString);
                mediaPlayer.Source = fileUri;
            }
        }
    }

    If you want to know more about the ScriptableMemberAttribute and how all this works, please see the MSDN article here.

    The second part of making the control scriptable is to register the scriptable object.  That is done in App.xaml.cs in Application_Startup()

    using System.Windows.Browser;
    private void Application_Startup(object sender, StartupEventArgs e)
    {
        Page p = new Page();
        HtmlPage.RegisterScriptableObject("VideoPlayer", p);
    
        this.RootVisual = p;
    }

    Playing the Video

    In order to play the video, we need to add a "play" button to each row of the GridView.

    Open up Default.aspx and in the Columns for the GridView, add a column at the end for the Play button.  We'll use the <input/> tag for this as we don't want a postback to occur when we click the button (that will re-initialize the Silverlight control)

    <asp:TemplateField>
        <ItemTemplate>
            <input type="button" value="Play"/>
        </ItemTemplate>
    </asp:TemplateField>

    To that button, let's hook up an event handler when it gets clicked.  I did this programatically on the RowDataBound method on the fileView GridView control. 

    The reason is that I wanted an easy way to pass the URI for the video to the Silverlight control.  In the RowDataBound event handler, I can get at the URI and pass it along as a parameter. 

    The event handler will be run on the client side in Javascript.

    In Default.aspx.cs in Page_Load(), register to handle the event:

    fileView.RowDataBound += new GridViewRowEventHandler(fileView_RowDataBound);

    Then handle the event by adding the event handler for each of the <input/> buttons that will pass the URI as a parameter.

    void fileView_RowDataBound(object sender, GridViewRowEventArgs e)
    {
        if (e.Row.RowType == DataControlRowType.DataRow)
        {
            FileEntry fe = (FileEntry)e.Row.DataItem;
            e.Row.Cells[3].Attributes.Add("onclick", "onGridViewRowSelected('" + fe.FileUri + "')");
        }
    }

    Finally, back in Default.aspx, we add the Javascript event handler that calls the scriptable method on our Silverlight control:

    <script language="javascript" type="text/javascript">
        function onGridViewRowSelected(fileUri) {
            var control = document.getElementById("Xaml1");
            control.content.VideoPlayer.Play(fileUri);
        }
    </script>

    Hit F5 to give it a try on the Development Fabric/Storage.  Upload a video sample (say from C:\Users\Public\Videos\Sample Videos) and then hit play.

    One final thing to remember is the upload file size limit on the FileUpload control which is discussed in the Simple Blob Storage Walkthrough.  More likely you'll run into here with video files.

    image

  • Cloudy in Seattle

    Using the CloudDrive Sample to Access Windows Azure Logs

    • 9 Comments

    On Windows Azure, you can output trace messages when your Roles are running "in the cloud". 

    You write the messages by calling the RoleManager.WriteToLog() API in Microsoft.ServiceHosting.ServiceRuntime. 

    This post will cover how to:

    1. Copy the logs for your service running on Windows Azure to a blob storage container using the Azure Services Developer Portal
    2. Build and install the CloudDrive sample from the Windows Azure SDK
    3. Use the CloudDrive sample to access your Blob Storage Account
    4. Use CloudDrive to copy your logs to a local directory where you can open them

    This post assumes that you have a service running on Windows Azure that makes use of the RoleManager.WriteToLog() API.  If needed, please refer to the Quick Lap around the Windows Azure Tools and Deploying a Service on Windows Azure walkthroughs.

    You also need to install Windows Powershell

    On the project page for your Hosted Service:

    image

    Click on the "Configure..." button.  You will be directed to a page that will allow you to choose which Storage Account (Note: this is the friendly name for the Storage Account, not the account name, this is important later) and specify the container name in Blob Storage where you want the logs to be copied.

    Note that the container name has the same restrictions as DNS names.

    image

    After you click "Copy Logs", you'll get the following message.

    image

    So how do you get the logs from blob storage?  The easiest way is to use the CloudDrive sample in the Windows Azure SDK.

    In the SDK install folder (C:\Program Files\Windows Azure SDK\v1.0 by default), you'll see a zip file (samples.zip), copy this to a writeable (i.e. not in Program Files) location and unzip it. 

    A useful document on using CloudDrive can be found by opening:

    C:\{. . .}\samples\CloudDrive\CloudDriveReadme.mht

    Follow the steps to build and register CloudDrive as a PowerShell provider:

    The usage of CloudDrive requires it to be registered as a PowerShell provider, which puts the appropriate entries into the registry for PowerShell to locate the .dll.

    1. Open an elevated Windows Azure SDK command prompt by right clicking on Start | Programs | Windows Azure SDK (October 2008 CTP) | Windows Azure SDK Command Prompt
    2. Go to the CloudDrive sample folder
    3. Build the CloudDrive using the provided “buildme.cmd” script.
    4. Install/Run CloudDrive using the provided “runme.cmd” from within an elevated command prompt.

    After doing those steps, you can do the following:

    1. cd Blob:
    2. dir

    This will list your blob containers in Development Storage.  Since I've been using the local Blob Storage, you can see that I do in fact get a list of blob containers:

    image

    That's useful but what I want to do is change this sample so that I can read from the Storage Account where my logs have been saved.

    In the C:\{. . .}\samples\CloudDrive\Scripts directory, you'll find a file called MountDrive.ps1. 

    Create your own copy of this file, and modify the account, key, ServiceUrl and DriveName to match the values you got when creating your Storage Account on Windows Azure through the Azure Services Developer Portal. 

    For example, for the storage account I created with service name of "mvcproviderstorage":

    image

    Account mvcproviderstorage
    Key Primary Access Key
    Service Url http://blob.core.windows.net
    DriveName MyStorage (choose what you like)

    The modified file looks like this:

    function MountDrive { Param ( $Account = "<insert storage service name>", $Key = "<insert primary key>", $ServiceUrl="http://blob.core.windows.net", $DriveName="<insert drive name of your choosing>", $ProviderName="BlobDrive") # Power Shell Snapin setup add-pssnapin CloudDriveSnapin -ErrorAction SilentlyContinue # Create the credentials $password = ConvertTo-SecureString -AsPlainText -Force $Key $cred = New-Object -TypeName Management.Automation.PSCredential -ArgumentList $Account, $password # Mount storage service as a drive new-psdrive -psprovider $ProviderName -root $ServiceUrl -name $DriveName -cred $cred -scope global } MountDrive
    
    

    Note that you could either pass in the new parameters at the bottom or change the default values and get rid of the parameters in the call to MountDrive.  I chose the latter although you may choose the former so that you can mount more than one drive with the same script.

    Open up a Windows Powershell and do the following:

    1. Run the version of MountDrive.ps1 you created
    2. "cd MyStorage:" (or whatever you called your DriveName followed by a colon)
    3. dir

    You will now see the container that was created by the Azure Services Developer Portal when you chose to "copy logs".

    "cd" to that container and you will see that you will have a subdirectory named WebRole if your service contains a Web Role and a subdirectory named WorkerRole if your service contains a Worker Role.

    Within the WebRole or WorkerRole directories, you will see subdirectories for each one of the role instances.  For example: WebRole_IN_0 and WebRole_IN_1.  The log files will be contained inside those directories split up by 15 minute chunks.

    To copy a log file, do the following (make sure you can write to the destination folder): 

    copy-cd Events_UTC_xyz.xml c:\file.log

    To copy a directory do the following (note the trailing '\'s -- CloudDrive is stricter than normal Power Shell in requiring the trailing slash for directories as files and directories can have the same name)

    copy-cd WebRole\ c:\WebRole\

    You can now open and examine your log files.  (Tip: Internet Explorer shows the logs formatted nicely)

    Technorati ProfileTechnorati Profile
  • Cloudy in Seattle

    How to Diagnose and Fix Windows Azure Development Storage Service Issues

    • 2 Comments

    On this thread in the forum Frank Siegemund posted some great guidelines on how to diagnose and resolve issues related to using the development storage services on Windows Azure.

    I had a quick chat with Frank and asked him if I could repost that info in blog format.  Here it is, very useful:

    Let me provide some general guidelines on how to diagnose and resolve issues when using the development storage services. 

    The assumption is that you are using the StorageClient sample library from the Windows Azure SDK for accessing the storage services.

    (a) If your first access to any storage service fails, it could be due to any of the following explanations:
        (1) the service is not started (local development storage scenario)
        (2) you did not configure account information or storage endpoints correctly in the configuration files for your service
        (3) there is an error reading the account information

    (b) The easiest thing to check is issue (a1). 
        (1) Make sure the development storage tool is running (icon in the status bar; if not you can run it over the entry points in the Start menu)
        (2) Make sure all services are running; right-click the development storage item and bring up the UI; the status for all services must be "running"
        (3) If you use table storage locally, make sure that you selected the right database; again in the Development Storage UI you can see which database is selected in the Name column in the table storage row. The development storage has certain restrictions with respect to the table name; follow the instruction in the Windows Azure documentation.
        (4) to make sure everything is in a initial state you can reset all services in development storage using the UI
        (5) In rare cases you might have to clean up some port reservations by running: netsh http delete urlacl url=http://+:10002/

    (c) Checking issue (a2) requires you to look at the configuration files. There are two kinds of configuration files: The csdef/cscfg files for your service and the application configuration files (app.config, Web.config).

    When you use StorageClient, the StorageAccountInfo class will look into all of these files. In most cases you want to use the standard configuration strings (which make is easier to access the configurations from StorageClient). The standard configuration strings are as follows:

    DefaultQueueStorageEndpointConfigurationString = "QueueStorageEndpoint"; 
    DefaultBlobStorageEndpointConfigurationString = "BlobStorageEndpoint"; 
    DefaultTableStorageEndpointConfigurationString = "TableStorageEndpoint"; 
    DefaultAccountNameConfigurationString = "AccountName"; 
    DefaultAccountSharedKeyConfigurationString = "AccountSharedKey";

    Be aware that when you configure endpoints with StorageClient in your configuration file, the account name is not part of the endpoint, but is specified separately!

    Here are two samples of a configuration in a cscfg file:

    <ConfigurationSettings>

    <Setting name="TableStorageEndpoint" value="http://127.0.0.1:10002/"/>  

    <Setting name="BlobStorageEndpoint" value="http://127.0.0.1:10000/"/>  

    <Setting name="AccountSharedKey" value="FjUfNl1…HHOlA=="/>  

    <Setting name="AccountName" value="devstoreaccount1"/>  

    </ConfigurationSettings>

    and

    <ConfigurationSettings>

    <Setting name="TableStorageEndpoint" value="http://table.core.windows.net/"/>  

    <Setting name="BlobStorageEndpoint" value="http://blob.core.windows.net/"/>  

    <Setting name="AccountSharedKey" value="FjUfNl1KiJ…OHHOlA=="/>  

    <Setting name="AccountName" value="myaccountname"/>  

    </ConfigurationSettings>

    Again: note that the account name is not specified in the endpoints.

    (d) Dealing with issue (a3) requires you to look at your code. In StorageClient, the class to use to read in configuration settings for accessing storage services is the StorageAccountInfo class. It has static methods that access the standard configuration strings (see above; this is used most often) or other configuration strings that you can specify explicitly.

    Be aware that the class looks up configuration settings in app.config/Web.config and in the settings that you provide for your service in the .csdef/.cscfg files. If the same configuration string is present in multiple files, the specifications in the .csdef/.cscfg files are always the ones that are considered most relevant.

  • Cloudy in Seattle

    Reading a Server Side XML File on Windows Azure

    • 4 Comments

    A question that I got at PDC was whether or not you could read an XML file contained in your Web Role when running on Windows Azure as some folks use this for metadata such as a site map.  The short answer is "yes", just as you would any ASP.Net Web Application (using MapPath to get at the file).

    As an example, you could use a DataSet to read an XML file you have in the App_Data folder:

    DataSet games = new DataSet();
    games.ReadXml(MapPath("App_Data/VideoGames.xml"));
    
    gameView.DataSource = games;
    gameView.DataBind();

    Where gameView is defined as:

    <asp:GridView ID="gameView" runat="server"/>

    Likewise, you can use an XmlDataSource to do the same thing:

    <asp:XmlDataSource ID="gameSource" DataFile="~/App_Data/VideoGames.xml" runat="server" />
    <asp:GridView ID="gameView2" DataSourceID="gameSource" runat="server" />
    

    Your XML file can reside in App_Data, or it could be at the root and everything will work as you expect -- I've tried this on the local Development Fabric as well as on Windows Azure to verify.

    That said, if you look at the Windows Azure trust policy you'll see that you can't write to files in App_Data however. 

    If you really have a need to reference a local store, say for caching or something, you can use the RoleManager.GetLocalResource() method.  You do so by setting the <LocalStorage/> tag in the Service Definition file (csdef) found in the Cloud Service project (ccproj)

    <LocalStorage name="tempStorage" sizeInMB="1"/>

    Then in your server side code, you can access that local storage like this:

    ILocalResource locRes = RoleManager.GetLocalResource("tempStorage");

    Where ILocalResource gives you a path where you can read/write files -- just be aware that you can't count on any files you write to always be there as the Fabric could stop your Role Instance and spin it up on a different VM, any kind of state should be stored in Windows Azure storage.

    That is, If you data that is changing, use Blob or Table Storage.  If you have a large amount of static data, and want to cache a subset locally for better performance, you can use Local Storage.

  • Cloudy in Seattle

    SpaceBlock File Transfer Utility now Supports Windows Azure Blob Storage

    • 2 Comments

    Right on the heels of my post on how to use the CloudDrive sample to access the logs for your service running on Windows Azure, John Spurlock posted a link to SpaceBlock on the Windows Azure forum that now supports Windows Azure Blob Storage.

    This is really cool and actually a much easier way to access logs, just type in your service name and primary access key and away you go:

    image

    Just makes it really easy to view and transfer to and from your blob storage account. 

    I hope we'll see local host Development Storage endpoint support soon. (with pre-populated devstoreaccount1 info already entered)

    ClickOnce install is here.

  • Cloudy in Seattle

    Visual Studio "Publish" of a Large Windows Azure Cloud Service may Fail

    • 2 Comments

    Update: This has been fixed in the January 2009 CTP of the Windows Azure Tools for Microsoft Visual Studio

    Shan McArthur who's been raising a lot of great issues and giving a lot of great feedback on the Windows Azure forum has recently blogged about an issue when using the "Publish" functionality in Visual Studio.

    Hani did some research and found that in our use of types in the System.IO.Packaging namespace, there is a point in which the code switches to using IsolatedStorage (most likely based on memory consumption) and when that happens, we're failing due to the way that the AppDomain we're running in is setup.

    In other words, it doesn't have to do with complex references but ultimately the size of the package being created.

    As indicated in Shan's post, you can workaround the issue by using cspack from the command line.

    We are actively working on the issue.

  • Cloudy in Seattle

    Windows Azure Tools: Getting the right Run/Debug and Publish Functionality

    • 4 Comments

    If you're using the Windows Azure Tools for Microsoft Visual Studio and you are finding that when you hit F5 or Ctrl+F5 you are getting the ASP.Net Development server:

    image

    Instead of the Development Fabric and Development Storage services:

    image image

    You need to make sure that the startup project in Visual Studio is set to the ccproj that was added when you created a Cloud Service.  Notice below how the CloudService22 project is bold, which indicates that it is the startup project.

    image

    You can set the startup project by right clicking on a project and selecting "Set as StartUp Project":

    image

    A useful setting to look at it in the Solution property pages (right click on the solution node in the Solution Explorer and click "Properties") is the Common Properties -> Startup Project.

    image

    Having this set to "Single startup project" and selecting the Cloud Service (ccproj) project is the way to go.  Having it set to the "Current Selection" causes the startup project to change whenever you activate a file in a different project.

    Cloud Service projects should have this set to a single startup project already.

    If you don't have a Cloud Service project in your solution, make sure that when you created your Cloud Service project, that you selected a template from the Cloud Service node and not the Roles node in the New Project -> Project types dialog:

    image

    See my post on the different between Cloud Service and Role Templates for more information.

    A Related Note on Publish

    When hitting "Publish" to create the cspkg file that you upload to the Azure Services Development Portal to deploy your Cloud Service on Windows Azure, you need to make sure that you publish on the Cloud Service project and not the Web Role.

    image

    Otherwise you'll get this dialog, which is not what you want when working on Windows Azure services:

    image

    This can be especially confusing when you use the Build -> Publish option:

    image 

    This is context sensitive to the selected project (not startup project) -- so just make sure the name to the right of the Publish action corresponds to the name of your Cloud Service project.

  • Cloudy in Seattle

    F# Windows Azure Templates

    • 1 Comments

    Recently I posted about the Windows Live Tools Web Role Template that integrates with the Windows Azure Tools for Microsoft Visual Studio as well as the ASP.Net MVC sample that helps you to get started with an ASP.Net MVC app you want to run on Windows Azure.

    Today I want to point out that Luke from the F# team has put together a set of F# templates that enable you to build an F# Worker Role.  He posted about them here and you can download them here.

    image

Page 1 of 1 (9 items)