UPDATE: Please refer to an updated version of this post for up-to-date steps working with Azure SDK November CTP and Silverlight 3 and Silverlight 4 Beta.

Windows Azure is a great platform for taking traditional web applications and rich internet applications (RIAs) to the next level with its high availability and scalability. After all, it has a scalable web role which many understand as “the cloud web server”. However, it’s not exactly the same thing and your applications might not and likely will not work as usual after moving to Windows Azure. Any more or less complex web application, especially LOB application, might be affected by one or several of the following aspects:

·         Azure web role is stateless, it doesn’t maintain any session state and your applications shouldn’t rely on continuing to do whatever they’re doing at any given moment in time

·         There is no traditional database in the cloud therefore most database access technologies simply don’t work or work differently in Azure

·        Azure load balancer is controlled by Azure fabric and you have no control over it, so your requests can be routed to any Azure instance breaking some "sticky" communication patterns

·         Azure applications cannot rely on a specific web server IP address

·         There are open port and component deployment limitations (what’s available is available and upgrading or adding might mean moving to different service level agreement (SLA), so this is not a technical problem you can solve).

In this series of posts I want to show how to overcome some of these limitations while porting traditional (if that’s the right word) Rich Internet Applications to Windows Azure. I’m going to be using Silverlight 3 Beta for this exercise. Now, you might wonder – why would Silverlight application be affected by changes on the server? After all it’s a client technology, right? Yes and no. Even though Silverlight has only client runtime, Silverlight application (even in SL3) is never a 100% client application (like mouse driver, for example) because it depends on the context which is provided by the server, therefore it must care about that server connection even if it’s running out of browser, especially if it’s serving as a UI for cloud application now.

I’m going to start with one of the most common scenarios – moving the data access layer to the cloud. This is something that almost every application will probably face considering that database in the cloud is different from traditional RDBMS in many ways and even though SQL Data Services is closing the gap very quickly, you’d still need to do things differently both on the server (Azure web/worker role) and on the client (Silverlight).  Building a simple DAL for web-hosted Silverlight application has been pretty straightforward – point the ADO.NET EF Wizard to your database, select the data for a model, wrap that model with ADO.NET DS data service, expose it as a Silverlight-enabled WCF web service, add a service reference in your SL application and you’re ready to go with full set of CRUD capabilities on a nicely looking object model. This breaks at the very first step with Azure though – there’s no Entity Framework available for either Azure Storage or SDS yet. Below I describe one way to port to Azure Table Storage with minimal changes  (I’ll cover SDS next month when it becomes available publicly with new RDBMS-like architecture and API).

Note: Steps 1-4 represent one server-side model class, step 5 represents another server-side data service class and steps 7-9 represent third, client-side data access class. Dots (...) represent similar or auxiliary code.

Step 1. Create POCO classes to represent your data model.

POCO stands for plain old CLR objects and looks like this:

public class Report

{

      public string PartitionKey { get; set; }

      public string RowKey { get; set; }

 

      public string Title { get; set; }

      public string Description { get; set; }

      public string Solution { get; set; }

      public string InitiatorId { get; set; }

      public string SolverId { get; set; }

      public DateTime DateCreated { get; set; }

 

      public Report()

      {

      }

 

      public Report(string title, string description, string initiatorId, string solverId)

      {

             this.Title = title;

             this.Description = description;

             this.InitiatorId = initiatorId;

             this.SolverId = solverId;

             this.DateCreated = DateTime.UtcNow;

             this.PartitionKey = title;

             this.RowKey = title;

       }

} 

 

Step 2. Decorate your POCO classes with DataServiceKey attribute:

      [DataServiceKey("PartitionKey", "RowKey")]

      public class Report

      {

                  

 

Step 3. Create a data model class to be used with ADO.NET DS:

public partial class AzureModel : TableStorageDataServiceContext

{

     public IQueryable<Report> Reports

     {

           get { return this.CreateQuery<Report>("Reports");}

     }

     

}

 

 Step 4. Provide implementation for IUpdatable

Now, this is interesting part. The reason you need IUpdatable implementation is because ADO.NET uses it to update the storage.  The ADO.NET Entity Framework's ObjectContext does not implement IUpdatable, but its ObjectContextServiceProvider does, so building a Data Service with an ObjectContext gives you an update support for free; using LINQ to SQL with a DataContext results in using the ReflectionServiceProvider that has no built in support for IUpdatable operations and relies on the data service context to implement it; same applies to any other DataServiceContext-based context classes such as StorageClient’s TableStorageDataServiceContext used here. There are several custom IUpdatable implementations out there, but this one by Tom Laird-McConnell is particularly simple and works in Azure without modifications. Only thing you need to do is to inherit your model class from it:

 

      public partial class AzureModel : IUpdatable

      {

            #region IUpdatable Members

                

 

 Step 5. Create Data Service in a web role

Add new ADO.NET Data Service item to your Azure web role project. You’ll have to reference your DAL project to have access to the model and then you can base your data service on it:

namespace psoWebRole

{

      public class AzureDS : DataService<AzureModel>

      {

            public static void InitializeService(IDataServiceConfiguration config)

            {

                  config.SetEntitySetAccessRule("*", EntitySetRights.All);

                  config.SetServiceOperationAccessRule("*", ServiceOperationRights.All);

            }

      }

}

 

Step 6. Add a service reference to your data service in your Silverlight project as you would do to any other web service.

Now your AzureModel class becomes a data context and is almost ready to be used in Silverlight.

 

Step 7. Initialize data context with service URI

// This is a context similar to entity context for ADO.NET EF -

// generated entities

AzureModel ctx = new AzureModel(new Uri("AzureDS.svc", UriKind.Relative));

// Safest merge option that allows not to lose any values that

// were changed on the client

ctx.MergeOption = MergeOption.PreserveChanges;

Please note - you might want to switch between NoTracking and PresenerveChanges merge options here to achieve maximum performance of the queries, but more about this in the next part.

Step 8. Modify all your Silverlight code so that it uses LINQ to SQL to query objects and uses context’s BeginSaveChanges to make updates:

// Persist report

public void CreateReport(string LiveUserId, string title, string description)

{

      //Associate with current user

      var query = (from c in Context.Users

                  where c.LiveUserId == LiveUserId

                  select c) as DataServiceQuery<User>;

 

      // Execute the query

      if (query != null) query.BeginExecute(new AsyncCallback(a =>

      {

             try

             {

                   matchingUsers = query.EndExecute(a);

 

                   currentUser = matchingUsers.FirstOrDefault();

 

                   Report report = new Report();

                   report.Title = title;

                   report.Description = description;

                   report.InitiatorId = currentUser.PartitionKey;

                   report.DateCreated = DateTime.UtcNow;

                   report.PartitionKey = report.InitiatorId;

                   report.RowKey = report.DateCreated.Ticks.ToString();

                   ctx.AddToReports(report);

                   ctx.BeginSaveChanges(OnCreateReportChangesCompleted, report);

             }

             catch (Exception ex)

             {

                  if (CreateReportError != null)

                   CreateReportError(this, new MessagingErrorEventArgs(ex));

             }

       }), null);

}

 

Step 9. Call EndSaveChanges to persist changes to Azure Table Storage:

void OnCreateReportChangesCompleted(IAsyncResult result)

{

      try

      {

            ctx.EndSaveChanges(result);

            CreateReportComplete(this, new CreateReportEventArgs((result.AsyncState as Report).Title));

      }

      catch (DataServiceRequestException ex)

      {

            HtmlPage.Window.Alert("Data service error occurred while creating report: " + ex.Response);

      }

      catch (Exception ex)

      {

            HtmlPage.Window.Alert("Error occurred while creating report: " + ex.Message);

      }

}

 

Of course, I omitted assembly reference and event handler steps for brevity but that’s how process looks like in a nutshell. I hope it saves you some time.