We’ve released an alpha of “WebJobs SDK”, a simple framework that makes it crazy easy to write code that runs on Azure and binds against azure storage.  (The project was internally codenamed “SimpleBatch” and also known amongst a few as “Project Awesome”). Scott Hanselman has a great blog about SimpleBatch, and there’s also an excellent getting started tutorial.

The basic idea is that you can write normal C# functions and then add some bindings that connect their parameters to Azure storage (notably blobs, tables, and queues). We then have a dashboard that provides free diagnostics into your functions execution.

 

Simple code…

Consider an example where we want to resize an image and then stamp it with a water mark. We may write a Resize() and ApplyWaterMark() function like this:

    public class ImageFuncs
    {
        // Given an image stored in blob, resize it and save the output back to a blob.
        // Use WebImage class from System.Web.Helpers. 
        public static void Resize(
            [BlobInput(@"images-input/{name}")] Stream inputStream,
            [BlobOutput(@"images-output/{name}")] Stream outputStream)
        {
            WebImage input = new WebImage(inputStream);

            var width = 80;
            var height = 80;

            WebImage output = input.Resize(width, height);
            output.Save(outputStream);
        }

        // Take the resulting image from Resize() and stamp a watermark onto it. 
        // The watermark is the filename minus the extension. 
        public static void ApplyWaterMark(
            [BlobInput(@"images-output/{name}")] Stream inputStream,
            string name,
            [BlobOutput(@"images-watermarks/{name}")] Stream outputStream)
        {
            WebImage image = new WebImage(inputStream);
            image.AddTextWatermark(name, fontSize: 20, fontColor: "red");
            image.Save(outputStream);
        }
    }

static class WebImageExtensions { public static void Save(this WebImage image, Stream output) { var bytes = image.GetBytes(); output.Write(bytes, 0, bytes.Length); } }

First thing to note is that these functions are using regular FX types and so can be easily unit tested running in-memory or against the local file system. This can really be just a normal console application – there’s nothing about it that needs to be azure aware.

Bindings and Triggers

The magic above is the BlobInput / BlobOutput attributes. Those are pulled in via the Microsoft.WindowsAzure.Jobs nuget package.  These bind the Stream parameters directly to azure blobs. The string to these attributes is a blob path meaning “Container/BlobName”, and the {name} is like a Route parameter that uses basic pattern matching rules. So “images-input/{name}” matches any blob in container “images-input”, and the blob name is captured in the {name} route parameter.

As you see in the ApplyWaterMark() function, you can directly capture the route parameters via a parameter. Eg, the ‘string name’ parameter captures the {name} route parameter. This can be useful if you need programmatic access to part of the blob’s name.

The route parameters from the [BlobInput] are then passed to the [BlobOutput]. So Resize() would execute when a new blob “images-input/fruit.jpg” is detected, and it would produce an output blob “images-output/fruit.jpg”

You may notice that the output from Resize() maps nicely to the input of ApplyWaterMark(). That provides a defacto way that one function can chain execution to another.

This example demonstrates bindings for blobs, but there are also bindings for queues and tables.

 

Hosting and executing .

So what actually reads the bindings, listens on the triggers, and invokes the functions? That’s the JobHost object (which lives in Microsoft.WindowsAzure.Jobs.Host ). Here’s a canonical main function:

        static void Main()
        {
            JobHost h = new JobHost();
            h.RunAndBlock();
        }

Short and sweet! With the default JobHost ctor, the azure connection strings are pulled from the app.config:

<?xml version="1.0" encoding="utf-8" ?>
<configuration>
  <connectionStrings>    
    <add name="AzureJobsRuntime" connectionString="DefaultEndpointsProtocol=https;AccountName=???;AccountKey=???"/>
    <add name="AzureJobsData" connectionString="DefaultEndpointsProtocol=https;AccountName=???;AccountKey=???"/>
  </connectionStrings>
  <startup>
    <supportedRuntime version="v4.0" sku=".NETFramework,Version=v4.5" />
  </startup>
</configuration>
AzureJobsData is the connection string that the BlobInput/BlobOuput attributes will bind against.
AzureJobsRuntime is where the logging data goes. Both connection strings can point to the same account.

JobHost lives in Microsoft.WindowsAzure.Jobs.Host and it will:

  1. Get azure account strings from your app.config  (the strings can also be passed in directly to the ctor).
  2. Reflect over your code to find C# methods with the SimpleBatch attributes. (much like how WebAPI discovers controllers)
  3. Listen for new blobs that match the [BlobInput] pattern.
  4. when a blob is found, invoke the function
  5. automatically log the invocation so that you can view the results in a separate dashboard.

 

The Dashboard

As functions execute, you’ll see the logs in the SimpleBatch Dashboard. The dashboard runs as an Azure Websites Site Extension and getting to the dashboard is detailed in the tutorial above.

Here’s what the dashboard homepage looks like. You’ll notice it lists the function invocation history on the right.

 

image

 

We can see that Resize() and ApplyWaterMark() each ran twice. We can click on an instance to see details about a specific execution:

image

 

Every function instance gets a unique GUID and a permalink showing it’s execution history. In this case, the dashboard is showing us:

  • basic information like when the function ran, how long, and any console output
  • any exceptions thrown.
  • an explanation as to why this function was run in the first place (in this case, a new blob input was detected)
  • parameter information, including how many bytes each read, and even how long it spent in IO.
  • Any children function triggered by this function. IE, Resize() will create a blob that causes ApplyWaterMark() to run.

There are additional navigation options, such as seeing which function instance wrote each of the input blobs.

Note that you get all of this diagnostic logging for free, without any additional instrumentation or configuration in your code. As Scott Hanselman said, “Minimal ceremony for maximum results”.

 

In summary

SimpleBatch makes it very easy to write code in azure and it provides excellent diagnostics. It has two parts:

  1. Client-side dlls, available via the Microsoft.WindowsAzure.Jobs.Host nuget package.
  2. A dashboard for viewing function result. The dashboard runs as an Azure Websites Site Extension.

 

 

Here are some more links to other resources