Using the CloudDrive Sample to Access Windows Azure Logs

Using the CloudDrive Sample to Access Windows Azure Logs

Rate This
  • Comments 9

On Windows Azure, you can output trace messages when your Roles are running "in the cloud". 

You write the messages by calling the RoleManager.WriteToLog() API in Microsoft.ServiceHosting.ServiceRuntime. 

This post will cover how to:

  1. Copy the logs for your service running on Windows Azure to a blob storage container using the Azure Services Developer Portal
  2. Build and install the CloudDrive sample from the Windows Azure SDK
  3. Use the CloudDrive sample to access your Blob Storage Account
  4. Use CloudDrive to copy your logs to a local directory where you can open them

This post assumes that you have a service running on Windows Azure that makes use of the RoleManager.WriteToLog() API.  If needed, please refer to the Quick Lap around the Windows Azure Tools and Deploying a Service on Windows Azure walkthroughs.

You also need to install Windows Powershell

On the project page for your Hosted Service:

image

Click on the "Configure..." button.  You will be directed to a page that will allow you to choose which Storage Account (Note: this is the friendly name for the Storage Account, not the account name, this is important later) and specify the container name in Blob Storage where you want the logs to be copied.

Note that the container name has the same restrictions as DNS names.

image

After you click "Copy Logs", you'll get the following message.

image

So how do you get the logs from blob storage?  The easiest way is to use the CloudDrive sample in the Windows Azure SDK.

In the SDK install folder (C:\Program Files\Windows Azure SDK\v1.0 by default), you'll see a zip file (samples.zip), copy this to a writeable (i.e. not in Program Files) location and unzip it. 

A useful document on using CloudDrive can be found by opening:

C:\{. . .}\samples\CloudDrive\CloudDriveReadme.mht

Follow the steps to build and register CloudDrive as a PowerShell provider:

The usage of CloudDrive requires it to be registered as a PowerShell provider, which puts the appropriate entries into the registry for PowerShell to locate the .dll.

  1. Open an elevated Windows Azure SDK command prompt by right clicking on Start | Programs | Windows Azure SDK (October 2008 CTP) | Windows Azure SDK Command Prompt
  2. Go to the CloudDrive sample folder
  3. Build the CloudDrive using the provided “buildme.cmd” script.
  4. Install/Run CloudDrive using the provided “runme.cmd” from within an elevated command prompt.

After doing those steps, you can do the following:

  1. cd Blob:
  2. dir

This will list your blob containers in Development Storage.  Since I've been using the local Blob Storage, you can see that I do in fact get a list of blob containers:

image

That's useful but what I want to do is change this sample so that I can read from the Storage Account where my logs have been saved.

In the C:\{. . .}\samples\CloudDrive\Scripts directory, you'll find a file called MountDrive.ps1. 

Create your own copy of this file, and modify the account, key, ServiceUrl and DriveName to match the values you got when creating your Storage Account on Windows Azure through the Azure Services Developer Portal. 

For example, for the storage account I created with service name of "mvcproviderstorage":

image

Account mvcproviderstorage
Key Primary Access Key
Service Url http://blob.core.windows.net
DriveName MyStorage (choose what you like)

The modified file looks like this:

function MountDrive { Param ( $Account = "<insert storage service name>", $Key = "<insert primary key>", $ServiceUrl="http://blob.core.windows.net", $DriveName="<insert drive name of your choosing>", $ProviderName="BlobDrive") # Power Shell Snapin setup add-pssnapin CloudDriveSnapin -ErrorAction SilentlyContinue # Create the credentials $password = ConvertTo-SecureString -AsPlainText -Force $Key $cred = New-Object -TypeName Management.Automation.PSCredential -ArgumentList $Account, $password # Mount storage service as a drive new-psdrive -psprovider $ProviderName -root $ServiceUrl -name $DriveName -cred $cred -scope global } MountDrive

Note that you could either pass in the new parameters at the bottom or change the default values and get rid of the parameters in the call to MountDrive.  I chose the latter although you may choose the former so that you can mount more than one drive with the same script.

Open up a Windows Powershell and do the following:

  1. Run the version of MountDrive.ps1 you created
  2. "cd MyStorage:" (or whatever you called your DriveName followed by a colon)
  3. dir

You will now see the container that was created by the Azure Services Developer Portal when you chose to "copy logs".

"cd" to that container and you will see that you will have a subdirectory named WebRole if your service contains a Web Role and a subdirectory named WorkerRole if your service contains a Worker Role.

Within the WebRole or WorkerRole directories, you will see subdirectories for each one of the role instances.  For example: WebRole_IN_0 and WebRole_IN_1.  The log files will be contained inside those directories split up by 15 minute chunks.

To copy a log file, do the following (make sure you can write to the destination folder): 

copy-cd Events_UTC_xyz.xml c:\file.log

To copy a directory do the following (note the trailing '\'s -- CloudDrive is stricter than normal Power Shell in requiring the trailing slash for directories as files and directories can have the same name)

copy-cd WebRole\ c:\WebRole\

You can now open and examine your log files.  (Tip: Internet Explorer shows the logs formatted nicely)

Technorati ProfileTechnorati Profile
Leave a Comment
  • Please add 5 and 4 and type the answer here:
  • Post
Page 1 of 1 (9 items)