• Cloudy in Seattle

    Windows Azure Tools: Getting the right Run/Debug and Publish Functionality

    • 4 Comments

    If you're using the Windows Azure Tools for Microsoft Visual Studio and you are finding that when you hit F5 or Ctrl+F5 you are getting the ASP.Net Development server:

    image

    Instead of the Development Fabric and Development Storage services:

    image image

    You need to make sure that the startup project in Visual Studio is set to the ccproj that was added when you created a Cloud Service.  Notice below how the CloudService22 project is bold, which indicates that it is the startup project.

    image

    You can set the startup project by right clicking on a project and selecting "Set as StartUp Project":

    image

    A useful setting to look at it in the Solution property pages (right click on the solution node in the Solution Explorer and click "Properties") is the Common Properties -> Startup Project.

    image

    Having this set to "Single startup project" and selecting the Cloud Service (ccproj) project is the way to go.  Having it set to the "Current Selection" causes the startup project to change whenever you activate a file in a different project.

    Cloud Service projects should have this set to a single startup project already.

    If you don't have a Cloud Service project in your solution, make sure that when you created your Cloud Service project, that you selected a template from the Cloud Service node and not the Roles node in the New Project -> Project types dialog:

    image

    See my post on the different between Cloud Service and Role Templates for more information.

    A Related Note on Publish

    When hitting "Publish" to create the cspkg file that you upload to the Azure Services Development Portal to deploy your Cloud Service on Windows Azure, you need to make sure that you publish on the Cloud Service project and not the Web Role.

    image

    Otherwise you'll get this dialog, which is not what you want when working on Windows Azure services:

    image

    This can be especially confusing when you use the Build -> Publish option:

    image 

    This is context sensitive to the selected project (not startup project) -- so just make sure the name to the right of the Publish action corresponds to the name of your Cloud Service project.

  • Cloudy in Seattle

    How to Diagnose and Fix Windows Azure Development Storage Service Issues

    • 2 Comments

    On this thread in the forum Frank Siegemund posted some great guidelines on how to diagnose and resolve issues related to using the development storage services on Windows Azure.

    I had a quick chat with Frank and asked him if I could repost that info in blog format.  Here it is, very useful:

    Let me provide some general guidelines on how to diagnose and resolve issues when using the development storage services. 

    The assumption is that you are using the StorageClient sample library from the Windows Azure SDK for accessing the storage services.

    (a) If your first access to any storage service fails, it could be due to any of the following explanations:
        (1) the service is not started (local development storage scenario)
        (2) you did not configure account information or storage endpoints correctly in the configuration files for your service
        (3) there is an error reading the account information

    (b) The easiest thing to check is issue (a1). 
        (1) Make sure the development storage tool is running (icon in the status bar; if not you can run it over the entry points in the Start menu)
        (2) Make sure all services are running; right-click the development storage item and bring up the UI; the status for all services must be "running"
        (3) If you use table storage locally, make sure that you selected the right database; again in the Development Storage UI you can see which database is selected in the Name column in the table storage row. The development storage has certain restrictions with respect to the table name; follow the instruction in the Windows Azure documentation.
        (4) to make sure everything is in a initial state you can reset all services in development storage using the UI
        (5) In rare cases you might have to clean up some port reservations by running: netsh http delete urlacl url=http://+:10002/

    (c) Checking issue (a2) requires you to look at the configuration files. There are two kinds of configuration files: The csdef/cscfg files for your service and the application configuration files (app.config, Web.config).

    When you use StorageClient, the StorageAccountInfo class will look into all of these files. In most cases you want to use the standard configuration strings (which make is easier to access the configurations from StorageClient). The standard configuration strings are as follows:

    DefaultQueueStorageEndpointConfigurationString = "QueueStorageEndpoint"; 
    DefaultBlobStorageEndpointConfigurationString = "BlobStorageEndpoint"; 
    DefaultTableStorageEndpointConfigurationString = "TableStorageEndpoint"; 
    DefaultAccountNameConfigurationString = "AccountName"; 
    DefaultAccountSharedKeyConfigurationString = "AccountSharedKey";

    Be aware that when you configure endpoints with StorageClient in your configuration file, the account name is not part of the endpoint, but is specified separately!

    Here are two samples of a configuration in a cscfg file:

    <ConfigurationSettings>

    <Setting name="TableStorageEndpoint" value="http://127.0.0.1:10002/"/>  

    <Setting name="BlobStorageEndpoint" value="http://127.0.0.1:10000/"/>  

    <Setting name="AccountSharedKey" value="FjUfNl1…HHOlA=="/>  

    <Setting name="AccountName" value="devstoreaccount1"/>  

    </ConfigurationSettings>

    and

    <ConfigurationSettings>

    <Setting name="TableStorageEndpoint" value="http://table.core.windows.net/"/>  

    <Setting name="BlobStorageEndpoint" value="http://blob.core.windows.net/"/>  

    <Setting name="AccountSharedKey" value="FjUfNl1KiJ…OHHOlA=="/>  

    <Setting name="AccountName" value="myaccountname"/>  

    </ConfigurationSettings>

    Again: note that the account name is not specified in the endpoints.

    (d) Dealing with issue (a3) requires you to look at your code. In StorageClient, the class to use to read in configuration settings for accessing storage services is the StorageAccountInfo class. It has static methods that access the standard configuration strings (see above; this is used most often) or other configuration strings that you can specify explicitly.

    Be aware that the class looks up configuration settings in app.config/Web.config and in the settings that you provide for your service in the .csdef/.cscfg files. If the same configuration string is present in multiple files, the specifications in the .csdef/.cscfg files are always the ones that are considered most relevant.

  • Cloudy in Seattle

    F# Windows Azure Templates

    • 1 Comments

    Recently I posted about the Windows Live Tools Web Role Template that integrates with the Windows Azure Tools for Microsoft Visual Studio as well as the ASP.Net MVC sample that helps you to get started with an ASP.Net MVC app you want to run on Windows Azure.

    Today I want to point out that Luke from the F# team has put together a set of F# templates that enable you to build an F# Worker Role.  He posted about them here and you can download them here.

    image

  • Cloudy in Seattle

    Windows Live Web Role for Windows Azure

    • 2 Comments

    In my prior post that talked about the difference between the Cloud Service templates and the Role templates I mentioned that out of the box, we only supported one kind of Web Role -- an ASP.Net Web Application.  (that is, if you used the Roles node in the Cloud Service (ccproj) project to add or replace a Web Role, you only got the option to add an ASP.Net Web Application.

    The cool thing is that we made the role templates extensible so other teams can add in Windows Azure versions of their templates.

    The first team to do this? The Windows Live Tools team -- they have an add-in to make incorporating Windows Live services into your Web application easier.

    Check out Vikas' post on the Windows Live Tools release that has templates for Windows Azure.  This is really cool:

    (and before you ask, there isn't an add-in for ASP.Net MVC at this point.  We do have a sample project that will make it easy for you to get started with MVC on Windows Azure)

  • Cloudy in Seattle

    PDC Hands on Labs

    • 1 Comments

    The Azure Services Training Kit - PDC Preview which contains the Hands on Labs that folks are doing at the PDC is now available for download.

    From the download page:

    The Azure Services Training Kit will include a comprehensive set of technical content including samples, demos, hands-on labs, and presentations that are designed to help you learn how to use the Azure Services Platform. This initial PDC Preview release includes the hands-on labs that were provided at the PDC 2008 conference. These labs cover the broad set of Azure Services including Windows Azure, .NET Services, SQL Services, and Live Services. Additional content will be included in future updates of this kit.

  • Cloudy in Seattle

    Windows Azure Walkthrough: Blob Storage Sample

    • 5 Comments

    Please see the updated post for November 2009 and later.

    This walkthrough covers what I found to be the simplest way to get a sample up and running on Windows Azure that uses the Blob Storage service. It is not trying to be comprehensive or trying to dive deep in the technology, it just serves as an introduction to how the Windows Azure Blob Storage Service works.

    Please take the Quick Lap Around the Tools before doing this walkthrough.

    Note: The code for this walkthrough is attached, you will still have to add and reference the Common and StorageClient projects from the Windows Azure SDK.

    After you have completed this walkthrough, you will have a Web Role that is a simple ASP.Net Web Application that shows a list of files that are stored and can be downloaded from Blob Storage. You can use the Web Role to add files to Blob storage and make them available in the list.

    image

    Blob Concepts

    Each storage account has access to blob storage. For each account there can be 0..n containers. Each container contains the actual blobs, which is a raw byte array. Containers can be public or private. In public containers, the URLs to the blobs can be accessed over the internet while in a private container, only the account holder can access those blob URLs.

    Each Blob can have a set of metadata set as a NameValueCollection of strings.

    image

    Creating the Cloud Service Project

    1. Start Visual Studio as an administrator

    2. Create a new project: File à New à Project

    3. Select “Web Cloud Service”. This will create the Cloud Service Project and an ASP.Net Web Role

    image

    4. Find the installation location of the Windows Azure SDK. By default this will be: C:\Program Files\Windows Azure SDK\v1.0

    a. Find the file named “samples.zip”

    b. Unzip this to a writeable location

    5. From the samples you just unzipped, add the StorageClient\Lib\StorageClient.csproj and HelloFabric\Common\Common.csproj to your solution by right-clicking on the solution in Solution Explorer and selecting Add –> Existing Project.

    image

    a. Common and StorageClient and libraries that are currently distributed as samples that provide functionality to help you build Cloud Applications. Common adds Access to settings and logging while StorageClient provides helpers for using the storage services.

    6. From your Web Role, add references to the Common and StorageClient projects you just added.

    image

    7. Add the code to connect to the Blob Storage Service in Default.aspx.cs.

    a. This code gets account information from the Service Configuration file and uses that to create a BlobStorage instance.

    b. From the BlobStorage instance, a container is created with a name taken from the Service Configuration. The name used for the container is restricted to valid DNS names.

    c. The container is made public so that the URLs for the blobs are accessible from the internet.

    using Microsoft.Samples.ServiceHosting.StorageClient;
    namespace DownloadSite_WebRole
    {
        public partial class _Default : System.Web.UI.Page
        {
            private BlobContainer _Container = null;
    
            protected void Page_Load(object sender, EventArgs e)
            {
                try
                {
                    // Get the configuration from the cscfg file
                    StorageAccountInfo accountInfo = StorageAccountInfo.GetDefaultBlobStorageAccountFromConfiguration();
    
                    // Container names have the same restrictions as DNS names
                    BlobStorage blobStorage = BlobStorage.Create(accountInfo);
                    _Container = blobStorage.GetBlobContainer(RoleManager.GetConfigurationSetting("ContainerName"));
    
                    // returns false if the container already exists, ignore for now
                    // Make the container public so that we can hit the URLs from the web
                    _Container.CreateContainer(new NameValueCollection(), ContainerAccessControl.Public);
                    UpdateFileList();
                }
                catch (WebException webExcept)
                {
                }
                catch (Exception ex)
                {
                }
            }

    8. In order for the StorageAccountInfo class to find the configuration settings, open up ServiceDefinition.csdef and add the following to <WebRole/>. These define the settings.

        <ConfigurationSettings>
          <Setting name="AccountName"/>
          <Setting name="AccountSharedKey"/>
          <Setting name="BlobStorageEndpoint"/>
          <Setting name="ContainerName"/>
        </ConfigurationSettings>
    

    9. Likewise, add the actual local development values to the ServiceConfiguration.cscfg file. Note that the settings between both files have to match exactly otherwise your Cloud Service will not run.

    <ConfigurationSettings>
      <Setting name="AccountName" value="devstoreaccount1"/>
      <Setting name="AccountSharedKey" value="Eby8vdM02xNOcqFlqUwJPLlmEtlCDXJ1OUzFT50uSRZ6IFsuFq2UVErCz4I6tq/K1SZFPTOtr/KBHBeksoGMGw=="/>
      <Setting name="BlobStorageEndpoint" value="http://127.0.0.1:10000/"/>
    
      <!-- Container, lower case letters only-->
      <Setting name="ContainerName" value="downloadsite"/>
    </ConfigurationSettings>
    

    10. When you run in the Cloud, the AccountName and AccountSharedKey will be set the values you will get back from the Portal for your account. The BlobStorageEndpoint will be set the URL for the Blob Storage Service: http://blob.core.windows.net

    a. Because these are set in the ServiceConfiguration.cscfg file, these values can be updated even after deploying to the cloud by uploading a new Service Configuration.

    11. For the local development case, the local host and port 10000 (by default) will be used as the Blob Storage Endpoint. The AccountName and AccountSharedKey are hard coded to a value that the Development Storage service is looking for (it’s the same for all 3, Table, Blob and Queue services).

    12. Next open up Default.aspx and add the code for the UI. The UI consists of:

    a. a GridView at the top

    b. Label and FileUpload control

    c. 2 Label and TextBox pairs (File Name and Submitter)

    d. Field validators to ensure that all of the fields are filled out before the file is uploaded.

    Add the following between the template generated <div></div> elements:

            <asp:GridView ID="fileView"
            AutoGenerateColumns="false" DataKeyNames="BlobName"
            Runat="server" onrowcommand="RowCommandHandler">
            <Columns>
                <asp:ButtonField Text="Delete" CommandName="DeleteItem"/>
                <asp:HyperLinkField
                    HeaderText="Link"
                    DataTextField="FileName"
                    DataNavigateUrlFields="FileUri" />
                <asp:BoundField DataField="Submitter" HeaderText="Submitted by" />
            </Columns>
        </asp:GridView>
    
        <br />
        <asp:Label id="filePathLabel" 
            Text="File Path:" 
            AssociatedControlID="fileUploadControl"
            Runat="server" />
        <asp:FileUpload ID="fileUploadControl" runat="server"/>
        <asp:requiredfieldvalidator id="filUploadValidator"
          controltovalidate="fileUploadControl"
          validationgroup="fileInfoGroup"
          errormessage="Select a File"
          runat="Server">
        </asp:requiredfieldvalidator>
        <br />
        <asp:Label
            id="fileNameLabel"
            Text="File Name:"
            AssociatedControlID="fileNameBox"
            Runat="server" />
        <asp:TextBox
            id="fileNameBox"
            Runat="server" />
        <asp:requiredfieldvalidator id="fileNameValidator"
          controltovalidate="fileNameBox"
          validationgroup="fileInfoGroup"
          errormessage="Enter the File Name"
          runat="Server">
        </asp:requiredfieldvalidator>
        <br />
        <asp:Label
            id="submitterLabel"
            Text="Submitter:"
            AssociatedControlID="submitterBox"
            Runat="server" />
        <asp:TextBox
            id="submitterBox"
            Runat="server" />
        <asp:requiredfieldvalidator id="submitterValidator"
          controltovalidate="submitterBox"
          validationgroup="fileInfoGroup"
          errormessage="Enter the Submitter Name"
          runat="Server">
        </asp:requiredfieldvalidator>
        <br />
        <asp:Button
            id="insertButton"
            Text="Submit"
            causesvalidation="true"
            validationgroup="fileInfoGroup"
            Runat="server" onclick="insertButton_Click"/>
        <br />
        <br />
        <asp:Label id="statusMessage" runat="server"/>
    

    13. If you now switch to design view, you will see:

    image

    14. In order to simplify the databinding to the UI, lets add a FileEntry class that contains the metadata for each blob.

    public class FileEntry
    {
        public FileEntry(string blobName, Uri fileAddress, string name, string user)
        {
            BlobName = blobName;
            FileUri = fileAddress;
            FileName = name;
            Submitter = user;
        }
    
        public Uri FileUri
        {
            get;
            set;
        }
    
        public string BlobName
        {
            get;
            set;
        }
    
        public string FileName
        {
            get;
            set;
        }
    
        public string Submitter
        {
            get;
            set;
        }
    }

    15. Back to Default.aspx.cs, let’s add code to populate the GridView. One row for each blob found in storage. A FileEntry for each blob found in the container is create and put in a List.

    a. Note that in order to get the metadata from a blob, you need to call BlobContainer.GetBlobProperties(), the list of blobs returned from BlobContainer.ListBlobs() does not contain the metadata.

    using System.Collections.Generic;
    using System.Collections.Specialized;
    private void UpdateFileList()
    {
        IEnumerable<object> blobs = _Container.ListBlobs(string.Empty, false);
        List<FileEntry> filesList = new List<FileEntry>();
    
        foreach (object o in blobs)
        {
            BlobProperties bp = o as BlobProperties;
            if (bp != null)
            {
                BlobProperties p = _Container.GetBlobProperties(bp.Name);
                NameValueCollection fileEntryProperties = p.Metadata;
                filesList.Add(new FileEntry(p.Name, bp.Uri, fileEntryProperties["FileName"], fileEntryProperties["Submitter"]));
            }
        }
    
        fileView.DataSource = filesList;
        fileView.DataBind();
    }

    16. Add the code to upload a file to blob storage.

    a. Create a unique blob name by using a Guid. Appended the existing file extension.

    b. Add metadata

    i. For the file name (friendly name to show instead of the blob name or URL)

    ii. For the Submitter

    c. Add the bytes and create the Blob

    d. The UI is refreshed after this operation

    protected void insertButton_Click(object sender, EventArgs e)
    {
        // Make a unique blob name
        string extension = System.IO.Path.GetExtension(fileUploadControl.FileName);
        BlobProperties properties = new BlobProperties(Guid.NewGuid().ToString() + extension);
    
        // Create metadata to be associated with the blob
        NameValueCollection metadata = new NameValueCollection();
        metadata["FileName"] = fileNameBox.Text;
        metadata["Submitter"] = submitterBox.Text;
    
        properties.Metadata = metadata;
        properties.ContentType = fileUploadControl.PostedFile.ContentType;
    
        // Create the blob
        BlobContents fileBlob = new BlobContents(fileUploadControl.FileBytes);
        _Container.CreateBlob(properties, fileBlob, true);
    
        // Update the UI
        UpdateFileList();
        fileNameBox.Text = "";
        statusMessage.Text = "";
    }

    17. Add code to delete the blob. This is as simple as calling BlobContainer.DeleteBlob() passing in the blob name. In this case, it is the Guid + file extension generated during the upload.

    protected void RowCommandHandler(object sender, GridViewCommandEventArgs e)
    {
        if (e.CommandName == "DeleteItem")
        {
            int index = Convert.ToInt32(e.CommandArgument);
            string blobName = (string)fileView.DataKeys[index].Value;
    
            if (_Container.DoesBlobExist(blobName))
            {
                _Container.DeleteBlob(blobName);
            }
        }
        UpdateFileList();
    }

    18. Finally, lets just round out some of the error handling on Page_Load(). Right after _Container.CreateContainer() lets update the UI and properly catch any exceptions that could get thrown.

    using System.Net;
        _Container.CreateContainer(new NameValueCollection(), ContainerAccessControl.Public);
        UpdateFileList();
    }
    catch (WebException webExcept)
    {
        if (webExcept.Status == WebExceptionStatus.ConnectFailure)
        {
            statusMessage.Text = "Failed to connect to the Blob Storage Service, make sure it is running: " + webExcept.Message;
        }
        else
        {
            statusMessage.Text = "Error creating container: " + webExcept.Message;
        }
    }
    catch (Exception ex)
    {
        statusMessage.Text = "Error creating container: " + ex.Message;
    }

    19. Build and hit F5 to run the application.

    a. Notice that the Development Fabric and the Development Storage startup on your behalf and will continue to run until you close them.

    b. Note that the FileUpload control has a file size limit. You can modify it by changing the httpRuntime maxRequestLength attribute in web.config.

    image

    Please see the Deploying a Cloud Service to learn how to modify the configuration of this Cloud Service to make it run on Windows Azure.

  • Cloudy in Seattle

    ASP.Net MVC Projects running on Windows Azure

    • 28 Comments

    [For more recent information on using ASP.NET MVC with Windows Azure please see this post.]

    Before you get started with ASP.Net MVC and Windows Azure – please install this hotfix.

    Strictly speaking, ASP.Net MVC projects are not supported on Windows Azure.  That is to say that we haven't spent the time to fully test all of the MVC scenarios when running on Windows Azure. 

    That said, for the most part, they do work, just as long as you know what tweaks you need to make in order to get up and running.

    I've attached a sample application that Phil and Eilon on the MVC team put together to help make it easier for you to get started.

    I’ll walk through the changes:

    1. Start by creating a Blank Cloud Service.  File –> New Project –> Visual C# –> Cloud Service –> Blank Cloud Service.  I call it MVCCloudService

    image

    2. Right click on the solution node in Solution Explorer and select “Add New Project”

    image

    3. In the Add New Project dialog, navigate to the Web node under Visual C# or Visual Basic and select the ASP.Net MVC Application (I call it MVCWebRole). 

    image

    4. In Solution Explorer, right click on MVCWebRole and select “Unload Project”

    image

    5. Right click again on MVCWebRole and select “Edit MVCWebRole.csproj”

    image

    6. Add <RoleType>Web</RoleType> to the top PropertyGroup in the project file.

    <Project ToolsVersion="3.5" DefaultTargets="Build" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
      <PropertyGroup>
    {. . .}
        <RoleType>Web</RoleType>

    7. Reload the project, be sure to save the project file.

    image 

    8. If you want to use the Windows Azure runtime library, add a reference to Microsoft.ServiceHosting.ServiceRuntime.dll by right clicking on the References node and selecting “Add reference…”.  Scroll down in the .Net Tab and you’ll find it.

    image

    9. Right click on the Roles node in the MVCCloudService project and select Add –> Web Role Project in solution…

    image

    Select the MVCWebRole project.

    image

    10. Set Copy Local = true on the MVC DLLs:

    • System.Web.Abstractions
    • System.Web.Mvc
    • System.Web.Routing

    These assemblies are not available when running on Windows Azure and need to be deployed with the Service Package.

    Expand the References node in the MVCWebRole and right click on System.Web.Abstractions and select Properties.  Change “Copy Local” to true.  Repeat for the other DLLs

    image

    With these changes, the edit, build, debug, publish and deploy functionality will all work in Visual Studio with the Windows Azure Tools for Microsoft Visual Studio installed.

    That said, it is still "use at your own risk".

    Note: The sample has not been modified to use the Windows Azure ASP.Net provider (for example, the membership provider), stayed tuned.

    The sample project is attached to this blog post.

  • Cloudy in Seattle

    Try Windows Azure Now

    • 1 Comments

    Want to try Windows Azure right now?  Install the following:

    This will give you a local version of the Fabric and Storage services (called the Development Fabric and Development Storage) that will allow you to build, run and debug your Cloud Services just as they would run on Windows Azure.

  • Cloudy in Seattle

    Video Walkthrough: A Quick Lap around Windows Azure Tools for Microsoft Visual Studio

    • 6 Comments

    Watch my screencast that introduces you to the Windows Azure Tools for Microsoft Visual Studio.

    http://wm.microsoft.com/ms/msdn/azure/visualstudioazure.wmv

    image

  • Cloudy in Seattle

    Windows Azure Walkthrough: Table Storage

    • 19 Comments

    Please see the updated post for November 2009 and later

    This walkthrough covers what I found to be the simplest way to get a sample up and running on Windows Azure that uses the Table Storage Service. It is not trying to be comprehensive or trying to dive deep in the technology, it just serves as an introduction to how the Table Storage Service works.

    Please take the Quick Lap Around the Tools before doing this walkthrough.

    Note: The code for this walkthrough is attached, you will still have to add and reference the Common and StorageClient projects from the Windows Azure SDK.

    After you have completed this walkthrough, you will have a Web Role that is a simple ASP.Net Web Application that shows a list of Contacts and allows you to add to and delete from that list. Each contact will have simplified information: just a name and an address (both strings).

    image

    Table Storage Concepts

    The Windows Azure Table Storage Services provides queryable structured storage. Each account can have 0..n tables.

    image

    Design of the Sample

    When a request comes in to the UI, it makes its way to the Table Storage Service as follows (click for larger size):

    clip_image002

    The UI class (the aspx page and it’s code behind) is data bound through an ObjectDataSource to the SimpleTableSample_WebRole.ContactDataSource which creates the connection to the Table Storage service gets the list of Contacts and Inserts to, and Deletes from, the Table Storage.

    The SimpleTableSample_WebRole.ContactDataModel class acts as the data model object and the SimpleTableSample_WebRole.ContactDataServiceContext derives from TableStorageDataServiceContext which handles the authentication process and allows you to write LINQ queries, insert, delete and save changes to the Table Storage service.

    Creating the Cloud Service Project

    1. Start Visual Studio as an administrator

    2. Create a new project: File à New à Project

    3. Select “Web Cloud Service”. This will create the Cloud Service Project and an ASP.Net Web Role. Call it “SimpleTableSample”

    image

    4. Find the installation location of the Windows Azure SDK. By default this will be: C:\Program Files\Windows Azure SDK\v1.0

    a. Find the file named “samples.zip”

    b. Unzip this to a writeable location

    5. From the samples you just unzipped, add the StorageClient\Lib\StorageClient.csproj and HelloFabric\Common\Common.csproj to your solution by right-clicking on the solution in Solution Explorer and selecting Add à Existing Project.

    image

    a. Common and StorageClient and libraries that are currently distributed as samples that provide functionality to help you build Cloud Applications. Common adds Access to settings and logging while StorageClient provides helpers for using the storage services.

    6. From your Web Role, add references to the Common and StorageClient projects you just added along with a reference to System.Data.Services.Client

    image

    image

    7. Add a ContactDataModel class to your Web Role that derives from TableStorageEntity. For simplicity, we’ll just assign a new Guid as the PartitionKey to ensure uniqueness. This default of assigning the PartitionKey and setting the RowKey to a hard coded value (String.Empty) gives the storage system the freedom to distribute the data.

    using Microsoft.Samples.ServiceHosting.StorageClient;
    public class ContactDataModel : TableStorageEntity
    {
        public ContactDataModel(string partitionKey, string rowKey)
            : base(partitionKey, rowKey)
        {
        }
    
        public ContactDataModel()
            : base()
        {
            PartitionKey = Guid.NewGuid().ToString();
            RowKey = String.Empty;
        }
    
        public string Name
        {
            get;
            set;
        }
    
        public string Address
        {
            get;
            set;
        }
    }

    8. Now add the ContactDataServiceContext to the Web Role that derives from TableStorageDataServiceContext.

    a. We’ll use this later to write queries, insert, remove and save changes to the table storage.

    using Microsoft.Samples.ServiceHosting.StorageClient;

    internal class ContactDataServiceContext : TableStorageDataServiceContext
    {
        internal ContactDataServiceContext(StorageAccountInfo accountInfo)
            : base(accountInfo)
        {
        }
    
        internal const string ContactTableName = "ContactTable";
    
        public IQueryable<ContactDataModel> ContactTable
        {
            get
            {
                return this.CreateQuery<ContactDataModel>(ContactTableName);
            }
        }
    }

    9. Next add a ContactDataSource class. We'll fill this class out over the course of the next few steps.  This is the class the does all the hookup between the UI and the table storage service. Starting with the first part of the constructor, a StorageAccountInfo class is instantiated in order to get the settings required to make a connection to the Table Storage Service. (note that this is just the first part of the constructor code, the rest is in step 12)

    using Microsoft.Samples.ServiceHosting.StorageClient;
    using System.Data.Services.Client;
    public class ContactDataSource
    {
        private ContactDataServiceContext _ServiceContext = null;
    
        public ContactDataSource()
        {
            // Get the settings from the Service Configuration file
            StorageAccountInfo account = 
    StorageAccountInfo.GetDefaultTableStorageAccountFromConfiguration();

    10. In order for the StorageAccountInfo class to find the configuration settings, open up ServiceDefinition.csdef and add the following to <WebRole/>. These define the settings.

        <ConfigurationSettings>
          <Setting name="AccountName"/>
          <Setting name="AccountSharedKey"/>
          <Setting name="TableStorageEndpoint"/>
        </ConfigurationSettings>  
    

    11. Likewise, add the actual local development values to the ServiceConfiguration.cscfg file. Note that the settings between both files have to match exactly otherwise your Cloud Service will not run.

    • When you run in the Cloud, the AccountName and AccountSharedKey will be set to the values you will get back from the Portal for your account. The TableStorageEndpoint will be set the URL for the Table Storage Service: http://table.core.windows.net
    • Because these are set in the ServiceConfiguration.cscfg file, these values can be updated even after deploying to the cloud by uploading a new Service Configuration.
    • For the local development case, the local host and port 10002 (by default) will be used as the Table Storage Endpoint. The AccountName and AccountSharedKey are hard coded to a value that the Development Storage service is looking for (it’s the same for all 3, Table, Blob and Queue services).
        <ConfigurationSettings>
          <Setting name="AccountName" value="devstoreaccount1"/>
          <Setting name="AccountSharedKey" 
    value="Eby8vdM02xNOcqFlqUwJPLlmEtlCDXJ1OUzFT50uSRZ6IFsuFq2UVErCz4I6tq/K1SZFPTOtr/KBHBeksoGMGw=="/> <Setting name="TableStorageEndpoint"
    value="http://127.0.0.1:10002/"/>
    </ConfigurationSettings>

    12. Next, continue to fill out the constructor (just after the call to GetDefaultTableStorageAccountFromConfiguration ()) by instantiating the ContactDataServiceContext. Set the RetryPolicy that applies only to the methods on the DataServiceContext (i.e. SaveChanges() and not the query. )

        // Create the service context we'll query against
        _ServiceContext = new ContactDataServiceContext(account);
        _ServiceContext.RetryPolicy = RetryPolicies.RetryN(3, TimeSpan.FromSeconds(1));
    }

    13. We need some code to ensure that the tables we rely on get created.  We'll do this on first request to the web site -- which can be done by adding code to one of the handlers in the global application class.  Add a global application class by right clicking on the web role and selecting Add -> New Item -> Global Application Class. (see this post for more information)

    image

    14. Add the following code to global.asax.cs to create the tables on first request:

    using Microsoft.Samples.ServiceHosting.StorageClient;
    protected void Application_BeginRequest(object sender, EventArgs e)
    {
        HttpApplication app = (HttpApplication)sender;
        HttpContext context = app.Context;
    
        // Attempt to peform first request initialization
        FirstRequestInitialization.Initialize(context);
    
    }

    And the implementation of the FirstRequestInitialization class:

    internal class FirstRequestInitialization
    {
        private static bool s_InitializedAlready = false;
        private static Object s_lock = new Object();
    
    
        // Initialize only on the first request
        public static void Initialize(HttpContext context)
        {
            if (s_InitializedAlready)
            {
                return;
            }
    
            lock (s_lock)
            {
                if (s_InitializedAlready)
                {
                    return;
                }
    
                ApplicationStartUponFirstRequest(context);
                s_InitializedAlready = true;
            }
        }
    
        private static void ApplicationStartUponFirstRequest(HttpContext context)
        {
            // This is where you put initialization logic for the site.
            // RoleManager is properly initialized at this point.
    
            // Create the tables on first request initialization as there is a performance impact
            // if you call CreateTablesFromModel() when the tables already exist. This limits the exposure of
            // creating tables multiple times.
    
            // Get the settings from the Service Configuration file
            StorageAccountInfo account = StorageAccountInfo.GetDefaultTableStorageAccountFromConfiguration();
    
            // Create the tables
            // In this case, just a single table.  
            // This will create tables for all public properties that are IQueryable (collections)
            TableStorage.CreateTablesFromModel(typeof(ContactDataServiceContext), account);
        }
    }

    15. When running in the real cloud, this code is all that is needed to create the tables for your Cloud Service. The TableStorage class reflects over the ContactDataServiceContext classs and creates a table for each IQueryable<T> property where the columns of that table are based on the properties of the type T of the IQueryable<T>.

    a. There is a bit more to do in order to get this to work in the local Development Storage case, more on that later.

    16. At this point, it’s just a matter of filling out the ContactDataSource class with methods to query for the data, insert and delete rows. This is done through LINQ and using the ContactDataServiceContext.

    a. Note: in the Select() method, the TableStorageDataServiceQuery<T> class enables you to have finer grained control over how you get the data.

    i. Execute() or ExecuteWithRetries() will access the data store and return up to the first 1000 elements.

    ii. ExecuteAll() or ExecuteAllWithRetries() will return all of the elements with continuation as you enumerate over the data.

    iii. ExecuteWithRetries() and ExecuteAllWithRetries() uses the retry policy set on the ContactDataServiceContext for the queries.

    b. Note: the use of AttachTo() in the Delete() method to connect to and remove the row.

    public IEnumerable<ContactDataModel> Select()
    {
        var results = from c in _ServiceContext.ContactTable
                      select c;
    
        TableStorageDataServiceQuery<ContactDataModel> query = 
    new TableStorageDataServiceQuery<ContactDataModel>(results as DataServiceQuery<ContactDataModel>); IEnumerable<ContactDataModel> queryResults = query.ExecuteAllWithRetries(); return queryResults; } public void Delete(ContactDataModel itemToDelete) { _ServiceContext.AttachTo(ContactDataServiceContext.ContactTableName, itemToDelete, "*"); _ServiceContext.DeleteObject(itemToDelete); _ServiceContext.SaveChanges(); } public void Insert(ContactDataModel newItem) { _ServiceContext.AddObject(ContactDataServiceContext.ContactTableName, newItem); _ServiceContext.SaveChanges(); }

    17. The UI is defined in the aspx page and consists of 3 parts. The GridView which will display all of the rows of data, the FormView which allows the user to add rows and the ObjectDataSource which databinds the UI to the ContactDataSource.

    18. The GridView is placed after the first <div>. Note that in this sample, we’ll just auto-generate the columns and show the delete button. The DataSourceId is set the ObjectDataSource which will be covered below.

        <asp:GridView
            id="contactsView"
            DataSourceId="contactData"
            DataKeyNames="PartitionKey"
            AllowPaging="False"
            AutoGenerateColumns="True"
            GridLines="Vertical"
            Runat="server" 
            BackColor="White" ForeColor="Black"
            BorderColor="#DEDFDE" BorderStyle="None" BorderWidth="1px" CellPadding="4">
            <Columns>
                <asp:CommandField ShowDeleteButton="true"  />
            </Columns>
            <RowStyle BackColor="#F7F7DE" />
            <FooterStyle BackColor="#CCCC99" />
            <PagerStyle BackColor="#F7F7DE" ForeColor="Black" HorizontalAlign="Right" />
            <SelectedRowStyle BackColor="#CE5D5A" Font-Bold="True" ForeColor="White" />
            <HeaderStyle BackColor="#6B696B" Font-Bold="True" ForeColor="White" />
            <AlternatingRowStyle BackColor="White" />
        </asp:GridView>    

    19. The Form view to add rows is really simple, just labels and text boxes with a button at the end to raise the “Insert” command. Note that the DataSourceID is again set to the ObjectDataProvider and there are bindings to the Name and Address.

        <br />        
        <asp:FormView
            id="frmAdd"
            DataSourceId="contactData"
            DefaultMode="Insert"
            Runat="server">
            <InsertItemTemplate>
                <asp:Label
                        id="nameLabel"
                        Text="Name:"
                        AssociatedControlID="nameBox"
                        Runat="server" />
                <asp:TextBox
                        id="nameBox"
                        Text='<%# Bind("Name") %>'
                        Runat="server" />
                <br />
                <asp:Label
                        id="addressLabel"
                        Text="Address:"
                        AssociatedControlID="addressBox"
                        Runat="server" />
                <asp:TextBox
                        id="addressBox"
                        Text='<%# Bind("Address") %>'
                        Runat="server" />
                <br />
                <asp:Button
                        id="insertButton"
                        Text="Add"
                        CommandName="Insert"
                        Runat="server"/>
            </InsertItemTemplate>
        </asp:FormView>
    

    20. The final part of the aspx is the definition of the ObjectDataSource. See how it ties the ContactDataSource and the ContactDataModel together with the GridView and FormView.

        <%-- Data Sources --%>
        <asp:ObjectDataSource runat="server" ID="contactData" 
    TypeName="SimpleTableSample_WebRole.ContactDataSource" DataObjectTypeName="SimpleTableSample_WebRole.ContactDataModel" SelectMethod="Select" DeleteMethod="Delete" InsertMethod="Insert"> </asp:ObjectDataSource>

    21. Build. You should not have any compilation errors, all 4 projects in the solution should build successfully.

    22. Create Test Storage Tables. As mentioned in step 15, creating tables in the Cloud is all done programmatically, however there is an additional step that is needed in the local Development Storage case.

    a. In the local development case, tables need to be created in the SQL Express database that the local Development Storage uses for its storage. These need to correspond exactly to the runtime code. This is due to a current limitation in the local Development Storage.

    b. Right click on the Cloud Service node in Solution Explorer named “Create Test Storage Tables” that runs a tool that uses reflection to create the tables you need in a SQL Express database whose name corresponds to your Solution name.

    image

    i. ContactDataServiceContext is the type that the tool will look for and use to create those tables on your behalf.

    ii. Each IQueryable<T> property on ContactDataServiceContext will have a table created for it where the columns in that table will correspond to the public properties of the type T of the IQueryable<T>.

    23. F5 to debug. You will see the app running in the Development Fabric using the Table Development Storage

    image

    Please see the Deploying a Cloud Service to learn how to modify the configuration of this Cloud Service to make it run on Windows Azure.

  • Cloudy in Seattle

    Submitting a Bug against the Windows Azure Tools and SDK

    • 2 Comments

    If you have questions you want to have answered, or want to participate in community discussions, the Windows Azure Forum is a great place to go. 

    If you've found a bug or have a suggestion, submitting a bug/suggestion directly to our product team is a great way to help us identify the issues that exist and to help us build a better product. 

    Reporting a bug also allows you to track the progress of the issue, which is harder to do if we open a bug report for you in our internal systems. Also, customer reported issues get higher priority, so it's more likely to be fixed. 

    To submit a bug against the Windows Azure Tools and SDK, please go to the Microsoft Visual Studio Connect site:

    https://connect.microsoft.com/VisualStudio

    Sign in with your Live ID and click on "Submit a bug" or "Submit a Suggestion".

    You will then be taken to the following page:

    image

    Where in the Version combobox, you can select either "Windows Azure SDK" or "Windows Azure Tools for Visual Studio".

    image

    Often times bugs are hard to reproduce, which makes it hard for us to address them. If you can let us know what the exact steps for creating the problem from scratch are then that makes it easier for us to track down and fix the bug. 

    What behavior you expected as well as stack traces, error messages and any other diagnostic information really helps as well.

    You can also attach a solution or project that reproduces the problem. If at all possible, please narrow down the code and the solution to the simplest possible form that reproduces the problem. Please remove any intellectual property or personal identifiable information from your samples. 

  • Cloudy in Seattle

    MSDN Articles on Windows Azure Tools

    • 0 Comments

    I've posted a number of technical articles on MSDN that will help you to get acquainted with the Windows Azure Tools for Microsoft Visual Studio.  Have a look:

    Introduction

    Deploying a Service (Interesting for folks who don't have a token yet but want to see the experience)

    Creating an HTTPS endpoint

    Help

     

  • Cloudy in Seattle

    Cloud Service Project Templates vs Role Project Templates

    • 5 Comments

    If you create a new Cloud Service project in Visual Studio but it doesn't contain a Cloud Service project (ccproj), you won't get the Visual Studio integration for building Windows Azure services that you would expect.  For example, you won't get debug/run in the Development Fabric, you won't get integration with the Development Storage service, and you won't get publishing support.

    The reason is that you have created a Role project, instead of a Cloud Service Project. Today, you can have 0 or 1 Web Roles (which is an ASP.Net Web Application) and 0 or 1 Worker Roles (which is a UI-less .Net application) in each service.

    When you create a new Cloud Service project, make sure you are choosing a template from the "Cloud Service" node and not the Roles node in the New Project dialog:

    image

    The question would then be, what is the Roles node in the New Project dialog used for?

    It's used when you want to:

    1) Add a role to an existing Cloud Service project.  For example, the following Cloud Service project contains a web role.  You can use the roles node to add a Worker Role project to that Cloud Service.

    image

    This will bring up the new project dialog with the Roles node selected.  Out of the box, we only support C# and VB ASP.Net Web Application Web Roles and and C# and VB Worker roles.

    image

    Likewise, if the Cloud Service contains a Worker Role and not a Web Role, you can add a Web Role to that project in the same way.

    2) Replace an existing Role with a new Role Project.  For example, you have a Cloud Service that contains a Web Role, and you want to replace it with a new Web Role.  You can right click on the Web Role association and select to associate it with a new Web Role:

    image

    This will bring up the the new project dialog with the Roles node selected:

    image 

    The final thing you can do, is associate a role with an existing Role project by adding that Role project to the solution, right clicking on either the Web or Worker role association node in Solution Explorer and select to associate that role with a "Role Project in solution..."

  • Cloudy in Seattle

    Using the Windows Azure SDK Samples in Visual Studio

    • 2 Comments

    The Windows Azure SDK contains a set of samples that will help you get started with and to learn about Windows Azure.  In this post, I want to cover some potential issues that some of you may run into. 

    Potential issue #1: You try to load one of the Windows Azure SDK Samples, found as a zip file in the Windows Azure SDK install folder (Default is: C:\Program Files\Windows Azure SDK\v1.0), by opening the sln (for example HelloFabric.sln) and you receive the following error:

    image

    ccproj cannot be opened because its project type (.ccproj) is not supported by this version of the application.  To open it, Please use a version that supports this type of project.

    Solution: You've installed the Windows Azure SDK but have not installed the Windows Azure Tools for Microsoft Visual Studio.  Install the tools to get Visual Studio support for building Windows Azure services.

    Potential Issue #2:

    You click File->New->Project or "Create Project" from the start page and you don't see the option to create Cloud Service projects:

    i.e. you see this:

    image

    Instead of this:

    image

    Solution: As with potential issue #1, you've installed the Windows Azure SDK but have not installed the Windows Azure Tools for Microsoft Visual Studio.  Install the tools to get Visual Studio support for building Windows Azure services.

    Potential Issue #3:

    You try to open one of the samples in the Windows Azure SDK by opening up the sln in Visual Studio and receive one or both of the following security warning:

    image

    image 

    Solution: You are opening the files from a untrustworthy source location (such as the Program Files directory or a file share).  Copy the samples to a writeable, full trust location.

    Potential Issue #4:

    You hit "F5" to debug and you see the ASP.Net Development Server instead of the Development Fabric:

    image

    Or, you click "Publish" and you see the following dialog instead of packaging and navigating to the Azure Services Developer Portal:

    image

    Solution: You need to ensure that the Cloud Service project is the startup project.  In Solution Explorer, right Click the project that has the globe and 3 blocks and select "Set as Startup Project". 

    In Visual Web Developer Express Edition, you simply need to click on that project in order to make it bold.

    image

    The bolded project in Solution Explorer is the startup project.

    image

  • Cloudy in Seattle

    Now: Cloud Computing

    • 1 Comments

    As some of you may have noticed, this blog has gone dark for a while.  During that time it got an interesting face lift and a new title "Cloudy in Seattle".

    As it turns out, I switched teams and am now a proud member of the (Cloud) Computing Tools team... but that also meant I couldn't blog about the cool things I've been working on. 

    With the announcement in today's key note at the PDC, I'm happy to say I can now start blogging about it, about the Azure Services Platform and in particular, Windows Azure.

    When you distill it out, Windows Azure is all about hosting your web applications, storage, management and development.  That is, you can host, scale and manage web application in Microsoft data centers.

    At the same time, it incorporates some truly cool features like the Fabric controller that manages resources, load balances across your running instances (which incidentally, you can configure on the fly and as easily as editing or uploading a new configuration file) and manages upgrades and failures to maintain availability.

    My part in all this?  You guessed it, I'm working on developer tools -- that's my passion.  My product: Windows Azure Tools for Microsoft Visual Studio.  An add-in for Visual Studio 2008 that brings the developer experience you would expect for building Cloud Applications to Visual Studio.

    The tools make it easy to get started, configure, edit, build, package, debug and publish for deployment.

    I'm really jazzed about what's ahead, and I'm jazzed to be able to start sharing.  Stay tuned.

  • Cloudy in Seattle

    VSX Dev Con 2008 - Learn to Extend Visual Studio

    • 1 Comments

    Are you looking to extend Visual Studio?  (or do you already and want more help?  more Microsoft contacts?)

    You really need to check out VSX Dev Con.

    September 15 and 16 on the Microsoft campus.  See http://msdn.com/vsx/conference/ for more information.

  • Cloudy in Seattle

    Thank you for your feedback on Silverlight Control Licensing

    • 2 Comments

    Recently a posted a set of questions around Silverlight Control licensing.  I was really happy with the number of responses I got.  Thank you all for replying.

    To give you an idea of what the responses were like:

    • 15/17 Control Vendors license their controls
      • 15/15 write their own LicenseProvider or roll their own system (no surprise here but we wanted to validate that the LicFileLicenseProvider was not being used)
      • 13/15 perform design time validation
      • 13/15 perform runtime validation
    • 10/17 indicated that not having licensing support in Silverlight would affect their level of investment in Silverlight controls.

    Thanks again for the responses, the data really helps us to make better decisions.

  • Cloudy in Seattle

    Silverlight Licensing for 3rd Party Controls

    • 3 Comments

    For any of you building 3rd party controls for Silverlight - I really need your feedback!

    We are looking at what the requirements should be licensing - and how important it is to you that we provide licensing support (i.e. LicenseProvider) for Silverlight controls.

    1) Do you use the LicFileLicenseProvider or do you roll your own LicenseProvider?

    2) Do you use both design time validation and runtime validation or just one?  If one, which one?

    3) Would the lack of licensing support affect your investment in building Silverlight controls?

    Please email me at "jnak" at microsoft and include your company name.

    Thank you so much.

  • Cloudy in Seattle

    XAML Object Mapping and WPF XAML Vocabulary Specifications

    • 0 Comments

    XAML specification published, added to Microsoft's open promise

    Specs that are shared include the following:

    ·         Xaml Object Mapping Specification 2006

    ·         WPF Xaml Vocabulary Specification 2006

    If you are looking to write a file format import/export tool, these will definetly be a great help to you.

  • Cloudy in Seattle

    How to: Sub-property editing in the Visual Studio 2008 WPF Designer Property Browser

    • 5 Comments

    Recently I saw a post on the forums asking about how to get subproperties to show up in the Property Browser for WPF projects.  This inspired me to blog about it as I'm sure other people are also running into this question.

    What do I mean by sub-properties?  Consider the following screen shot of the Property Browser:

    Notice how to the left of the ExpandablePersonObject property there is a square with a '-' in it?  That is the expanded state for sub-properties.  When collapsed, it will show up as a '+'.

    The sub-properties are the Age, Name and ShirtColor that are properties on the ExpandablePersonObject type (shares the same name as the property).

    So the question is: how do I create a property for a type that shows my sub-properties?

    The short answer is that the property or the type of the property needs to have the ExpandableObjectConverter TypeConverter applied to it.  That said, in order to fully answer that question, I created a project that illustrates the various scenarios (see attached zip to this post). 

    There are two interesting cases, the first is for DependencyObject derived types and the second is for System.Object derived types.

    In the case of DependencyObjects, you pretty much get the behavior for free because Cider itself adds the ExpandableObjectConverter to DependencyObjects (note: Blend has the same behavior as Cider for all this stuff, we designed it all together).

    So, if I have a type such as this:

        public class PersonDependencyObject : DependencyObject
        {
            public static readonly DependencyProperty AgeProperty = DependencyProperty.Register("Age", typeof(int), typeof(PersonDependencyObject));

            public int Age
            {
                get { return (int)GetValue(AgeProperty); }
                set { SetValue(AgeProperty, value); }
            }

    (. . .)

    and the property on the control that will be selected in the designer is:

             public PersonDependencyObject PersonDependencyObject;

    The Age, ShirtColor and Name properties on the PersonDependencyObject type will be shown in the Property Browser.

    If the type of the property is derived from object:

        public class PersonObject
        {
            private string _Name = "";
            private int _Age = 0;
            private Brush _ShirtColor = Brushes.Blue;

            public string Name
            {
                get { return _Name; }
                set { _Name = value; }
            }
    (. . .)

    You have a little more work to do.

    You can either set the TypeConverter on the type itself:

        [TypeConverter(typeof(ExpandableObjectConverter))]
        public class ExpandablePersonObject (. . .)

    or on the property on the control:

            [TypeConverter(typeof(ExpandableObjectConverter))]
            public PersonObject PersonObjectWithExpansion

    If you do either of these, you will get the sub-properties showing up for that property.

    You will notice one other thing here, in order for sub-properties to show up, you need to ensure that the property is editable in XAML.  That is, if you programmatically set an instance on the property or set an instance as the default property, you'll get the + to expand/collapse but the subproperties will not show up underneath. 

    For example if I were to programmatically set my property to an instance:

             public PersonControl()
            {
                InitializeComponent();
                PersonDependencyObject = new PersonDependencyObject();
            }

    Looking at the Property Browser:

    Notice how you get the expander that shows the subproperties expanded but you cannot see or edit them.  This is because the property is not set in XAML.  We want to improve this experience in a future release however this is what we are stuck with for now.

    For properties that do not have a value, if you click the dropdown and select the type in the drop down, that will create a new instance in the XAML  For example, if in my attached sample I select the "PersonObject" type in the drop down for the PersonObjectWithExpansion property, I will get the following added to the XAML:

            <local:PersonControl>
                <local:PersonControl.PersonObjectWithExpansion>
                    <local:PersonObject Age="0" Name="" ShirtColor="Blue" />
                </local:PersonControl.PersonObjectWithExpansion>
            </local:PersonControl>

    The sub-properties will be expanded and you can edit them using the property browser.

    You might now ask if there is a way to customize that drop down of types...  the answer is yes!  There is.  For this you will have to use the Metadata Store to add the attribute and the attribute is the NewItemTypesAttribute.  When you decorate a property with the NewItemTypesAttribute, you specify the list of types that are compatible to be instantiated and set as the property value for the given property -- i.e. it shows up in the drop down.

  • Cloudy in Seattle

    Cider (WPF and Silverlight Designer and Tools Team) is Hiring!

    • 1 Comments

    Want to work on the designer and tools for WPF and Silverlight?  Here's your opportunity, and it's an amazing one.

    We're really excited about all the cool work we have coming in the next Visual Studio release, code named Dev10 and the division is investing in Cider in order to ensure that we have the team to make it all happen.

    Microsoft is a really unique and special place to work, if you are passionate about software, about next gen technologies and the developer experience, click on the links below.

    Also look at our jobs page on WindowsClient.net

  • Cloudy in Seattle

    Professional Developer and Designer tools for students at no charge

    • 1 Comments

    Check it out: https://downloads.channel8.msdn.com/

    DreamSpark is simple, it's all about giving students Microsoft professional-level developer and design tools at no charge so you can chase your dreams and create the next big breakthrough in technology - or just get a head start on your career.

  • Cloudy in Seattle

    Divelements SandRibbon for WPF

    • 1 Comments

    Following up on the great design time experience that they developed for their SandDock for WPF, divelements has just released their SandRibbon control that includes an impressive array of design time features.

    The SandRibbon is a control suite that allows you to add an Office 2007 Ribbon UI to your application.

    The good folks at divelements have made use of the following extensibility features:

    • Context Menu Actions
    • DefaultInitializer
    • DesignModeValueProvider
    • Adorners
    • Policy
    • PlacementAdapter
    • ParentAdapter

    It's really great to see the power of Cider's extensibility in action.

  • Cloudy in Seattle

    The WPF Designer for Windows Forms Developers

    • 2 Comments

    One of the things we've been working on as a team is a guide to help Windows Forms developers ramp up on the WPF Designer in Visual Studio 2008.  Thanks to the great writing talents of Jim and the whole Cider PM team, this white paper is now available for all to enjoy.

    http://msdn2.microsoft.com/en-us/library/cc165605.aspx

  • Cloudy in Seattle

    Actipro WPF Studio - Updated with Improved Design Time Support

    • 0 Comments

    On 1/29/2008, Actipro released a new build of their WPF Studio.  You'll recall that I blogged about their use of extensibility to improve the design time interaction of their controls.  This new release ups the ante with drag and drop layout, a really cool image picker and a host of other improvements across their task panes.

    Check out: http://www.actiprosoftware.com/Products/DotNet/WPF/Ribbon/DesignerSupport.aspx

    Here's a shot of the image picker:

    For a hint on how they did this, here is a tip from resident Cider Architect Brian Pepin:

    There is a service called ExternalResourceService that is defined in Microsoft.Windows.Design.Markup.  It allows you to enumerate the images (and other resources) in the project.  When you find an image you want, it is returned as a type called BinaryResource.  BinaryResource can be used to obtain a stream and it can also be used to obtain a special kind of URI called a "stream" URI.  If you pass this URI to a WPF API that uses URIs, or if you pass it to WebRequest.Create, you can get back a stream as well. 

    This is how we handle all resources in the designer and it allows you to ignore the details of the URI.

    I also updated the screen shots on the previous post referenced above.

Page 5 of 9 (202 items) «34567»