Architecture + Strategy

Musings from David Chou - Architect, Microsoft

  • Architecture + Strategy

    Silverlight and Photosynth at Stargate Universe



    MGM and Microsoft partnered to build an application intended to give old and new fans of Stargate a first look at the main location of the new Stargate Universe show, an ancient starship called the Destiny. The sets for the Destiny are amongst the most sophisticated ever built for a TV show, accurately capturing the spirit of a spaceship that has been abandoned for millennia. 

    MGM wanted to give fans an accurate experience of what it would be like to walk around and explore the Destiny, and using Microsoft Silverlight and Photosynth were able to deliver an immersive 3-D experience.

    UPDATE 2009.09.04 – a Microsoft Case Study and video has been published for this project. You can view the Case Study at, and the video at


    Over the space of two days, several hundred photographs were taken of some of the locations of the Destiny starship. These include the fan-favorite Gate Room, from where the crew embarks on their adventures through the Stargate as well as locations such as the ship’s control center, observation deck, and main shuttle.

    These photographs were used to create a number of 3D scenes using Microsoft Photosynth technology.  This technology analyzes pictures and creates a virtual 3D space using the different viewpoints from different pictures. Because the pictures were shot in high resolution, fans can zoom in and see the sets in great detail.


    See the Stargate Universe Photosynth set at

    Application Scenario

    Now Photosynth does not require any programming effort. All we need to do, is to take pictures. Lots of pictures. Then all we need to do is upload the pictures to, and Photosynth will do the rest, by interpreting textures in each photo and correlate the photos by matching similar textures, and figuring out spatially where each photo is located. Then finally, all of the photos in one set are stitched into a 3D environment for visitors to navigate through.

    Compare Photosynth to traditional approaches of trying to deliver 360 degree view of items, rooms, etc., which requires carefully planned photo shoots, then writing or obtaining specialized software to render them and allow viewers to interact with them. The time and effort required to create a “synth” can literally be minutes. Then just a few HTML-level changes to embed the Silverlight viewer for Photosynth into a web page, and anyone visiting that page can now interact with the synth.

    A screen shot of the gallery view is shown below.


    Kind of like a photo gallery on steroids. However, Photosynth is not just a slideshow. It allows viewers to traverse through the set of pictures in a non-linear manner, and to see the spatial perspectives between pictures (where each photo was taken relative to other photos).

    At the same time, specific considerations should be taken when taking photos for Photosynth. Basically, cover lots of angles and have lots of overlaps for overview shots, then lots of close-ups for detailed shots. Here’s a short tutorial on how to shoot for Photosynth.

    For the Stargate Universe synth, extra attention was paid to the physical architecture of the Destinty set. And the series of shots were taken so that the resulting synth would provide viewers an experience of walking through the set.

    Thus Photosynth can be leveraged in many scenarios, especially when a project has physical content that can be shared online. These can be anything, such as product showcases, exterior environments, interior walk-throughs, etc. Another simple and elegant way to enhance user acquisition and retention on websites.


  • Architecture + Strategy

    Silverlight 3 Beta


    You may have already heard about the many announcements made at Microsoft’s MIX09 conference this week in Las Vegas. One of the major announcements is the Beta availability of Silverlight 3 (and it does look like three’s the charm).

    Customer adoption to date

    Scott Guthrie during his keynote at MIX09 shared some encouraging statistics:

    • Silverlight launched 18 months ago, shipped Silverlight 2 six months ago
    • 350+ million installations globally
    • 300,000+ designers and developers
    • 200+ partners in 30 countries contributing to the ecosystem
    • 10,000+ applications globally
    • Microsoft has 200+ of its own websites built using Silverlight

    A few major customer adoption announcements were also made:

    • Netflix standardizing on Silverlight to deliver its online on-demand video instant watch service across PCs, Macs, and devices; leveraging PlayReady DRM and smooth/adaptive streaming capabilities
    • 2010 Winter Olympics at Vancouver – NBC announced that the event will again be delivered online using Silverlight, similar to the 2008 Beijing Olympics event
    • NCAA “March Madness” Men’s Basketball Championship – CBS Sports Online coverage of all tournament games delivered using Silverlight at
    • 2008 Presidential Inauguration – the Presidential Inauguration Committee selected Silverlight to enable online video streaming of President Obama’s official swear-in and Whistle Stop Tour events
    • Bondi Publishing (who owns the Playboy Archives above) – working to deliver their set of magazines including the New Yorker, Rolling Stones, and Playboy online using Silverlight. Playboy Archives is now live with with search, navigation, and DeepZoom at (best to visit at home)
    • KEXP – a Seattle-based radio station showed off an out-of-browser version (that works when off-line) of their content browser and player
    • Kelley Blue Book – Perfect Car Finder application using Silverlight at
    • SAP – working to deliver Silverlight controls to be used in Netweaver and Web DynPro
    • Microsoft Worldwide Telescope – now Silverlight enabled at
    • Microsoft Virtual Earth – Silverlight Map Control for any website to use for enhanced visualization of geo-location and mapping capabilities CTP now available
    • Other mentions including Yahoo! Japan, CareerBuilder, 10 Cent QQ, BSkyB, ITV, Intuit, etc.

    Interesting Silverlight applications to see

    Some of the most compelling Silverlight applications I have seen (many are registered on Silverlight Showcase) are listed below.

    General Info:

    Media sites/demos:

    Rich application sites/demos:

    Casual Games:

    Reusable Controls Libraries (for enterprise applications):

    Now these are just some of my favorites. But with the pace of developers building cool Silverlight applications, this list may need to be updated very frequently (last updated 3/20/09).

    What’s new in Silverlight 3?

    Fully supported by Visual Studio and Expression Blend, highlights of new features and functionality of Silverlight 3 include: major media enhancements, out of browser support allowing Web applications to work on the desktop; significant graphics improvements including 3D graphics support, GPU acceleration and H.264 video support; and many features to improve RIA development productivity.

    Enhanced media support

    • Live and on-demand true HD (720p+) Smooth Streaming. IIS Media Services (formerly IIS Media Pack), an integrated HTTP media delivery platform, features Smooth Streaming which dynamically detects and seamlessly switches, in real time, the video quality of a media file delivered to Silverlight based on local bandwidth and CPU conditions.
    • Hardware accelerated HD playback. Leveraging graphics processor unit (GPU) hardware acceleration, Silverlight experiences can now be delivered in true full-screen HD (720p, 1080p, etc.).
    • Extensible media format support. In addition to VC-1/WMA, Silverlight 3 now supports MPEG-4-based H.264/AAC Audio. Also with the new extensible Raw AV pipeline, audio and video can be decoded outside the runtime and rendered in Silverlight, extending format support beyond the native codecs.
    • Digital rights management. Full and built-in support for DRM, powered by PlayReady Content Protection enables protected in-browser experiences using AES encryption or Windows Media DRM.

    Enhanced graphics support

    • Perspective 3D Graphics. Content can now be applied to a 3D plane without writing any code. Live content can be rotated or scaled live content in space.
    • Pixel Shader effects. Image manipulation effects such as blur and drop shadow, using software-based rendering. Custom effects can also be created, and applied to any graphical content.
    • Bitmap Caching. Vector content, text, and controls can now be cached as bitmaps. This improves the rendering performance of applications and is useful for background content and for content which needs to scale without making changes to its internal appearance.
    • New Bitmap API. Support for writing pixels to bitmaps directly.
    • Themed application support. Runtime support for application theme updates driven by templates, and cascading style sheets.
    • Animation Effects. New effects such as spring and bounce. Developers can also now develop their own mathematical functions to describe an animation
    • Enhanced control skinning. Simplified skinning capabilities by keeping a common set of controls external from an application. This allows the sharing of styles and control skins between different applications.
    • Improved text rendering & font support. Enhanced rendering and rapid animation of text. Applications also load faster by enabling the use of local fonts.

    Enhanced rich internet application (RIA) support

    • Out-of-Browser capabilities
      • Run outside of browser. Light-weight, sandboxed companion experiences for the Web that run on the desktop. Enabled without any additional download of runtime or the need to write applications in a different way.
      • Consumer friendly non-administrator install. Applications are hosted in a cache and do not require any privileges to run.
      • Safer, more secure, sandboxed. An application can be trusted without security warnings. All assets are stored in an isolated storage.
      • Built in auto-update. An application will check for new versions on the server and update on launch.
      • Connectivity detection (on-line/off-line). Can detect a loss of connection (or react to event notifications), then choose to cache data locally until a network connection is restored.
      • Desktop integration. On Windows and Macs, applications can be saved as short-cuts on the desktop and be one click away from your customer. On Windows 7, support will be provided for superbar integration, multi-touch, and location awareness services such as GPS support so that your application can react to the users location.
    • Deep Linking. Support for deep linking, which enables bookmarking a page within a RIA.
    • Search Engine Optimization (SEO). By utilizing business objects on the server, together with ASP.NET controls and site maps, users can automatically mirror database-driven RIA content into HTML that is easily indexed by the leading search engines.
    • 60+ default controls with source code. Over 60 fully skinnable and customizable out-of-the-box controls such as charting and media, new layout containers such as dock and viewbox, and controls such as autocomplete, treeview and datagrid. The controls come with nine professional designed themes and the source code can be modified/recompiled or utilized as-is. Other additions include multiple selection in listbox controls, file save dialog, and support for multiple page applications with navigation.
    • Enhanced data support
      • Element-to-element binding. Enables property binding to CLR objects and other UI components via XAML, for instance binding a slider value to the volume control of a media player.
      • Data forms. Provides support for layout of fields, validation, updating and paging through data.
      • Data validation. Automatically catch incorrect input and warn the user with built-in validation controls.
      • Support for business objects on both client and server with n-Tier data support. Easily load, sort, filter and page data with added support for working with data. Includes a new built-in CollectionView to perform a set of complex operations against server side data. A new set of .NET RIA services supports these features on the server.
    • Improved performance
      • Application library caching. Reduced the size of applications by caching framework on the client which helps improve rendering performance.
      • Enhanced DeepZoom. Allows users to fluidly navigate through larger image collections by zooming.
      • Binary XML. Allows communication with the server to be compressed, greatly increasing the speed at which data can be exchanged.
      • Local Connection. This feature allows communication between multiple Silverlight applications on the client-side without incurring a server roundtrip.
    • Enhanced Accessibility Features. Provides access to all system colors, allowing partially-sighted people to make changes such as high contrast color schemes for ease of readability by re-using familiar operating system controls.
  • Architecture + Strategy

    Multi-Enterprise Business Applications (MEBA) as Cloud-Based Next-Generation B2B Business Processes


    Multi-Enterprise Business Applications (MEBA) are a new class of applications that can be used to support business processes that span enterprise and organizational boundaries. MEBAs leverage best practices and patterns from service-oriented architecture (SOA) techniques and technologies, and specifically cloud-based platforms, to facilitate the next-generation B2B (or multi-enterprise) collaboration.

    This is a project I had the privilege to participate in for the past few months, along with my esteemed colleague, Wade Wegner, and under Jack Greenfield's leadership, as part of Microsoft's Platform Architecture Team. This project was just highlighted at Microsoft’s Professional Developers Conference (PDC2008) in Los Angeles a few weeks ago, as the RedPrairie keynote demos that showcased Microsoft’s cloud computing platform, and discussed in more depth in one of its breakout sessions (see Wade's blog for more info).

    And more recently, we took a more architectural look at MEBA's at Microsoft's Strategic Architect Forum 2008 (SAF08).



    So what do we mean by “multi-enterprise business applications”? They are a new class of applications, different from the traditional data-driven applications that focus on managing data and resources and providing access to end users. They are more focused on implementing business processes that span enterprises, such as traditional B2B integration and collaboration scenarios. They leverage and build upon message-based integration and use well-defined protocols and roles, such as fundamental approaches and technologies used in enterprise service-oriented architecture initiatives. Actually, in a way, they represent an approach of extending enterprise SOA beyond the four walls of each enterprise, to integrate and work more seamlessly with other enterprises.

    At the same time, multi-enterprise business applications also have some differentiating requirements. Scenarios may include participants distributed throughout different parts of the world. And since they are intended to support mission-critical business activities, we need to have a very robust architecture that can ensure high availability, high reliability, a high-level of security; plus the need to have auditing, reporting, regulatory compliance, and so on; not significantly different from our enterprise IT architecture concerns from that perspective.

    And unlike traditional SOA applications that are more focused on functional capabilities within one enterprise, MEBAs extend the SOA concepts and technologies to business processes that span multiple enterprises. In addition, because MEBAs operate between organizations, their primary concerns are also different – community management, identity management, process execution management, multi-enterprise governance, etc.

    Moreover, unlike incumbents in the B2B software and service providers spaces where implementations today consist mostly of customizing proprietary products and services; MEBA defines an architecture in which foundational, common services should be implemented, and how they can composited into applications that define business processes. The evolution and maturity model of MEBAs are also more dependent on the community, than any one vendor maintaining a solution. MEBAs are a new class of applications; not a new class of products or solutions.

    Thus MEBA’s can be used across many different industries, especially ones that, from a traditional B2B perspective, tend to have a lot of cross-organizational collaboration needs. During this phase of the MEBA project, we attempted to take a more detailed look at the supply chain industry. And a more detailed view found many capabilities, within the supply chain industry, that today leverage various forms of technologies and implementation approaches to facilitate communication and integration across multiple partners on supply chain networks, or in a way, multiple enterprises. And for our project, we chose just one specific area, supply chain orchestration, to investigate further to evaluate how MEBA’s can be used to meet its specific requirements. So at the next lower level of detail, we have identified a set of common scenarios in supply chain orchestration. And then we chose just a few to prototype out, such as search for capacity, product recall, etc.; by building with Microsoft's Azure Services Platform.

    Challenges Today

    So now taking a few steps back up. Let’s talk about how some of these scenarios and capabilities are implemented today across the various industries. First of all we have multiple protocols that are intended to standardize the interaction models and data being exchanged. Some are more industry specific, such as RosettaNet, HL7 for healthcare, and FIX for financial services. While some are designed to be more general purpose, such as ebXML, WS-BPEL or BPEL4WS, and many others such as WS-Choreography, Java Business Integration (JBI), etc.

    But this at the same time also hints at some challenges we have today, as in a way, there are just too many standards. And many organizations are in the process of developing more standards to define how multi-enterprise collaboration should be facilitated in their industries. For example, automotive, auto parts distribution, etc.; just to name a few.

    And the observation today, is that, B2B, or integration or collaboration between multiple enterprises, or inter-enterprise SOA, is still relatively complex and difficult to implement and manage.

    For example, we have a very diverse set of technologies, collected from the past 25 years or so in various attempts at optimizing or streamlining communications between multiple organizations. Everything from EDI, FTP, on-premises software, integration service providers, B2B gateways, to the current class of SOA solutions that can be extended to support B2B scenarios.

    And not just the underlying technologies used, enterprises have very sophisticated needs and concerns in many areas, such as security, data ownership, management, and governance, etc. What’s more here, is that these needs and concerns are in a way, amplified or more complex when we look at them from a cross-organizational perspective.

    Why MEBA?

    So integration and enabling business processes across organizational boundaries, have been complex and difficult to accomplish, for many years. Why do we think now we may have a better chance at simplifying and streamlining efforts in this area, and moreover, why do we think MEBA’s, as a new class of applications may have a better chance at doing so?

    Traditional B2B or multi-enterprise communication and collaboration were complex to implement and often unreliable and error-prone. Organizations had to deal with a multitude of technologies, standards, and additional sets of technologies and implementations with each partner organization on a one-to-one basis. MEBAs aim to streamline and simplify these inherent complexities, by providing an architecture that builds on existing technologies and abstracts infrastructural concerns.

    From a timing perspective, the growing inter-dependence and always-connected environment for businesses, and availability of key new technologies (Web services, SOA, cloud computing, etc.), are showing signs that the time is right to take a new approach at multi-enterprise (or B2B) interactions. MEBAs build on the best practices and proven technology models of today, and offer the potential to greatly streamline and simplify efforts of implementing business processes that span multiple enterprises.

    And in general, by leveraging many of the best practices and building on many of the latest trends, such as the concept of an “Internet Service Bus” (ISB) providing an architecture and platform, upon which multi-enterprise business applications can be constructed, that simplify and streamline many aspects of B2B integration today, while meeting the needs of the organizations and participants involved. We think, this concept of the ISB, will significantly transform the architecture, and how we design and implement approaches to facilitate multi-enterprise business applications.

    And of course, this doesn’t mean we intend to replace the work organizations have already done with protocols and standards (such as RosettaNet, ebXML, HL7, FIX, etc.). In fact we think we can leverage those work, and use MEBA’s to provide implementations of these protocols, as well as an environment / platform to facilitate their orchestration and management.

    Thus MEBAs provide the potential of these high-level benefits:

    Business benefits – improve business agility, bottom-line revenue (reduced errors and cost, and faster response), and top-line revenue (higher automation, improved relationships with partners and customers)

    Technical benefits – simplify and streamline connectivity, improve visibility and governance, higher quality of service, focus on delivering business processes instead of infrastructure, leverage standards-based technologies, bridge multiple and disparate identity systems, etc.

    MEBA Reference Architecture

    MEBA Reference Architecture

    Now let us show you what a reference architecture for multi-enterprise business applications may look like. Just as you would expect, there are many layers of capabilities from this perspective. Basically, these applications will need to have a foundational services layer, which provides some core infrastructure capabilities such as compute or runtime environment, identity management system, workflow execution and management, robust messaging infrastructure, data management solution, and operational management services. On top of that, we can then build a layer of higher-level services, which are considered more functional, and provides capabilities such as community management (extending from trading party management in a traditional B2B perspective), services orchestration, business process management, party management, and so on. Then finally, we can build various types of communities, or partner networks, that define working relationships between multiple organizations. For example, each organization can belong to multiple communities, or as administrators to a particular community, can invite and add new partners, and so on. We would also like to use model-driven techniques to define business processes, and to be able to provision one or more instances of a given business process, each supporting a specific community of trading partners.

    Azure Services Platform

    When we saw the plans for the Azure Services Platform, we studied them to understand the kinds of applications it would support. Our conclusion was that it will spawn a second generation of B2B applications. We then coined the name MEBA to describe this new class of applications. Our focus here is to talk about MEBAs and how they can leverage this Azure Services Platform. So we will not get into the details of the Azure Services Platform, but we do want to articulate specifically how some of its components help support the MEBA concept.

    Windows Azure – the cloud compute platform, provides the underlying runtime environment for the foundational services and MEBA implementation. It offers high scalability and reliability, and global reach to partners across the world

    .NET Services – with its Service Bus, Access Control Services, and Workflow Services, provides the fundamental “Internet Service Bus” that can effectively address common concerns around identity and security, connectivity, and service orchestrations between multiple organizations

    SQL Services – provides the scalable and reliable cloud-based database solution, which helps to establish databases in the cloud to manage the data that is shared across the multiple organizations, and especially data that support the communities and their interactions

    Live Services – provides capabilities to connect to end users and support many aspects of human interaction needs

    Food for Thought

    MEBAs have the potential of completely changing the way enterprises interact with each other. If we further extend the MEBA vision and how it continues to simplify and streamline infrastructure concerns in facilitating business processes across organizations, we may see a world where business networks can be quickly and dynamically constructed to deliver business capabilities, simply by being able to connect the dots (or building with blocks). Business results delivered by the collective resources and capabilities from a network of multiple enterprises will become true differentiating factors, and more significant than any one enterprise can deliver alone. New business models may also emerge, as increasingly enterprises can participate in the larger scheme of things, and sometimes be a part of the process that span industries. The control of processes may begin to shift away from being enterprise-centric, to community-centric. Ultimately, enterprises can create new, more diverse, and more differentiating business offerings by being able to leverage the community of partners.

    This post is part of a series of articles on cloud computing and related concepts.

  • Architecture + Strategy

    Silverlight and Live Messenger at Photobucket


    image Photobucket has collaborated with Microsoft to build Photobucket Visual Search (, making use of Silverlight, Windows Live Services and Photobucket's Open API platform. Photobucket Visual Search uses Silverlight to provide a rich and entertaining search experience, by displaying photos and videos in an easy-to-browse interface. The experience shows up search results, as well as related terms, helping users to find images they might not have in the past.

    Recognizing that photos are about sharing a social experience, Photobucket decided to use Microsoft's new Windows Live Messenger Web Toolkit to socially enable this Visual Search experience. Windows Live Messenger has over 320 Million active users, with over 32 billion social relationships between them. Sharing an image with friends on Windows Live is as easy as a click of a button with Photobucket Visual Search.



    Photobucket is a very popular site on the Internet for uploading, sharing, linking and finding photos, videos and graphics. Photobucket is usually used for personal photographic albums, remote storage of avatars displayed on internet forums, and storage of videos. Photobucket's image hosting is often used for eBay, MySpace (now a corporate cousin), Bebo, Neopets and Facebook accounts, LiveJournals or other blogs, and message boards. Users may keep their albums private, allow password-protected guest access, or open them to the public.

    Below are some statistics (circa 2007) regarding the website.

    • 30+ million searches processed / day
    • 25 million unique site visitors/month in the US, and over 46 Million unique site visitors/month worldwide
    • Over 7 billion images uploaded
    • #31 in Top 50 Sites in the US
    • #41 top 100 Global Sites
    • 18th Largest Ad supported site in the US
    • 41.4% share of U.S. visits to photography web sites
    • 56% of users are under 35, and 52% are female

    This project is intended to enhance Photobucket’s user experience while mapping specific objectives to the following core business goals:

    • Leverage rich software to create a fun and engaging experience
    • Social communication features that provide increased web traffic
    • Encourage users to add their own tags and collect metadata in the process
    • Improve user acquisition and retention to the Photobucket website

    Richer visualization and interaction, plus social networking capabilities, were chosen as the means to achieve the goals of improving user acquisition and retention. Consequently, Silverlight and Windows Live Messenger Web Toolkit were chosen as the components from the Microsoft platform that can be leveraged for the Photobucket Visual Search project. Photobucket’s existing open API’s (HTTP/REST-based) are used directly to support the search client application.


    The project architecture consists of Photobucket’s existing server infrastructure, which provides the open REST-based API’s, images, albums, thumbnails, groups (featured, most active, most recent, most contributions, contests, etc.), static content, etc. The website itself is a user interaction/presentation layer on top of the thousands of servers deployed as part of the content infrastructure. The website manages all the metadata associated with the massive amount of content, such as tags, descriptions, comments, image ownerships and relationships, user memberships, etc. Searches done on the Photobucket website is performed on the metadata and indexes; and the search results point to actual locations of content and assets across the massively parallel content infrastructure.

    The content infrastructure consists of multiple farms of thousands of servers, each manages different types of content, with user data partitioned horizontally across servers in each group. The content servers can be accessed directly using sub-domains on, such as for one of the image clusters, for one of the thumbnail clusters.

    Application requests are managed by the website, and the open API’s are managed via servers. Search queries retrieve an XML response, which represent search hits, with associated metadata, and actual image locations potentially pointing to hundreds of different servers where they physically reside.

    The Silverlight search client implementation is designed to fully leverage the search service as part of’s open API’s. It captures the search queries from the user, then sends it to for processing, then interprets the returned XML, renders the result, and downloads individual thumbnails from actual server locations in the thumbnails server farm.

    The Windows Live Messenger Web Toolkit implementation is mostly client-side. Photobucket hosts some of the files such as static images for branding the messenger interface, but most of the files are retrieved directly from the messenger service in the Live Services platform. Integration with the Silverlight Visual Search application is done through the JavaScript bridge.

    A diagram representing the logical architecture is shown below.


    The overall end-to-end architecture consists of these components

    • Client – Silverlight 2, Windows Live Messenger Web Toolkit, JavaScript, CSS, HTML, etc.
    • Server (Photobucket) – Linux, Apache 2.2.4 (EL4), PHP 5.2.6, MySql
    • Server (Live Services) – IIS 6, ASP.NET 2.0.50727
    • Tools – Visual Studio 2008, Expression Studio

    Silverlight Visual Search Development

    The default target dimensions for the application are 1024 X 768. However, the application supports browser resizing and will adjust accordingly when the user changes browser window size dynamically.

    The application has two primary states:

    • Main View (Search & Explore) – This is the state the user will be in initially and for a majority of the search and explore functionality. The main view displays thumbnails of the image results. It also allows users to modify their searches, or explore further using alternative searches.
    • Zoom View – The secondary state of the application brings the user into a view where the focus is on the viewing aspect of the large image and the image options, including sharing with Live Services.

    The project team went through many data visualization designs for the search results. In the end, a simplistic one that represented a similar model to the rest of the Photobucket site was chosen.

    Windows Live Messenger Web Toolkit Integration

    While in Zoom View, a user can share the photo via IM using Live Messenger. Doing so will prompt the user to sign in or register to sign up for Live Messenger. If the user is already signed in, this will trigger the user's Windows Live contact list to pop up.

    Delegated Authentication

    Signing in will link a user’s Windows Live ID with the website, granting an authentication consent (as a site-wide stored consent token) to Photobucket so that web pages generated from Photobucket can reuse this session with Live Messenger, and not have to require the user to sign-in on each page.

    The project team decided to leverage Live Messenger’s Delegated Authentication method because it provides the most unobtrusive method to the existing Photobucket’s membership system, as well as simplicity in implementation as the identity federation approach would be significantly more complex.

    Messenger Web Bar

    The Messenger Web Bar is a single UI Control that contains a full Windows Live Messenger experience. The Messenger Web Bar has the following functionality:

    • Contact list—The contact list enables the user to manage contacts and interact with them.
    • Conversation list—The conversation list contains all active conversations.
    • User area—The user area shows the user's presence and enables the user to update this information
    • Cross-page navigation—Cross-page navigation enables the user to stay signed in to Live Messenger while navigating from page to page within your application Web site.

    This functionality is encapsulated within a small bar at the bottom of the page. Because the Live Messenger Web Bar works across pages, conversations that start on a page on the application Web site can continue on another page of the application Web site. This functionality enables an application Web site to use the Live Messenger Web Bar as a platform to deeply integrate Live Messenger functionality and data into the site. Making an application Web site more social with the Messenger Web Bar and UI Controls can significantly increase user engagement.

    Shown below is a picture of the Messenger Web Bar used as part of the Visual Search experience.


    Once the user is logged into the Windows Live Messenger service, the user can view and interact with the list of contacts managed in the Windows Live service. The user can see also see presence information in terms of who among the contacts is online or offline at the moment. An IM conversation can then be initiated just by selecting a contact. If the user initiates an IM conversation while in Zoom View, Visual Search will automatically insert a link that says “Check this photo” with a URL to the actual image.

    Application Scenario

    This project demonstrated how Silverlight can be leveraged to add value to a website completely built on the LAMP stack. It also showed how Live Services can be leveraged, without any significant custom development effort, to add social computing capabilities to any website. The combination of rich clients, and composition of multiple cloud-based services on the Web, represents a Software Plus Services implementation approach, and how it enhances existing Web browsing models and improves user acquisition and retention.

    Related Resources

  • Architecture + Strategy

    Cloud-optimized architecture and Advanced Telemetry


    Advanced TelemeryOne of the projects I had the privilege of working with this past year, is the Windows Azure platform implementation at Advanced Telemetry. Advanced Telemetry offers an extensible, remote, energy-monitoring-and-control software framework suitable for a number of use case scenarios. One of their current product offerings is EcoView™, a smart energy and resource management system for both residential and small commercial applications. Cloud-based and entirely Web accessible, EcoView enables customers to view, manage, and reduce their resource consumption (and thus utility bills and carbon footprint), all in real-time via the intelligent on-site control panel and remotely via the Internet.


    Much more than Internet-enabled thermostats and device end-points, “a tremendous amount of work has gone into the core platform, internally known as the TAF (Telemetry Application Framework) over the past 7 years” (as Tom Naylor, CEO/CTO of Advanced Telemetry wrote on his blog), which makes up the server-side middleware system implementation, and provides the intelligence to the network of control panels (with EcoView being one of the applications), and an interesting potential third-party application model.

    The focus of the Windows Azure platform implementation, was moving the previously hosted server-based architecture into the cloud. Advanced Telemetry completed the migration in 2010, and the Telemetry Application Framework is now running in Windows Azure Platform. Tom shared some insight from the experience in his blog post “Launching Into the Cloud”. And of course, this effort was also highlighted as a Microsoft case study on multiple occasions:


    The Move to the Cloud

    As pointed out by the first case study, the initial motivation to adopt cloud computing was driven by the need to reduce operational costs of maintaining an IT infrastructure, while being able to scale the business forward.

    “We see the Windows Azure platform as an alternative to both managing and supporting collocated servers and having support personnel on our side dedicated to making sure the system is always up and the application is always running,” says Tom Naylor. “Windows Azure solves all those things for us effectively with the redundancy and fault tolerance we need. Because cost is based on usage, we’ll also be able to much more accurately assess our service fees. For the first time, we’ll be able to tell exactly how much it costs to service a particular site.”

    For instance, in the Channel 9 video, Tom mentioned that replicating the co-located architecture from Rackspace to Windows Azure platform resulted in approximately 75% cost reduction on a monthly basis in addition to other benefits. One of the major ‘other’ benefits is agility, which arguably is much more valuable than the cost reduction normally associated with cloud computing benefits. In fact, as the second case study pointed out, in addition to breaking ties to an IT infrastructure, Windows Azure platform become a change enabler that supported to shift to a completely different business model for Advanced Telemetry (from a direct market approach to that of an original equipment manufacturer (OEM) model). The move to Windows Azure platform provided the much needed scalability (of the technical infrastructure), flexibility (to adapt to additional vertical market scenarios), and manageability (maintaining the level of administrative efforts while growing the business operations). The general benefits cited in the case study were:

    • Opens New Markets with OEM Business Model
    • Reduces Operational Costs
    • Gains New Revenue Stream
    • Improves Customer Service

    Cloud-Optimized Architecture

    However, this is not just another simple story of migrating software from one data center to another data center. Tom Naylor understood well the principles of cloud computing, and saw the value in optimizing the implementation for the cloud platform instead of just using it as a hosting environment for the same thing from somewhere else. I discussed this in more detail in a previous post Designing for Cloud-Optimized Architecture. Basically, it is about leveraging cloud computing as a way of computing and as a new development paradigm. Sure, conventional hosting scenarios do work in cloud computing, but there is more value and benefits to gain if an application is designed and optimized specifically to operate in the cloud, and built using unique features from the underlying cloud platform.

    In addition to the design principles around “small pieces, loosely coupled” fundamental concept I discussed previously, another aspect of the cloud-optimized approach is to think about storage first, as opposed to thinking about compute. This is because, in cloud platforms like Windows Azure platform, we can build applications using the cloud-based storage services such as Windows Azure Blob Storage and Windows Azure Table Storage, which are horizontally scalable distributed storage systems that can store petabytes and petabytes of data and content without requiring us to implement and manage the infrastructure. This is in fact, one of the significant differences between cloud platforms and traditional outsourced hosting providers.

    In the Channel 9 video interview, Tom Naylor said “what really drove us to it, honestly, was storage”. He mentioned that the Telemetry Application Platform currently handles about 200,000 messages per hour, each containing up to 10 individual point updates (which roughly equates to 500 updates per second). While this level of traffic volume isn’t comparable to the top websites in the world, it still poses significant issues for a startup company to store and access the data effectively. In fact, the data required the Advanced Telemetry team to cull the data periodically in order to maintain a relatively workable size for the operational data.

    “We simply broke down the functional components, interfaces and services and began replicating them while taking full advantage of the new technologies available in Azure such as table storage, BLOB storage, queues, service bus and worker roles. This turned out to be a very liberating experience and although we had already identified the basic design and architecture as part of the previous migration plan, we ended up making some key changes once unencumbered from the constraints inherent in the transitional strategy. The net result is that in approximately 6 weeks, with only 2 team members dedicated to it (yours truly included), we ended up fully replicating our existing system as a 100% Azure application. We were still able to reuse a large percentage of our existing code base and ended up keeping many of the database-driven functions encapsulated in stored procedures and triggers by leveraging SQL Azure.” Tom Naylor described the approach on his blog.

    The application architecture employed many cloud-optimized designs, such as:

    • Hybrid relational and noSQL data storage – SQL Azure for data that is inherently relational, and Windows Azure Table Storage for historical data and events, etc.
    • Event-driven design – Web roles receiving messages act as event capture layer, but asynchronously off-loads processing to Worker roles

    Lessons Learned

    In the real world, things rarely go completely as anticipated/planned. And it was the case for this real-world implementation as well. :) Tom Naylor was very candid about some of the challenges he encountered:

    • Early adopter challenges and learning new technologies – Windows Azure Table and Blob Storage, and Windows Azure AppFabric Service Bus are new technologies and have very different constructs and interaction methods
    • “The way you insert and access the data is fairly unique compared to traditional relational data access”, said Tom, such as the use of “row keys, combined row keys in table storage and using those in queries”
    • Transactions - initial design was very asynchronous; store in Windows Azure Blob storage and put in Windows Azure Queue, but that  resulted in a lot of transactions and significant costs based on the per-transaction charge model for Windows Azure Queue. Had to leverage Windows Azure AppFabric Service Bus to reduce that impact

    The end result is a an application that is horizontally scalable, allowing Advanced Telemetry to elastically scale up or down the deployments of individual layers according to capacity needs, as different application layers are nicely decoupled from each other, and the application is decoupled from horizontally scalable storage. Moreover, the cloud-optimized architecture supports both multi-tenant and single-tenant deployment models, enabling Advanced Telemetry to support customers who have higher data isolation requirements.

Page 4 of 28 (137 items) «23456»