Architecture + Strategy

Musings from David Chou - Architect, Microsoft

  • Architecture + Strategy

    Silverlight and Photosynth at Stargate Universe

    • 1 Comments

    image

    MGM and Microsoft partnered to build an application intended to give old and new fans of Stargate a first look at the main location of the new Stargate Universe show, an ancient starship called the Destiny. The sets for the Destiny are amongst the most sophisticated ever built for a TV show, accurately capturing the spirit of a spaceship that has been abandoned for millennia. 

    MGM wanted to give fans an accurate experience of what it would be like to walk around and explore the Destiny, and using Microsoft Silverlight and Photosynth were able to deliver an immersive 3-D experience.

    UPDATE 2009.09.04 – a Microsoft Case Study and video has been published for this project. You can view the Case Study at http://www.microsoft.com/casestudies/Case_Study_Detail.aspx?casestudyid=4000005102, and the video at http://www.microsoft.com/video/en/us/details/c0387435-dd01-4c5a-87c9-6e87cedeee15.

    Solution

    Over the space of two days, several hundred photographs were taken of some of the locations of the Destiny starship. These include the fan-favorite Gate Room, from where the crew embarks on their adventures through the Stargate as well as locations such as the ship’s control center, observation deck, and main shuttle.

    These photographs were used to create a number of 3D scenes using Microsoft Photosynth technology.  This technology analyzes pictures and creates a virtual 3D space using the different viewpoints from different pictures. Because the pictures were shot in high resolution, fans can zoom in and see the sets in great detail.

    image

    See the Stargate Universe Photosynth set at http://stargate.mgm.com/photosynth/index.html

    Application Scenario

    Now Photosynth does not require any programming effort. All we need to do, is to take pictures. Lots of pictures. Then all we need to do is upload the pictures to Photosynth.net, and Photosynth will do the rest, by interpreting textures in each photo and correlate the photos by matching similar textures, and figuring out spatially where each photo is located. Then finally, all of the photos in one set are stitched into a 3D environment for visitors to navigate through.

    Compare Photosynth to traditional approaches of trying to deliver 360 degree view of items, rooms, etc., which requires carefully planned photo shoots, then writing or obtaining specialized software to render them and allow viewers to interact with them. The time and effort required to create a “synth” can literally be minutes. Then just a few HTML-level changes to embed the Silverlight viewer for Photosynth into a web page, and anyone visiting that page can now interact with the synth.

    A screen shot of the gallery view is shown below.

    image

    Kind of like a photo gallery on steroids. However, Photosynth is not just a slideshow. It allows viewers to traverse through the set of pictures in a non-linear manner, and to see the spatial perspectives between pictures (where each photo was taken relative to other photos).

    At the same time, specific considerations should be taken when taking photos for Photosynth. Basically, cover lots of angles and have lots of overlaps for overview shots, then lots of close-ups for detailed shots. Here’s a short tutorial on how to shoot for Photosynth.

    For the Stargate Universe synth, extra attention was paid to the physical architecture of the Destinty set. And the series of shots were taken so that the resulting synth would provide viewers an experience of walking through the set.

    Thus Photosynth can be leveraged in many scenarios, especially when a project has physical content that can be shared online. These can be anything, such as product showcases, exterior environments, interior walk-throughs, etc. Another simple and elegant way to enhance user acquisition and retention on websites.

     

  • Architecture + Strategy

    Web 2.0 - A Platform Perspective

    • 10 Comments

    Background & Primer

    "Web as a Platform" has been a much discussed topic since Tim O'Reilly used it as a tagline in the first Web 2.0 conference back in October of 2004, then described in more detail in a 2005 article, and the subsequent "Mind Map" graphic:

    800px-Web_2_0_Map_svg

    Since then many interpretations of the "Web platform" have existed, ranging from technical perspectives that focused on tools such as AJAX, RSS, REST, SOAP, mashups, composite applications; user-generated content and collective intelligence such as Wikipedia, Youtube; social bookmarking/syndication such as del.icio.us, Digg; to social networks such as Facebook, Myspace, etc. Just to list a few, but the list of sites and categories of sites that exemplify Web 2.0 principles has undergone an explosive growth in the past few years.

    Collectively, the rich cluster of "Web 2.0" sites on the internet form a services foundation from which applications and functionalities can be built upon, without needing any additional dedicated infrastructure. This marks a significantly different approach from "Web 1.0" site implementations where each organization has to procure dedicated hardware, software, hosting environment, etc. in order to provision a new application on the internet. As a result, the collection of cloud-based services form a new kind of "platform" to create a new breed of applications.

    Understanding Web as a Platform

    Without making this yet another attempt at trying to define the specifics of Web 2.0 (or even Web 3.0 for that matter) and the internet platform, delegating it to those who focus on semantics, I think we can look at "Web as a Platform" in its broadest terms. That is, a platform that provides some sort of framework which allows people to build stuff upon, while encapsulating (or hiding) some of the underlying complexities.

    But this doesn't point directly to technical solutions; it really encompasses many categories of "stuff" (such as media, social interactions, implicit relationships, semantic connections, monetization methods, etc.) that can be leveraged and implemented on the Web today. I liked how Fred Wilson said it:

    I believe the web is a platform. And that everything we need for an open ad market, or an open data architecture, or frankly most anything else, is available on the "web platform" today.

    So what can we do with the Web platform? There are many perspectives on this as well. Such as Marc Andreesen's "layered" perspective:

    Level 1 - API access - Flickr, Delicious, Twitter, etc.
    Level 2 - API plug-in - Facebook
    Level 3 - Runtime environment - Ning, Salesforce.com, etc.

    And Alex Iskold's "building blocks" perspective:

    Storage Services - Amazon S3, GDrive, Windows Live Skydrive, etc.
    Messaging Services - Amazon Simple Queue Service, BizTalk Services, etc.
    Compute Services - Sun Grid
    Information Services - Amazon E-Commerce, Yahoo! Answers, Virtual Earth, etc.
    Search Services - Google Search API, Alexa Search Platform, Live Search, etc.
    Web 2.0 Services - del.icio.us, Flickr, Basecamp, etc.

    Again, without questioning the validity of these categorizations used (as there are lots of discussion about that as well), I think from a general sense, both perspectives are valid. I think that building blocks do exist, but at the same time, there are multiple layers of building blocks (or categories) in the Web platform.

    What this means, is that building blocks in each layer can be utilized in various combinations/permutations to create the next layer up. These layers span between two extremes - information and people. The layers closer to information consist of Web application platforms as we know today, such as ASP.NET, Silverlight, LAMP, Java, Ruby on Rails, etc.; that require more expert knowledge in development and technology but smaller parts of the overall population. The layers closer to people are still being formed as we speak, but in general they rely on higher forms of abstraction that provide services closer to our lives, while enabling the broad reach of larger pools of audiences (consumerization and democratization of technology comes to mind). And today we are seeing higher and higher layers of platforms being created that allow people to connect, to organize, to find and use resources, to be social, and to basically "live" on the Web.

    Of course, the word "platform" is being used very loosely today, and new "platforms" and layers of platforms are being created almost on a daily basis. Marshall Kirkpatrick took a real brief look at some of the most hyped new platforms today. For example, the most recent and significant incarnations of higher-level Web platforms are probably Facebook Platform and Google OpenSocial.

    From a platform layer perspective, the Facebook Platform and Google OpenSocial, even though aimed at doing different things (lots of debate on this too), are built on top of other existing layers. Applications built on top of the Facebook Platform use a combination of traditional Web app technologies like HTML, CSS, JavaScript, XML, etc., but their benefits are derived from building blocks available on the Facebook Platform, in the form of mashups of external services building blocks, explicit foundation blocks (such as News Feeds, Status, Events, FBML, FQL, configuration and provisioning systems, etc.), and implicit foundation blocks (social graphs, software distribution/dissemination channel, monetization, 50+ million and still growing user base, etc.). A major characteristic of this platform is that it is very easy to develop against, which democratizes development and allows more and more people to participate in the social experience. In essence this platform furtherly narrows the gap between technology and people (thus categorized as a higher-layer platform). This resulted in a wildly viral and vital platform that has accounted more than 5,000 applications deployed today and growing exponentially.

    From a higher level, it seems that a "Web OS" of some sort is starting to take shape, as we can draw many parallels to the layered, subsystems and componentized approaches in modern computer operating system and software architectures. But I am not yet sure that it would be of value to try to apply traditional thinking in defining a "standard" Web platform stack, by needlessly preempting more knowledgeable people, and risk further defragmenting the evolution.

    In general though, by today we can definitely see the Web maturing as a very viable platform. News such as Amazon S3 exceeds 99.99% uptime should remove most doubts about the reliability of cloud-based services. But I think it is a platform with a spectrum of choices (layers and building blocks) where people with different skillsets can look to leverage and add value. The choices available in the full spectrum are all relevant, despite some idealists' claim that newer and higher-level models (such as higher layers of the platform used in the context of this post) will completely commoditize and subsume older and lower-level models. I tend to think that, while it is true that more and more attention will be focused on newer and higher-level models, we will continue to see lots of innovation on the lower-level layered platforms. We will just see that more and more people will be involved in the overall ecosystem, with a large infusion of participants with non-technical skillsets increasingly more involved at the higher levels. This I think is the true goal of Web 2.0, connecting people and democratizing/bridging the technology chasm.

    What's Next?

    It's always interesting to try to take a peek at what may be possible in the future.

    Democratization in software development - Recent advances in the Web platform (raising layers of abstraction), model-driven architectures, etc., will increasingly simplify software development efforts for the higher level platforms. Two very notable examples are Yahoo! Pipes and Microsoft Popfly.

    The Implicit Web - Increasing specialization in making sense of the dynamic aspects of user behaviors and activities in the online world. For example, search engines to finally grasp user intent (via click streams, combinational media consumption habits, etc.). This is also an area where the Facebook Platform may be able to glean from the reactions its applications can elicit from the members, based on the static social graphs.

    Privacy Controls - With so much attention on enabling the "read-write" Web, and increasing openness, a need for better privacy control will inevitably arise. Web idealists argue that traditional data silos (or intellectual property as we know today) will need to be opened up and interoperate in the new world. Again, I believe a hybrid model somewhere between the two extremes (of fully open and completely closed architectures) usually work out better to the benefit of its users. From this perspective, yes the highly protected enterprise data silos today will need to open up, but should be just enough to add value for the users. To do that, some kind of interoperable privacy controls is required.

    Ubiquitous Access w/ Rich User Experiences - A consistent and seamless experience for people accessing their information, applications, and services, across a full spectrum of connected devices and systems. At the same time, highly targeted user experiences implemented for the appropriate form factors are available to take advantage of the latest hardware and device innovations.

    There are many more, such as the data/semantic Web, evolutionary intelligence, changes in social trends, etc. It'll be interesting to see how things pan out in this space.

    Share this post :

    This post is part of a series:

  • Architecture + Strategy

    Silverlight 3 Beta

    • 3 Comments

    You may have already heard about the many announcements made at Microsoft’s MIX09 conference this week in Las Vegas. One of the major announcements is the Beta availability of Silverlight 3 (and it does look like three’s the charm).

    Customer adoption to date

    Scott Guthrie during his keynote at MIX09 shared some encouraging statistics:

    • Silverlight launched 18 months ago, shipped Silverlight 2 six months ago
    • 350+ million installations globally
    • 300,000+ designers and developers
    • 200+ partners in 30 countries contributing to the ecosystem
    • 10,000+ applications globally
    • Microsoft has 200+ of its own websites built using Silverlight

    A few major customer adoption announcements were also made:

    • Netflix standardizing on Silverlight to deliver its online on-demand video instant watch service across PCs, Macs, and devices; leveraging PlayReady DRM and smooth/adaptive streaming capabilities
    • 2010 Winter Olympics at Vancouver – NBC announced that the event will again be delivered online using Silverlight, similar to the 2008 Beijing Olympics event
    • NCAA “March Madness” Men’s Basketball Championship – CBS Sports Online coverage of all tournament games delivered using Silverlight at mmod.ncaa.com
    • 2008 Presidential Inauguration – the Presidential Inauguration Committee selected Silverlight to enable online video streaming of President Obama’s official swear-in and Whistle Stop Tour events
    • Bondi Publishing (who owns the Playboy Archives above) – working to deliver their set of magazines including the New Yorker, Rolling Stones, and Playboy online using Silverlight. Playboy Archives is now live with with search, navigation, and DeepZoom at playboy.covertocover.com (best to visit at home)
    • KEXP – a Seattle-based radio station showed off an out-of-browser version (that works when off-line) of their content browser and player
    • Kelley Blue Book – Perfect Car Finder application using Silverlight at www.kbb.com/kbb/PerfectCarFinder/PhotoEdition.aspx
    • SAP – working to deliver Silverlight controls to be used in Netweaver and Web DynPro
    • Microsoft Worldwide Telescope – now Silverlight enabled at www.worldwidetelescope.org/webclient
    • Microsoft Virtual Earth – Silverlight Map Control for any website to use for enhanced visualization of geo-location and mapping capabilities CTP now available
    • Other mentions including Yahoo! Japan, CareerBuilder, 10 Cent QQ, BSkyB, ITV, Intuit, etc.

    Interesting Silverlight applications to see

    Some of the most compelling Silverlight applications I have seen (many are registered on Silverlight Showcase) are listed below.

    General Info:

    Media sites/demos:

    Rich application sites/demos:

    Casual Games:

    Reusable Controls Libraries (for enterprise applications):

    Now these are just some of my favorites. But with the pace of developers building cool Silverlight applications, this list may need to be updated very frequently (last updated 3/20/09).

    What’s new in Silverlight 3?

    Fully supported by Visual Studio and Expression Blend, highlights of new features and functionality of Silverlight 3 include: major media enhancements, out of browser support allowing Web applications to work on the desktop; significant graphics improvements including 3D graphics support, GPU acceleration and H.264 video support; and many features to improve RIA development productivity.

    Enhanced media support

    • Live and on-demand true HD (720p+) Smooth Streaming. IIS Media Services (formerly IIS Media Pack), an integrated HTTP media delivery platform, features Smooth Streaming which dynamically detects and seamlessly switches, in real time, the video quality of a media file delivered to Silverlight based on local bandwidth and CPU conditions.
    • Hardware accelerated HD playback. Leveraging graphics processor unit (GPU) hardware acceleration, Silverlight experiences can now be delivered in true full-screen HD (720p, 1080p, etc.).
    • Extensible media format support. In addition to VC-1/WMA, Silverlight 3 now supports MPEG-4-based H.264/AAC Audio. Also with the new extensible Raw AV pipeline, audio and video can be decoded outside the runtime and rendered in Silverlight, extending format support beyond the native codecs.
    • Digital rights management. Full and built-in support for DRM, powered by PlayReady Content Protection enables protected in-browser experiences using AES encryption or Windows Media DRM.

    Enhanced graphics support

    • Perspective 3D Graphics. Content can now be applied to a 3D plane without writing any code. Live content can be rotated or scaled live content in space.
    • Pixel Shader effects. Image manipulation effects such as blur and drop shadow, using software-based rendering. Custom effects can also be created, and applied to any graphical content.
    • Bitmap Caching. Vector content, text, and controls can now be cached as bitmaps. This improves the rendering performance of applications and is useful for background content and for content which needs to scale without making changes to its internal appearance.
    • New Bitmap API. Support for writing pixels to bitmaps directly.
    • Themed application support. Runtime support for application theme updates driven by templates, and cascading style sheets.
    • Animation Effects. New effects such as spring and bounce. Developers can also now develop their own mathematical functions to describe an animation
    • Enhanced control skinning. Simplified skinning capabilities by keeping a common set of controls external from an application. This allows the sharing of styles and control skins between different applications.
    • Improved text rendering & font support. Enhanced rendering and rapid animation of text. Applications also load faster by enabling the use of local fonts.

    Enhanced rich internet application (RIA) support

    • Out-of-Browser capabilities
      • Run outside of browser. Light-weight, sandboxed companion experiences for the Web that run on the desktop. Enabled without any additional download of runtime or the need to write applications in a different way.
      • Consumer friendly non-administrator install. Applications are hosted in a cache and do not require any privileges to run.
      • Safer, more secure, sandboxed. An application can be trusted without security warnings. All assets are stored in an isolated storage.
      • Built in auto-update. An application will check for new versions on the server and update on launch.
      • Connectivity detection (on-line/off-line). Can detect a loss of connection (or react to event notifications), then choose to cache data locally until a network connection is restored.
      • Desktop integration. On Windows and Macs, applications can be saved as short-cuts on the desktop and be one click away from your customer. On Windows 7, support will be provided for superbar integration, multi-touch, and location awareness services such as GPS support so that your application can react to the users location.
    • Deep Linking. Support for deep linking, which enables bookmarking a page within a RIA.
    • Search Engine Optimization (SEO). By utilizing business objects on the server, together with ASP.NET controls and site maps, users can automatically mirror database-driven RIA content into HTML that is easily indexed by the leading search engines.
    • 60+ default controls with source code. Over 60 fully skinnable and customizable out-of-the-box controls such as charting and media, new layout containers such as dock and viewbox, and controls such as autocomplete, treeview and datagrid. The controls come with nine professional designed themes and the source code can be modified/recompiled or utilized as-is. Other additions include multiple selection in listbox controls, file save dialog, and support for multiple page applications with navigation.
    • Enhanced data support
      • Element-to-element binding. Enables property binding to CLR objects and other UI components via XAML, for instance binding a slider value to the volume control of a media player.
      • Data forms. Provides support for layout of fields, validation, updating and paging through data.
      • Data validation. Automatically catch incorrect input and warn the user with built-in validation controls.
      • Support for business objects on both client and server with n-Tier data support. Easily load, sort, filter and page data with added support for working with data. Includes a new built-in CollectionView to perform a set of complex operations against server side data. A new set of .NET RIA services supports these features on the server.
    • Improved performance
      • Application library caching. Reduced the size of applications by caching framework on the client which helps improve rendering performance.
      • Enhanced DeepZoom. Allows users to fluidly navigate through larger image collections by zooming.
      • Binary XML. Allows communication with the server to be compressed, greatly increasing the speed at which data can be exchanged.
      • Local Connection. This feature allows communication between multiple Silverlight applications on the client-side without incurring a server roundtrip.
    • Enhanced Accessibility Features. Provides access to all system colors, allowing partially-sighted people to make changes such as high contrast color schemes for ease of readability by re-using familiar operating system controls.
  • Architecture + Strategy

    Silverlight and Live Messenger at Photobucket

    • 0 Comments

    image Photobucket has collaborated with Microsoft to build Photobucket Visual Search (http://photobucket.com/visualsearch), making use of Silverlight, Windows Live Services and Photobucket's Open API platform. Photobucket Visual Search uses Silverlight to provide a rich and entertaining search experience, by displaying photos and videos in an easy-to-browse interface. The experience shows up search results, as well as related terms, helping users to find images they might not have in the past.

    Recognizing that photos are about sharing a social experience, Photobucket decided to use Microsoft's new Windows Live Messenger Web Toolkit to socially enable this Visual Search experience. Windows Live Messenger has over 320 Million active users, with over 32 billion social relationships between them. Sharing an image with friends on Windows Live is as easy as a click of a button with Photobucket Visual Search.

    image

    Situation

    Photobucket is a very popular site on the Internet for uploading, sharing, linking and finding photos, videos and graphics. Photobucket is usually used for personal photographic albums, remote storage of avatars displayed on internet forums, and storage of videos. Photobucket's image hosting is often used for eBay, MySpace (now a corporate cousin), Bebo, Neopets and Facebook accounts, LiveJournals or other blogs, and message boards. Users may keep their albums private, allow password-protected guest access, or open them to the public.

    Below are some statistics (circa 2007) regarding the Photobucket.com website.

    • 30+ million searches processed / day
    • 25 million unique site visitors/month in the US, and over 46 Million unique site visitors/month worldwide
    • Over 7 billion images uploaded
    • #31 in Top 50 Sites in the US
    • #41 top 100 Global Sites
    • 18th Largest Ad supported site in the US
    • 41.4% share of U.S. visits to photography web sites
    • 56% of users are under 35, and 52% are female

    This project is intended to enhance Photobucket’s user experience while mapping specific objectives to the following core business goals:

    • Leverage rich software to create a fun and engaging experience
    • Social communication features that provide increased web traffic
    • Encourage users to add their own tags and collect metadata in the process
    • Improve user acquisition and retention to the Photobucket website

    Richer visualization and interaction, plus social networking capabilities, were chosen as the means to achieve the goals of improving user acquisition and retention. Consequently, Silverlight and Windows Live Messenger Web Toolkit were chosen as the components from the Microsoft platform that can be leveraged for the Photobucket Visual Search project. Photobucket’s existing open API’s (HTTP/REST-based) are used directly to support the search client application.

    Architecture

    The project architecture consists of Photobucket’s existing server infrastructure, which provides the open REST-based API’s, images, albums, thumbnails, groups (featured, most active, most recent, most contributions, contests, etc.), static content, etc. The Photobucket.com website itself is a user interaction/presentation layer on top of the thousands of servers deployed as part of the content infrastructure. The Photobucket.com website manages all the metadata associated with the massive amount of content, such as tags, descriptions, comments, image ownerships and relationships, user memberships, etc. Searches done on the Photobucket website is performed on the metadata and indexes; and the search results point to actual locations of content and assets across the massively parallel content infrastructure.

    The content infrastructure consists of multiple farms of thousands of servers, each manages different types of content, with user data partitioned horizontally across servers in each group. The content servers can be accessed directly using sub-domains on photobucket.com, such as i98.photobucket.com for one of the image clusters, t104.photobucket.com for one of the thumbnail clusters.

    Application requests are managed by the photobucket.com website, and the open API’s are managed via api.photobucket.com servers. Search queries retrieve an XML response, which represent search hits, with associated metadata, and actual image locations potentially pointing to hundreds of different servers where they physically reside.

    The Silverlight search client implementation is designed to fully leverage the search service as part of Photobucket.com’s open API’s. It captures the search queries from the user, then sends it to api.photobucket.com for processing, then interprets the returned XML, renders the result, and downloads individual thumbnails from actual server locations in the thumbnails server farm.

    The Windows Live Messenger Web Toolkit implementation is mostly client-side. Photobucket hosts some of the files such as static images for branding the messenger interface, but most of the files are retrieved directly from the messenger service in the Live Services platform. Integration with the Silverlight Visual Search application is done through the JavaScript bridge.

    A diagram representing the logical architecture is shown below.

    image

    The overall end-to-end architecture consists of these components

    • Client – Silverlight 2, Windows Live Messenger Web Toolkit, JavaScript, CSS, HTML, etc.
    • Server (Photobucket) – Linux, Apache 2.2.4 (EL4), PHP 5.2.6, MySql
    • Server (Live Services) – IIS 6, ASP.NET 2.0.50727
    • Tools – Visual Studio 2008, Expression Studio

    Silverlight Visual Search Development

    The default target dimensions for the application are 1024 X 768. However, the application supports browser resizing and will adjust accordingly when the user changes browser window size dynamically.

    The application has two primary states:

    • Main View (Search & Explore) – This is the state the user will be in initially and for a majority of the search and explore functionality. The main view displays thumbnails of the image results. It also allows users to modify their searches, or explore further using alternative searches.
    • Zoom View – The secondary state of the application brings the user into a view where the focus is on the viewing aspect of the large image and the image options, including sharing with Live Services.

    The project team went through many data visualization designs for the search results. In the end, a simplistic one that represented a similar model to the rest of the Photobucket site was chosen.

    Windows Live Messenger Web Toolkit Integration

    While in Zoom View, a user can share the photo via IM using Live Messenger. Doing so will prompt the user to sign in or register to sign up for Live Messenger. If the user is already signed in, this will trigger the user's Windows Live contact list to pop up.

    Delegated Authentication

    Signing in will link a user’s Windows Live ID with the Photobucket.com website, granting an authentication consent (as a site-wide stored consent token) to Photobucket so that web pages generated from Photobucket can reuse this session with Live Messenger, and not have to require the user to sign-in on each page.

    The project team decided to leverage Live Messenger’s Delegated Authentication method because it provides the most unobtrusive method to the existing Photobucket’s membership system, as well as simplicity in implementation as the identity federation approach would be significantly more complex.

    Messenger Web Bar

    The Messenger Web Bar is a single UI Control that contains a full Windows Live Messenger experience. The Messenger Web Bar has the following functionality:

    • Contact list—The contact list enables the user to manage contacts and interact with them.
    • Conversation list—The conversation list contains all active conversations.
    • User area—The user area shows the user's presence and enables the user to update this information
    • Cross-page navigation—Cross-page navigation enables the user to stay signed in to Live Messenger while navigating from page to page within your application Web site.

    This functionality is encapsulated within a small bar at the bottom of the page. Because the Live Messenger Web Bar works across pages, conversations that start on a page on the application Web site can continue on another page of the application Web site. This functionality enables an application Web site to use the Live Messenger Web Bar as a platform to deeply integrate Live Messenger functionality and data into the site. Making an application Web site more social with the Messenger Web Bar and UI Controls can significantly increase user engagement.

    Shown below is a picture of the Messenger Web Bar used as part of the Visual Search experience.

    image

    Once the user is logged into the Windows Live Messenger service, the user can view and interact with the list of contacts managed in the Windows Live service. The user can see also see presence information in terms of who among the contacts is online or offline at the moment. An IM conversation can then be initiated just by selecting a contact. If the user initiates an IM conversation while in Zoom View, Visual Search will automatically insert a link that says “Check this photo” with a URL to the actual image.

    Application Scenario

    This project demonstrated how Silverlight can be leveraged to add value to a website completely built on the LAMP stack. It also showed how Live Services can be leveraged, without any significant custom development effort, to add social computing capabilities to any website. The combination of rich clients, and composition of multiple cloud-based services on the Web, represents a Software Plus Services implementation approach, and how it enhances existing Web browsing models and improves user acquisition and retention.

    Related Resources

  • Architecture + Strategy

    Cloud-optimized architecture and Advanced Telemetry

    • 0 Comments

    Advanced TelemeryOne of the projects I had the privilege of working with this past year, is the Windows Azure platform implementation at Advanced Telemetry. Advanced Telemetry offers an extensible, remote, energy-monitoring-and-control software framework suitable for a number of use case scenarios. One of their current product offerings is EcoView™, a smart energy and resource management system for both residential and small commercial applications. Cloud-based and entirely Web accessible, EcoView enables customers to view, manage, and reduce their resource consumption (and thus utility bills and carbon footprint), all in real-time via the intelligent on-site control panel and remotely via the Internet.

    image

    Much more than Internet-enabled thermostats and device end-points, “a tremendous amount of work has gone into the core platform, internally known as the TAF (Telemetry Application Framework) over the past 7 years” (as Tom Naylor, CEO/CTO of Advanced Telemetry wrote on his blog), which makes up the server-side middleware system implementation, and provides the intelligence to the network of control panels (with EcoView being one of the applications), and an interesting potential third-party application model.

    The focus of the Windows Azure platform implementation, was moving the previously hosted server-based architecture into the cloud. Advanced Telemetry completed the migration in 2010, and the Telemetry Application Framework is now running in Windows Azure Platform. Tom shared some insight from the experience in his blog post “Launching Into the Cloud”. And of course, this effort was also highlighted as a Microsoft case study on multiple occasions:

     

    The Move to the Cloud

    As pointed out by the first case study, the initial motivation to adopt cloud computing was driven by the need to reduce operational costs of maintaining an IT infrastructure, while being able to scale the business forward.

    “We see the Windows Azure platform as an alternative to both managing and supporting collocated servers and having support personnel on our side dedicated to making sure the system is always up and the application is always running,” says Tom Naylor. “Windows Azure solves all those things for us effectively with the redundancy and fault tolerance we need. Because cost is based on usage, we’ll also be able to much more accurately assess our service fees. For the first time, we’ll be able to tell exactly how much it costs to service a particular site.”

    For instance, in the Channel 9 video, Tom mentioned that replicating the co-located architecture from Rackspace to Windows Azure platform resulted in approximately 75% cost reduction on a monthly basis in addition to other benefits. One of the major ‘other’ benefits is agility, which arguably is much more valuable than the cost reduction normally associated with cloud computing benefits. In fact, as the second case study pointed out, in addition to breaking ties to an IT infrastructure, Windows Azure platform become a change enabler that supported to shift to a completely different business model for Advanced Telemetry (from a direct market approach to that of an original equipment manufacturer (OEM) model). The move to Windows Azure platform provided the much needed scalability (of the technical infrastructure), flexibility (to adapt to additional vertical market scenarios), and manageability (maintaining the level of administrative efforts while growing the business operations). The general benefits cited in the case study were:

    • Opens New Markets with OEM Business Model
    • Reduces Operational Costs
    • Gains New Revenue Stream
    • Improves Customer Service

    Cloud-Optimized Architecture

    However, this is not just another simple story of migrating software from one data center to another data center. Tom Naylor understood well the principles of cloud computing, and saw the value in optimizing the implementation for the cloud platform instead of just using it as a hosting environment for the same thing from somewhere else. I discussed this in more detail in a previous post Designing for Cloud-Optimized Architecture. Basically, it is about leveraging cloud computing as a way of computing and as a new development paradigm. Sure, conventional hosting scenarios do work in cloud computing, but there is more value and benefits to gain if an application is designed and optimized specifically to operate in the cloud, and built using unique features from the underlying cloud platform.

    In addition to the design principles around “small pieces, loosely coupled” fundamental concept I discussed previously, another aspect of the cloud-optimized approach is to think about storage first, as opposed to thinking about compute. This is because, in cloud platforms like Windows Azure platform, we can build applications using the cloud-based storage services such as Windows Azure Blob Storage and Windows Azure Table Storage, which are horizontally scalable distributed storage systems that can store petabytes and petabytes of data and content without requiring us to implement and manage the infrastructure. This is in fact, one of the significant differences between cloud platforms and traditional outsourced hosting providers.

    In the Channel 9 video interview, Tom Naylor said “what really drove us to it, honestly, was storage”. He mentioned that the Telemetry Application Platform currently handles about 200,000 messages per hour, each containing up to 10 individual point updates (which roughly equates to 500 updates per second). While this level of traffic volume isn’t comparable to the top websites in the world, it still poses significant issues for a startup company to store and access the data effectively. In fact, the data required the Advanced Telemetry team to cull the data periodically in order to maintain a relatively workable size for the operational data.

    “We simply broke down the functional components, interfaces and services and began replicating them while taking full advantage of the new technologies available in Azure such as table storage, BLOB storage, queues, service bus and worker roles. This turned out to be a very liberating experience and although we had already identified the basic design and architecture as part of the previous migration plan, we ended up making some key changes once unencumbered from the constraints inherent in the transitional strategy. The net result is that in approximately 6 weeks, with only 2 team members dedicated to it (yours truly included), we ended up fully replicating our existing system as a 100% Azure application. We were still able to reuse a large percentage of our existing code base and ended up keeping many of the database-driven functions encapsulated in stored procedures and triggers by leveraging SQL Azure.” Tom Naylor described the approach on his blog.

    The application architecture employed many cloud-optimized designs, such as:

    • Hybrid relational and noSQL data storage – SQL Azure for data that is inherently relational, and Windows Azure Table Storage for historical data and events, etc.
    • Event-driven design – Web roles receiving messages act as event capture layer, but asynchronously off-loads processing to Worker roles

    Lessons Learned

    In the real world, things rarely go completely as anticipated/planned. And it was the case for this real-world implementation as well. :) Tom Naylor was very candid about some of the challenges he encountered:

    • Early adopter challenges and learning new technologies – Windows Azure Table and Blob Storage, and Windows Azure AppFabric Service Bus are new technologies and have very different constructs and interaction methods
    • “The way you insert and access the data is fairly unique compared to traditional relational data access”, said Tom, such as the use of “row keys, combined row keys in table storage and using those in queries”
    • Transactions - initial design was very asynchronous; store in Windows Azure Blob storage and put in Windows Azure Queue, but that  resulted in a lot of transactions and significant costs based on the per-transaction charge model for Windows Azure Queue. Had to leverage Windows Azure AppFabric Service Bus to reduce that impact

    The end result is a an application that is horizontally scalable, allowing Advanced Telemetry to elastically scale up or down the deployments of individual layers according to capacity needs, as different application layers are nicely decoupled from each other, and the application is decoupled from horizontally scalable storage. Moreover, the cloud-optimized architecture supports both multi-tenant and single-tenant deployment models, enabling Advanced Telemetry to support customers who have higher data isolation requirements.

Page 4 of 28 (137 items) «23456»