Microsoft UK Faculty Connection - Site Home - MSDN Blogs


  • Microsoft UK Faculty Connection

    Accessing Microsoft DreamSpark Azure account if you have onthehub DreamSpark Premium Account Already



    Microsoft DreamSpark now offers FREE Azure for all students via 

    However if your institution has a DreamSpark Premium subscription with an OntheHub ELMS will need to sign up for your Azure Subscription via


    Why do I need to sign up at

    The DreamSpark ELMS stores are hosted by Kivuto.  Kivuto supplies software to students from a variety of companies like Microsoft, Adobe etc and they don't use a Microsoft Account to validate students.

    As Microsoft Azure requires a Microsoft Account to get access to Azure, we can't automatically reconcile all the registered email address from with a Microsoft Account because there's no guarantee that's the email address they have associated with their Microsoft Account.

    So if you try to access Azure services from your academic email registered to your onthehub store you will receive the error below. which states ‘This Offer is only available to DreamSpark members’


    Students with a DreamSpark premium subscription who’s institution have a onthehub store need to activate a additional account to access the FREE Azure services.

    This account will NOT impact your access to your institutions onthehub DreamSpark premium resources.

    Here how to Get Access to DreamSpark Azure services

    1. Go to and select Create Account


    2. Sign-in with or create your Microsoft Account (eg: Hotmail, Outlook, or your Shibboleth academic login). On the Academic Verification page, select the best verification option for yourself, fill in the necessary data, and click Verify and then Continue. See here for more details instructions on account setup.


    With your verified account you can now start downloading Microsoft Azure

    3. Go to and setup your Azure account


    4. Get the most out of your membership and checkout these free course to help you learn more at

    Beginner Courses

    Azure Fundamentals:
  • Microsoft UK Faculty Connection

    Future Decoded 10–11 Nov 2015 Excel London–FREE Event




    This year we're doing it bigger and bolder!

    Get free tickets

    In 2014, we gave you a tantalising glimpse into the future - this year we're doing it again.

    Future Decoded 2015 will host 10,000 of the brightest Business Decision Makers, Developers, IT pros and Partners from across the UK and Europe. Attendees will be taken on a journey to decode the future of business and technology whilst gaining tangible insights in making sense of the latest social and economic changes. The Expo and Conferences will be taking place at the fantastic ICC Capital Suite, at the ExCel centre in London and will take place over two days: Tuesday 10th November 2015 for Enterprise, Partner and SMB audiences and Wednesday 11th November 2015 for Technical and Developer audiences. Last year the event had almost 9,000 attendees and saw a social reach of 52 million.

    Watch this space for keynote and session announcements, leading up to the big event.

    Tuesday 10th Nov - The Business Day

    Get tickets

    If you're a large or small business or a valuable Microsoft partner, the Future Decoded Business day is perfect for you. We have scheduled a variety of activities to enable you to gain as much value as possible from your time at Future Decoded, including...
    • A series of customer success stories where you can hear directly from your peers about their transformation journey, sharing their insights and experiences.

    • A tailored selection of roundtables and briefings to explore key trends, identify opportunities, address common challenges and spark new ideas.

    • Presentations from our Partners and Microsoft Product Groups, with experts available in our Expo to help you decode 'the art of the possible' and apply within your organisation.

    Wherever you are on your journey - digital transformation, enhancing customer experience, gaining data insights or becoming a truly modern business - there will be presentations and breakout sessions to empower you to reach your destination successfully, at the Future Decoded Business day.

    Tuesday 10th November 2015

    08:30 - 09:45

    09:45 - 12:00

    11:00 - 19:30
    Expo Open

    12:15 - 16:15
    Breakout Sessions

    16:45 - 17:45
    Closing Keynote

    17:45 - 19:00

    Wednesday 11th Nov - The Technical Day

    Get tickets

    If you're a developer, an IT professional or any other kind of propeller-head then the Future Decoded Technical Day is the place for you. Where else can you hang out with 4,000 like-minded folks and get...
    • Keynotes from top industry leaders presenting their vision on topics across Cloud, Web and the Future of Computing.

    • Deep technical tracks with world class speakers across programming languages, web, data, internet of things, cross platform apps and also Microsoft technologies like Windows 10 and Visual Studio.

    • Short, snappy demo sessions from leading UK Microsoft Researchers and Most Valued Professionals.

    Whether you build or manage bits that run on a device, in a browser, on a server, in a database or anywhere else, we've got something for you at the Future Decoded Technical Day.

    Wednesday 11th November 2015

    08:30 - 09:45

    09:45 - 12:30

    11:00 - 19:30
    Expo Open

    13:00 - 16:30
    Breakout Sessions

    16:30 - 17:30
    Closing Keynote

    17:30 - 19:00

    Agenda and sessions will be confirmed see

  • Microsoft UK Faculty Connection

    Visual Studio Code – September Update (0.8.0)



    Visual Studio Code v0.8 is now live  

    Head over the VSCode website to find out about everything that’s included.

    There is lots of good stuff to explore so update now (or download VS Code if you have not tried it yet) and let us know how we are doing via Send-a-Smile and our issue tracking system.


    Using Unity3d try out this Plugin

    The following capabilities:

    An option to enable VS Code integration (Editor –> Assets –> Enable Integration), this updates your solution files and (more importantly) keeps them in sync.  This also sets the preferred external tool editor in the Unity preferences.

    It writes out the necessary (and sometimes hard to find) VS Code configuration files, including the ability to hide “non-code” files in the editor (hides things like .sln, .csproj and the ever present unity .meta files)

    Automatically launches VS Code direct to your project folder, EVERY-TIME.  no longer do you have to worry about keeping that window open, or switching around if you work on multiple projects

    Please note

    One thing to be aware of, once you enable the VSCode integration, changing your preferred code editor in the External Tools preferences will have no effect as the plugin takes over opening code files.  If you want to use another editor, you’ll have to disable the integration first using the setting shown below


    These are just the main highlights as there are more features in there as well.

    Currently you need to download the plugin files from GitHub but a package is going on the Unity asset store, which when it’s available, I’ll also post the link here.

    If your interested in reading more about VSCode see

    You can also read all the VSCode documentation at

  • Microsoft UK Faculty Connection

    Got an Internet of Things or Data project on the go, Accelerate now with $120k of support.




    Microsoft is offering to help accelerate UK start-ups who are developing IoT and Data solutions with $120,000 of free Azure cloud services, technology advisory, and go-to-market support. 

    Ten of the UK’s most promising start-ups and innovators will be selected to receive

    · $120,000 of Azure cloud credits

    · Free technology and tools including Visual Studio with MSDN, Office365 and more.

    · Microsoft Tech Advisory to support your development

    · Go To Market advisory and support

    · 1 month residency at Microsoft Ventures Accelerator.

    Apply now through 30th September at

    Learn more about IOT


    Windows IoT device

    Rapidly prototype and build your Windows IoT solutions on a variety of devices running Windows 10 IoT Core. Windows 10 gives you powerful tools that let you develop fast and deploy to your device.


    Device connectivity

    Leverage the power of open frameworks like Connect-the-Dots to help connect your devices to Microsoft Azure. Microsoft Azure allows you to implement great solutions by leveraging advanced analytics services.


    Maker community

    Connect with other makers to share code and make contributions through GitHub. Join the community to influence future releases of the SDK.

  • Microsoft UK Faculty Connection

    Azure Con Sept 29th 2015 - Be the first to see what’s next. Attend the free, virtual event.



    Be the first to see what’s next. Attend the free, virtual event.

    Unleashed possibilities

    Be the first to hear about the latest Azure innovation. Find new ways to address your IT challenges and enable scenarios only possible with the cloud.

    Access to the experts

    Get unfiltered access to top Azure engineers and community members. And hear the latest from Scott Guthrie, Jason Zander, Bill Staples, Mark Russinovich, Scott Hanselman, T.K. “Ranga” Rengarajan, Shawn Bice, and others.

    Customers take the stage

    Hear from startups and enterprises who are accelerating, differentiating, and transforming their business with the cloud. See what they're up to with Azure and see what you can do next.

    • Airline
    • Food and beverage
    • Manufacturing
    • Oil and gas
    • Public sector
    • Retail
    • Startups

    Customizable learning

    Choose from more than 50 technical videos and learn about new Azure features and existing capabilities. Whether you're just getting started or want to dive deep, we've got you covered with sessions on:

    • Analytics
    • Azure platform
    • Compute
    • Data and storage
    • Developer services
    • Enterprise mobility
    • Internet of Things
    • Management
    • Media and CDN
    • Networking
    • Trusted cloud
    • Visual Studio

    Register Now

  • Microsoft UK Faculty Connection

    Teaching Minecraft why not host your own server on Microsoft Azure for £1.50 per month



    Logo for Minecraft

    Discover how to create a Minecraft Virtual Machine on Ubuntu on Microsoft Azure in minutes using a pre-prepared Virtual Machine Image from VM Depot.

    This image contains fully configured Minecraft server, with Spigot, a project which seeks to improve performance of the Minecraft server and mark2, a convenience wrapper for the Minexraft server tools.

    To play on a shared server world, using this virtual machine image, you need a full (paid) version of the Minecraft client. If you would like to try out Minecraft first, give the single player free client a go.

    There are many ways to deploy a Virtual Machine from VM Depot to Microsoft Azure.

    So what is VMDepot

    VMDepot is community managed repository of Linux and FreeBSD virtual machine images for easy deployment to Microsoft Azure.

    VMDepot is a great resources for discovering and running open source software on Microsoft Azure.

    VMDepot makes it easy. Publishers can share Virtual Machine Images for free. Users discover and deploy to Microsoft Azure with a Free Trial.

    Setting Up Minecraft on Azure

    In this tutorial, we will focus on a feature called VM Depot Easy Deploy. This method is available within the VM Depot web application and is designed for those new to VM Depot and Microsoft Azure. If your knowledge of Azure is more sophisticated, you could also opt to run through this process using the Cross Platform Command Line tools or the Microsoft Azure Portal.

    Learn more about Minecraft

    Skip ahead to the next section if you want to proceed with deploying a Minecraft virtual machine on Azure, but if you are not sure if Minecraft is the right option for you then you might find the resources below helpful:

    Overview of Creating a Virtual Machine Using VM Depot Easy Deploy

    VM Depot Easy Deploy iconDeploy Now!

    The steps below outline what you need to do to get your VM up and running. Each step is explained in more detail below.

    1. Get a free trial of Microsoft Azure Subscription

    2. Click the "Easy Deploy" Icon in VM Depot (or click the icon to the right)

      1. If necessary, set-up VM Easy Deploy

    3. Configure your VM

    4. Wait for an email, usually 5-15 Minutes

    5. Login and start using your VM

    The Details

    Obtaining an Azure Subscription

    If you do not already have an active Microsoft Azure subscription, you can sign up for a free trial in order to follow along with this tutorial. If you have an MSDN subscription then you are probably aware that you have Azure credits included.

    All Student now get a Azure via however at present Azure for Student DOES NOT include Virtual Machines

    The Student offer has the following Services 

    • Azure App Service Web Apps is a part of a fully managed cloud offering that enables you to build and deploy web apps in seconds. Use ASP.NET, Java, PHP, Node.js or Python. Run popular web apps and CMS solutions. Set up continuous integration and deployment workflows with VSO, GitHub, TeamCity, Hudson or BitBucket – enabling you to automatically build, test and deploy your web app on each successful code check-in or integration tests.
    • MySQL Database from ClearDB adds the power of MySQL to your Web Apps. With clearDB MySQL you can deploy more kinds of web apps and CMS solutions such as Wordpress, Joomla, Acquia Drupal, phpBB, and more.
    • Application Insights provides a 360° view across availability, performance and usage of your ASP.NET services and mobile applications for Windows Phone, iOS and Android platforms. Search and analyze your data to continuously improve your application, prioritize future investments and improve overall customer experience.
    • Visual Studio Online is the fastest and easiest way yet to plan, build, and ship software across a variety of platforms. Get up and running in minutes on our cloud infrastructure without having to install or configure a single server.

    To learn more about Azure, see What is Azure?

    First time Set-Up

    The first time you deploy an image using VM Depot Easy Deploy, you will need to provide your Azure subscription details. This information can be readily obtained from Azure once your subscription has been activated. Your browser will cache this information locally which means that while your first use of the Easy Deploy feature is simple, subsequent uses will be even easier. Note, however, that because the details are stored in your browser you will need to reconfigure them on each browser you use. If, for any reason, the authentication information contained within these settings are invalidated, the Easy Deploy web tool will prompt you to update them. You can also remove these settings from your browser cache, should you wish to do so (see Advanced Configurtion below).

    The Deployment Process

    For this exercise, we have elected to use Minecraft (distribution Ubuntu). You may find that many other Minecraft images are available on VM Depot. The availability of both newer and older versions may be useful for testing purposes. The deployment process for any Virtual Machine image is the same regardless of which one you select.

    Once you have identified a VM Depot image that you would like to deploy, click "Create Virtual Machine".

    Screenshot of VM Depot displaying the search results page

    Set-Up VM Depot Easy Deploy

    If you have previously provided your publish settings file, you will bypass this step and be directed straight to the deployment configuration page.

    As discussed above, the first time you use the Easy Deploy feature, you will need to provide your Azure Subscription profile settings. Simply drag and drop the appropriate file (download it from Azure) onto this page (or click to browse to the file). A link to download your publish settings file will also be conveniently provided on this page should you need to re-download it at any point in the future.

    Drag and Drop, or Browse to, Azure Publish Settings.

    Configure Your Virtual Machine

    On the deployment configuration page, you will be required to, at a minimum, provide a password for the default "azureuser" account and accept the terms of use. You can also see (and change) values for the DNS name and username. In addition, the default VM name (the name used in Azure to manage the image), VM Size and deployment region, are displayed. If you want to change these simply click them.

    There is minimal information that you must provide to configure your VM

    An important item to call to your attention: the username specified here is not the username for the Minecraft application; it is the default user on the VM. This means that you will use this username to login to the virtual machine itself. The image description on VM Depot should contain any required usernames and passwords.

    You may also click on the “Advanced” link to access additional configuration options. More information about these options is detailed in the next section of this tutorial.

    Virtual Machine Creation

    After you have read and accepted the terms and conditions, you should click “Create Virtual Machine” to schedule the creation of your VM. Doing so will generate the following confirmation page, which also includes status information. Once you arrive at this page, you have completed all the deployment steps. An email will notify you when your VM is ready to be used (usually within 5-15 minutes).

    The confirmation page shows a handy progress bar and descriptive status message.

    For the curious, the confirmation page shows real-time status updates pertaining to your deployment. You may return to this page at any time by selecting "VM Deploy Status" within the "My Account" sidebar menu on the left side of the screen (which will only be available if you are currently logged in. This page will tell you the current status of your VM both during and after deployment. That is this page will later be used to review which VMs are currently running.

    Changing Password

    you need to SSH into your server

    You can get SSH details from the Azure Portal look on the right hand side under quick glance


    SSH (port 22)

    username "mineadmin" and password "~1qaz2wsx" (it's better to change it ASAP)

    Starting Minecraft Server

    SSH into your server

    Navigate to the Minecraft folder

    cd minecraft

    Start the Services by typing

    mark2 start

    You can now connect to your Azure Minecraft Server

    So How much does it cost to run?

    Using the Azure Pricing Calculator and the specification of the default setting here are your costs


    The Azure pricing calculator is a great online tool which allows you to forecast expected costs per month based on the number of virtual Machines.  The Cost above are for running a server 5 days per week 24 hours per day.

    For a school lab where you have 6 hour of lesson per week so 24 hour per moth the costs would be £1.50


    Advanced Configuration Options

    As noted above, you may also take advantage of several advanced options. You can change the default settings before launching your image by clicking on the “Advanced” link on the configuration page. The screenshot below illustrates the available options:

    The advanced configuration page provides a number of additional options to configure your VM (text provides full details).

    Most users will only have one subscription associated with their account. If you have a Microsoft account with multiple Azure subscriptions enabled, each will be listed within the "Azure Subscription" dropdown at the bottom of this section. It is advisable that you first make changes to this, if you plan to do, as many of the other options depend on the selection here. If you wish to use a subscription not listed, then select the “Use another Azure Profile” link to remove the current settings from the cache and provide a new Azure Publish Settings file.

    The “VM Name” field defines the name assigned to the VM in Azure. The portal and other management tools use this value to identify the VM to the user. The default names provided may not be very descriptive, but they are often sufficient for a small number of VMs. For those who wish to do so, you may configure a more memorable VM name here.

    “VM Size” refers to the size of the Virtual Machine you wish to create, in terms of processor, memory, disk size and maximum IOPS. Descriptions of the codes used within this field are available in the Azure documentation on Virtual Machine Sizes for Azure.

    The “Region” field specifies which data-center you wish to deploy to. The dropdown contains all valid regions for your deployment.

    The default “Storage Account” is the first attached to your subscription. If you prefer to use a different account, simply select it from the options provided within the dropdown menu. Alternatively, you may create a new storage account using the text box provided.

    Lastly, VM Depot Easy Deploy will automatically open any “Endpoints” defined by the publisher as being necessary. In the majority of cases, this setting will remain unchanged. If you have specific requirements, you have the option to add or remove endpoints, as needed.



    Using the VM Depot Easy Deploy feature will enable you to deploy a community provided Virtual Machine image, such as Minecraft, in minutes.

    Once your virtual machine has started, you can download the Minecraft client, connect to your new world server and build to your hearts content.


    Further Reading

    For more information on Minecraft see the following resources:

  • Microsoft UK Faculty Connection

    Unity3d Version 5.2 now with Windows UWP and Visual Studio Support


    Today is an exciting day for cross platform game developers using


    Unity 5.2 brings you Windows 10 and Universal Windows Platform (UWP) build options.

    A UWP app can run on any Windows-based device, including Windows Phones, XboxOne, and Windows 10 PC and Tablet!

    see more details on game development on Windows 10 at and if your interested in cross play for your title between Xbox and PC see

    Native Visual Studio Integration


    Unity 5.2 comes with a much tighter Visual Studio integration for a vastly improved coding and debugging experience on Windows machines.


    The Unity installer will offer to install Visual Studio Community 2015 and Visual Studio Tools for Unity (formerly known as UnityVS). Everything just works out of the box!

    The full low-down

    Check out the Unity 5.2 Release Notes to find out what else is new. We’ve added multiscene lightmap baking, support for 3ds Max’s biped rig, Occlusion Culling improvements and more…

  • Microsoft UK Faculty Connection

    Continuous Integration and testing using Visual Studio Online




    Both Visual Studio Online and Team Foundation Server 2015 make it easy to achieve the Continuous Integration Automation.

    You can see the quick video which shows Continuous Integration workflow and a DevOps walkthrough using Visual Studio 2015

    For the purpose of this blog I am going to walk you through and example of using Visual Studio Online ‘VSO’ with an existing Git repository and then look at some best practices for setting testing and deployments.

    Preliminary requirements

    Setup Visual Studio online via DreamSpark Visual Studio Online is the fastest and easiest way yet to plan, build, and ship software across a variety of platforms. Get up and running in minutes on our cloud infrastructure without having to install or configure a single server.

    Using Visual Studio Online and Git

    1. Create the Team Project and Initialize the Remote Git Repo
    2. Open the Project in Visual Studio, Clone the Git Repo and Create the Solution
    3. Create the Build Definition
    4. Enable Continuous Integration, Trigger a Build, and Deploy the Build Artifacts
    5. Deploying the build artefacts to our web application host server

    Getting Started

    1. Create the Team Project and Initialize the Remote Git Repo

    Create a new team project by Logging onto VSO, going to the home page, and click on the New.. link.


    Enter a project name and description. Choose a process template.

    Select Git version control, and click on the Create Project button.


    The project is created. Click on the Navigate to project button.


    The team project home page is displayed.

    We now need to initialize the Git repo.

    Navigate to the CODE page, and click on Create a ReadMe file.

    The repo is initialized and a Master branch created.

    For simplicity I will be setting up the continuous integration on this branch.


    Below shows the initialized master branch, complete with file.


    2. Open the Project in Visual Studio, Clone the Git Repo and Create the Solution

    Next we want to open the project in Visual Studio and clone the repo to create a local copy.

    Navigate to the team project’s Home page, and click on the Open in Visual Studio link.


    Visual Studio opens with a connection established to the team project.

    On the Team Explorer window, enter a path for the local repo, and click on the Clone button.


    Now click on the New… link to create a new solution.


    Select the ASP.NET Web Application project template, enter a project name, and click on OK.


    Choose the ASP.NET 5 Preview Web Application template and click on OK.

    new web app

    Now add a unit test project by right-clicking on the solution in the solution explorer, selecting the Add New Project option, and choosing the Unit Test Project template. I have named my test project CITest.Tests.

    Your solution should now look like this.


    The UnitTest1 test class is generated for us, with a single test method, TestMethod1. TestMethod1 will pass as it has no implementation.

    Add a second test method,

    TestMethod2, with an Assert.Fail statement.

    This 2nd method will fail and so will indicate that the CI test runner has been successful in finding and running the tests.

       1: using System;
       2: using Microsoft.VisualStudio.TestTools.UnitTesting;
       4: namespace CITest.Tests
       5: {
       6:     [TestClass]
       7:     public class UnitTest1
       8:     {
       9:         [TestMethod]
      10:         public void TestMethod1()
      11:         {
      12:         }
      14:         [TestMethod]
      15:         public void TestMethod2()
      16:         {
      17:             Assert.Fail("failing a test");
      18:         }
      19:     }
      20: }

    Save the change, and build the solution.

    We now want to commit the solution to the local repo and push from the local to the remote. To do this, select the Changes page in the Team Explorer window, add a commit comment, and select the Commit and Push option.


    The master branch of the remote Git repo now contains a solution, comprising of a web application and a test project.

    3. Create a Build Definition

    We now want to create a VSO build definition.

    Navigate to the team project’s BUILD page, and click on the + button to create a new build definition.


    Select the Visual Studio template and click on OK.


    The Visual Studio build definition template has 4 build steps –

    1. Visual Studio Build – builds the solution
    2. Visual Studio Test – runs tests
    3. Index Sources & Publish Symbols – indexes the source code and publishes symbols to .pdb files
    4. Publish Build Artifacts – publishes build artifacts (dlls, pdbs, and xml documentation files)

    For now accept the defaults by clicking on the Save link and choosing a name for the build definition.


    We now want to test the build definition. Click on the Queue build… link.


    Click on the OK button to accept the build defaults.


    We are taken the build explorer. The build is queued and once running we will see the build output.


    The build has failed on the Build Solution step, with the following error message –

    The Dnx Runtime package needs to be installed.

    The reason for the error is that we’re using the hosted build pool and so we need to install the DNX runtime that our solution targets prior to building the solution.

    Return to the Visual Studio and add a new file to the solution items folder. Name the file Prebuild.ps1, and copy the following powershell script into the file.

       1: DownloadString(''))}
       3: # load up the global.json so we can find the DNX version
       4: $globalJson = Get-Content -Path $PSScriptRoot\global.json -Raw -ErrorAction Ignore | ConvertFrom-Json -ErrorAction Ignore
       6: if($globalJson)
       7: {
       8:  $dnxVersion = $globalJson.sdk.version
       9: }
      10: else
      11: {
      12:  Write-Warning "Unable to locate global.json to determine using 'latest'"
      13:  $dnxVersion = "latest"
      14: }
      16: # install DNX
      17: # only installs the default (x86, clr) runtime of the framework.
      18: # If you need additional architectures or runtimes you should add additional calls
      19: # ex: & $env:USERPROFILE\.dnx\bin\dnvm install $dnxVersion -r coreclr
      20: & $env:USERPROFILE\.dnx\bin\dnvm install $dnxVersion -Persistent
      22:  # run DNU restore on all project.json files in the src folder including 2>1 to redirect stderr to stdout for badly behaved tools
      23: Get-ChildItem -Path $PSScriptRoot\src -Filter project.json -Recurse | ForEach-Object { & dnu restore $_.FullName 2>1 }

    The script bootstraps DNVM, determines the target DNX version from the solution’s global.json file, installs DNX, and then restores the project dependencies included in all the solution’s project.json files.

    With the Prebuild.ps1 file added, your solution should now look like this.


    Commit the changes to the local repo and push them to the remote.

    We now need to add a Powershell build step to our build definition.

    Return to VSO, and edit the build definition. Click on the + add build step… link and add a new PowerShell build step.


    Drag the Powershell script task to the top of the build steps list, so that it it is the 1st step to run. Click on the Script filename ellipses and select the Prebuild.ps1 file.

    Click on Save and then Queue build… to test the build definition.


    This time all build steps succeed.


    However, if we look more closely at the output from the Test step, we see a warning – No results found to publish. But we added 2 test methods to the solution?

    The clue is in the second “Executing” statement which shows that the vstest.console was executed for 2 test files – CITest.Tests.dll, which is good. And Microsoft.VisualStudio.QualityTools.UnitTestFramework.dll, which is bad.


    We need to modify the Test build step to exclude the UnitTestFramework.dll file.

    Edit the build definition, select the Test step, and change the Test Assembly path from **\$(BuildConfiguration)\*test*.dll;-:**\obj\** to **\$(BuildConfiguration)\*tests.dll;-:**\obj\**.

    Click on Save and then click on Queue Build…


    The build now fails. But this is what we want to happen. TestMethod2 contains an Assert.Fail() statement, and so we are forcing the Test build step to fail as shown below. We have successfully failed (not something I get to say often), hence proving that the tests are being correctly run.


    4. Enable Continuous Integration, Trigger a Build, and Deploy the Build Artifacts

    We have a working Pre-build step that downloads the target DNX framework, a working Build step that builds the solution, and a working Test step that fails due to TestMethod2.

    We will now set-up the continuous integration, and then make a change to the UnitTest1 class in order to remove (fix) TestMethod2. We will then commit and push the change, which should hopefully trigger a successful build thanks to the continuous integration.

    Edit the build definition, and navigate to the Triggers tab. Check the Continuous Integration (CI) check-box and click on Save.


    Edit the UnitTest1.cs file in Visual Studio, and delete the TestMethod2 method. Commit and push the changes.

    Return to VSO and navigate to the BUILD page. In the Queued list we should now see a build waiting to be processed, which it will be in due course.


    All build steps should now succeed.

    The target DNX version is installed onto the build host. The solution is built. The tests are run. The symbol files are generated. And finally, the build artefacts are published.

    So we have a new build that has been tested.

    5. Deploying the build artefacts to our web application host server

    If we are hosting our web application on Windows Azure, we can add an Azure Web Application Deployment step to our build definition and in so doing have the build artifacts automatically deployed to Azure when our application is successfully built and tested.

    Alternatively, we can manually download the build artefacts and then copy to our chosen hosting server. To do this, navigate to the completed build queue, and open the build. then click on the Artifacts tab and click on the Download link. A .zip file will be downloaded containing the artifacts.

    Test, Test, Test

    So we now have the site built using continuous deployment, now lets look at how we can do testing.


    Prerequisites for executing build definitions is to have your build agent ready, here are steps to setup your build agent, you can find more details in this blog .

    Creating a build definition and select “Visual Studio” template.


    Selecting Visual Studio template will automatically add a Build Task and Unit Test Task. Please fill in the parameters needed for each of the tasks. Build task is straight-forward, it just takes the solution that has to be built and the configuration parameters. As I had mentioned earlier this solution contains product code, unit test code and also automated selenium tests that we want to run as part of build validation.


    Final step is to add the required parameters needed for the Unit Test task – Test Assembly and Test Filter criteria. One key thing you notice below in this task is we take the unit test dll and enumerate all tests in it and run the tests automatically. You can include a test filter criteria and filter on traits defined in test cases if you want to execute specific tests. Another important point, unit tests in Visual Studio Test Task always run on build machine and do not require any deployment/additional setup. See figure 3 below.


    Using Visual Studio Online for Test Management

    1. Setting up machines for application deployment and running tests
    2. Configuring for application deployment and testing
    3. Deploying the Web Site using Powershell
    4. Copy Test Code to the Test Machines
    5. Deploy Visual Studio Test Agent
    6. Run Tests on the remote Machines
    7. Queue the build, execute tests and test run analysis
    8. Configuring for Continuous Integration

    Getting Started

    1. Setting up machines for application deployment and running tests

    Once the Build is done and the Unit tests have passed, the next step is to deploy the application (website) and run functional tests.

    Prerequisites for this are:

    1. Already provisioned and configured Windows Server 2012 R2 with IIS to deploy the web site or a Microsoft Azure Website.
    2. A set of machines with all browsers (Chrome, Firefox and IE) installed to automatically run Selenium tests on these machines.

    Please make sure Powershell Remote is enabled on all the machines.

    Once the machines are ready, go to the Test Hub->Machine page to create the required machine configuration as shown in the below screen shots.

    Enter machine group name and enter the FQDN/IP Address of the IIS/Web Server machine that is setup earlier. You might also need to enter the admin username and password for the machine for all further configurations. The application under test environment, should always be a test environment not a production environment as we are doing integration tests targeting the build.


    For Test Environment, give a name to the test environment and add all IP address of the lab machines that were setup already with the browsers. As I had mentioned earlier test automation system is capable of executing all tests in a distributed way and can scale up to any number of machines (we will have another blog).

    At the end of this step, in machines hub you should have one application under test environment and a test environment, in the example we are using them as “Application Under Test” and “Test Machines” respectively as the names of the machine groups.

    2. Configuring for application deployment and testing

    In this section, we will show you how to add deployment task for deploying the application to the web server and remote test execution tasks to execute integration tests on remote machines.

    We will use the same build definition and enhance it to add the following steps for continuous integration:

    3. Deploying the Web Site using Powershell

    We first need to copy the all the website files to the destination. Click on “Add build step” and add “Windows Machine File Copy” task and fill in the required details for copying the files. Then add “Run Powershell on Target Machine Tasks” to the definition for deploying/configuring the Application environment. Chose “Application under Test” as the machine group that we had setup earlier for deploying the web application to the web server. Choose powershell script for deploying the website (if you do not have deployment web project, create it). Please make sure to include this script in the solution/project. This task executes powershell script on the remote machine for setting up the web site and any additional steps needed for the website.




    4. Copy Test Code to the Test Machines

    As Selenium UI tests which are we are going to use as integration tests are also built as part of the build task, add “Copy Files” task to the definition to copy all the test files to the test machine group “Test Machines” which was configured earlier. You can chose any test destination directory in the below example it is “C:\Tests”



    5. Deploy Visual Studio Test Agent

    To execute on remote machines, you first deploy and configure the test agent. To do that, all you need is a task where you supply the remote machine information. Setting up lab machines is as easy just adding a single task to the work flow. This task will deploy “Test Agent” to all the machines and configures them automatically for the automation run. If the agent is already available and configured on the machines, this task will be a no-op.

    Unlike older versions of Visual Studio – now you don’t need to go manually copy and setup the test controller and test agents on all the lab machines. This is a significant improvement as all the tasks can be done remotely and easily.



    6. Run Tests on the remote Machines

    Now that the entire lab setup is complete, last task is to add “Run Visual Studio Tests using Test Agent” task to actually run the tests. In this task specify the Test Assembly information and a test filter criteria to execute the tests. As part of build verification we want to run only P0 Selenium Tests, so we will filter the assemblies using SeleniumTests*.dll as the test assembly.

    You can include a runsettings file with your tests and any test run parameters as input. In the example below, we are passing the deployment location of the app to the tests using the $(addurl) variable.



    Once all tasks are added and configured save the build definition.

    7. Queue the build, execute tests and test run analysis

    Now that the entire set of tasks are configured, you can verify the run by queuing the build definition. Before queuing the build make sure that the build machine and test machine pool is setup.

    Once the build definition execution is complete, you will get a great build summary with all the required information needed for you to take the next steps.

    As per the scenario, we have completed the build, executed unit tests and also ran Selenium Integration Tests on remote machines targeting different browsers. 

    Build Summary has the following information:


    • A summary of steps that have passed and color coded on the left and details in the right side panel.
    • You can click on each task to see detailed logs.
    • From the tests results, you can see that all unit tests passed and there were failures in the integration tests.



    Next step is to drill down and understand the failures. You can simply click on the Test Results link in the build summary to navigate to the test run results.

    Based on the feedback, we have created a great Test Run summary page with a set of default charts and also mechanism to drill down into the results. Default summary page has the following built-in charts readily available for you - Overall Tests Pass/Fail, Tests by priority, configuration, failure type etc.



    If you want to drill deeper on the tests, you can click on the “Test Results” tab, you will get to see each and every test – test title, configuration, owner, machine where it was executed etc.

    For each failed test, you can click on the “Update Analysis” to analyze the test. In the below summary you notice that IE Selenium tests are failing. You can directly click on “Create Bug” link at the top to file bugs, it will automatically take all test related information metadata from the results and include it in the bug – it is so convenient.



    8. Configuring for Continuous Integration

    Now that the tests are all investigated and bugs filed, you can configure the above build definition for Continuous Integration to run build, unit tests and key integration tests automatically for every subsequent check-in. You can navigate to the build definition and click on Triggers.

    You have two ways to configure:


    • Select “Continuous Integration” to execute the workflow for all batched check-ins
    • Select a specific schedule for validating the quality after all changes are done.

    You can also choose both as shown below, the daily scheduled drop can be used as daily build for other subsequent validations and consuming it for partner requirements.


    Using the above definition, you are now setup for “Continuous Integration” of the product to automatically build, run unit tests and also key integration tests for validating the builds. All the tasks shown above can be used in Release Management workflow as well to enable Continuous Delivery scenarios.


    To summarize what all we have achieved in this walk through:

    1. Created a simple build definition with build, unit testing and automated tests
    2. Simplified experience in configuring machines and test agents
    3. Improvements in build summary and test runs analysis
    4. Configuring the build definition as a “continuous integration” for all check-ins
  • Microsoft UK Faculty Connection

    What is Continuous Integration–GitHub, WebSite Deployment Example



    Continuous Integration is a software development practice where members of a team integrate their work frequently, usually each person integrates at least daily - leading to multiple integrations per day. Each integration is verified by an automated build (including test) to detect integration errors as quickly as possible.

    Many teams find that this approach leads to significantly reduced integration problems and allows a team to develop cohesive software more rapidly. 

    Learning more about Continuous Integration

    Paul Duvall's appropriately titled book on the subject

    The process of Continuous Integration, take a look at Jez Humble and Dave Farley's book

    Why use continuous Integration

    Projects with Continuous Integration tend to have dramatically less bugs, both in production and in process. However I should stress that the degree of this benefit is directly tied to how good your test suite as students and academics you get Access to Visual Studio Online you can learn more about Visual Studio online at Fundamentals of Visual Studio Online via Microsoft Virtual Academy.

    You should find that it's not too difficult to build a test suite that makes a noticeable difference. Usually, however, it takes a while before a team really gets to the low level of bugs that they have the potential to reach. Getting there means constantly working on and improving your tests.

    If you have continuous integration, it removes one of the biggest barriers to frequent deployment. Frequent deployment is valuable because it allows your users to get new features more rapidly, to give more rapid feedback on those features, and generally become more collaborative in the development cycle. This helps break down the barriers between customers and development.

    Continuous Integration - where do you start?

    Some quick best practices are

    1. Get the build automated. Get everything you need into source control get it so that you can build the whole system with a single command. For many projects this is not a minor undertaking - yet it's essential for any of the other things to work. Initially you may only do build occasionally on demand, or just do an automated nightly build. While these aren't continuous integration an automated nightly build is a fine step on the way.

    2. Introduce some automated testing into your build. Try to identify the major areas where things go wrong and get automated tests to expose those failures. Particularly on an existing project it's hard to get a really good suite of tests going rapidly - it takes time to build tests up. see online video learning on Software Testing Fundamentals via Microsoft Virtual Academy

    3. Try to speed up the commit build. Continuous Integration on a build of a few hours is better than nothing, but getting down to that magic ten minute number is much better.

    4. If you are starting a new project, begin with Continuous Integration from the beginning. Keep an eye on build times and take action as soon as you start going slower than the ten minute rule. By acting quickly you'll make the necessary restructurings before the code base gets so big that it becomes a major pain.

    5. Learn from others.  Like any new technique it's hard to introduce it when you don't know what the final result looks like so do as much background research and learning as possible.

    Example of setting up continuous Integration to deploy a Web Site

    Here are the basic steps for getting started with continuous integration

    1. Setting up a GitHub repository
    2. Creating a new Web app in Azure
    3. Setting up continuous integration
    4. Updating and committing new code

    Step 1: Setting up a GitHub repository

    If you don’t currently have a GitHub account, you’ll want to set one up and then log into it to follow the rest of this walkthrough. If you have a brand-new account, or one without any repositories, your browser will look like this:

    Initial screen for new GitHub accounts.

    On this screen, you’ll want to click on the + New Repository button, which will create a new repository for your code in your GitHub account.

    Green + New Repository button for creating new repos.

    Once you’ve clicked the + New Repository button, you’ll see a new screen that allows you to make a new Repository name, a description, select either public or private settings, and whether to initialize with a README (we’ll leave this unchecked for our walkthrough). Once you’re satisfied with your repo’s unique name and descriptive text, click on the Create repository button to create this new repo with no code in it.

    GitHub repository setup screen and options.

    Brand-new GitHub repository waiting for your code.

    We want to make sure that this repository works, so you’ll want to use your favorite Git client to commit code back to the repo you’ve created. There are many different Git clients available, such as GitHub for Windows, posh-git or git bash. Create or use any file that you’d like (and don’t mind sharing publicly) and then use your Git client to commit and push your file into the repository.

    Example of using posh-git client command line to add a file.

    Back in your web browser, refresh to see the newly-added file in your repository.

    The default.html file added via the Git client is now visible in the repository.

    Step 2: Creating a new Web app in Azure


    We’ve confirmed that our GitHub repository is working, since committing and pushing changes is confirmed, so now we work on creating our Web app in Microsoft Azure.

    In your favorite browser, open the Azure website, log in, and then open the Azure Portal.

    Access your account either from the Account or Sign In buttons on the Microsoft Azure site.

    Access the Azure Portal.

    The main Azure Portal, a convenient dashboard for checking your Azure project status.

    From the main Azure Portal, in the lower right-hand corner, click on NEW to start creating a new web app.

    After selecting the NEW button, you’ll be offered a menu full of different things you can create on Azure, but for now, focus on Web+Mobile, and then from the Web+Mobile menu, choose Web app.

    Menus for selecting and creating a new Web app.

    Once you select Web app, a new menu will open up to the right side. You’ll want to enter a URL for your project, and choose to name and create a New AppService Plan. Keep the pricing on F1 Free for now, and create a New Resource Group. Azure will let you know if names are taken, so keep adjusting your naming if it turns out a name you’d like is unavailable. Keep the location used for your cloud development on South Central US for now, though in future projects you’ll be able to customize this if you’d like. Leave the checkbox for Add to Startboard checked, so that it’s easier to access this work in the near future.

    Once everything has been entered to your liking, hit the Create button.

    Completing the details for the new Web app.

    New tile depicting Web app creation in-process.

    Microsoft Azure will work on setting everything up and initial creation of your Web app, and then the tile will appear on your Startboard once it’s ready.

    Click on your web app’s tile to open up the related display, which gives you information about your web app’s live performance. 

    Web app information blade.

    Step 3: Setting up continuous integration

    Now that the app is ready for your work, you’ll want to connect it with your GitHub repository so that changes are synced across both. First thing is to go into Settings.

    Settings menu icon available from the Web app’s main page.

    This opens up a new blade with all of the settings available, and we want to select Deployment credentials. This will open up another new blade next to the Settings blade, called Set deployment credentials. Fill in the appropriate username and password for your deployments to use.

    Blades for Deployment credentials-related settings.

    Now that you’ve set that up, go back to the main blade for your web app and scroll down, far below the Monitoring graph. Toward the bottom, you’ll find a section for Deployment, and a tile for Set up continuous deployment. Go ahead and click on that box and we’ll work on the main setup for continuous integration with GitHub.

    Click the Set up continuous deployment tile to proceed.

    The Continuous Deployment menu blade will open up, so Choose Source, and then from that blade, select GitHub from the list of providers. 

    Select GitHub as your source for this Web app.

    After you’ve selected GitHub as your source, you may be prompted to authorize your GitHub account to enable the connection between GitHub and your Azure web app. Log in with your GitHub credentials (same as those used to create your repository), and then once you’ve authorized the connection between Azure and GitHub, you’ll be able to select the repository you want to integrate.

    Your Continuous Deployment blade will update with the chosen source and authorization name, and then you will need to choose the GitHub repository to integrate with under Choose Project.

    Once you’ve selected your repo, click the OK button in the Continuous Deployment blade.

    Select your repository from the Choose Project blade, click OK in Continuous Deployment.

    After you click OK, your code is fetched from the specified branch in your GitHub repository and deployed to Azure automatically. 

    Fetching code in progress.

    The deployment status will update on the Azure portal’s Deployments blade as soon as the deployment completes, and then you can see the Active Deployment under the portal blade.

    Successful code deployment, and updated Azure information.

    Now click on the Browse button in the Web app blade’s toolbar to see your code running in your Azure Web app, in your browser.

    Click Browse to see your code in action.

    Updated site visible in-browser.

    Step 4: Updating and committing new code

    Now that we’ve confirmed successful deployment of your code, we can work on updating it and confirming that those changes are pushed forward.

    Open the GitHub repository where you committed your code, and then open one of those committed files using the Edit this file toolbar icon.

    In the upper-right corner, click on Edit this file to start making changes on GitHub.

    In our example, we’ll add a new paragraph via GitHub, below our header.

    New paragraph added to the file on GitHub.

    Once you’re satisfied with your code updates, you’ll want to consider adding a commit message explaining the changes. Below the editor area, there’s a Commit changes section where coders can either take the default message, or provide a more detailed explanation to help give context to the changes. When you finish updating the commit message, go ahead and click on the Commit changes button to save your work to the repository.

    Main commit changes section on GitHub.

    Once your code is committed to your repo, continuous integration kicks in and the code on GitHub will be pulled into your Azure Web app and redeployed automatically. If you open up and watch the Azure Portal again after your GitHub commit, the change is quickly pulled in and deployed to your live site.

    Latest changes appear in Azure after GitHub repository’s code is updated.

    Now you can refresh your browser to see your changes live on your Azure web app.

    Updated version of the Web app in-browser

    Advantages and Disadvantages of Continuous Integration

    These are some of the advantages of continuous integration:

    • You catch build breaks early on.
    • In a distributed development environment where developers do not always communicate with one another, continuous integration is a great way to assure the developer that the build he or she is building is the latest one.
    • Continuous integration also causes less regression
    • The feedback loop is smaller.
    • A developer does not have to wait for the end of the day or week to find out how the check-in affected the build.
    • Integration testing moves up in the chain.
    • Every check-in goes through the integration testing where problems are caught early.
    • Continuous integration enforces better development processes.
    • Each developer is held accountable.
    • You always have a latest-and-greatest build to use in demos, showcases, etc.

    On the other hand, there are some disadvantages:

    • Maintenance overhead often increases.
    • Some teams find that the level of discipline required for continuous integration causes bottlenecks.

      This often requires a shift in the developer mindset.

    • The immediate impact of a check-in often causes a backup because programmers cannot check in partially completed code.

    In my next blog I will look at continuous integration using Microsoft Visual Studio and Visual Studio Online.

  • Microsoft UK Faculty Connection

    Microsoft Architectural Patterns and Practices–BluePrint Examples and Resources for Microsoft Azure



    The Microsoft blueprint resources illustrates real-world scenario that uses Microsoft Azure to manage and monitor the data flows.


    These example range from the very small to larger enterprise systems leveraging Azure Big Compute, Big Data, and infrastructure services including Batch, HDInsight, Machine Learning, Virtual Machines, Virtual Network and more to build cost-effective solutions.

    To build your next solution, check out the collection of scenario-based Microsoft Architectural Blueprints and additional resources on the Pattern & Practices GitHub repository which include source code and machine configurations

    Azure in education

    Using Azure in your research or in teaching a course? Microsoft is committed to supporting education and has various programs to meet your needs.


    Empower faculty to leverage Microsoft Azure in teaching cutting edge courses

    See all services

    The Educator Grant is a program designed specifically to provide access to Microsoft Azure to college and university professors teaching advanced courses. As part of the program, faculty teaching Azure in their curricula are awarded subscriptions to support their course.

    To apply for an Educator Grant fill out this simple application form.

    Apply now


    Accelerate the speed of scientific discovery with Microsoft Azure

    The Microsoft Azure for Research program accelerates scholarly and scientific research customized for academic, government, and industry researchers to use big data computations, collaboration, and data-intensive processing in the cloud. Take full advantage of the power and scalability of Microsoft Azure, a platform that supports frameworks like Azure Machine Learning and programing tools including Linux, Python, Java, Hadoop, and Microsoft .NET. Get access to a variety of tools and resources to maximize the benefits of cloud computing by the following:

    • Free access to Azure cloud computing and storage (submit proposals for Azure for Research Awards)
    • Training classes and webinars
    • Technical resources and support
    • Community discussion on LinkedIn (Microsoft Azure for Research group) and Twitter (@Azure4Research)

    For more information, visit

Page 7 of 112 (1,117 items) «56789»