random bits, rants and raves

  • allen@msft

    Remote Test Execution using Team Foundation Server 2015 RC2 and beyond



    The Agents for Visual Studio suite containing the test controller and agents allowed you to execute your tests on a set of remote machines by providing the right configuration in your .testsettings file. Based on the feedback we have received we are providing a new and simpler way for you to do your remote automated testing starting Team Foundation Server (TFS) 2015 RC2.

    In this blog post we will quickly setup a test rig for remote test execution using TFS 2015 RC2



    1. TFS 2015 RC2 installation with a Build vNext agent configured on a machine with internet access.
    2. A (virtual) machine to be used for testing with following requirements


    Step-By-Step Guide

    Head to your TFS homepage and navigate to the team project you want to work with. The below slides will guide you through this entire scenario.



    So what has changed

    There have been a lot of changes to get the stability and resiliency that you demanded in test execution. To summarize the top few improvements

    1. The test controller has been removed as the intermediate hop and the agents now directly communicate with the TFS server using HTTP.
    2. You can now run tests using 3rd party test frameworks like nunit/xunit/cpp using the relevant unit test adapter, similar to Visual Studio.
    3. Automated installation and configuration of your test agent on your test machines as a part of the Build workflow using our tasks in Build vNext.
    4. Improved reporting for automated runs in Test Management.


    What are our limitations

    1. We do distribution on a per test container basis ie if you have 2 machines and 2 test dlls we will execute all tests from testdll1 on machine1 and the rest from testdll2 on machine2.
    2. We don’t have support for workgroup/isolated/azure machines as yet
    3. We do not support Azure VMs as machines in the machine group used for testing
    4. Yes the scenario can be used in VSO as well. However we do not support hosted build pools ie you would have to create a new build pool and register a new build agent with it.



    Why do we have a test agent and a build agent? Why can’t there just be a common agent?

    The build agent does provide you with unit testing facilities but only after you install Visual Studio on your build machine. The test agent has no such prerequisite. This is the sole reason why we have two agents instead of one. We are definitely looking at ways in which we no longer need to make such a distinction.

    Will I be able to run my java tests on my remote Linux/OSX machines using the test agent?

    Today the cross platform functionality is available in Build vNext. We are looking for the right data points before we build the remote testing scenario for other platforms.


    We hope that this post will allow you to get started with the new remote test execution workflow on your on-prem TFS installation. Try it out and let us know what you think


  • allen@msft

    Important timeout tweaks for controller agent


    Based on your network you may run into test run failures associated with the agent/controller communication. This primarily happens due to network latency, high data transfer between controller & agent or high cpu utilization on the agent. In such cases you can tweak some of the timeouts to improve the resiliency of the system

    Increase the timeout setting for Agent timeout:

    [Controller machine] In QTController.exe.config: <drive letter:>\Program Files (x86)\Microsoft Visual Studio <Visual Studio Version>\Common7\IDE\.

        <add key="AgentConnectionTimeoutInSeconds" value="120"/>

        <add key=" AgentConnectionTimeoutInSeconds" value="120"/>

        <add key="AgentDeploymentTimeout" value="300"/>

        <add key="AgentSyncTimeoutInSeconds " value="300"/>



    Increase the value to triple.

    [Agent machine] In QTAgentService.exe.config <drive letter:>\Program Files (x86)\Microsoft Visual Studio <Visual Studio Version>\Common7\IDE

                        Set ControllerConnectionPeriodInSeconds to 90 (making this 3 times too).

    Restart the services.

    Apart from this increasing the bucket size will help reducing communication too.

  • allen@msft

    Getting started with Fakes using Visual Studio


    These are a good set of references to anyone looking into getting started with fakes

  • allen@msft

    Test controller/agent usage recommendations


    Over the years we have had many people ask us on what is the best way to configure and use the test controller and agent for remote test execution. Here is a summary of the best practices that we advocate.

    Setup and Settings

    If you are on the latest VS2012U1+ or VS2013 you can ignore step#1&2

    1. Ensure that Visual Studio 2010 SP1 is installed on all machines
    2. Dev10 SP1 -

    3. Next install the latest patch available

    5. Open "<VSINSTALLDIR>\Common7\IDE\QTControllerService.exe.config" and add the following entry under
    6. <add key="ControllerJobSpooling" value="false"/>

      • Restart Controller
    7. Agent setup
      • Install the latest patch available
      • Open "<VSINSTALLDIR>\Common7\IDE\QTAgentService.exe.config" and add the following

      <add key="RestartTestExecutionProcessForEachRun" value="true"/>

      • Restart Agent
    8. Client Setup
      • Install the latest patch available
      • Open "<VSINSTALLDIR>\common7\ide\mstest.exe.config and add the following key

      <add key="DeleteTestDeploymentFiles" value="yes" />

    9. In your test settings
      • Keep bucket size as 1000. This will result in a test run going to smaller set of agents.


          <Buckets size="1000"/>

    10. Set test run timeouts to ensure no runs go beyond acceptable limits
    11. <Execution>

      <Timeouts runTimeout="7200000" />

      7200000 = 2hrs specified in milliseconds

    12. Run your tests in parallel if your agents are on a multiproc system
    13. <Execution parallelTestCount=0>

    Other Recommendations

    1. Test Controller keeps all results for a run in memory. Therefore it is necessary that test runs be split into logical groups. We recommend you split tests into assemblies so that each test dll has <=5000 tests and/or run the TC in 64bit mode
      1. testcontrollerconfig.exe configure <… configuration parameters… > /platform64

    2. When any component of the Test Rig (any of the agents or controller) faces an issues (network disconnect, deployment failures, test crashes), the current test run is aborted. The probability of the test run getting aborted due to failure in agents reduces when we send one test run to a smaller set of agents. Keep a run limited to minimum number of test assemblies
    3. To scale out you should be using multiple such rigs and breaking up your test runs amongst these rigs as required.
  • allen@msft

    Getting the status of your test agents


    Users routinely want to check the status of the test agents against which they are scheduling their remote test execution. The existing way of doing this was to fire up VS and open the “Manage Test Controllers” dialog

    With Visual Studio 2013 you can now check the status of your agents on the controller itself using the commandline

    TestControllerConfig.exe status [/testController:<testControllerUri>]

    testController              URI of the test controller. Default is localhost:6901

    Sample output

    Microsoft (R) Visual Studio Test Controller Configuration Tool
    Version 12.0…. for Microsoft Visual Studio 2013
    Copyright (c) Microsoft Corporation.  All rights reserved.

    Total number of agents         : 1

    Status of agents :
    Agent name                     : vstfs:///…
    Agent status                   : Disconnected

    Summary of all agents :
    Ready                          : 0
    Running tests                  : 0
    Offline                        : 0
    Deploying build                : 0
    Disconnected                   : 1

    The agent status correspond to the existing known states listed here

  • allen@msft

    Test Controller 2012 Update 3+


    New features introduced

    1. Visual Studio Test Controller 2012 Update3+ now supports backward compat with TFS servers. So you can connect your 2013 test controller to TFS 2012+
    2. The test controller now works with hosted builds having server drop locations
  • allen@msft

    Empty .coverage file with profiler related errors in the event logs


    ​If you find yourself with a an empty .coverage file and see errors similar to the below in your event logs you most probably have a corrupt install

    (info) .NET Runtime version 4.0.30319.17929 - The profiler has requested that the CLR instance not load the profiler into this process. Profiler CLSID: '{b19f184a-cc62-4137-9a6f-af0f91730165}'. Process ID (decimal): 12624. Message ID: [0x2516].

    (Error) TraceLog Profiler failed in initialization due to a lack of instrumentation methods, process vstest.executionengine.x86.exe



    a) Environment variable VS110COMNTOOLS is set to <vsinstalldir>\common7\tools

    b) Regkey HKLM\SOFTWARE\Microsoft\VisualStudio\11.0\InstallDir is set to your <vsinstalldir>\Common7\IDE\

    c) covrun32.dll and covrun64.dll exist in "<vsinstalldir>\Team Tools\Dynamic Code Coverage"

  • allen@msft

    Troubleshooting missing data in Code Coverage Results


    Code Coverage tool in Visual Studio 11 instruments native and managed binaries (DLLs/EXEs) whenever they are loaded at runtime, if they meet some criteria. The code coverage information is collected for these binaries

    At the end of the Code Coverage run you see code coverage within the Code Coverage results window. The total code coverage as well as code coverage for each binary are reported.


    However in some cases you may end up with a .coverage file which shows 0% coverage in the Code Coverage Results window with an error similar to “Empty results generated: No binaries were instrumented …”

    This means code coverage results were not obtained for any binary.


    In this article we will attempt to list down the most common reasons for such a problem and provide you with resolutions for the same.

    1. No tests were executed
    2. PDBs (symbol files) are unavailable/missing
    3. Using an instrumented or optimized binary
    4. Code executed is not managed (.NET) or native (cpp) code
    5. A custom .runsettings file with exclusions is being used



    No tests were executed


    Check your output window – Select “Tests” in the “Show Output from:” dropdown to see if there are any warnings or errors logged.


    The Dev11 Code Coverage engine is dynamic in nature. What this means is that the instrumentation of binaries occurs only for those binaries that are loaded into memory.

    However if none of the tests are executed then there is nothing that is available for the Code Coverage to report.


    Verify if that the tests run fine, without Code Coverage turned on, by clicking on “Run All”. Fix any issues you find here before using “Analyze Code Coverage”



    PDBs (symbol files) are unavailable/missing


    Check that all the modules within the solution have their associated PDBs available alongside the binary


    The Dev11 Code Coverage engine requires that every module have its associated PDB available and accessible during execution. If the PDBs are unavailable we will skip the module and not provide any data for the same.

    Note: PDBs have a direct association with a DLL via build. The DLLs and PDBs should be from the same build for it to be recognized as a valid PDB.


    If PDB are not installed alongside the binary, but are present in some share, configure code coverage settings specifying the path to pick the the PDB from. See “Customizing Code Coverage in Visual Studio 11


    Install the PDB alongside the binary; Or customize code coverage settings by specifying the path to pick the PDB from.



    Using an instrumented or optimized  binary


    Check if the binary has undergone any form of advanced optimization (like the BBT optimization, Profile Guided Optimization, etc.) or has been instrumented by a profiling tool like vsinstr.exe.


    If a binary has already been instrumented or optimized by another profiling tool eg. vsinstr.exe we skip the binary and do not include it as a part of the coverage results.


    Code coverage cannot be obtained for such binaries.



    Use the non-instrumented / non-optimized version of the binary.


    In fact, it is recommended to turn off all optimizations, and run code coverage with non-optimized CHK build to get best results.



    Code executed is not managed (.NET) or native (cpp) code


    Verify if your project has any .net or cpp UTs being run


    Code Coverage in Visual Studio 11 is available only for .net and native (C++) code. If you are working with any language that does not fall into either of these categories Code Coverage will not be available for code in these unsupported languages.


    This can typically happen if you are developing tests with a third party unit test adapter extension.


    None available



    A custom .runsettings file with exclusions is being used


    Verify if you are using a .runsettings file and have specified any exclusion rules that leave out your dlls from being instrumented


    You can run your UTs with a custom .runsettings file with Code Coverage configuration specified in the DataCollectors node. In the Code Coverage configuration we allow exclusions of dlls based on name, company name, public key token etc.You might have, by mistake, excluded all your DLLs, or missed including any of your DLLs here


    Fix your .runsettings to have the minimal set of exclusions that do not exclude your DLLs OR explicitly include the DLLs you want coverage for.



    Further Analysis

    To know why a specific DLL was not included by code coverage, you can analyze the .coverage file generated further using our command line utility, CodeCoverage.exe

    Cd “<VSInstallDir>\Team Tools\Dynamic Code Coverage Tools”

    CodeCoverage.exe analyze /include_skipped_modules my.coverage > analysis.xml

    my.coverage is the .coverage file generated for your test run found in the TestResults folder.


    Here /include_skipped_modules will provide information regarding each and every module that was inspected by the Code Coverage engine and provide a reason why it was skipped. These reason codes are to be interpreted as below:





    See PDBs (symbol files) are unavailable/missing





    See A custom .runsettings file with exclusions



    See Using an instrumented or optimized binary


    No code in the dll could be instrumented.

    See Code executed is not managed (.NET) or native (cpp) code


    The engine itself failed due to some internal error. Please try again.


    If the output file is empty, see “No tests were executed

  • allen@msft

    Lab Management is now available


    Visual Studio 2010 Lab Management is now available for download.


    So how do you get started?

    Install Visual Studio 2010 Team Foundation Server 10.0 and SCVMM Admin Console on a machine.

    Set up a SCVMM Server on a machine.

    Install Visual Studio 2010 Ultimate sku to get the Microsoft Test Manager client and the agents skus. Find the detailed steps @ the msdn download page

    Then download and apply the patch on all your client and server machines (available here). This has all the right fixes making it sturdy to be used in your live environments.


    Soma is excited and so is Brian and so should you. Try it out and send us your feedback while we keep working hard on the next release.

  • allen@msft

    Using the TFS - Best Practice Analyzer (BPA) for Lab Management


    As announced earlier the TFS Best Practice Analyzer (BPA) has been released for RC. You can install those bits from here.

    One of the new and useful features in the RC release is the ability to exclusively run the Lab Management checks.


    This new option will reduce your scan times when debugging lab related issues.

    We are looking for some early feedback on the existing functionality and suggestions moving forward. So download the bits and give it a whirl.

    Note: The pre-req for the lab scan is the VMM Admin Console 2008 R2

Page 1 of 2 (12 items) 12