Benjamin Wright-Jones

Exploring data and distributed systems [I also cross-post to]

  • Benjamin Wright-Jones

    Using PowerShell to deploy Windows Azure Virtual Machines and Windows Azure SQL Databases


    Author: Benjamin Wright-Jones (Microsoft)
    Contributors: Karthika Raman (Microsoft)
    Technical Reviewers: Guy Bowerman (Microsoft), Sanjay Nagamangalam (Microsoft)

    I have recently been exploring the use IaaS (Infrastructure as a Service) to provide cloud-based virtual machines (VM’s) as opposed to laptop-based VM’s and also PaaS (Platform as a Service) for SQL databases.  I like the idea of carrying around a lighter more portable laptop and using cloud services to help me day to day, in contrast to carrying a heavy weight workstation for Hyper-v usage.

    I know we can deploy VM’s through the Azure Portal but I prefer an automated approach.  Fortunately the new PowerShell cmdlets support Azure VM provisioning and also Azure SQL database provisioning (plus some other nice interfaces).  This enables me to quickly spin up a SQL Server VM in Azure or SQL database in Azure.   I am actually quite amazed what is possible with PowerShell and it is my new best friend.  PowerShell ISE in Windows 8 is superb, I highly recommend this as a development environment due to the intellisense and cmdlets search pane integration. 

    Below is the PowerShell script I wrote to provision a VM.  Unfortunately some manual steps are still required if you wish to manage the SQL Server instance remotely through Management Studio e.g. opening firewall ports, enabling TCP etc.   This is all documented here  Fortunately it is a lot easier to connect to an Azure SQL database and we can automate the registration of the provisioned instance in Management Studio.

    Azure VM Provisioning (SQL Server 2012 Evaluation Edition)

    Step 1. Download and register the Azure publishing certificate (one time only event). 

    In order to use PowerShell with the Azure VM and SQL Database services you will need to download and import the publishing file. Fortunately, this is a simple process.  I also store my certificate on Skydrive so I can access it everywhere I go in case I need it again.

    Import-AzurePublishSettingsFile C:\...

    If you don’t import the publishing file then you may see an error similar to below when attempting to access the Azure services.

    An error occurred while making the HTTP request to https://management.core.
    /<server>?op=ResetPassword. This could be due to the fact that the server
    certificate is not configured properly with HTTP.SYS in the HTTPS case. This
    could also be caused by a mismatch of the security binding between the client
    and the server. 
    Step 2. View available subscriptions and set the correct Subscription
    Get-AzureSubscription | Select SubscriptionName
    Select-AzureSubscription –SubscriptionName

    Incidentally, you may also need to associate a specific Azure Storage Vault (ASV) Account with your subscription, for some reason the default value was null so I had to allocate a specific account.  The Azure Storage account is required to host the VM disks which are provisioned during the creation of an Azure VM image.  

    Get-AzureStorageAccount | Select Label
    Set-AzureSubscription -SubscriptionName "Windows Azure MSDN - Visual Studio Ultimate" 
    -CurrentStorageAccount "<storageaccount>"
    Step 3. View the available Azure VM images and locations
    Get-AzureVMImage | Select ImageName
    Get-AzureLocation | Select DisplayName
    Step 4. Create an Azure VM

    I like my VM’s big! ExtraLarge!  You can use the New-AzureQuickVM syntax, New-AzureVM or New-AzureVMConfig syntax,  The New-AzureQuickVM automatically creates and provisions the VM which does not require any additional steps. 

    New-AzureQuickVM -Windows -ServiceName "<azureservice>" -Name "<vmname>" 
    -ImageName "b83b3509582419d99629ce476bcb5c8__Microsoft-SQL-Server-2012-
    Evaluation-CY13Feb-SQL11-SP1-CU2-11.0.3339.0" –Password <password> -Location "North Europe"
    –InstanceSize “ExtraLarge”

    You should see something like this below. 


    If you see a DNS error then you may be trying to provision a under a duplicate service name e.g. Error "DNS name already exists" is misleading as it refers to a duplicate service name so change this in the parameters. This servicename refers to the VM service by which you reference/connect e.g. <vmservice>

    Step 6. Start the Azure VM

    Start the Azure VM using the command below.

    Start-AzureVM -ServiceName "<servicename>" -Name "<vmname>"

    You can view the properties of your Azure VM’s by using Get-AzureVM –ServiceName <ServiceName>.  I would also add that additional data disks can be simply added using the Add-AzureDataDisk syntax providing the ability to simply increase the capacity of the provisioned Azure VM instance.

    Step 7. Download the  remote desktop connection file to your local desktop!

    Another great feature is the ability to automatically download the remote desktop file for the provisioned Azure VM. 

    Get-AzureRemoteDesktopFile -ServiceName "<ServiceName>" -name "<vmname>" -LocalPath "$ENV:userprofile\Desktop\myVm01.rdp"

    It is also possible to automatically launch and connect to your VM instance using the standard Remote Desktop client, mstsc.exe, which just eliminates another step in the process to connect to your Azure VM. 

    mstsc $ENV:userprofile\Desktop\myAzureVm1.rdp

    Of course you could parameterise all of this to make life even simpler. 

    Step 8. Connect!

    Simply enter the username and password off you go, simple.   You could create multiple PowerShell batch files for each Azure VM image type.  I am sticking with the SQL Server 2012 instance for now. 


    Cleaning up the environment is simple too, just a couple of PowerShell commands to stop and remove the provisioned VM.

    Step 9. Stop the Azure VM
    Stop-AzureVM  -ServiceName "<ServiceName>" -Name "<VmName>"
    Step 10. Remove/delete the Azure VM
    Remove-AzureVM -ServiceName "<ServiceName>" -Name "<VmName>"
    Step 11. Remove/delete the Azure VM disks

    The VHD’s associated with the image are not automatically removed so you will need to issue the Remove-AzureDisk command.  You can view the existing VHD’s and the associated image and container using the Get-AzureDisk command as shown below.  You will notice that I only have one disk (VHD) associated to an image.  The other VHD’s were from previous Azure VM deployments. 


    Removing (deleting) the VHD is simple.  The –DeleteVHD parameter is required if you wish to permanently delete the image from ASV so use with caution!

    Remove-AzureDisk –DiskName <diskname> –DeleteVHD


    Step 12. Remove the allocated cloud service
    Remove-AzureService -ServiceName "<servicename>" 
    Step 13. Remember to save your PowerShell script and parameterise it for one-click deployment!

    Azure SQL Database Provisioning

    On a related noted, it is also possible to provision a Windows Azure SQL Database using the new PowerShell cmdlets allowing me to rapidly deploy a cloud-based relational data store.  You must register the Azure publishing certificate unless this has been done previously (as above in step 1.).

    Import-AzurePublishSettingsFile C:\....
    Set-AzureSqlDatabaseServer –ServerName <server> –AdminPassword <password>

    Creating a new Azure SQL Database instance is also easy, an example is provided below.  I chose North Europe due to my geographical location. 

    New-AzureSqlDatabaseServer -location "North Europe" -AdministratorLogin "<login>" 
    -AdministratorLoginPassword "<password>”

    You’ll need to create a firewall rule so you can connect to the new Azure SQL Database instance:

    New-AzureSqlDatabaseServerFirewallRule -ServerName <server> –RuleName <rulename> 
    -StartIPAddress "" -EndIPAddress ""

    As I was working through this, I discovered a VERY useful command which pops up a dialog with the help options for a specific command, an example is shown below.  Omitting the –ShowWindow syntax will output the help details to the console window.

    Get-Help Set-AzureSqlDatabase –ShowWindow

    The next step was to create a SQL Server authenticated connection to the server hosting the Windows Azure SQL Database.  This is an important step as it establishes the context for the connection. 

    #specify sql auth credential
    $servercredential = new-object System.Management.Automation.PSCredential("<username>", 
    ("<password>" | ConvertTo-SecureString -asPlainText -Force)) #create a connection context $ctx = New-AzureSqlDatabaseServerContext –ServerName <servername> -Credential $serverCredential

    Incidentally, if you are wondering what is stored in the connection context then see below:

    ServerName        : <server>
    SessionActivityId : 850d5e6f-7201-4bad-8fd5-331086064d4a
    ClientSessionId   : e8d82a6d-0ed2-4aa3-9c38-3c3da924ab6a-2013-03-13 15:49:35Z
    ClientRequestId   : d2436b67-ecca-453d-8d7c-12619e599784-2013-03-13 16:02:36Z
    Databases         : {master} 

    Wondering what databases you can see?  Easy.

    Get-AzureSqlDatabase -Context $ctx

    Which returns all the databases deployed on the provisioned instance.  In the case below, only the master database was listed as no other databases are currently deployed.

    Name          : master
    CollationName : SQL_Latin1_General_CP1_CI_AS
    Edition       : Web
    MaxSizeGB     : 5
    CreationDate  : 12/03/2013 22:35:57

    Want a new database?  Easy again.

    New-AzureSqlDatabase -Context $ctx –DatabaseName <databasename>-Collation SQL_Latin1_General_CP1_CI_AS 
    -Edition Web -MaxSizeGB 5

    Interestingly, there are a host of DataServiceContext class options made available under the context of the connection such as ServerMetrics and DatabaseMetrics.  This provides some interesting insight into the metadata for your Azure SQL database server such as throttled connections and failures.   Unfortunately, the context commands are not documented right now so this is just exploratory and the exposed properties may be removed in the future. 


    Beyond the ability to provision a Windows Azure SQL Database using PowerShell cmdlets, I can also save time by automatically registering the Azure SQL instance in SQL Server Management Studio by invoking the SQL Server 2012 PowerShell command New-Item as below (thereby saving even more time!).  The AzureSqlDbServer1 reference is the friendly name which appears in the SQL Server Management console.

    Import-Module sqlps
    Cd "sqlregistration\Database Engine Server Group"
    New-Item AzureSqlDbServer1 -ItemType Registration -Value "server=<server>; 
    security=false; userid=<username>; password=<password>; initial catalog=<databasename>"


    Removing (or de-provisioning) the Azure SQL database, instance and Management Studio registration is simple.  The last command, Remove-Item, is a SQL Server PowerShell command to delete the Management Studio server registration and this must be invoked using sqlps as above.

    Remove-AzureSqlDatabase $ctx –DatabaseName "<dbname>"
    Remove-AzureSqlDatabaseServer -ServerName "<AzureSqlDbServer>"
    Import-Module sqlps
    Remove-Item AzureSqlDbServer1

    Closing Remarks

    The new PowerShell cmdlets for Azure are a fantastic way to easily provision either VM’s or a database in the cloud.  I will be parameterising my scripts (and including try.. catch blocks) to quickly create an Windows Azure Virtual Machine or Windows Azure SQL Database as needed (one-click deployment made easy!).  PowerShell ISE is also an excellent development environment which can be leveraged for not only Azure VM or SQL database provisioning but also for many more solution scenarios. 

    What is missing?

    • Provisioning Azure HDInsight clusters is not currently possible however this should be coming soon (refer to Programmatic Cluster Management). 
    • PowerShell cmdlets for Azure SQL Reporting are not currently available.
    • The ability to provision an Azure VM image with the full business intelligence stack deployed i.e. SharePoint 2013, PowerView and Power Pivot integration. 
    • PowerShell remoting to be automatically enabled in the Azure VM.
    • The ability to invoke SqlCmd and query Azure SQL databases through PowerShell (inc. support for Federations). 


    Managing Windows Azure SQL Databases with PowerShell

    Windows Azure SQL Database Management with PowerShell(

    Getting Started with SQL Server on a Windows Azure Virtual Machine

  • Benjamin Wright-Jones

    Building a data mart to analyse web log traffic

    I recently had the opportunity to build a data mart in order to analyse web logs from IIS servers.  The solution utilised SQL Server 2012 Database Engine, Integration Services, Analysis Services and Excel 2010 (to slice/dice the data).   

    Loading IIS logs (text files with fixed width spaces) was fairly challenging due as the column definitions would alter throughout.  I therefore used the .NET StreamReader class to read the metadata and detect changes in the fields. 

    The data flow is shown below.


    A snippet of the code in the Load Data (Source Script Component) is presented below

    Code Snippet
    1. // Get variables
    2. strSourceFile = Variables.vCurrentSourceFileName;
    3. intAuditLogFileIdentifier = Variables.vAuditExecutionFileIdentifier;
    5. try
    6. {
    7.     // Create an instance of StreamReader to read from a file.
    8.     // The using statement also closes the StreamReader.
    9.     using (StreamReader sr = new StreamReader(strSourceFile))
    10.     {
    11.         String line;
    12.         int intNumberOfFields = 0;
    13.         string[] strListOfFields = null;
    15.         Trace.WriteLine("Log File: " + strSourceFile);
    17.         // Output the source file name as the first line (debugging purposes)
    18.         OutputLogFileRawDataBuffer.AddRow();
    19.         OutputLogFileRawDataBuffer.colRawData = strSourceFile;
    21.         // Read and display lines from the file until the end of the file is reached.
    22.         while ((line = sr.ReadLine()) != null)


    Extracting the data from the file was relatively straightforward.  I placed the string into an array based on the fixed spacing between fields.  From the data, I was able to extract useful information such as browser type i.e. Safari, Chrome, IE and even browser version. 

    IP addresses were mapped to geolocation using the free GeoLite information (CSV data imported into the database).  I converted longitude and latitude to the spatial geography data type and presented this against a world map (I wanted to validate that the GeoLite data correctly mapped to the IP address e.g. country/city to IP address).

       1:  USE BiKitchen;
       3:  DECLARE @SRID int = 4326
       4:  DECLARE @pLat nvarchar(max)
       5:  DECLARE @pLong nvarchar(max)
       6:  DECLARE @g geography
       8:  -- Check longitude and latitude for London
       9:  SET @pLat = (SELECT CAST(Latitude AS nvarchar(max)) FROM [GeoData].[GeoLiteCity-Location] WHERE locid = 13547)
      10:  SET @pLong = (SELECT CAST(longitude AS nvarchar(max)) FROM [GeoData].[GeoLiteCity-Location] WHERE locid = 13547)
      12:  SET @g =  geography::STPointFromText('POINT(' +
      13:          @pLong + ' ' +
      14:          @pLat + ')', @SRID).BufferWithTolerance(11000, 1000, 0)
      16:  SELECT @pLat AS Latitude, @pLong AS Longitude
      17:  -- SELECT @g.Lat, @g.Long
      19:  -- Map the geography type to base world map data
      20:  -- View the result in the spatial tab to validate coordinates
      21:  SELECT @g AS spatiallocation
      22:  UNION ALL SELECT geog FROM World_Borders

    The star schema was built and deployed in SQL Server 2012 Analysis Services (UDM).  I found named calculations to be incredibly powerful way of extending the data model and making attributes more meaningful for end-users


    The data was presented using Excel 2010, a screenshot is shown below.  I found slicers to be extremely useful




    I thought it would be interesting to see what Excel 2013 had to offer so I tried to create a Power View report but this is not currently supported against the UDM.   There are however some nice enhancements to chart types so I’ll be looking at this in more detail.


  • Benjamin Wright-Jones

    Redux: Using an SSIS package to monitor and archive the default trace file


    I’ve recently been asked for details regarding the SSIS package I wrote to copy and archive the default trace file which is generated by SQL Server.  The contents of the file can be quite useful for troubleshooting or diagnosis purposes.

    I’ve updated the package to work with SQL Server 2008 R2 and SQL Server 2012.  

    The detection of a new trace file is implemented using a Script Task which watches the \Log folder for new files.  When a new file is detected it copies the previous file to an archive location.   The logic is embedded in Visual Basic.NET (not my personal choice although I was limited in SQL Server 2005 when I wrote the package). 


    The archive process renames the file with the date and time and then copies the file to a chosen location.   I should point out that I use expressions on a few variables to alter outputs such as the filename i.e. date_time_filename. 

    I also noticed that the service needs appropriate permissions to both access the \LOG directory and also copy to the target directory, in my scenario, this was \\server\share.  When I was testing, I launch SSDT (SQL Server Data Tools) using Administrator privileges for testing purposes as a quick workround to permission issues).


    Here is the code for the Script Task (apologies for the word wrap, the Live Writer plug-in seems to do this to fit it on the page).  I have commented out some of the writeline commands I was using to debug the package when it was initially developed.

    Code Snippet
    1. 'Disclaimer:
    2.     'The sample scripts and SSIS package are not supported under any Microsoft standard support program or service.
    3.     'The sample scripts and SSIS package are provided AS IS without warranty of any kind.
    4.     'Microsoft further disclaims all implied warranties including, without limitation, any implied warranties of merchantability or of fitness for a particular purpose.
    5.     'The entire risk arising out of the use or performance of the sample scripts and documentation remains with you.
    6.     'In no event shall Microsoft, its authors, or anyone else involved in the creation, production, or delivery of the scripts be liable for any damages whatsoever (including, without limitation, damages for loss of business profits,
    7.     'business interruption, loss of business information, or other pecuniary loss) arising out of the use of or inability to use the sample scripts or documentation, even if Microsoft has been advised of the possibility of such damages.
    9.     Public Sub Main()
    11.         Dim vars As Variables
    12.         Dim strComputer As String
    13.         Dim objWMIService As Object
    14.         Dim colMonitoredEvents As Object
    15.         Dim objEventObject As Object
    16.         Dim strSourceDirectory As String
    17.         Dim strServerName As String
    18.         Dim strSourceErrorLogDirectory As String
    19.         Dim strSourceErrorLogDirectoryWithQuotes As String
    21.         Try
    22.             ' Use the SSIS variables in this code for the WMI query
    23.             strServerName = Dts.Variables("v_ServerName").Value
    24.             strSourceErrorLogDirectory = Dts.Variables("v_DT_SourceLogDirectory").Value
    26.             Console.WriteLine("Servername: " + strServerName)
    27.             Console.WriteLine("Monitoring \Log Directory: " + strSourceErrorLogDirectory)
    29.             ' Replace \ with \\\\ which is needed for the WMI query
    30.             strSourceErrorLogDirectory = Replace(strSourceErrorLogDirectory, "\", "\\\\")
    31.             strSourceErrorLogDirectoryWithQuotes = Chr(34) & strSourceErrorLogDirectory & Chr(34)
    33.             'MsgBox("Server Name: " + strServerName)
    35.             ' Connect to the WMI source
    36.             objWMIService = GetObject("winmgmts:\\" & strServerName & "\root\cimv2")
    38.             ' Monitor the error log folder for instances of ERRORLOG.1 as this is the file we want to archive
    39.             ' The directory name is parameterised and populated from the SSIS variable
    41.             ' Monitor the directory for new default trace files
    42.             colMonitoredEvents = objWMIService.ExecNotificationQuery _
    43.                 ("SELECT * FROM __InstanceCreationEvent WITHIN 10 WHERE " _
    44.                     & "Targetinstance ISA 'CIM_DirectoryContainsFile' and " _
    45.                         & "TargetInstance.GroupComponent= " _
    46.                                 & "'Win32_Directory.Name=" & strSourceErrorLogDirectoryWithQuotes & "'")
    48.             objEventObject = colMonitoredEvents.NextEvent()
    50.             'MsgBox("A new file was just created: " + objEventObject.TargetInstance.PartComponent)
    52.             Dim strReturned, strFilePath As String
    54.             strReturned = objEventObject.TargetInstance.PartComponent
    55.             strFilePath = Split(strReturned, "CIM_DataFile.Name=")(1)
    56.             strFilePath = Replace(strFilePath, """", "")
    57.             strFilePath = Replace(strFilePath, "\\", "\")
    58.             'MsgBox("Sliced file: " + strFilePath)
    60.             ' strFilePath is C:\Program Files\Microsoft SQL Server\MSSQL.1\MSSQL\LOG\log_nnn.trc
    61.             ' Array element (6) is log_nnn.trc, this assumes the above directory structure
    62.             Dim strFilename As String
    64.             'strFilename = Split(strFilePath, "\")(6)
    65.             'MsgBox("Split: " + strFilename)
    67.             strFilename = System.IO.Path.GetFileName(strFilePath)
    68.             'MsgBox("IO.Path: " + strFilename)
    70.             ' If filename like log_ then enter this code path
    71.             ' The default trace filename is always log_ so we can rely on this for filename matching
    72.             If strFilename Like "log_*.trc" Then
    74.                 Console.WriteLine("A new default trace file was just created in \LOG called " + strFilename)
    75.                 Trace.WriteLine("A new default trace file was just created in \LOG called " + strFilename)
    77.                 ' Archive the previous default trace file
    78.                 'MsgBox("Default Trace found, now process the file")
    80.                 Dim arrContainer As Array
    81.                 Dim intTraceFileNumber As Integer
    82.                 Dim strArchiveFileName As String
    84.                 arrContainer = Split(strFilename, "_")
    86.                 'Console.WriteLine(arrContainer(0).ToString)
    87.                 'Console.WriteLine(arrContainer(1).ToString)
    89.                 ' Split 1111.trc so we only store 1111 to convert to int
    90.                 arrContainer = Split(arrContainer(1), ".")
    92.                 ' This is the active default trace file number
    93.                 'Console.WriteLine(arrContainer(0).ToString)
    95.                 ' Convert the active trace file number to int and decrease by 1
    96.                 intTraceFileNumber = CInt(arrContainer(0)) - 1
    98.                 ' Convert back to string and create the default trace file name
    99.                 strArchiveFileName = "log_" + CStr(intTraceFileNumber) + ".trc"
    100.                 'Console.WriteLine("Archiving: " + strArchiveFileName + " to " + Dts.Variables("v_DT_DestinationDefaultTraceDirectory").Value)
    102.                 'MsgBox(strArchiveFileName)
    104.                 'Write the filename to the SSIS variable
    105.                 Dts.Variables("v_DT_ActiveFileName").Value = strArchiveFileName
    107.                 Console.WriteLine("Archiving: " + strArchiveFileName + " to " + Dts.Variables("v_DT_DestinationDefaultTraceDirectory").Value)
    108.                 MsgBox("Output to SSIS Variable: " + Dts.Variables("v_DT_ActiveFileName").Value + " strFilename: " + strArchiveFileName)
    110.                 ' Indicate success to move on to the next step
    111.                 Dts.TaskResult = ScriptResults.Success
    112.             End If
    114.             ' Error handling
    115.         Catch ex As Exception
    116.             Console.WriteLine(System.DateTime.Now.ToString + " - SSIS Script Task Error: " + ex.Message.ToString)
    117.             Dts.TaskResult = ScriptResults.Failure
    118.         End Try
    120.     End Sub


    I hope you find this useful.

    I will try and attach the SSIS package to this post later.

  • Benjamin Wright-Jones

    Redux: Using a C# script task in SSIS to download a file over http


    A few people have asked for further information about the C# script task which I blogged about (quite a while ago).  I mistakenly forgot to add the full source code, sorry everyone.  Here is the link to the original blog post:

    I have since imported the SSIS package into Visual Studio 2010 (BIDS) and the code compiles without error.  Some of the code below is truncated on the right (just a formatting issue I need to resolve) but the core of the code is there.  

    As always, let me know if there are any problems. 

       Download a file over http using Script Task in SQL Server 2008 R2 Integration Services.   
    Two key variables, vSSOReportURL, which is constructed in a prior Script Task e.g. http://www..
    vSSOLocalFileName, which is the fully qualified reference for the downloaded file e.g. c:\

    */ using System; using System.Data; using Microsoft.SqlServer.Dts.Runtime; using System.Windows.Forms; using System.Net; using System.Net.Security; namespace ST_7e897e41dd5945f3b77366d32f0a97e0.csproj { [Microsoft.SqlServer.Dts.Tasks.ScriptTask.SSISScriptTaskEntryPointAttribute] public partial class ScriptMain : Microsoft.SqlServer.Dts.Tasks.ScriptTask.VSTARTScriptObjectModelBase { #region VSTA generated code enum ScriptResults { Success = Microsoft.SqlServer.Dts.Runtime.DTSExecResult.Success, Failure = Microsoft.SqlServer.Dts.Runtime.DTSExecResult.Failure }; #endregion public void Main() { WebClient myWebClient; string RemoteURI; string LocalFileName; bool FireAgain = true; Variables vars = null; Dts.VariableDispenser.LockForRead("User::vSSOReportURL"); Dts.VariableDispenser.LockForRead("User::vSSOLocalFileName"); Dts.VariableDispenser.LockForWrite("User::vSSOReportURLIndicator"); Dts.VariableDispenser.GetVariables(ref vars); try { // Ignore certificate warnings ServicePointManager.ServerCertificateValidationCallback = new RemoteCertificateValidationCallback(delegate { return true; }); // Initiate webclient download, use default credentials (current login) myWebClient = new WebClient(); myWebClient.Credentials = CredentialCache.DefaultCredentials; RemoteURI = vars["User::vSSOReportURL"].Value.ToString(); LocalFileName = vars["User::vSSOLocalFileName"].Value.ToString(); // Log provider notification Dts.Events.FireInformation(0, String.Empty, String.Format("Downloading '{0}' from '{1}'", LocalFileName, RemoteURI), String.Empty, 0, ref FireAgain); // Download the file myWebClient.DownloadFile(RemoteURI, LocalFileName); // Set report URL indicator, this is used to determine the http source of the // download i.e. vSSOReportURL or vSSOReportURLRetry for the message which is // written to the table vars["User::vSSOReportURLIndicator"].Value = 0; // Return success Dts.TaskResult = (int)ScriptResults.Success; } catch (Exception ex) { // Catch and handle error Dts.Events.FireError(0, String.Empty, ex.Message, String.Empty, 0); Dts.TaskResult = (int)ScriptResults.Failure; } } } }

    The files are extracted using an Execute Process Task (with 7-Zip) as shown below:


    And the arguments are set using the expression (below).  There are probably better ways of doing this but I found this worked well.


    The .zip file is then archived using a File System task and the extracted file is renamed to .xlsx. 

  • Benjamin Wright-Jones

    Are you interested in Data Science?


    The University of Dundee is now gauging interest in a number of data science focused courses (see below).


    You can register interest via this link

  • Benjamin Wright-Jones

    Testing SQL Server HA using Hyper-V in Windows 8 Release Preview


    I often find the need to build and test SQL Server clusters to support native two-node or N+1 (multi-instance) scenarios and also validate SQL Server 2012 Always On configurations.  I was previously running Windows Server 2008 R2 on my laptop but I a few issues using this as a day to day operating system e.g. no support for standby when hyper-v was enabled, no dual boot (hyper-v on/off) with bitlocker enabled, no Bluetooth support etc.  Fortunately, Windows 8 meets all my needs and the experience has been excellent to date.

    Note: for rapid provisioning of other images, I created a fully patched base OS image which was sysprep’d (this is incredibly easy).  I now just copy the image if I want to create other server roles e.g. System Center 2012.

    The only issue I have found to date is that host internet connectivity is affected (delayed) after defining multiple internal network adapters, therefore I switched to using Private network adapters.   I only need internal adapters for host to guest connectivity e.g. copying files etc.

    My hardware and software is defined below:

    • Laptop HP 8540w (16GB memory, 4 cores hyper-threaded)
    • Two internal SATA disks (hybrid)
    • Windows 8 Release Preview (x64) with Hyper-V enabled

    The environment consists of the following virtual guest images:

    • 1 x Domain Controller (also hosts the virtual storage)
    • 1 x Primary Node (node 1)
    • 1 x Secondary Node (node 2)

    All servers are running Windows Server 2008 R2 Enterprise Edition x64 with Service Pack 1.  The cluster role is enabled on node 1 and node 2. 


    The shared storage is provisioned using the iSCSI Software Target which I defined on the domain controller.  I didn’t find a need to dedicate a specific storage server.  The screenshot of the virtual storage is presented below




    Cluster Validation is good, the only warning was inconsistent OS patch levels on both cluster nodes as shown below.



    After running cluster validation, I created a cluster and the final configuration is shown below. 

    Below is a screenshot of the raw cluster configuration.


    I’ll blog more following the SQL Server install.  I also plan to repeat this for Window Server 2012 and SQL Server 2012.

  • Benjamin Wright-Jones

    Transposing Columns onto Rows


    After a long period of absence, I have returned to posting some blog articles.  This one popped up last week from an Oracle DBA while I was onsite with a customer.  The conversation went something along the lines of:

    Oracle DBA: “How can I place columns values on rows? I have a limited page width and need to alter the output”

    Microsoftie “ah.. well, that’s easy..”

    Consider the output below:



    but we want it to appear like this:



    The magic is really in the UNPIVOT function as shown below.

    CREATE DATABASE sandbox;
    USE sandbox;
    CREATE TABLE tblPerson
        Email_Address varchar(50),
        First_Name varchar(50),
        Last_Name varchar(50)
    ('', 'Ben', 'WJ')
    SELECT * FROM tblPerson;
    SELECT tblPivot.Property, tblPivot.Value FROM (SELECT CONVERT(sql_variant,Email_Address) AS Email_Address, CONVERT(sql_variant,First_Name) AS First_Name, CONVERT(sql_variant,Last_Name) AS Last_Name FROM tblPerson) Person UNPIVOT (Value For Property In (Email_Address, First_Name, Last_Name)) as tblPivot;
  • Benjamin Wright-Jones

    BI Service Applications in SharePoint 2010


    One of my colleagues, Chris Bailiss, has written a number of great articles describing how BI service applications authenticate in SharePoint 2010.  If you are interested then head over here to part 1 (link below) now:

  • Benjamin Wright-Jones

    Two new SQLCAT papers available, Spinlock and Latch Contention


    Diagnosing and Resolving Spinlock Contention on SQL Server

    Diagnosing and Resolving Latch Contention on SQL Server

  • Benjamin Wright-Jones

    Whitepaper Alert: Microsoft EDW Architecture, Guidance and Deployment Best Practices



  • Benjamin Wright-Jones

    Creating custom reports against the System Center Operations Manager 2007 R2 data warehouse


    It is possible to create custom reports, using SQL Server 2008 Reporting Services, against the System Center 2007 R2 data warehouse. 

    Reports can be designed in SQL Server 2008 Reporting Services to provide the end user with a better representation of the data since the built-in System Center reports are not ideal. 

    Fortunately the OperationsManagerDW database schema is documented on MSDN along with code samples which makes the task of creating a custom report a little easier. 

    Stored procedures were implemented to return the data from the OperationsManagerDW schema.  I prefer this approach as it provides more control over the code rather than embedding logic in the report itself.  Incidentally, the stored procedures were hosted in a separate database to avoid support issues embedding these in OperationsManagerDW database.

    An example of the stored procedure logic to query the OperationsManagerDW is below:

       1: /********************************************************************************************
       2:  *
       3:  *    (c) Microsoft 2010  All rights reserved
       4:  *
       5:  *    The code contained in this file is provided "as is" without any warranty of any kind.
       6:  *    The code is for reference purposes only and must not be relied on in connection with 
       7:  *    any operational purposes.
       8:  *    Please refer to the terms and conditions which cover the provision of consulting
       9:  *    services to you.
      10:  *
      11:  ********************************************************************************************
      12:  *
      13:  *          The following parameters are required:
      14:  *          pSlot            (nvarchar, 255)
      15:  *            pCurrentDate    (datetime)
      16:  *
      17:  ********************************************************************************************
      18:  *
      19:  *          Stored Procedure Creation Script
      20:  *          [usp_ReportPerfCounterHealth]
      21:  *
      22:  *===========================================================================================
      23:  * Modification History
      24:  *-------------------------------------------------------------------------------------------
      25:  * Verion    Date        Author                Description
      26:  *-------------------------------------------------------------------------------------------
      27:  * 01.00.00  10/01/2011  B Wright-Jones        Created
      28:  ********************************************************************************************/
      30: CREATE PROCEDURE [Schema].[usp_ReportPerfCounterHealth]
      31:     @pSlot nvarchar(255), 
      32:     @pCurrentDate datetime
      33: AS
      35:     SET NOCOUNT ON;
      37:     SELECT    
      38:             [vME].[Path] AS [ServerName], 
      39:             [vPR].[ObjectName], 
      40:             [vPR].[CounterName], 
      41:             [vPRI].[InstanceName], 
      42:             AVG(SampleValue) AS Average, 
      43:             MIN(SampleValue) AS Minimum, 
      44:             MAX(SampleValue) AS Maximum,
      45:             SUM(SampleValue) AS SumOfValue
      47:     FROM
      48:             [dbo].[vPerformanceRule] vPR
      50:             INNER JOIN    [dbo].[vPerformanceRuleInstance] vPRI 
      51:             ON            vPR.[RuleRowId] = vPRI.[RuleRowId]
      53:             INNER JOIN    [Perf].[vPerfRaw] vPRW
      54:             ON            [vPRI].[PerformanceRuleInstanceRowId] = [vPRW].[PerformanceRuleInstanceRowId] 
      56:             INNER JOIN    [dbo].[ManagedEntity] vME
      57:             ON            [vPRW].[ManagedEntityRowId] = [vME].[ManagedEntityRowId] 
      59:     WHERE
      60:             [vPRW].[DateTime] BETWEEN DATEADD(Hh, -24, @pCurrentDate) AND @pCurrentDate            -- Filter for the last 24 hours
      61:             AND vPR.[CounterName] = '<Performance Counter goes here>'                            -- Filter for the specific performance monitor counter
      62:             AND vPRI.[InstanceName] = '<Instance goes here>'                                    -- Filter for the specific slot
      64:     GROUP BY
      65:             [vME].[Path], [vPR].[ObjectName], [vPR].[CounterName], [vPRI].[InstanceName];        

    The design time report is shown below.  The line graph shows specific counter values over the period of a month e.g. processor utilisation, web service requests etc. whereas the tablix controls display critical events, uptime, hits and so on for each server in the farm. 


    A datetime parameter was implemented in order to allow the user to select a date from the calendar control


    The uptime field displays the result of a custom counter which is the number of seconds since the service was started.  I implemented a vb function (shown below) on the report body to transform this into days, hours, minutes which is more readable for the end user. 


    The expression references the custom code using the following syntax:


    The Visual Basic function to format the seconds as days, hours, minutes is below:

       1: Function SecondsToText(Seconds) As String
       2: Dim bAddComma As Boolean
       3: Dim Result As String
       4: Dim sTemp As String
       5: Dim days As String
       6: Dim hours As String
       7: Dim minutes As String
       9: If Seconds <= 0 Or Not IsNumeric(Seconds) Then 
      10:      SecondsToText = "0 seconds"
      11:      Exit Function
      12: End If
      14: Seconds = Fix(Seconds)
      16: If Seconds >= 86400 Then
      17:   days = Fix(Seconds / 86400)
      18: Else
      19:   days = 0
      20: End If
      22: If Seconds - (days * 86400) >= 3600 Then
      23:   hours = Fix((Seconds - (days * 86400)) / 3600)
      24: Else
      25:   hours = 0
      26: End If
      28: If Seconds - (hours * 3600) - (days * 86400) >= 60 Then
      29:  minutes = Fix((Seconds - (hours * 3600) - (days * 86400)) / 60)
      30: Else
      31:  minutes = 0
      32: End If
      34: Seconds = Seconds - (minutes * 60) - (hours * 3600) - _
      35:    (days * 86400)
      37: If Seconds > 0 Then Result = Seconds & " second" & AutoS(Seconds)
      39: If minutes > 0 Then
      40:     bAddComma = Result <> ""
      42:     sTemp = minutes & " minute" & AutoS(minutes)
      43:     If bAddComma Then sTemp = sTemp & ", "
      44:     Result = sTemp & Result
      45: End If
      47: If hours > 0 Then
      48:     bAddComma = Result <> ""
      50:     sTemp = hours & " hour" & AutoS(hours)
      51:     If bAddComma Then sTemp = sTemp & ", "
      52:     Result = sTemp & Result
      53: End If
      55: If days > 0 Then
      56:     bAddComma = Result <> ""
      57:     sTemp = days & " day" & AutoS(days)
      58:     If bAddComma Then sTemp = sTemp & ", "
      59:     Result = sTemp & Result
      60: End If
      62: SecondsToText = Result
      63: End Function
      66: Function AutoS(Number)
      67:     If Number = 1 Then AutoS = "" Else AutoS = "s"
      68: End Function

    The report can be executed directly by passing parameters in the URL as documented here  This was useful to test the report execution.  I used two parameters, an example of this URL structure is shown below:


    This can be achieved both in native and SharePoint integrated mode. 

    The fact that the OperationsManagerDW schema is documented has made the whole process of creating a custom report a lot easier and the flexibility of Reporting Services provides a better representation of the data to the end-user.

  • Benjamin Wright-Jones

    Using SQL Server 2008 R2 Best Practice Analyzer against Analysis Services


    SQL Server 2008 R2 BPA is an excellent and often under-used, tool.   The title is a little misleading but it does work against SQL Server 2008 instances, not just R2 instances.

    I recently had the chance to use BPA against SQL Server 2008 Analysis Services.  SQL Browser is disabled by default.

    I entered the server\instance with the port number and ran the BPA tool only to be told that the incorrect SQL Server version was in use.  Very strange. 


    I checked PowerShell 2.0 was installed.   I also had administrative permissions on both the OS and SSAS instance.

    It turns out that the SQL Server Browser service must be running in order to discover the instance.  This is now filed as a bug which should hopefully be fixed in the next release. 

    The output of BPA is really useful and the rules point to knowledge base articles


  • Benjamin Wright-Jones

    Debugging Parameters in the SSIS Data Flow (Script Component)


    I had unfortunately picked up the (bad) habit of debugging my SSIS packages by using at design time.  The SSIS data flow Script component does not allow debugging using this approach so I had to find an alternative.  I discovered that it is possible to output parameters at design time using the FireInformation in the Script component.  This puts the parameters in the progress window.

    // Output parameters into the SSIS progress window
    bool FireAgain = true;
    this.ComponentMetaData.FireInformation(0, "DataFlowDebug", "The value of time is " + Row.ParamValueTime.ToString(), "", 0, ref FireAgain);
    this.ComponentMetaData.FireInformation(0, "DataFlowDebug", "The value of environment is " + Row.ParamValueEnvironment.ToString(), "", 0, ref FireAgain);

    I like this approach although I should probably output the parameters to one of the other console windows.

  • Benjamin Wright-Jones

    Editing XML using PowerShell


    I recently had a requirement to edit an XML document using PowerShell  Fortunately, this is a relatively easy task as shown below.  I’ve used a system environment variable %USERPROFILE% to populate the path to the file.

    The script modifies a value in the XML document (highlighted in the example below) and ensures this is persisted using the .Save method.  The intention is to execute this script during user logon to ensure the value is always disabled in the application, ProClarity Web Professional 6.3 SP3.


    REM Code changes CheckQuerySize=”Yes” to CheckQuerySize=”No”

    REM Windows XP
    $xmlFile = “$env:userprofile\Application Data\ProClarity\Client\Options.xml”
    $xmlDoc = [XML](gc $xmlFile)
    $xmldoc.ProClarity5Options.ProClarityApp.CheckQuerySize = "No"

    REM Windows Vista and Windows 7
    $xmlFile = “$env:userprofile\AppData\Roaming\ProClarity\Client\Options.xml”
    $xmlDoc = [XML](gc $xmlFile)
    $xmldoc.ProClarity5Options.ProClarityApp.CheckQuerySize = "No"

  • Benjamin Wright-Jones

    SQL Server 2008 Service Pack 2 CTP


    Just noticed that we have posted SQL Server 2008 Service Pack 2 CTP on, here is the link

    What's New:

    • SQL Server Utility The SQL Server 2008 R2 Utility supports SQL Server 2008 instances that have SP2 installed. The SQL Server Utility models SQL Server-related entities in a unified view. Utility Explorer and SQL Server Utility viewpoints in SQL Server Management Studio provide administrators a holistic view of resource health through an instance of SQL Server that serves as a utility control point (UCP). For more information about the SQL Server Utility, see Managing the SQL Server Utility.
    • Data-tier Applications SQL Server Management Studio in SQL Server 2008 R2 supports all data-tier application (DAC) operations for SQL Server 2008 instances that have SP2 installed. DAC is an entity that contains all databases and instance objects used by an application. A DAC provides a single unit for authoring, deploying, and managing the data-tier objects instead of having to manage them separately. A DAC allows tighter integration of data-tier development with the development of the associated application code. It also gives administrators an application level view of resource usage in their systems through the utility control point. For more information about data-tier applications, see Understanding Data-tier Applications.
    • Reporting Services in SharePoint Integrated Mode (Feature Pack) The Microsoft SQL Server 2008 Service Pack 2 (SP2) Reporting Services Add-in for Microsoft SharePoint Products 2007 allows you to connect to SQL Server 2008 R2 Report Servers. The new features include:
      • Improved handling of credentials and shared data sources.
      • Utilization of the existing instances of Report Builder configured on the Report Server.
      • Support for Report Part Gallery, shared datasets, and shared report items.
      • Report Builder 3.0
      • ATOM feed renderer. For more information, see the ‘Rendering Reports to Data Feeds’ section in "What's New in SQL Server 2008 R2".
  • Benjamin Wright-Jones

    Using Visual Studio Test Tools to troubleshoot and reproduce database performance issues


    Visual Studio test suite contains a very useful set of tools to help troubleshoot and reproduce performance issues.  From a SQL Server perspective, I have found the tools invaluable as it allows me to replay HTTP workloads at volume against web servers and therefore the underlying database server.  

    The Visual Studio tools provide a number of great features such as test mixes, step load plans, real-time visualisations, support for performance monitor counters and SQL Server tracing. 

    Fiddler is a HTTP capture tool which also supports the ability to save the output to a .webtest file  Client activity can be captured easily using Fiddler and this can then be imported into a Visual Studio Test Project as a workload. 

    This workload can be replayed against servers using a step pattern or constant load.  There are many different configuration options which are conceptually referred to as a scenario.


    Options such as warm-up or cool-down can be configured via the Run settings, as can the ability to capture SQL trace data although, when enabled, this captures a pre-defined set of event classes as defined here


    SQL Server performance monitor counters can be added in the load test configuration as shown below.  Performance counters for named instances must be added manually (for some reason they are not


    Real-time test data is displayed in Visual Studio and the results are saved either into a SQL Server Express database (installed as part of Visual Studio) or an existing SQL Server instance.  The latter requires manually configuring the data store connection string. 

    Load balancers can also introduce some interesting behaviour such as binding your IP address to one web server however it is possible to avoid this using the Visual Studio Test Load Agent software to circumvent this behaviour: How to: Use IP Switching with Agents

    As a side note, here are a few tips:

    • Be careful when using SQL Server Express Edition to store the results as the database has a size limitation (SQL Server 2005 and SQL Server 2008 have a 4GB limitation whilst, SQL Server 2008 R2 10GB limitation)
    • Review the SQL Sizing Considerations information in the Load Test Agent readme. e.g. The SQL Express database is license-limited to store 10 GB of data, which is around 24 hours of load test data for a typical load test.
    • Install SQL Server Management Studio Express to manage the instance
    • Schema differs for LoadTest between VS 2005 and VS 2008
    • Schema for LoadTest exists in C:\Program Files\Microsoft Visual Studio 9.0\Common7\IDE. (or Visual Studio 8.0 if you’re using Visual Studio 2005)
    • If you want to use a different instance of SQL Server to host the database and store the test results then execute the loadtestresultsrepository.sql (stored in folder listed above) and creates the database called LoadTest
    • You will need to change the connection string in Visual Studio, this can be found in Test\Administer Test Controllers.
    • The LoadTest schema differs between Visual Studio 2005 and Visual Studio 2008 so you can’t point Visual Studio 2008 at a previous LoadTest schema


    I also used implemented DMV_Stats, a blocked process trace and performance monitor counter logs on the database server as I like to have a record of waitstats, blocking/deadlocking and also any (perfmon) queues.   From a web server perspective, I recorded a counter log of the key IIS metrics as defined here

    In summary, Visual Studio and Fiddler provide the ability to capture and replay HTTP traffic at load.  This can be useful for investigating and recreating web server and database performance issues.

  • Benjamin Wright-Jones

    Best Practice Analyzer is back


    BPA is back for SQL Server 2008 and SQL Server 2008 R2

  • Benjamin Wright-Jones

    Blogs from Microsoft Consultancy Services


    Microsoft Consultancy Services (UK) has recently started blogging.  These are team-based blogs rather than individual blogs.   I have listed a few of these blogs below

    MCS UK Microsoft Business Intelligence Team Blog  This blog focuses on the Microsoft business intelligence technology stack e.g. SSIS, SSAS, PerformancePoint rather than transactional-based systems.

    MCS UK Solution Development Team Blog  This blog focuses on application development.

    MCS UK SharePoint Team Blog  This blog focuses on SharePoint solutions.

  • Benjamin Wright-Jones

    SQL Azure vs. SQL Server


    I’ve just read the SQL Azure vs. SQL Server whitepaper its only a few pages so one of the lighter documents.

    Some interesting and important points which are summarised in the paper e.g. you cannot control where your database files are placed. and also referenced in the Books Online Transact-SQL Support (SQL Azure Database)

    The following Transact-SQL features are not supported by SQL Azure:

    • Common Language Runtime (CLR)
    • Database file placement
    • Database mirroring
    • Distributed queries
    • Distributed transactions
    • Filegroup management
    • Global temporary tables
    • Spatial data and indexes
    • SQL Server configuration options
    • SQL Server Service Broker
    • System tables
    • Trace Flags

    50GB databases will also be here soon

  • Benjamin Wright-Jones

    Backing up to NUL(L)?


    Some of you may be wondering why you would want to do this.  We had a real requirement in a performance benchmark to use FULL recovery and backup the transaction log but we did not want to retain the log changes.  We already had a gold backup which we restored after each test.

    I was wondering if it was still possible to BACKUP to NULL in SQL Server 2008 R2, and the answer is yes.  Warning: don’t ever do this in a production environment.

    BACKUP LOG sandbox TO DISK = 'NUL'
    I am unsure why the NUL keyword is missing a ‘L’.

    Interesting, if I try to backup the log using the NUL keyword then this fails

    BACKUP LOG sandbox TO DISK = 'NUL.bak'


    Msg 3201, Level 16, State 1, Line 1
    Cannot open backup device '…\Backup\NUL.bak'.

    Operating system error 2(failed to retrieve text for this error. Reason: 15105).
    Msg 3013, Level 16, State 1, Line 1
    BACKUP DATABASE is terminating abnormally


    But, this works and persists the backup file in the default backup directory (look in msdb.dbo.backupmediafamily)


    BACKUP LOG sandbox TO DISK = 'NULL.bak'
  • Benjamin Wright-Jones

    SQL Server 2008 R2 Cumulative Update 1


    Hot on the heels of the RTM release of SQL Server 2008 R2, we have just announced the first cumulative update.  The knowledge base article is here which lists all the relevant fixes, which are essentially fixes from SQL Server 2008 Service Pack 1 Cumulative Update 5 to 7.

    As always, please review the knowledge base article and list of fixes before you consider deploying this update.

  • Benjamin Wright-Jones

    Hotfix Alert: Identifying poor I/O


    We have just released a hotfix to help identify poor I/O issues on Windows Server 2008 and Windows Server 2008 R2. This is particularly relevant for SQL Server environments as it will the administrator to determine if I/O performance issues are caused by the SAN.

    Whilst the output is not particularly easy to consume, it is a good step towards identifying I/O issues.

    The knowledge base articles:

    Add a fix to improve the logging capabilities of the Storport.sys driver to troubleshoot poor disk I/O performance in Windows Server 2008 R2

    A hotfix is available that improves the logging capabilities of the Storport.sys driver to troubleshoot poor performance issues for the disk I/O in Windows Server 2008

    The full details are available here

  • Benjamin Wright-Jones

    SQL Server 2008 R2 RTM BOL now available



  • Benjamin Wright-Jones

    Using SSIS with SQL Azure databases


    I’m quite surprised how easy it is to setup an SSIS package to read and write data to a SQL Azure database.  A simple data flow task with an ADO.NET Destination is easy to setup, just make sure the SQL Azure table has a clustered index otherwise writing data will not work (as this is a pre-requisite). 


    The SSIS team have added a BULK INSERT option on the ADO.NET Destination in SQL Server 2008 R2 to improve performance, this blog entry covers the details:

    On the topic of indexes, it looks like there is a bug in the scripting engine in SQL Server 2008 R2.  Scripting the index as DROP to an SSMS window returns the following however the ONLINE syntax is not supported with SQL Azure

  • Benjamin Wright-Jones

    Performance monitor, SQL Server and PAGE compression


    I tend to automate perfmon collection with logman.exe logging to a SQL Server 2008 database via the System DSN (ODBC data source) as follows:

    @echo off
    REM create the counter log, counters specified in .config file
    REM stores the counter log in a .blg format, this can alos be stored in a SQL Server database if required

    REM set environment variables
    SET SERVER1=\\benjones01
    SET CONFIGFILE=SQL2005BaselineCounters.config
    SET START=09/02/2010 16:00:00
    SET END=09/02/2010 17:00:00

    ECHO    ====================================================
    ECHO    Creating performance monitor counter sets on %SERVER1%
    ECHO    ====================================================

    logman create counter MicrosoftSSIS2005Perf -s %SERVER1% -f bin -b %START% -E %END% -si 5 -v mmddhhmm -o "Perfmon_SSIS_Performance" -cf %CONFIGFILE%


    In this example, the config file contains a list of the counters I am interested in:

    "\\SERVERNAME\MSSQL:Access Methods\*"
    "\\SERVERNAME\MSSQL:Buffer Manager\*"
    "\\SERVERNAME\MSSQL:Buffer Node(*)\*"
    "\\SERVERNAME\MSSQL:Exec Statistics(*)\*"
    "\\SERVERNAME\MSSQL:General Statistics\*"
    "\\SERVERNAME\MSSQL:Memory Manager\*"
    "\\SERVERNAME\MSSQL:Plan Cache(*)\*"
    "\\SERVERNAME\MSSQL:SQL Statistics\*"
    "\\SERVERNAME\MSSQL:Wait Statistics(*)\*"
    "\\SERVERNAME\Network Interface(*)\*"
    "\\SERVERNAME\System\Context Switches/sec"
    "\\SERVERNAME\System\Processor Queue Length"

    Logging can generate quite a lot of data depending on the threshold specified in the –si parameter for logman.exe.

    Logging to a SQL Server database makes it easier to query the data as the binary (blg) files can be awkward to deal with when they are very large.

    SQL Server 2008 also provides an added benefit in that I can compress the main table, [dbo].[CounterData] quite significantly using PAGE compression.  The stored procedure shown below provided an estimate of the space saving, this was about 1/3 of the original table size.

    EXEC sp_estimate_data_compression_savings 'dbo', 'CounterData', NULL, NULL, 'PAGE' ;

    The screenshot below illustrates rebuild procedure in order to apply compression at the page level.  My laptop is definitely struggling whilst the data is being compressed…

Page 1 of 7 (167 items) 12345»