Benjamin Wright-Jones

Exploring Information, Infrastructure and Solution Architecture

Posts
  • Benjamin Wright-Jones

    How can I generate a T-SQL script for just the indexes?

    • 5 Comments

    I was trying to work out the best way to generate a T-SQL script for only the indexes and I thought that this would be really easy in the Management Studio interface.  I was mistaken.  SQL Server Management Studio also generates scripts for the tables when you want the indexes.  This is not great so I looked at other methods.  Method 1) Use T-SQL to generate a script using the metadata or method 2) use SMO (SQL Server Management Objects).  Although I like doing things with T-SQL, I thought I’d give SMO a try and below is the result.  I just hope this is made easier in future releases.

    using System;
    using System.Collections.Generic;
    using Microsoft.SqlServer.Management.Common;
    using Microsoft.SqlServer.Management.Smo;
    using System.Text;
    using System.Collections.Specialized;

    namespace SQLRMOSMO
    {
        class SMOTest   // Scripts all indexes for a specified database
        {
            static void Main(string[] args)
            {
                string servername;
                string databasename;
                servername = "<server\\instance>";
                databasename = "<database>";

                Server server = new Server(servername);
                Database db = server.Databases[databasename];
                ScriptingOptions so = new ScriptingOptions();
                so.ScriptDrops = false;
                so.Indexes = true;
                so.IncludeIfNotExists = false;
                so.DriForeignKeys = false;
                so.FileName = "c:\indexes.sql";
                so.AppendToFile = true;
                foreach (Table t in db.Tables)
                {
                    foreach (Index i in t.Indexes)
                        i.Script(so);
                }

            }
        }
    }

  • Benjamin Wright-Jones

    Using SSIS with SQL Azure databases

    • 4 Comments

    I’m quite surprised how easy it is to setup an SSIS package to read and write data to a SQL Azure database.  A simple data flow task with an ADO.NET Destination is easy to setup, just make sure the SQL Azure table has a clustered index otherwise writing data will not work (as this is a pre-requisite). 

    image

    The SSIS team have added a BULK INSERT option on the ADO.NET Destination in SQL Server 2008 R2 to improve performance, this blog entry covers the details: http://blogs.msdn.com/mattm/archive/2009/11/12/something-new-for-ssis-in-sql-server-2008-r2-november-ctp.aspx

    On the topic of indexes, it looks like there is a bug in the scripting engine in SQL Server 2008 R2.  Scripting the index as DROP to an SSMS window returns the following however the ONLINE syntax is not supported with SQL Azure

    image
  • Benjamin Wright-Jones

    How do I find out if Large Values out of Row is enabled?

    • 4 Comments

    I thought the SQL Server 2005 Large Values out of Row would be a sp_tableoption config value similar to text in row - apparently not!  It took a while to find out that this is hidden in sys.tables.  The following query returns the value I was looking for:

    SELECT [name] AS tablename, large_value_types_out_of_row FROM sys.tables

  • Benjamin Wright-Jones

    SQL Server 2005 Extended Triggers

    • 4 Comments

    Ok, I am impressed with everything in SQL Server 2005, especially extended triggers to capture DDL actions.  I was playing around with these today and I can see how this will be a nice feature, especially for audit trails.  I've included a simple script below.

    -- create sample db
    create
    database trigtestdb
    go

    use trigtestdb
    go
    create table tblextrig (eventid int identity, eventdata xml)
    go

    -- create event driven trigger
    create trigger trig_ddl
       
    on database
       
    for create_table, drop_table, alter_table
    as
       
    insert tblextrig values (eventdata())
    go

    -- do some ddl
    create table ben (id int)
    drop table ben
    create table ben (id int)
    go

    -- check if DML events have been logged by trigger
    select eventid, eventdata from tblextrig
    go

    -- clean up
    use master
    go

    drop database trigtestdb

     

  • Benjamin Wright-Jones

    Whitepapers: Working with TempDB & Physical Database Storage

    • 4 Comments

    Two new whitepapers from the SQL Storage Team:

     

    Working with TempDB
    http://download.microsoft.com/download/4/f/8/4f8f2dc9-a9a7-4b68-98cb-163482c95e0b/WorkingWithTempDB.doc
     
    Physical Database Storage
    http://download.microsoft.com/download/4/f/8/4f8f2dc9-a9a7-4b68-98cb-163482c95e0b/PhysDBStor.doc

  • Benjamin Wright-Jones

    Building a data mart to analyse web log traffic

    • 4 Comments
    I recently had the opportunity to build a data mart in order to analyse web logs from IIS servers.  The solution utilised SQL Server 2012 Database Engine, Integration Services, Analysis Services and Excel 2010 (to slice/dice the data).   

    Loading IIS logs (text files with fixed width spaces) was fairly challenging due as the column definitions would alter throughout.  I therefore used the .NET StreamReader class to read the metadata and detect changes in the fields. 

    The data flow is shown below.

    image

    A snippet of the code in the Load Data (Source Script Component) is presented below

    Code Snippet
    1. // Get variables
    2. strSourceFile = Variables.vCurrentSourceFileName;
    3. intAuditLogFileIdentifier = Variables.vAuditExecutionFileIdentifier;
    4.  
    5. try
    6. {
    7.     // Create an instance of StreamReader to read from a file.
    8.     // The using statement also closes the StreamReader.
    9.     using (StreamReader sr = new StreamReader(strSourceFile))
    10.     {
    11.         String line;
    12.         int intNumberOfFields = 0;
    13.         string[] strListOfFields = null;
    14.  
    15.         Trace.WriteLine("Log File: " + strSourceFile);
    16.  
    17.         // Output the source file name as the first line (debugging purposes)
    18.         OutputLogFileRawDataBuffer.AddRow();
    19.         OutputLogFileRawDataBuffer.colRawData = strSourceFile;
    20.  
    21.         // Read and display lines from the file until the end of the file is reached.
    22.         while ((line = sr.ReadLine()) != null)

     

    Extracting the data from the file was relatively straightforward.  I placed the string into an array based on the fixed spacing between fields.  From the data, I was able to extract useful information such as browser type i.e. Safari, Chrome, IE and even browser version. 

    IP addresses were mapped to geolocation using the free GeoLite information http://www.maxmind.com/app/geolite (CSV data imported into the database).  I converted longitude and latitude to the spatial geography data type and presented this against a world map (I wanted to validate that the GeoLite data correctly mapped to the IP address e.g. country/city to IP address).

       1:  USE BiKitchen;
       2:   
       3:  DECLARE @SRID int = 4326
       4:  DECLARE @pLat nvarchar(max)
       5:  DECLARE @pLong nvarchar(max)
       6:  DECLARE @g geography
       7:   
       8:  -- Check longitude and latitude for London
       9:  SET @pLat = (SELECT CAST(Latitude AS nvarchar(max)) FROM [GeoData].[GeoLiteCity-Location] WHERE locid = 13547)
      10:  SET @pLong = (SELECT CAST(longitude AS nvarchar(max)) FROM [GeoData].[GeoLiteCity-Location] WHERE locid = 13547)
      11:   
      12:  SET @g =  geography::STPointFromText('POINT(' +
      13:          @pLong + ' ' +
      14:          @pLat + ')', @SRID).BufferWithTolerance(11000, 1000, 0)
      15:   
      16:  SELECT @pLat AS Latitude, @pLong AS Longitude
      17:  -- SELECT @g.Lat, @g.Long
      18:   
      19:  -- Map the geography type to base world map data
      20:  -- View the result in the spatial tab to validate coordinates
      21:  SELECT @g AS spatiallocation
      22:  UNION ALL SELECT geog FROM World_Borders
     

    The star schema was built and deployed in SQL Server 2012 Analysis Services (UDM).  I found named calculations to be incredibly powerful way of extending the data model and making attributes more meaningful for end-users

    image

    The data was presented using Excel 2010, a screenshot is shown below.  I found slicers to be extremely useful

    image

    image

     

    I thought it would be interesting to see what Excel 2013 had to offer so I tried to create a Power View report but this is not currently supported against the UDM.   There are however some nice enhancements to chart types so I’ll be looking at this in more detail.

    image 

  • Benjamin Wright-Jones

    Analysing SQL Server Error Logs; Useful Tool

    • 3 Comments

    I recently found quite a useful tool hidden in the Windows Installer SDK developer Kit called WiLogUtl.exe which can be used to analyze / parse not just SQL Server error logs, but any form of text-based log file.  The tool is embedded in the Windows Platform SDK (XPSP2) which can be downloaded from here http://www.microsoft.com/msdownload/platformsdk/sdkupdate/.  The binary file is installed under the following directory tree C:\Program Files\Microsoft Platform SDK for Windows XP SP2\Bin. It's provides a simple yet useful interface for browsing and analyzing those log files which are generated by failed setups and service pack installs. 

    I've recently corrected this post (thanks to Matt Tisdel) as I included an incorrect URL (the correct one is shown above).  Further information on WiLogUtl.exe can be found in MSDN here http://msdn.microsoft.com/library/default.asp?url=/library/en-us/msi/setup/wilogutl_exe.asp and here http://msdn.microsoft.com/library/default.asp?url=/library/en-us/msi/setup/windows_installer_development_tools.asp

    On a slightly different topic, my new favourite RSS reader is now Sauce Reader from Synop http://www.synop.com/Products/SauceReader/Home - this is great for consolidating RSS reads.

  • Benjamin Wright-Jones

    New cumulative update++

    • 3 Comments

    Just noticed that the a new cumulative update for SQL Server 2005 service pack 2 has been released.  The KB is here http://support.microsoft.com/kb/943656.  The incremental servicing model has also been clearly defined in the following knowledge base article http://support.microsoft.com/kb/935897

  • Benjamin Wright-Jones

    Katmai whitepapers

    • 3 Comments

    There are some good katmai whitepapers appearing on the the katmai site.  The list is available here http://www.microsoft.com/sql/2008/learning/whitepapers.mspx

    Performance and Scale http://www.microsoft.com/sql/techinfo/whitepapers/sql_2008_perfandscale.mspx 

    Manageability http://www.microsoft.com/sql/techinfo/whitepapers/sql_2008_manageability.mspx

    Delivering location intelligence with spatial data http://www.microsoft.com/sql/techinfo/whitepapers/spatialdata.mspx

    SQL Server 2008 Reporting Services http://www.microsoft.com/sql/techinfo/whitepapers/sql_2008_ssrs.mspx

    What's new for XML in SQL Server 2008 http://www.microsoft.com/sql/techinfo/whitepapers/sql_2008_xml.mspx

    Managing unstructured data http://www.microsoft.com/sql/techinfo/whitepapers/sql_2008_unstructured.mspx

    Server consolidation http://www.microsoft.com/sql/techinfo/whitepapers/hosting_sql_srv_consol.mspx

    Data warehousing http://www.microsoft.com/sql/techinfo/whitepapers/sql-for-datawarehousing_2008.mspx

    Data Programmability http://www.microsoft.com/sql/techinfo/whitepapers/sql_2008_dp.mspx

  • Benjamin Wright-Jones

    Using an SSIS package to monitor and archive the default trace file

    • 3 Comments

    I find it frustrating that the SQL Server 2005/2008 default trace is continually overwritten and there is no way to store x number of files or x MBs of data.  As a workaround, I developed an SSIS package to monitor the \LOG folder and automatically archive the default trace file whenever a new file is created.

    This consists of a FOR LOOP container, a Script Task and a File System Task plus a whole bunch of variables and property expressions.

    The guts of the package is really in the Script Task as this is where I use a WMI query to monitor the \LOG folder for .trc files.  The file is then renamed (date-time-servername-file) to another folder\share which can be a UNC structure e.g. \\server\share.  This way I have a permanent record of the basic server activity for root cause analysis/troubleshooting. 

    The screenshot below shows the basic structure of the package.

    image

  • Benjamin Wright-Jones

    Using a C# script task in SSIS to download a file over http

    • 3 Comments

    I recently had a requirement to automate downloading a file from an website and then perform ETL on the data in the file.  Fortunately, this is possible via the script task in SSIS (note that this is using SQL Server 2008 Integration Services).  I found a couple of web references to do this in VB.NET but I prefer C# so modified the code and made some adjustments to suit my (debugging) needs.  I set two package variables, RemoteURI and LocalFileName, to store the URL (source) and filename (destination).

    This works really well and I can change the variables at run-time using property expressions

    public void Main()
           {
               WebClient myWebClient;
               string RemoteURI;
               string LocalFileName;
               bool FireAgain = true;

               Dts.Log("entering download..", 999, null);

               try
               {
                   myWebClient = new WebClient();

                   RemoteURI = Dts.Variables["User::vPipeline"].Value.ToString();
                   LocalFileName = Dts.Variables["User::vLocalFileName"].Value.ToString();

                   Console.WriteLine(RemoteURI);
                   Console.WriteLine(LocalFileName);

                   MessageBox.Show(RemoteURI);
                   MessageBox.Show(LocalFileName);

                   // Notification
                   Dts.Events.FireInformation(0, String.Empty, String.Format("Downloading '{0}' from '{1}'", LocalFileName, RemoteURI), String.Empty, 0, ref FireAgain);

                   // Download the file
                   myWebClient.DownloadFile(RemoteURI, LocalFileName);

                   Dts.TaskResult = (int)ScriptResults.Success;

               }

               catch (Exception ex)
               {
                   // Catch and handle error
                   Dts.Events.FireError(0, String.Empty, ex.Message, String.Empty, 0);
                   Dts.TaskResult = (int)ScriptResults.Failure;
               }

           }

  • Benjamin Wright-Jones

    Initializing a Transactional Replication Subscriber from an Array-Based Snapshot

    • 2 Comments

    After a few late nights, some coffee and a few review cycles, a new article has just been published on the SQLCAT site which provides an overview of the subscriber initialisation techniques for transactional replication and, more specifically using an array-based snapshot http://sqlcat.com/technicalnotes/archive/2009/05/04/initializing-a-transactional-replication-subscriber-from-an-array-based-snapshot.aspx

  • Benjamin Wright-Jones

    New Microsoft Certification Tier

    • 2 Comments

    The Microsoft Certified Master (MCM) program was announced this week at Tech-Ed in the US.  The public URL is now live and all the information can be viewed here http://www.microsoft.com/learning/mcp/master/default.mspx

  • Benjamin Wright-Jones

    Debugging Parameters in the SSIS Data Flow (Script Component)

    • 2 Comments

    I had unfortunately picked up the (bad) habit of debugging my SSIS packages by using messagebox.show at design time.  The SSIS data flow Script component does not allow debugging using this approach so I had to find an alternative.  I discovered that it is possible to output parameters at design time using the FireInformation in the Script component.  This puts the parameters in the progress window.

    // Output parameters into the SSIS progress window
    bool FireAgain = true;
    this.ComponentMetaData.FireInformation(0, "DataFlowDebug", "The value of time is " + Row.ParamValueTime.ToString(), "", 0, ref FireAgain);
    this.ComponentMetaData.FireInformation(0, "DataFlowDebug", "The value of environment is " + Row.ParamValueEnvironment.ToString(), "", 0, ref FireAgain);

    I like this approach although I should probably output the parameters to one of the other console windows.

  • Benjamin Wright-Jones

    Database mail is cool...

    • 2 Comments

    I'm now a big fan of database mail in SQL Server 2005 - it is so much better than SQLMail.  For starters, it is now cluster-aware (customers have been asking for this for a while now), supports SMTP and takes advantage of service broker.  When database mail is configured, it is at the user database level.  Various objects are installed in the database being enabled for mail.  The stored procedure, sp_send_dbmail, is one of these objects and it takes a variety of parameters e.g. queries can be used and the results can be attached as a file.  The following script provides a simple example:

    EXEC dbo.sp_send_dbmail

    @profile_name = 'YUKON01_Database_Mail',

    @recipients = '<email address>',

    @body = 'This is a query test',

    @subject = 'Query Test',

    @query = '<T-SQL Statement>',

    @attach_query_result_as_file = 1 ;

    There are also two DMV's which can be queried to look at the metadata and determine when the last message was sent; these are sysmail_mailitems & sysmail_log.  Please feel free to post any comments on your thoughts regarding database mail. 

  • Benjamin Wright-Jones

    SQL Server 2005 Training, Free!

    • 2 Comments
    Go here for free online learning for SQL Server 2005: https://www.microsoftelearning.com/sqlserver2005/ - not bad as a starting point...
  • Benjamin Wright-Jones

    Using vardecimal storage in SQL Server 2005 SP2 CTP2

    • 2 Comments

    As you probably know, Service Pack 2 for SQL Server 2005 introduces the vardecimal storage format.  Below is a step-by-step example using the AdventureWorks database.  There are a few incorrections in BOL SP2 CTP2 although these are not major problems and can be easily resolved; I imagine discussions are ongoing about naming conventions for the vardecimal options. 

    USE master ;
    GO

    -- Enable vardecimal on database
    EXEC sp_db_vardecimal_storage_format 'AdventureWorks', 'ON' ;
    GO

    -- Check the vardecimal storage format state for all databases in the instance
    EXEC sp_db_vardecimal_storage_format
    GO

    -- Enable vardecimal compression at the table level
    USE AdventureWorks
    GO
    -- Note: The BOL example incorrectly references 'decimal data compression'
    EXEC sp_tableoption 'Sales.SalesOrderDetail', 'vardecimal storage format', 1
    GO

    -- Does not show vardecimal properties
    EXEC sp_help 'Sales.SalesOrderDetail'

    -- So, use the TableHasVarDecimalStorageFormat objectproperty
    USE AdventureWorks ;
    GO
    SELECT name, object_id, type_desc
    FROM sys.objects
    WHERE OBJECTPROPERTY(object_id,
    N'TableHasVarDecimalStorageFormat') = 1 ;
    GO

    -- Under the covers, this uses sys.dm_db_index_physical_stats to calculate the stats
    -- Documented in BOL CTP2 as sp_estimatedecimalcompression
    EXEC sp_estimated_rowsize_reduction_for_vardecimal 'Sales.SalesOrderDetail' ;

    -- Clean-up / disable vardecimal storage format
    USE AdventureWorks
    GO

    -- Disable table-level storage format
    EXEC sp_tableoption 'Sales.SalesOrderDetail', 'vardecimal storage format', 0
    GO
    USE master;
    GO


    -- Disable database property
    EXEC sp_db_vardecimal_storage_format 'AdventureWorks', 'OFF' ;
    GO

  • Benjamin Wright-Jones

    Redux: Using an SSIS package to monitor and archive the default trace file

    • 2 Comments

    I’ve recently been asked for details regarding the SSIS package I wrote to copy and archive the default trace file which is generated by SQL Server.  The contents of the file can be quite useful for troubleshooting or diagnosis purposes.

    I’ve updated the package to work with SQL Server 2008 R2 and SQL Server 2012.  

    The detection of a new trace file is implemented using a Script Task which watches the \Log folder for new files.  When a new file is detected it copies the previous file to an archive location.   The logic is embedded in Visual Basic.NET (not my personal choice although I was limited in SQL Server 2005 when I wrote the package). 

    image

    The archive process renames the file with the date and time and then copies the file to a chosen location.   I should point out that I use expressions on a few variables to alter outputs such as the filename i.e. date_time_filename. 

    I also noticed that the service needs appropriate permissions to both access the \LOG directory and also copy to the target directory, in my scenario, this was \\server\share.  When I was testing, I launch SSDT (SQL Server Data Tools) using Administrator privileges for testing purposes as a quick workround to permission issues).

    image

    Here is the code for the Script Task (apologies for the word wrap, the Live Writer plug-in seems to do this to fit it on the page).  I have commented out some of the writeline commands I was using to debug the package when it was initially developed.

    Code Snippet
    1. 'Disclaimer:
    2.     'The sample scripts and SSIS package are not supported under any Microsoft standard support program or service.
    3.     'The sample scripts and SSIS package are provided AS IS without warranty of any kind.
    4.     'Microsoft further disclaims all implied warranties including, without limitation, any implied warranties of merchantability or of fitness for a particular purpose.
    5.     'The entire risk arising out of the use or performance of the sample scripts and documentation remains with you.
    6.     'In no event shall Microsoft, its authors, or anyone else involved in the creation, production, or delivery of the scripts be liable for any damages whatsoever (including, without limitation, damages for loss of business profits,
    7.     'business interruption, loss of business information, or other pecuniary loss) arising out of the use of or inability to use the sample scripts or documentation, even if Microsoft has been advised of the possibility of such damages.
    8.  
    9.     Public Sub Main()
    10.  
    11.         Dim vars As Variables
    12.         Dim strComputer As String
    13.         Dim objWMIService As Object
    14.         Dim colMonitoredEvents As Object
    15.         Dim objEventObject As Object
    16.         Dim strSourceDirectory As String
    17.         Dim strServerName As String
    18.         Dim strSourceErrorLogDirectory As String
    19.         Dim strSourceErrorLogDirectoryWithQuotes As String
    20.  
    21.         Try
    22.             ' Use the SSIS variables in this code for the WMI query
    23.             strServerName = Dts.Variables("v_ServerName").Value
    24.             strSourceErrorLogDirectory = Dts.Variables("v_DT_SourceLogDirectory").Value
    25.  
    26.             Console.WriteLine("Servername: " + strServerName)
    27.             Console.WriteLine("Monitoring \Log Directory: " + strSourceErrorLogDirectory)
    28.             
    29.             ' Replace \ with \\\\ which is needed for the WMI query
    30.             strSourceErrorLogDirectory = Replace(strSourceErrorLogDirectory, "\", "\\\\")
    31.             strSourceErrorLogDirectoryWithQuotes = Chr(34) & strSourceErrorLogDirectory & Chr(34)
    32.  
    33.             'MsgBox("Server Name: " + strServerName)
    34.  
    35.             ' Connect to the WMI source
    36.             objWMIService = GetObject("winmgmts:\\" & strServerName & "\root\cimv2")
    37.  
    38.             ' Monitor the error log folder for instances of ERRORLOG.1 as this is the file we want to archive
    39.             ' The directory name is parameterised and populated from the SSIS variable
    40.  
    41.             ' Monitor the directory for new default trace files
    42.             colMonitoredEvents = objWMIService.ExecNotificationQuery _
    43.                 ("SELECT * FROM __InstanceCreationEvent WITHIN 10 WHERE " _
    44.                     & "Targetinstance ISA 'CIM_DirectoryContainsFile' and " _
    45.                         & "TargetInstance.GroupComponent= " _
    46.                                 & "'Win32_Directory.Name=" & strSourceErrorLogDirectoryWithQuotes & "'")
    47.  
    48.             objEventObject = colMonitoredEvents.NextEvent()
    49.  
    50.             'MsgBox("A new file was just created: " + objEventObject.TargetInstance.PartComponent)
    51.  
    52.             Dim strReturned, strFilePath As String
    53.  
    54.             strReturned = objEventObject.TargetInstance.PartComponent
    55.             strFilePath = Split(strReturned, "CIM_DataFile.Name=")(1)
    56.             strFilePath = Replace(strFilePath, """", "")
    57.             strFilePath = Replace(strFilePath, "\\", "\")
    58.             'MsgBox("Sliced file: " + strFilePath)
    59.  
    60.             ' strFilePath is C:\Program Files\Microsoft SQL Server\MSSQL.1\MSSQL\LOG\log_nnn.trc
    61.             ' Array element (6) is log_nnn.trc, this assumes the above directory structure
    62.             Dim strFilename As String
    63.  
    64.             'strFilename = Split(strFilePath, "\")(6)
    65.             'MsgBox("Split: " + strFilename)
    66.  
    67.             strFilename = System.IO.Path.GetFileName(strFilePath)
    68.             'MsgBox("IO.Path: " + strFilename)
    69.  
    70.             ' If filename like log_ then enter this code path
    71.             ' The default trace filename is always log_ so we can rely on this for filename matching
    72.             If strFilename Like "log_*.trc" Then
    73.  
    74.                 Console.WriteLine("A new default trace file was just created in \LOG called " + strFilename)
    75.                 Trace.WriteLine("A new default trace file was just created in \LOG called " + strFilename)
    76.  
    77.                 ' Archive the previous default trace file
    78.                 'MsgBox("Default Trace found, now process the file")
    79.  
    80.                 Dim arrContainer As Array
    81.                 Dim intTraceFileNumber As Integer
    82.                 Dim strArchiveFileName As String
    83.  
    84.                 arrContainer = Split(strFilename, "_")
    85.  
    86.                 'Console.WriteLine(arrContainer(0).ToString)
    87.                 'Console.WriteLine(arrContainer(1).ToString)
    88.  
    89.                 ' Split 1111.trc so we only store 1111 to convert to int
    90.                 arrContainer = Split(arrContainer(1), ".")
    91.  
    92.                 ' This is the active default trace file number
    93.                 'Console.WriteLine(arrContainer(0).ToString)
    94.  
    95.                 ' Convert the active trace file number to int and decrease by 1
    96.                 intTraceFileNumber = CInt(arrContainer(0)) - 1
    97.  
    98.                 ' Convert back to string and create the default trace file name
    99.                 strArchiveFileName = "log_" + CStr(intTraceFileNumber) + ".trc"
    100.                 'Console.WriteLine("Archiving: " + strArchiveFileName + " to " + Dts.Variables("v_DT_DestinationDefaultTraceDirectory").Value)
    101.  
    102.                 'MsgBox(strArchiveFileName)
    103.  
    104.                 'Write the filename to the SSIS variable
    105.                 Dts.Variables("v_DT_ActiveFileName").Value = strArchiveFileName
    106.  
    107.                 Console.WriteLine("Archiving: " + strArchiveFileName + " to " + Dts.Variables("v_DT_DestinationDefaultTraceDirectory").Value)
    108.                 MsgBox("Output to SSIS Variable: " + Dts.Variables("v_DT_ActiveFileName").Value + " strFilename: " + strArchiveFileName)
    109.  
    110.                 ' Indicate success to move on to the next step
    111.                 Dts.TaskResult = ScriptResults.Success
    112.             End If
    113.  
    114.             ' Error handling
    115.         Catch ex As Exception
    116.             Console.WriteLine(System.DateTime.Now.ToString + " - SSIS Script Task Error: " + ex.Message.ToString)
    117.             Dts.TaskResult = ScriptResults.Failure
    118.         End Try
    119.  
    120.     End Sub

     

    I hope you find this useful.

    I will try and attach the SSIS package to this post later.

  • Benjamin Wright-Jones

    Editing XML using PowerShell

    • 1 Comments

    I recently had a requirement to edit an XML document using PowerShell  Fortunately, this is a relatively easy task as shown below.  I’ve used a system environment variable %USERPROFILE% to populate the path to the file.

    The script modifies a value in the XML document (highlighted in the example below) and ensures this is persisted using the .Save method.  The intention is to execute this script during user logon to ensure the value is always disabled in the application, ProClarity Web Professional 6.3 SP3.

    clip_image002

    REM Code changes CheckQuerySize=”Yes” to CheckQuerySize=”No”

    REM Windows XP
    $xmlFile = “$env:userprofile\Application Data\ProClarity\Client\Options.xml”
    $xmlDoc = [XML](gc $xmlFile)
    $xmldoc.ProClarity5Options.ProClarityApp.CheckQuerySize = "No"
    $xmldoc.Save($xmlFile)

    REM Windows Vista and Windows 7
    $xmlFile = “$env:userprofile\AppData\Roaming\ProClarity\Client\Options.xml”
    $xmlDoc = [XML](gc $xmlFile)
    $xmldoc.ProClarity5Options.ProClarityApp.CheckQuerySize = "No"
    $xmldoc.Save($xmlFile)

  • Benjamin Wright-Jones

    Processing Excel files in SSIS 2008 – Lessons Learned

    • 1 Comments

    I have recently been working on a project where we need to load and transform data held in Excel 2007 into a SQL Server database.  The Excel sheets were fairly complex and had different structures.  Fortunately, SSIS 2008 provides some good tools to handle this situation. 

    Lessons learned

    1. It is possible to read the Excel metadata using the mechanism listed in this knowledge base article, http://support.microsoft.com/kb/318452 HOW TO: Retrieve Meta Data from Excel by Using GetOleDbSchemaTable in Visual C# .NET.  You may be wondering why I want to do this.  Handling Excel sheets with different structures can be tricky so reading the metadata can help determine control flow processing i.e. which data flow to use to process the file.

    2. Remember, if you are testing on x64 then your package will not execute if you are using the Excel source since it is not supported on 64-bit so you need to disable the Run64BitRuntime property as below

    image

    3. The Script component is really great when it comes to data manipulation in the data flow.  This can be used as either a source, transformation or destination and allows you to manipulate the rows in the pipeline using either VB.NET or C#.

    4. As mentioned, Excel files can also be read using the Execute SQL task in the control flow, which is a nice feature e.g. SELECT * FROM $PivotData

    5. The File System Task can make file handling a lot easier.  Combine this with the use of variables and property expressions, dynamic file manipulation became a whole lot easier.  For example, after processing I either move the file to another folder based on the outcome e.g. success or failure. 

  • Benjamin Wright-Jones

    Transposing Columns onto Rows

    • 1 Comments

    After a long period of absence, I have returned to posting some blog articles.  This one popped up last week from an Oracle DBA while I was onsite with a customer.  The conversation went something along the lines of:

    Oracle DBA: “How can I place columns values on rows? I have a limited page width and need to alter the output”

    Microsoftie “ah.. well, that’s easy..”

    Consider the output below:

     

    image

    but we want it to appear like this:

    image

     

    The magic is really in the UNPIVOT function as shown below.

    CREATE DATABASE sandbox;
    
    USE sandbox;
    
    CREATE TABLE tblPerson
    (
        Email_Address varchar(50),
        First_Name varchar(50),
        Last_Name varchar(50)
    );
    
    INSERT INTO tblPerson VALUES
    ('ben@test.com', 'Ben', 'WJ')
    
    SELECT * FROM tblPerson;
    SELECT tblPivot.Property, tblPivot.Value FROM (SELECT CONVERT(sql_variant,Email_Address) AS Email_Address, CONVERT(sql_variant,First_Name) AS First_Name, CONVERT(sql_variant,Last_Name) AS Last_Name FROM tblPerson) Person UNPIVOT (Value For Property In (Email_Address, First_Name, Last_Name)) as tblPivot;
  • Benjamin Wright-Jones

    Testing SQL Server HA using Hyper-V in Windows 8 Release Preview

    • 1 Comments

    I often find the need to build and test SQL Server clusters to support native two-node or N+1 (multi-instance) scenarios and also validate SQL Server 2012 Always On configurations.  I was previously running Windows Server 2008 R2 on my laptop but I a few issues using this as a day to day operating system e.g. no support for standby when hyper-v was enabled, no dual boot (hyper-v on/off) with bitlocker enabled, no Bluetooth support etc.  Fortunately, Windows 8 meets all my needs and the experience has been excellent to date.

    Note: for rapid provisioning of other images, I created a fully patched base OS image which was sysprep’d (this is incredibly easy).  I now just copy the image if I want to create other server roles e.g. System Center 2012.

    The only issue I have found to date is that host internet connectivity is affected (delayed) after defining multiple internal network adapters, therefore I switched to using Private network adapters.   I only need internal adapters for host to guest connectivity e.g. copying files etc.

    My hardware and software is defined below:

    • Laptop HP 8540w (16GB memory, 4 cores hyper-threaded)
    • Two internal SATA disks (hybrid)
    • Windows 8 Release Preview (x64) with Hyper-V enabled

    The environment consists of the following virtual guest images:

    • 1 x Domain Controller (also hosts the virtual storage)
    • 1 x Primary Node (node 1)
    • 1 x Secondary Node (node 2)

    All servers are running Windows Server 2008 R2 Enterprise Edition x64 with Service Pack 1.  The cluster role is enabled on node 1 and node 2. 

    Storage

    The shared storage is provisioned using the iSCSI Software Target which I defined on the domain controller.  I didn’t find a need to dedicate a specific storage server.  The screenshot of the virtual storage is presented below

    image

     

    image

    Cluster Validation is good, the only warning was inconsistent OS patch levels on both cluster nodes as shown below.

     

    image

    After running cluster validation, I created a cluster and the final configuration is shown below. 

    Below is a screenshot of the raw cluster configuration.

    image

    I’ll blog more following the SQL Server install.  I also plan to repeat this for Window Server 2012 and SQL Server 2012.

  • Benjamin Wright-Jones

    Sysinternal Utilities on TechNet!

    • 1 Comments

    The sysinternal tools are now on TechNet.  I know this is not specifically related to SQL Server however, but they are very useful tools which can help diagnose performance problems.   I particularly like process explorer and process monitor (and they work on x64 & Vista too!).  Get them here: http://www.microsoft.com/technet/sysinternals/processesandthreadsutilities.mspx

  • Benjamin Wright-Jones

    SQL Server 2005 Service Pack 2 required for Longhorn & Vista

    • 1 Comments

    In case you were wondering, Longhorn & Vista will only support SQL Server 2005 Service Pack 2. This should be out as a Community Technology Preview (CTP) build sometime soon (I'll blog about the forthcoming changes once we have publicly announced the CTP details). A more detailed explanation on SQL Support for Longhorn & Vista is posted here: http://www.microsoft.com/sql/howtobuy/sqlonvista.mspx

  • Benjamin Wright-Jones

    Finding the NTFS Sector Size via DBCC

    • 1 Comments

    You can determine the formatted sector size of the database using DBCC FILEHEADER ('<dbname>').  The DBCC command actually returns quite a lot of information but we are only interested in the sector size column.  

Page 1 of 7 (167 items) 12345»