SQL in Stockholm

SQL Server thoughts from the support team in Stockholm

SQL Server troubleshooting thoughts from Graham Kent, (ex) Senior Support Escalation Engineer at Microsoft Sweden.

  • SQL in Stockholm

    Unable to use forced parameterization when deploying plan guides in SQL Server 2005

    • 1 Comments

    I was working a case last week where I wanted to instate some fairly simple plan guides for a query that had picked a bad plan on a particular deployment of an application. I was thinking it was going to be an easy process. The query in question executed several tens of thousands of times, and had 2 parameters being generated by the application. On the server I was analyzing the performance had degraded due to a bad plan, but I had a trace from another server using the same application, where I had captured the good plan. This ought to have been a classic case for a plan guide (since I couldn’t change the SQL batch coming from the application, and the server had been managed properly so I couldn’t address the issue by simple update stats or some such). Anyway it drove me mad for a few hours as I couldn’t get it to work, so thought I would share, what turned out to be, the simple reason why :-)

    I needed to do 2 things:

    1. Create a plan guide which forced parameterization of the query, since I had noted it wasn’t being parameterized by default.

    2. Create a plan guide which forced a specific plan on the parameterized form of the query

    Both of these are fairly standard steps, step1 is documented here:

    Specifying Query Parameterization Behavior by Using Plan Guides

    and step 2 is documented here:

    Using the USE PLAN Query Hint

    I’ve done this many times before, but whatever I did, I couldn’t get step 1 to fly, so I couldn’t go any further. After much double checking of what I was doing (including trying to set forced parameterization at a database level), I decided to take a step back and check out the pre-requisites of forced parameterization at a database level. Books Online listed exceptions where parameterization could not be forced, so I went back and checked them all, even though my SQL batch seemed extremely simple. I found that there were 2 problems:

    a) Deprecated query constructs

    b) Any statement that is run in the context of ANSI_PADDING or ANSI_NULLS set to OFF

    It turned out that the connections that the application was using were issuing SET ANSI_NULLS statements manually, just after they connected. This statement itself it deprecated meaning that parameterization was a no no, and ironically it meant that doubly since ANSI NULLS OFF also stops this occurring.

    A little something to watch out for.

  • SQL in Stockholm

    How to Customise SQLDIAG in SQL Server 2008 to collect DBCC MEMORYSTATUS

    • 1 Comments

    Although Books Online makes some mention of SQLDIAG customizations, it doesn’t any explicit examples, so here’s one for you. I documented previously here about how to customize your XML configuration file to connect to different machines, and collect output from the standard collectors, and I also mentioned how you could download the PerfStats scripts and use the XML files that they came supplied with. The following example though shows how to insert a custom script of your own, which can be anything of your choice. I’ll be using DBCC MEMORYSTATUS output to demonstrate. The thing to remember here is that I’m not saying that this is the only way to collect such data, but that as ever, with SQLDIAG it allows you to do everything in one place, in a controlled and easy to administer way.

    Firstly we need a script which will collect DBCC MEMORYSTATUS repeatedly. The following one won’t win any design prizes but will do the job just nicely:

    WHILE 1=1

    BEGIN

    select 'start time:', GETDATE();

    DBCC MEMORYSTATUS

    WAITFOR DELAY '00:00:10'

    END;

    This will run DBCC MEMORYSTATUS every 10 seconds until you kill it.

    Next we need to create an XML configuration to run this script using SQLDIAG. I’ll take the default SQLDIAG.XML which unpacks when you run the sqldiag.exe for the first time. If you need any explanations of these pre-requisites then go back to the SQLDIAG link above and read the background.

    <CustomDiagnostics>
    <CustomGroup name="msinfo" enabled="true" />
    <CustomTask enabled="true" groupname="MsInfo" taskname="Get MSINFO32" type="Utility" point="Startup" wait="OnlyOnShutdown" cmd="start /B /WAIT MSInfo32.exe /computer %server% /report &quot;%output_path%%server%_MSINFO32.TXT&quot; /categories +SystemSummary+ResourcesConflicts+ResourcesIRQS+ComponentsNetwork+ComponentsStorage+ComponentsProblemDevices+SWEnvEnvVars+SWEnvNetConn+SWEnvServices+SWEnvProgramGroup+SWEnvStartupPrograms" />
    <CustomTask enabled="true" groupname="MsInfo" taskname="Get default traces" type="Copy_File" point="Startup" wait="OnlyOnShutdown" cmd="&quot;%sspath%log\log*.trc&quot; &quot;%output_path%&quot;" />
    <CustomTask enabled="true" groupname="MsInfo" taskname="Get SQLDumper log" type="Copy_File" point="Startup" wait="OnlyOnShutdown" cmd="&quot;%sspath%log\SQLDUMPER_ERRORLOG*.*&quot; &quot;%output_path%&quot;" />
    </CustomDiagnostics>

    If you look in the above section we already have one custom group, called MSINFO32 , which has 3 sub-tasks. (The format of these if worth noting if you wish to do any file manipulations and command line tools.) What we want to do is add another CustomGroup, which will contain a single task, which is our DBCC script from above. The script itself needs to be saved into a file for the collector to access it, so in this scenario we’ll save the script into the same root directory as the sqldiag.exe. If you’re planning on doing this for many different scripts and scenarios you might be better organising these in appropriately named sub directories. So I’ve saved my script above as the following

    C:\Program Files\Microsoft SQL Server\100\Tools\Binn\exec_dbcc_memorystatus.sql

    Then I need to add the following XML to tell the SQLDIAG that I have a new custom group of diagnostics (called SQL memory scripts), which contains currently 1 custom task (out DBCC MEMORYSTATUS script). The new section looks like this:

    <CustomDiagnostics>
    <CustomGroup name="msinfo" enabled="true" />
    <CustomTask enabled="true" groupname="MsInfo" taskname="Get MSINFO32" type="Utility" point="Startup" wait="OnlyOnShutdown" cmd="start /B /WAIT MSInfo32.exe /computer %server% /report &quot;%output_path%%server%_MSINFO32.TXT&quot; /categories +SystemSummary+ResourcesConflicts+ResourcesIRQS+ComponentsNetwork+ComponentsStorage+ComponentsProblemDevices+SWEnvEnvVars+SWEnvNetConn+SWEnvServices+SWEnvProgramGroup+SWEnvStartupPrograms" />
    <CustomTask enabled="true" groupname="MsInfo" taskname="Get default traces" type="Copy_File" point="Startup" wait="OnlyOnShutdown" cmd="&quot;%sspath%log\log*.trc&quot; &quot;%output_path%&quot;" />
    <CustomTask enabled="true" groupname="MsInfo" taskname="Get SQLDumper log" type="Copy_File" point="Startup" wait="OnlyOnShutdown" cmd="&quot;%sspath%log\SQLDUMPER_ERRORLOG*.*&quot; &quot;%output_path%&quot;" />
    <CustomGroup name="SQLscripts" enabled="true" />
    <CustomTask enabled="true" groupname="SQLscripts" taskname="dbcc memorystatus" type="TSQL_Script" point="Startup" wait="No" cmd="exec_dbcc_memorystatus.sql"/>
    </CustomDiagnostics>

    Next I save this XML file as a new name, let’s say SQLDIAG_memstatus.XML and then I just execute the sqldiag tool with a input file switch like this:

    sqldiag /ISQLDIAG_memstatus.XML

    and away we go. If you look in your output directory, you’ll find the following file:

    <yourservername>__exec_dbcc_memorystatus_Startup.OUT

    which contains all the DBCC MEMORYSTATUS output.

    All you need to do now, is create custom XML files and scripts for as many scenarios as you can think of! Hopefully if you examine the entries for the MSINFO task, this might give you a few ideas as well.

  • SQL in Stockholm

    TechDays Sweden: post presentation questions and follow ups

    • 0 Comments

    Following my presentation here in Västerås yesterday there were a few people waiting to speak to me who I didn't have time for. If you were one of those people and want to ask some questions, you're more than welcome to post them here in the comments, or contact me through the contact section on this site, which comes direct to my work mail box. I'll attempt to follow up on everything, I just can't promise any timescales.

  • SQL in Stockholm

    Come and say Hi at TechDays Sweden tomorrow....

    • 0 Comments
    I'll be presenting (my usual mantra on using SQLDIAG to troubleshoot your own problems) at 16:15 tomorrow. Also the support team will be running a desk where they'll be 3 or 4 of us at all times to answer general support related questions. I've got four 90 minute shifts on there, so come over and have a chat if you're not running off to loads of other talks.
  • SQL in Stockholm

    sys.dm_exec_query_stats DMV can report invalid information for parallel plans

    • 1 Comments

    Elisabeth Redei asked me to respond to a question earlier today following a comment on her blog post about performance monitoring using DMVs. The question asked was

    "when will sys.dm_exec_query_stats be fixed to report worker time (cpu) for queries with parallel execution plans?"

    The answer to this is that we have already fixed it, but the code has been checked into the next version of SQL Server (i.e. the next major release after SQL 2008). When it was fixed we also fixed another problem where the same DMV can report NULL in the dbid and objectid columns under certain circumstances.

    Currently there are no plans to backport this fix into SQL 2005 or 2008.

  • SQL in Stockholm

    New SQL Blogger - a must read

    • 1 Comments

    An ex-colleague of mine Elisabeth Redei, has just opened a new blog at http://sqlblog.com/blogs/elisabeth_redei/default.aspx 

    Elisabeth is extremely knowledgable and has spent many years in Microsoft SQL Server support roles. You should definately add her to you reading list, if you haven't already seen her at sqlblogs.com

    She's just posted a good article here about statistics:

    http://sqlblog.com/blogs/elisabeth_redei/archive/2009/03/01/lies-damned-lies-and-statistics-part-i.aspx

Page 1 of 1 (6 items)