CSS SQL Server Engineers

This is the official team Web Log for Microsoft Customer Service and Support (CSS) SQL Support. Posts are provided by the CSS SQL Escalation Services

I think I am getting duplicate query plan entries in SQL Server’s procedure cache

I think I am getting duplicate query plan entries in SQL Server’s procedure cache

Rate This
  • Comments 2

Before the post dives into the subject I need to point out that Keith did most of the work.  I just kept pestering him with various scenarios until he sent me the e-mail content I needed.   Thanks Keith – Smile

Keith devised a set of steps that you can use to collect information about the plans and the associated plan, key attributes.  Using these queries you can track down entries in procedure cache with the same handles and determine what attribute is different, indicating why there appears to be duplicate entries for the same query.    It is often as simple as a SET statement difference.

From: Keith Elmore

Bob asked me to take a quick look and see if I could make some headway on understanding why there appears to be duplicate plans in cache for the same sql_handle and query_hash. In researching this, if you call a procedure that references a temp table created outside of that scope, the cached plan has the session_id as part of the cache key for the plan. From http://technet.microsoft.com/en-us/library/ee343986(v=SQL.100).aspx

If a stored procedure refers to a temporary table not created statically in the procedure, the spid (process ID) gets added to the cache key. This means that the plan for the stored procedure would only be reused when executed again by the same session. Temporary tables created statically within the stored procedure do not cause this behavior.

Because the customer is invoking this query via sp_executesql and the temp table is created outside of the sp_executesql the above condition applies, and the theory is that this could be causing the larger number of entries even though the sql_handle and query_hash are the same. But in order to confirm this theory we need some additional data. If the customer wants to pursue this, the following queries is what I’d want to run:

1. A single execution of this query from SSMS.

-- Look and see if there is any hash bucket with a large number of entries (> 20)

-- which may cause slower lookup of entries

select p1.* from sys.dm_exec_cached_plans p1

join (select bucketid, count(*) as cache_entries, count(distinct plan_handle) as distinct_plans from sys.dm_exec_cached_plans p

group by bucketid

having count(*) > 20) as p2 on p1.bucketid = p2.bucketid

2. Run the following query from SSMS, which will save all of these "duplicate" queries into a permanent table that we’ll retrieve.

-- Save all of the "duplicate" plans for this specific query in a table in tempdb

select qs.sql_handle, qs.statement_start_offset, qs.statement_end_offset,

qs.creation_time,

qs.execution_count,

qs.plan_generation_num,

p.*

into tempdb..DuplicateCachePlans

from sys.dm_exec_query_stats qs

join sys.dm_exec_cached_plans p on qs.plan_handle = p.plan_handle

where qs.sql_handle = 0x0200000093281821F68C927A031EDA1B661FC831C10898D0

and qs.query_hash = '0x07BD94E2146FD875'

3. From a command prompt, bcp out the data from the table above, as well as the plan_attributes data for each of these plans (add appropriate server name with –S parameter and optionally add path to where you want the file written, filename highlighted in yellow below)

bcp "select * from tempdb..DuplicateCachePlans" queryout cached_plans.out -n –T

bcp "select p.plan_handle, pa.* from tempdb..DuplicateCachePlans p cross apply sys.dm_exec_plan_attributes (p.plan_handle) as pa" queryout plan_attributes.out -n –T

4. Back in SSMS, you can drop the temp table created in step 2

drop table tempdb..DuplicateCachePlans

-Keith

 Bob Dorr - Principal SQL Server Escalation Engineer

Leave a Comment
  • Please add 8 and 5 and type the answer here:
  • Post
  • Indeed this is a very helpful post!

    What can be done to resolve such issues.  Is code change the only option?

  • Nice, thanks for squeezing out the good stuff and placing in the Public Domain.

Page 1 of 1 (2 items)