As promised way back in August, here is the first installment of a series of blogs on database performance from the application point of view.
Complex screens can cause database performance problems when each control (grid, list box, etc) on the screen is treated as an individual object and is responsible for retrieving its own data. This pattern may seem too basic for experienced programmers, yet we still see it in production quite frequently. A complex screen is defined by most UI guidelines as one that contains 20 or more controls. The pattern is mostly seen in objects that retrieve data from the database (as opposed to saving data back to the database). If there are too many objects sending individual queries to the database, it is usually done in serial, meaning that the second query will wait until the first one is done.
An even worse pattern is when you have multiple tab controls on the screen and each tab contains many controls. This just compounds the problem described above. Instead of retrieving all data on form load, why not retrieve the data when the user clicks on the tab. Although it would be nice to have the tab display instantly by having the data already at the client, some of these screens are rarely seen by the user. Your solution will mean that you will need to know your application well enough to know which tabs the users see frequently and the ones that are rarely seen. Then you can decide which ones to retrieve on form load, and which to load only when needed.
Your first thought might be to multithread the application and make asynchronous queries to the database. If that seems like too much work, you can do asynchronous database calls in ADO.NET 2.0 and use WaitHandles to eliminate the complexity of explicit threading. Check out this sample from the Visual Studio Books-On-Line: ms-help://MS.VSCC.v80/MS.MSDN.v80/MS.VisualStudio.v80.en/WD_ADONET/html/f588597a-49de-4206-8463-4ef377e112ff.htm. Or if you don’t have VS BOL handy, here’s a pretty good samples: http://www.devx.com/dotnet/Article/26747 or http://www.xmlforasp.net/codeSection.aspx?csID=119.
An alternative solution is to cache some of the result sets on the client or web tier. If you are using SQL 2005, you can employ the new Cache Sync (formerly called Query Notification) feature. By caching the data, you can avoid frequent round trips to the database for data that rarely changes. Here’s a link to the chapter on MSDN http://msdn2.microsoft.com/en-us/library/ms175110.aspx. This technique should be used sparingly because it takes and keeps one worker thread open on the server for each registered notification. It is best used on the mid-tier for managing a common cache instead of a separate notification per user session.
My favorite solution for most applications is to have one object that makes one call to the database to retrieve multiple result sets, then loads each individual control. For example, your original code might call three stored procedures from the front end (or middle tier).
Each of these would make a separate round trip to the database and each would return a separate dataset to the calling program (even an empty data set would return the metadata of column names and data types).
This MasterProc would look like:
CREATE PROCEDURE dbo.MasterProc AS
This MasterProc would return three result sets and you would have to set up a loop in your program to process everything while there are more result sets to process (using IDataReader.NextResult()).
C# Example to handle multiple result sets being returned.
SqlCommand myCMD = new SqlCommand("EXEC dbo.MasterProc", nwindConn);
SqlDataReader myReader = myCMD.ExecuteReader();
// add your processing to interrogate which result set it is so you know what to do with it
} while (myReader.NextResult());
Note: Don’t put this in production without adding some error handling.
Next blog in the series will be about "Academically correct object modeling".... [Kevin]