My name is Eddie Lau and I am a Principal SDET with Microsoft’s Sales and Marketing IT (SMIT) organization. As part of my role, I serve as the performance engineering lead to help ensure our line-of-business applications meet performance requirements defined, and offer positive & predicable end users experiences.
Microsoft Dynamics CRM 2011 is the core of our internal sales solution and the business requires it to manage large quantity of entity data such as Activities and Contacts. This created the desire for us to evaluate the effect of enabling SQL Server Data Compression on Dynamics CRM tables with millions of records. We were particularly interested in understanding the amount of storage savings that could be achieved, as well as the overall effect on application performance. Here is how the testing / evaluation was accomplished at a high level:
1. Conduct baseline performance load test by creating custom web tests using Visual Studio 2012, or leverage the Performance Toolkit for Microsoft Dynamics CRM 2011.
2. Retrieve table counts to identify large tables as candidates for data compression. For example:
select object_name(id), rowcntfrom sysindexeswhere indid in (0,1)and object_name(id) not like 'sys%'order by rowcnt desc
It is recommended not to select tables that are frequently accessed or continuously updated such as PrincipalObjectAccess. Entity base and extension base tables are good considerations.
3. Use sp_spaceused system stored procedure to retrieve data and index size in KB. For example:
4. Compress table (enable table compression) by executing ALTER TABLE REBUILD WITH. For example:
ALTER TABLE ContactBase REBUILD WITH (DATA_COMPRESSION = ROW)
5. Optional: Compress underlying indexes of table (enable index compression) by executing ALTER INDEX REBUILD WITH. For example:
ALTER INDEX NC_FirstName ONdbo.ContactBase REBUILD WITH (DATA_COMPRESSION = ROW)
6. Execute sp_spaceused to measure new data and index size after rebuild
7. Execute performance load test again to measure variations in transaction / page execution count, response times, server resources consumption etc. Note: Visual Studio Load Test Report Excel plugin provides an effective method for test run comparison
In our testing with SQL Server 2012 ROW compression, an average table size reduction of 49% was achieved by compressing 6 tables with the smallest table contains 6 million rows. From a performance perspective, 50% of the tested scenarios comprise of Quick Searches, Loading System Views & View / Update Entities have shown performance improvements in throughputs and response times, while performance of 42% of the scenarios remained at the same level. The remaining 8% of the scenarios have shown some performance degradations which were acceptable in our case.
It is important to note that there is an overhead associated with SQL Data Compression. In our performance environment, it was observed that average SQL Server processor utilization increased from 55% to 74% during tests, which was under our defined threshold.
In summary the benefits that SQL Server Data Compression offers in storage savings and performance improvements were well demonstrated on our sales solution powered by Microsoft Dynamics CRM 2011.
Reference: Improving Microsoft Dynamics CRM Performance and Securing Data with Microsoft SQL Server 2008 R2