Information, tips, news and announcements about SQL Server Analysis Services and PowerPivot directly from the product team.
PowerPivot is an absolutely fantastic performance analysis tool. You might say that I’m biased, but wait until you see firsthand how easy it is to analyze large amounts of performance data from any number of computers. For example, I recently had to analyze resource utilization in a SharePoint farm with 16 front-end servers, 8 Excel Services application servers, 4 PowerPivot application servers, and 2 content database servers. Each load test sampled a variety of performance counters from all of these servers in one-second intervals and generated over 250 million rows of data, and I needed to analyze this data for individual servers as well as for server roles. Sounds like a lot of work? Piece of cake if you use PowerPivot! Here are the steps:
Point 1 is by far the most work-intensive step—especially if you have to configure Performance Monitor on some 30 or more servers. As a prerequisite, you must create a relational database on a central SQL server. Let’s call it PerfData. You also need to create a System Data Source Name (DSN) on every computer running Performance Monitor, pointing to the PerfData database. You can simplify this configuration task by importing the following .reg file content into your computers’ registry. Just make sure you replace <Server Name> with the name of your SQL server hosting the PerfData database. Use two backslashes as a separator if you must specify server and instance names, such as SERVER\\POWERPIVOT.
Windows Registry Editor Version 5.00[HKEY_LOCAL_MACHINE\SOFTWARE\ODBC\ODBC.INI][HKEY_LOCAL_MACHINE\SOFTWARE\ODBC\ODBC.INI\ODBC Data Sources]"PerfDSN"="SQL Server"[HKEY_LOCAL_MACHINE\SOFTWARE\ODBC\ODBC.INI\PerfDSN]"Driver"="C:\\Windows\\system32\\SQLSRV32.dll""Server"="<Server Name>""Database"="PerfData""Trusted_Connection"="Yes"
Having imported the .reg file content into the registry, you can verify the presence of the System DSN called PerfDSN by using ODBC Data Source Administrator, which is available in the Control Panel under Administrative Tools.
The next step is to create a data collector in Performance Monitor, which uses the PerfDSN to connect to the PerfData database. Again, it’s best to automate this configuration task, this time by means of an XML template and a batch file. A sample XML template is attached to this blog post. The batch file has the following content (make sure you replace the user account and server names with actual information and add further lines as necessary).
logman import "PerfCollector" -xml "Template.xml" -u contoso\administrator p@ssword1 -s SERVER01logman import "PerfCollector" -xml "Template.xml" -u contoso\administrator p@ssword1 -s SERVER02
Note that the specified user account requires write permissions to the PerfData database. Performance Monitor uses this account to write the performance counter values. Of course, it’s a good idea to test the configuration:
Tip: If Performance Monitor does not start the PerfCollector successfully, then either your PerfDSN configuration is incorrect (use ODBC Data Source Administrator to adjust the configuration) or the account you specified for the PerfCollector doesn’t have the required permissions (edit the PerfCollector properties in Performance Monitor and specify a different account under Run As on the General tab).
The performance data gathering system is almost complete. An optional but very useful piece is a batch file to start and stop the PerfCollector on all test machines. The following batch can serve as a starting point. Just make sure you replace SERVER01 and SERVER02 with actual computer names, and add start and stop commands for further computers as necessary. The batch expects a command-line argument, which specifies the number of minutes to run the PerfCollector on the specified computers. It automatically stops the data collection when the specified time has elapsed and ensures in this way that no PerfCollector instances are left running. This helps to avoid flooding the PerfData database.
@echo offsetlocal ENABLEDELAYEDEXPANSIONREM Start the PerfCollector on all servers@echo onlogman start "PerfCollector" -s SERVER01 -aslogman start "PerfCollector" -s SERVER02 -as@echo offREM Set start and end time for performance data collectionset /A Start=1 set /A End=%1 REM Loop for the specified number of minutes:LOOPif !Start! gtr !End! GOTO LOOP_ENDchoice /d y /t 60 > nulset /A Start+=1goto LOOP:LOOP_END@echo onlogman stop "PerfCollector" -s SERVER01 -aslogman stop "PerfCollector" -s SERVER02 -as
The hard work is done, but before moving on to PowerPivot let’s briefly discuss why you need the PerfData database. After all, Performance Monitor could also write the data to comma- or tab-separated text files. The most obvious disadvantage of text files, however, is that it takes a very long time to import any number of them, especially if you want to import hundreds of millions of rows into PowerPivot. Another significant disadvantage is that the text files tend to be extremely wide. Performance Monitor tracks every selected performance counter in a separate column. For example, the PerfCollector template includes approximately 200 counters, which would imply that PowerPivot ends up with very wide tables. Wide tables increase metadata overhead and decrease storage efficiency. Furthermore, the column names are not intuitive as they include the server name, performance object, and counter name separated by backslashes. Cleaning this up for 200 columns is just too much work. Moreover, text files contain counter values as strings. You’d have to convert them into DateTime and decimal numbers after the import. Again, a tedious task for 200 columns, so don’t torture yourself and just use a SQL Server database.
Now, let’s have some fun with performance data:
If desired, rename the table to PerformanceData. This is about all it takes to prepare the data for basic analysis:
CounterValue (automatically changes to Sum of CounterValue)
Axis Fields (Categories):
ObjectName, InstanceName, CounterName
% Processor Time
Additionally, if you want to analyze performance data based on server roles, you must add a lookup table to the PowerPivot model that associates each server with a role. A linked Excel table is a good choice for this purpose:
That’s it for now. Save the workbook and copy it to a document library or other location if you want to share it with others. Note also that you don’t ever need to repeat the modeling work for subsequent performance tests. Just refresh the data in PowerPivot and Excel to load the latest performance counter values from the PerfData database and your job is done! Of course, it might be worth adding some advanced features to this solution by using DAX. This will be the topic of my next blog post. Stay tuned.
Awaiting for this... Thanks for sharing.
Thanks for sharing
Thanks a lot!!!