A common question I continue to hear is how to create a report that will give the results of all updates deployed to a system, whether that update is delivered through the traditional software updates mechanism or through standard software distribution, such as a package. This is an important question since in the world of software updates there are some, even from Microsoft, that do not get published to the software updates catalog.. In addition, there are updates from third parties that need to be deployed and these are not part of the software updates catalog (at least by default, more on that shortly). Further, ConfigMgr 2012 SP1 now supports Unix/Linux and Mac clients but does NOT support the traditional software updates mechanism for those systems. So, to deliver updates to Unix/Linux and Mac clients requires using traditional software deployment. Accordingly, software deployment, continues to be an important mechanism for delivering updates. This underscores the importance of a unified reporting option to view the status of all update deployments, regardless of deployment mechanism. For this article I will walk through one way you might go about this. Let me say at the very beginning that the method described here likely isn’t an exhaustive approach as it focuses on a Windows only solution. Further, depending on the update, the way to track it may be different. So, the goal here is not to provide a total solution but rather to provide guidance on at least one option and to help you start thinking about how this could be addressed in your unique scenarios.
Note: For this article we will focus on ConfigMgr 2012. This same approach, however, could be used in a ConfigMgr 2007 environment. In ConfigMgr 2007, however, there is no support for Unix/Linux or Mac clients. Plus, ConfigMgr 2012 is full of awesomeness and every environment should be upgraded or well into the process!
OK, stage is set. Before we start discussion of the example scenario let me first say that while the approach I describe below is useful, particularly for Unix/Linux and Mac clients, if you are deploying updates to Windows systems I would strongly urge you to consider the use of System Center Updates Publisher (SCUP). SCUP is a robust tool that allows building and publishing of custom updates for Windows systems directly into the update catalog. Using this tool there is no need to extend reporting as all of the results will be brought up just like any other update! Take a look at my previous blog posts discussing this tool – available here, here and here.
The example scenario will deploy an update that is not part of the standard catalog. In order to deploy and track this update I will need to do a few things: 1. Create a package to facilitate the deployment. Yes, you could also use an application for this in ConfigMgr 2012 but for a ‘one-off’ update, a package makes more sense to me. In addition, a package is the only mechanism for deploying software updates to Unix/Linux and Mac so it makes sense to have consistency in your non-software updates deployment vehicle.
2. Modify default inventory to detect that the package was installed In our example, we will do this by collecting updates that appear in the Win32_QuickFixEngineering WMI class.
3. Modify an existing software updates report to merge the data from both sources. For this example I will show how to modify one report. This will serve as a guide for other software update reports you might want to update.
Create a package to facilitate the deployment Creating a package is a routine task in ConfigMgr so I won’t go through the details here. To deploy the update, prepare a package and program and then deploy it to a collection of devices.
Modify default inventory to detect that the package was installed OK, this item is key. We can deploy things all day but unless we have a way to detect that the deployment was successful it doesn’t help. Thankfully there are a few options here. First, all ConfigMgr deployments report back their status to the site server – for packages the deployment success is returned in the form of status messages. While this is one option that could be workable it is not the one I’ve chosen here. SInce package deployments will likely contain more than just software updates, trying to use the status that returns as a success or failure in a report will require filtering to just focus on the software updates.
There also is the option to scan the system for the presence of the deployed update and report back. Again, several ways we could do that – script, file system, Compliance Settings and more. In our example I will leverage the Win32_QuickFixEngineering WMI class. This WMI class is specifically designed to reflect the presence of all system-wide updates deployed. We also have an added advantage that this class specifically excludes any update supplied by Windows Update - so the result set we get from this WMI class comes pre-filtered with just the supplemental data we want. To read more about the Win32_QuickFixEngineering class, click here.
OK, good. So after the deployment completes we check the Win32_QuickFixEngineering namespace in WMI and we can see that our update is now listed as installed.
So now that this information is here, how do we collect it? That’s easy – simple modification to the default hardware inventory classes will do the trick – and fortunately, the Win32_QuickFixEngineering class is already defined in the default option list so all we need to do is enable it.
To do this, modify the client settings. In my case I’m adding the modification to the default client settings.
Select ‘Set Classes’ and find the Quick Fix Engineering section. I have chosen to enable everything.
That’s all there is to it. At this stage, wait a few minutes and then force a policy cycle on a test client machine. Wait a couple of minutes again and then force a Hardware Inventory cycle. Once the updated hardware inventory reports up you can see the addition of Quick Fix Engineering data by going to a machine in a collection, right-clicking and selecting Start-Resource Explorer.
Modify an existing software updates report to merge the data from both sources. With the data gathered it’s now time to turn attention to reporting. To build our report we need to have three items. 1. A SQL query to pull our newly collected data. 2. An idea of a report we either want to build or one we want to modify. 3. Report Builder 2.0/3.0, depending on your version of SQL.
Let’s start with our SQL query. The query is often the most intimidating part of the reporting process – especially for SQL neophytes. The Report Builder environment does allow you to build a SQL query but I find it far better to build it in SQL itself, test and then tweak to include in the report. The query I use to pull the data is below. A quick note here, always build your queries using the SQL views. Doing so is easier and is the supported avenue for building reports.
select HotFixID0 as 'Update ID', Description0 as 'Update Type', InstalledOn0 as 'Date Installed', InstalledBy0 as 'Installed By' from v_GS_QUICK_FIX_ENGINEERING where ResourceID = @RscID
So this query was built in SQL but if you paste it there it won’t work. Why? The answer is the inclusion of a variable (@RscID) in the query. The variable is necessary to snap into the existing report I will modify – and the variable is declared in the report as you will soon see. If you wanted to run this directly in SQL you would either need to explicitly declare a value instead of a variable or declare the variable and supply a value. The variable declaration and value are supplied in the report so we don’t’ have to do it in the query at this stage. Confused? Hope not - but if yes, read on and I’ll tie this together.
Next we need a report that we want to modify. The report I chose is ‘Compliance 5: Specific Computer’. Remember, the goal is to list all patches deployed to a computer, whether deployed by Software Updates our by Software Distribution. Lets take a look at the standard report.
With Report Builder it is possible to build very basic to very complex reports from the ground up or by modifying existing reports – the choice is yours. In this example we will modify this standard report. Note that the report consists of several elements – Parameters, the Data Source and the DataSets.
What are Parameters? Think back to when you ran your last report. Likely when you launched the report you first had to fill in some information before viewing. In the case of this report we are required to supply the target computer name (remember, this is a report that pulls all update for a given system) and optionally can supply the Vendor name or Update Classification. Note that the MachineID parameter is highlighted. This is the parameter that is used to reflect the machine ID that you chose. The list of parameters presented to you is populated by SQL queries that are stored in the DataSets of the report - and the choice you make is stored in the actual Parameter value. So, parameters first collect data that is used to feed the main SQL query (also a DataSet)– remember that as we will come back to it shortly.
In addition to Parameters we have a Data Source. You must have at least one but could have multiple. A single data source is typical for ConfingMgr reports. The Data Source simply defines what database we want to use for pulling data, where it is located and how to connect. As in this case, a single Data Source can be shared across multiple DataSets.
Finally, we have the DataSets. A given report requires at least one DataSet or could have several. In this case we have 4. DataSet0 is the key dataset for the report that pulls all of the data we need to display the report. Remember those parameter menus that show data, such as system name, vendor or update classification, to be chosen when you first open the report? That data has to come from somewhere – that somewhere is the queries associated with DataSets 1, 2 and 3.
Lets take a look at the properties for DataSet 0. Note that the query is displayed on the first page. In addition, the first page of the DataSet indicates we are embedding the DataSet into the report vs. using a shared (which we won’t get into here) and also defines which Data Source this DataSet is using.
If you review the Parameters we can begin to see how DataSets use Parameters which are defined by other DataSets. The arrows correlate the Parameter Values to the Parameter list showing what has been defined. The highlighted Parameter Names are labels defined in the DataSet that correlate to the variables used in the query for the DataSet.. You can’t see the variables in the query shown above because the query is too long – you can see the variables being used in our custom DataSet we will create shortly. I’ve also attached the custom report so you can explore this more directly.
Remember, just like DataSet0 does the job of executing the main query to populate the report there are other DataSets defined that are there to pull data used to populate the initial drop down menus for the report and ‘feed’ DataSet0.
OK, so now time to modify this report to include our data obtained from Win32_QuickFixEngineering. This modification takes two forms – first we need to add an additional DataSet and ensure it is leveraging the Parameters already defined, where applicable. The next part will be the visual modifications to the report.
To add a DataSet select DataSets, right-click and select Add Dataset. Fill out the DataSet information as shown.
The Fields, Options and Filters pages are automatically populated – review them for completeness and then configure the Parameters page as shown. Remember our SQL variable, @RscID? In order for this query to work we needed to associate it to one of the existing Parameters. That is the purpose of our Parameters page.
Cool – last thing to do then is to adjust the report formatting to display the data the way we want. This is totally a personal preference. Report Builder is very powerful here and operates similarly to Visual Studio in that it is a visual design environment. Just right-click on the design page and select to insert text, a chart, a table, whatever. This part is hard to illustrate in a post like this so I’ll just display the finished work and correlate to what I did with a few graphics.
All that’s left is to test run the report which we can also do in Report Builder.
Once satisfied, publish it to the ConfigMgr reporting server – Report Builder does that too – just select the circle in the top left and choose Save As and point to the appropriate folder on the reporting server.
With it now save to the Reporting Server it is now visible and runnable in the ConfigMgr console – or directly through the reporting web page, whichever you like.
Enjoy and happy report building!