April, 2007

  • Jakub@Work

    Working with Management Pack Templates

    • 9 Comments

    Management pack templates provide a powerful way to create single or a collection of management pack objects. Essentially, you author the fragment of a management pack you want created, with some missing values that become configuration to your template, and upon execution of the template the missing values are provided and the finished management pack object(s) are materialized and imported.

    Templates show up in the UI under the Authoring pane. In order for your own custom template to show up there, you need to create a folder for it and place the template in the folder as a folder item. It will show up without the folder, but won't work quite right. Enabling executing the template via the UI is outside the scope of this post, but should be eventually available as a tutorial on www.authormps.com.

    I've attached a sample management pack that essentially recreates the instance group template we use for creating groups.

    If you take a look at the template defined there, you will see that the configuration section is very similar to other management pack elements. The configuration schema here specifies what values must be provided to the template processor.

    The second section is the references for the actual template. The references refer to the references section of the management pack the template is in, by alias. There is one special alias defined, 'Self', which refers to the management pack the template is in.

    Finally, under the implementation section is the fragment of the management pack you want your template to create. You can have the configuration values replace any part of the implementation by referring to the values via the $TemplateConfig/<configuration variable name>$ format.

    In the templates management pack you will also notice that I put the sample templates in a newly created folder. This will ensure the UI behaves properly with any template output I produce. What happens is that the template output can be placed in a folder, and the UI will treat these folders as individual instances of execution of the template, such that they can be managed as homogenous units, even though they may have created a wide variety of management pack objects.

    The code below runs my template using the SDK.

    First you will notice that I need to get the management pack the template is in and the template itself. I need the management pack for two reasons. First, I need a management pack to run the template against; all the objects produced by the template will be placed in this management pack. Second, I need this particular management pack because it is not sealed and thus any templates defined in it, must be run against it. If you seal the management pack that contains the template, you can run it against any non-sealed management pack.

    Next, I have to build the configuration for my template. This is just XML that matches the schema of my template. You will also notice that within my configuration I have to referenced some management packs. This will be reflected by adding additional references as a parameter to processing the template. Note that if I want to use references that already exist in the management pack the template output will be put in, these aliases must match the already existing aliases for the same management packs.

    Finally, when I process my template, I provide additional information that will be used to name the folder the template output is put into. This is optional, but required if you want the output to show up in the UI and want to be able to delete it easily (by deleting everything in this folder). The method actually returns the folder the output was put in.

    using System; using System.Collections.ObjectModel; using System.Text; using System.Xml; using Microsoft.EnterpriseManagement; using Microsoft.EnterpriseManagement.Configuration; namespace Jakub_WorkSamples { partial class Program { static void ProcessTemplate() { // Connect to the local management group ManagementGroup localManagementGroup = new ManagementGroup("localhost"); // Get template management pack. This is where we will store out template output since // the sample template management pack is not sealed and the output needs to be // in the same management pack as the template in this case. ManagementPack templateManagementPack = localManagementGroup.GetManagementPacks( "Template.Sample")[0]; // Get the template you want to process MonitoringTemplate sampleTemplate = localManagementGroup.GetMonitoringTemplates( new MonitoringTemplateCriteria("Name = 'Sample.Template'"))[0]; // Populate the configuration for the template string formula = @"<MembershipRule> <MonitoringClass>$MPElement[Name=""Windows!Microsoft.Windows.Computer""]$</MonitoringClass> <RelationshipClass>$MPElement[Name=""InstanceGroup!Microsoft.SystemCenter.InstanceGroupContainsEntities""]$</RelationshipClass> </MembershipRule>"; StringBuilder stringBuilder = new StringBuilder(); XmlWriter configurationWriter = XmlWriter.Create(stringBuilder); configurationWriter.WriteStartElement("Configuration"); configurationWriter.WriteElementString("Namespace", "Sample.Namespace"); configurationWriter.WriteElementString("TypeName", "MyClass"); configurationWriter.WriteElementString("LocaleId", "ENU"); configurationWriter.WriteElementString("GroupDisplayName", "My Class"); configurationWriter.WriteElementString("GroupDescription", "My Class Description"); configurationWriter.WriteStartElement("MembershipRules"); configurationWriter.WriteRaw(formula); configurationWriter.WriteEndElement(); configurationWriter.WriteEndElement(); configurationWriter.Flush(); // Get the management packs for references ManagementPack windowsManagementPack = localManagementGroup. GetManagementPack(SystemManagementPack.Windows); ManagementPack instanceGroupManagementPack = localManagementGroup. GetManagementPack(SystemManagementPack.Group); ManagementPackReferenceCollection newReferences = new ManagementPackReferenceCollection(); newReferences.Add("Windows", windowsManagementPack); newReferences.Add("InstanceGroup", instanceGroupManagementPack); templateManagementPack.ProcessMonitoringTemplate(sampleTemplate, stringBuilder.ToString(), newReferences, "MyTemplateRunFolder", "My Template Run", "This is the folder for my sample template output"); } } }
  • Jakub@Work

    Security Auditing

    • 1 Comments

    The SDK service supports success and failure auditing for every method invoked on the service. For the most part, the event only includes the SDK operation (e.g. UserRole__Get), the user we validated against that operations, the actual SDK service method requesting the access check and the session id of the SDK client that this access check was performned for. Enabling auditing is straight forward as it leverages the object level auditing mechanism built in Windows 2003 Server:

    1. Open Local Security Policy, found under Administrative Tools
    2. Expand Local Policies and select Audit Policy
    3. Select "Audit object access"
    4. Right-click and Properties
    5. Enable "Success" and/or "Failure", depending on what you want to audit

    Shortly, all SDK methods should be audited to the security event long.

    Aside from the information mentioned above, we do audit more information for some operations.

    For tasks, we audit: JobId, TaskId, TargetObjectId, whether the task requires encryption, every override and value applied (if any) and username and domain if an alternate account was provided for execution.

    For ManagementPack change failures, including imports, updates and deletes, for security related failures we audit which management failed to be changedand what element triggered the failure. This can occur for users in an Author or Advanced Operator profile that try to perform management pack operations outside their class scope.

    Finally, we audit additional information for user role related changes.

  • Jakub@Work

    Exporting Management Packs

    • 8 Comments

    Customers have asked if you can see what is inside sealed management packs. The UI actually prevents this, but it is possible to do via the command shell. In both cases, you need to find the management pack you want to export through whatever method you prefer and then export it as below:

    Command Shell Example:

    Get-ManagementPack -Name "Microsoft.SystemCenter.2007" | Export-ManagementPack -Path "C:\"

    SDK Example:

    using System; using System.Collections.ObjectModel; using Microsoft.EnterpriseManagement; using Microsoft.EnterpriseManagement.Configuration; using Microsoft.EnterpriseManagement.Configuration.IO; namespace Jakub_WorkSamples { partial class Program { static void ExportManagementPack() { // Connect to the local management group ManagementGroup mg = new ManagementGroup("localhost"); // Get any management pack you want ManagementPack managementPack = mg.GetManagementPack("Microsoft.SystemCenter.2007", "31bf3856ad364e35", new Version("6.0.5000.0")); // Provide the directory you want the file created in ManagementPackXmlWriter xmlWriter = new ManagementPackXmlWriter(@"C:\"); xmlWriter.WriteManagementPack(managementPack); } } }

  • Jakub@Work

    MCF from non-Windows Clients

    • 7 Comments

    Note: This will only work in RTM

    In order to make our MCF web-service work from non-windows clients, the first step is to actually change the binding the MCF web-service uses. In theory, what we ship with (wsHttpBinding) should work cross-platform, but at the moment there are no non-Windows products that generate a proper proxy and thus make it difficult to use (See Update below). If at all possible, please use a windows based client to talk to MCF or a non-Windows based proxy that fully supports wsHttpBinding as the functionality is much richer, especially around errors. If you choose to proceed down this route, note that exceptions are not properly propagated (they all show up as generic fault exceptions) and it will be impossible to tell from the client what server-side errors occurred. If you have no choice, keep reading...

    If we switch the proxy to use basicHttpBinding, the service will act like an asmx web-service and everything should work cross-platform with existing web-service products. In order to actually use basicHttpBinding, however, the service will require some additional configuration. Ultimately, we need the caller to be presented as a windows account. For cross platform, since you can't use windows authentication, you are forced to use client certificates and map them accordingly. In order to use client certificates, however, you need to setup SSL (you can also use Message level security but I only setup and tested Transport level). Here are the steps:

     

    1. Create a server certificate to use for your MCF endpoint to enable SSL (this certificate will need to be trusted by your clients)

    2. Import this certificate into the Local Machine store on the Root Management Server

    3. Setup the MCF endpoint to use SSL

    Since we are self-hosted, this cannot be done in IIS. You will need to find HttpCfg.exe (for Win2K3 this is found under SupportTools on the CD) and run the following command:

    HttpCfg.exe set ssl -i 0.0.0.0:6000 -h 82e8471434ab1d57d4ecf5fbed0f1ceeba975d8d -n LOCAL_MACHINE -c MY -f 2
     
    The 6000 is the port you are using in the configuration file. The "82e8471434ab1d57d4ecf5fbed0f1ceeba975d8d " is the thumbprint of the certificate you want to use (this can be found under the Details tab of the certificate snap-in after viewing the certificate). The -f 2 enables the server to accept client certificates.

    4. Update Microsoft.Mom.Sdk.ServiceHost.exe.config to look like the attached file

    You can pick whatever port you want to use, although I was never able to get 80 to work in testing.

    Also note that when I tried this out generating the proxy using wsdl.exe from a machine other than the server itself, it failed when my endpoint was defined as localhost. I had to specify the server name in the endpoint definition for the tool to work.

    5. Restart the omsdk (OpsMgr Sdk Service) service

    6. Generate a client certificate

    There seem to be two ways to do this. The first, and the one I tried successfully, is to generate a client certificate that has, in the Subject Alternate Name field the principal name of the user you want to map to. This will work if the CA that issues the certificate is an Enterprise CA on the domain your OpsMgr Sdk Service is running on. In the details under the Subject Alternate Name field this looks something like this:

    Other Name:
    Principal Name=youruser@yourdomain

    Alternatively, AD allows for the configuration of certificate mapping directly on the respective user object. I did not try this method as I do not have domain admin access on the domain I was testing on, but this should work as well.

    7. Use the client certificate in the request

     

    I tested this out using a proxy generated by wsdl.exe from the 2.0 .Net Framework and everything seemed to work ok. Things didn't work well with the 1.1 Framework wsdl.exe as some of the normally non-nullable fields (such as DateTime fields) are not nullable in 1.1, but can be null from MCF. Thus, whatever tool you use to generate proxies, it needs to be able to handle null values for value types.

    Update: I did some digging around, and although I did not test any of these, it seems there are some java projects for WCF interop that should allow you to communicate directly with our wsHttpBinding. There is the JAX-WS project and Project Tango. There are probably more out there, but I was reading about this in particular and people having success using them to interop specifically with wsHttpBinding.

Page 1 of 1 (4 items)