November, 2010

  • Microsoft Dynamics NAV Team Blog

    Performance Analyzer 1.0 for Microsoft Dynamics

    • 4 Comments

    The Microsoft Premier - Dynamics team has created and compiled a set of scripts and tools for helping analyze and troubleshoot SQL Server performance issues on the Dynamics products. These are the same tools that we use on a daily basis for collecting SQL performance data and troubleshooting SQL performance issues on all our Dynamics products* and we want to make this available to our partners and customers. These tools rely heavily on the SQL Server DMVs so it is only available for SQL Server 2005, SQL Server 2008, and SQL Server 2008 R2. 

    This tool can aid in the troubleshooting of blocking issues, index utilization, long running queries, and SQL configuration issues. Instructions for installing and using the tool are included in the download package. One nice feature of this tool is that it creates a database called DynamicsPerf and imports all the performance data collected into the DynamicsPerf DB, which can be backed up and restored on any SQL Server (2005, 2008) for later analysis making it "portable." The collection of performance data can also be automated via a SQL job for which the scripts are provided.

    Performance Analyzer 1.0 for Microsoft Dynamics can be downloaded via the following MSDN link. This tool is updated on a fairly consistent basis with bug fixes and new functionality so please check often for new versions.

    http://code.msdn.microsoft.com/DynamicsPerf

    This tool and associated scripts are released "AS IS" and are not supported.

    *There is added functionality for Dynamics AX

    -Michael De Voe

  • Microsoft Dynamics NAV Team Blog

    Microsoft Dynamics NAV : Online Maps

    • 0 Comments

    Online map functionality that we've had in Dynamics NAV for some time now has changed for most of countries, after Maps were moved to Bing.

    As maps were moved and collections changed, the code needs to be changed to align with this. This is scheduled for change in standard application.

    Meanwhile, it is fairly simple to make some basic changes in current application code that will allow you to align existing functionality with Bing maps, for basic search functionalities in any case. Code changes are not many and are fairly simple, but stretch across several objects so I have uploaded the txt file containing these objects, for simplicity.

    The changes i made here only align with new maps, functionality is more or less the same as it was, with few minor changes. Online map setup is a bit simpler, as no default links are needed nor inserted. Remember though when using this: when URL is built, it is the country/region code that will decide what language the site will open in.

    Not all local sites are translated to local languages, so if one specifies NO (Norway) as a country (for example), maps will still show in English. For some countries (Germany, Spain, Italy, France,...)  selecting a country will show maps in local language. One should also keep in mind, for countries that don't have the collections in local languages - specifying country name in local language might confuse the search in some scenarios. Meaning if your user resides in Norway (example) and  customer resides in Sweden, address URL will contain country name:  Sverige (Norwegian for Sweden), and that will put the search off in some scenarios. All you need here is to either change the code so country code is not within URL (but that might give more hits to select from), or simply add a field to table 9 (Country/Regions) called ENUName. Use the data migration functionality to Import the attached XML file containing list of countries present in demo data and their ENU translations. Include only Code and ENUName fields when importing the file, and change the following line of codeunit 801 Online Map Utilities:

    replace:


    IF country.GET(Address[5]) THEN
      Address[5] := country.Name

    with


    IF country.GET(Address[5]) THEN
      Address[5] := country.ENUName

     

    and your URL will always be built with country name in English, which would work for search in any language.

     

  • Microsoft Dynamics NAV Team Blog

    Test Automation Series 2 - Creation Functions

    • 0 Comments

    With the introduction of the NAV test features in NAV 2009 SP1 it has become easier to develop automated test suites for NAV applications. An important part of an automated test is the setup of the test fixture. In terms of software testing, the fixture includes everything that is necessary to exercise the system under test and to expect a particular outcome. In the context of testing NAV applications, the test fixture mainly consists of all values of all fields of all records in the database. In a previous post I talked about how to use a backup-restore mechanism to recreate the fixture and also about when to recreate the fixture.

    A backup-restore mechanism allows a test to use a fixture that was prebuild at some other time, that is, before the backup was created. In this post I'll discuss the possibility to create part of the fixture during the test itself. Sometimes this will be done inline, but typically the creation of new records will be delegated to creation functions that may be reused. Examples of creation functions may also be found in the Application Test Toolset.

    Basic Structure

    As an example of a creation function, consider the following function that creates a customer:

    CreateCustomer(VAR Customer : Record Customer) : Code[20] 
    
    Customer.INIT;
    Customer.INSERT(TRUE);
    CustomerPostingGroup.NEXT(RANDOM(CustomerPostingGroup.COUNT));
    Customer."Customer Posting Group" := CustomerPostingGroup.Code;
    Customer.MODIFY(TRUE);
    EXIT(Customer."No.")

    This function shows some of the idiom that is used in creation functions. To return the record an output (VAR) parameter is used. For convenience the primary key is also returned. When only the primary key is needed, this leads to slightly simplified code. Compare for instance

    CreateCustomer(Customer);
    SalesHeader.VALIDATE("Bill-to Customer No.",Customer."No.");

    with

    SalesHeader.VALIDATE("Bill-to Customer No.",CreateCustomer(Customer));

     Obviously, this is only possible when the record being created has a primary key that consists of only a single field.

    The actual creation of the record starts with initializing all the fields that are not part of the primary key (INIT). If the record type uses a number series (as does Customer), the record is now inserted to make sure any other initializations (in the insert trigger) are executed. Only then the remaining fields are set. Finally, the number of the created customer is returned.

    Field Values

    When creating a record some fields will need to be given a value. There are two ways to obtain the actual values to be used: they can be passed in via parameters or they can be generated inside the creation function. As a rule of thumb when calling a creation function from within a test function, only the information that is necessary to understand the purpose of the test should be passed in. All the other values are "don't care" values and should be generated. The advantage of generating "don't care" values inside the creation functions over the use parameters is that it becomes immediately clear which fields are relevant in a particular test by simply reading its code.

    For the generation of values different approaches may be used (depending on the type). The RANDOM, TIME, and CREATEGUID functions can all be used to generate values of different simple types (optionally combined with the FORMAT function).

    In the case a field refers to another table a random record from that table may be selected. The example shows how to use the NEXT function to move a random number of records forward. Note that the COUNT function is used to prevent moving forward too much. Also note that If this pattern is used a lot, there may be a performance impact.

    Although the use of random value makes it very easy to understand what is (not) important by reading the code, it could make failures more difficult to reproduce. A remedy to this problem is to record all the random values used, or to simply record the seed used to initialize the random number generator (the seed can be set using the RANDOMIZE function). In the latter case, the whole sequence of random values can be reproduced by using the same seed.

    As an alternative to selecting random records, a new record may be created to set a field that refers to another table.

    Primary Key

    For some record types the primary key field is not generated by a number series. In such cases a simple pattern can be applied to create a unique key as illustrated by the creation function below:

    CreateGLAccount(VAR GLAccount : Record "G/L Account") : Code[20]; 
    
    GLAccount.SETFILTER("No.",'TESTACC*');
    IF NOT GLAccount.FINDLAST THEN
    GLAccount.VALIDATE("No.",'TESTACC000');
    GLAccount.VALIDATE("No.",INCSTR(GLAccount."No."));
    GLAccount.INIT;
    GLAccount.INSERT(TRUE);
    EXIT(GLAccount."No.")

    The keys are prefixed with TESTACC to make it easy to recognize the records created by the test when debugging or analyzing test failures. This creation function will generate accounts numbered TESTACC001, TESTACC002, and so forth. In this case the keys will wrap around after 999 accounts are created, after which this creation function will fail. If more accounts are needed extra digits may simply be added.

    Input Parameters

    For some of the fields of a record you may want to control their values when using a creation function in your test. Instead of generating such values inside the creation function, input parameters may be used to pass them in.

    One of the difficulties when defining creation functions is to decide on what and how many parameters to use. In general the number of parameters for any function should be limited. This also applies to creation functions. Parameters should only be defined for the information that is relevant for the purpose of a test. Of course that may be different for each test.

    To avoid libraries that contain a large number of creation functions for each record, only include the most basic parameters. For master data typically no input parameters are required. For line data, consider basic parameters such as type, number, and amount.

    Custom Setup

    In a particular test you typically want to control a slightly different set of parameters compared to the set of parameters accepted by a creation function in an existing library. A simple solution to this problem is to update the record inline after it has been returned by the creation function. In the following code fragment, for example, the sales line returned by the creation function is updated with a unit price.

    LibrarySales.CreateSalesLine(SalesLine.Type::"G/L Account",AccountNo,Qty,SalesHeader,SalesLine,);
    SalesLine.VALIDATE("Unit Price",Amount);
    SalesLine.MODIFY(TRUE);

    When the required updates to a particular record are complex or are required often in a test codeunit, this pattern may lead to code duplication. To reduce code duplication, consider wrapping a simple creation function (located in a test helper codeunit) in a more complex one (located in a test codeunit). Suppose that for the purpose of a test a sales order needs to be created, and that the only relevant aspect of this sales order is that it is for an item and its total amount. Then a local creation function could be defined like

    CreateSalesOrder(Amount : Integer; VAR SalesHeader : Record "Sales Header") 
    
    LibrarySales.CreateSalesHeader(SalesHeader."Document Type"::Order,CreateCustomer(Customer),SalesHeader);
    LibrarySales.CreateSalesLine(SalesHeader,SalesLine.Type::Item,FindItem(Item),1,SalesLine);
    SalesLine.VALIDATE("Unit Price",Amount);
    SalesLine.MODIFY(TRUE)

    In this example a complex creation function wraps two simple creation functions. The CreateSalesHeader function takes the document type, a customer number (the customer is created here as well) as input parameters. The CreateSalesLine function takes the sales header, line type, number, and quantity as input. Here, a so-called, finder function is used that returns the number for an arbitrary item. Finder functions are a different type of helper functions that will be discussed in a future post. Finally, note that the CreateSalesLine function needs the document type and number from the header; instead of using separate parameters they are passed in together (with the SalesHeader record variable).

    Summary

    To summarize here is a list of tips to consider when defining creation functions:

    • Return the created record via an output (VAR) parameter
    • If the created record has a single-field primary key, return it
    • Make sure the assigned primary key is unique
    • If possible, have the key generated by a number series
    • The safest way to initialize a record is to make sure all triggers are executed in the same order as they would have been executed when running the scenario from the user interface. In general (when DelayedInsert=No) records are created by this sequence :
      • INIT
      • VALIDATE primary key fields
      • INSERT(TRUE)
      • VALIDATE other fields
      • MODIFY(TRUE)
    • Only use input parameters to pass in information necessary to understand the purpose of a test
    • If necessary add custom setup code inline
    • Wrap generic creation functions to create more specific creation functions
    • Instead of passing in multiple fields of a record separately, pass in the entire record instead

    These and some other patterns have also been used for the implementation of the creation functions included in the Application Test Toolset.

  • Microsoft Dynamics NAV Team Blog

    NAV 2009 Tips and Tricks: Create Notifications from Task Pages

    • 7 Comments

    You can create notifications from task pages such as customer cards or sales orders. You can use notifications as reminders, or as a way to send information to other NAV users in your company.

    A notification is displayed on the recipient's Role Center. By clicking the notification, the recipeint opens the associated task page.

    1. To create a notification, open the task page from which you want to send a notification. For example, open a customer card, sales order, or other task page.

    2. In the FactBox area, scroll down to the Notes part.

    3. In the Notes part, click to create a new note.

    4. In the Enter a new note here text box, type a note.

    5. Select a recipient.

    6. Select Notify.

    7. Click Save.

    The notification is saved on the task page.

     

    The notification is also relayed to the recipient and is displayed on his/her Role Center under My Notifications.

    For more information about usability and the RoleTailored client, see the blog post Useful, Usable, and Desirable.

  • Microsoft Dynamics NAV Team Blog

    Integration to Payment Service in NAV 2009 R2

    • 4 Comments

    The Payment Service available from Online Services for Microsoft Dynamics ERP is an example of the growing availability of online services that users of ERP systems can benefit from connecting to. Adding functionality to the application through connecting to a service is new territory for us in the NAV development team, and we have learned a lot through this development project. We are looking forward to sharing the benefits of being able to expand the service, while we keep our focus on delivering the new NAV product.

    The Payment Service is hosted by Microsoft and the number of available Payment Providers is growing. Today there are multiple Payment Providers like First Data, Pay Pal, and Cybersource supporting the US and Canadian markets. The plan is to grow the number of payment providers, so that the rest of the world can be supported as well. We are shipping the integration for all NAV supported countries - even though the payment providers aren't available yet - so the code is ready when the service becomes available.

    The integration to the Payment Service that is included in NAV 2009 R2 will allow users of Microsoft Dynamics NAV to accept credit and debit card payments from Sales Orders, Invoices, as well as the Cash Receipt Journal. The solution will allow for both an authorization process and an automatic capture of the required amount during post as well as using it more freely on the Cash Receipt Journal.

    Adding the integration to the online services has been done with a number of goals in mind:

    • Keeping it simple: Adding the integration to the Payment Service will allow the user of NAV to work within NAV when he is accepting the credit cards as payments. There is no need for a third party add-on outside the normal environment. The payment information is built in to the existing order entry process, using the Sales Orders and the Invoices as a starting point. This means a simple payment flow that doesn't require a huge effort to learn and set up. This will support the envisioning behind adding services to existing installations: it must add to the existing functionality without making it more complex.
    • Power of Choice: Secondly, choosing the online payment service will allow the users to choose the payment provider that suits their needs the best. There can be a difference in the transaction cost per payment provider and the user is encouraged to investigate which one that fits their scenario the best. Depending on the payment provider, there is support for multiple credit cards and currencies. Out of the box there is built-in setup for Visa, MasterCard, American Express, and Discover.
    • Secure integration: Third, there has been focus on ensuring that the information that is required to handle credit card transactions is kept as secure as possible and that the design adheres to the standards of the market. To this, there are two aspects to consider: the data that is stored in the ERP system and what is being sent to the payment providers through the service.
    • The ERP data includes encrypted storage of the customer credit card number as well as securing that users don't have access to the numbers.
    • The payment service is also certified by following the guidance of the Payment Card Industry (PCI) Security Standards Council.

    Scenarios Covered by the Integration

    The areas that are relevant when describing the integration to the Payment Service can be described by the following scenarios:

    1. Authorization of the amount from the Sales Order or Invoice against the customer's credit card
    2. Capture of the actual amount and thereby also creating the payment in the system
    3. Voiding the authorized amount
    4. Refunding against an existing capture

    To describe the scenarios it is useful to think about the personas using the functionality; in this case we work with Susan, the Order Processor, as well as Arnie the Accounts Receivable Administrator.

    As a part of Susan's work, she receives and processes the incoming orders from the sales representatives. She will in some cases talk to the customer to validate the orders and ensure that items are available and that the price is correct according to the agreement. In some cases the customer can request that they want to pay using a credit card, instead of having to handle the invoice later. Susan has to ensure that the information required for using a credit card is available; if not, she will get the information from the customer.

    If Susan needs to be certain that the customer can pay the agreed amount, she can go ahead and authorize the amount against the provided credit card information. If the result is successful the sales order can be shipped. When the sales order is posted (or parts of it) the actual capture of the amount on the sales order will automatically be processed. When capture is successful the payment will be automatically registered and money will be received shortly.

    On the sales order it looks like below - there are two new fields for the credit cards - as well as a requirement to use a specific Payment Method (described below in the Getting Started section).

     

    The scenario above is the simplest process that is supported by the new payment service integration. The following scenarios is also covered in the implementation:

    1. Partial posting of the sales order will only capture the amount that is posted. The rest can be captured later.
    2. It is possible from the Cash Receipt Journal to accept multiple credit cards against the same invoice. This is done by adding multiple lines in the journal - one per credit card.
    3. It is possible to use a credit card transaction to cover more than one invoice. Again, this is only possible from the Cash Receipt Journal.
    4. It is possible to void an existing authorization in case the amount is not needed. This is only implemented in a manual step.
    5. It is possible to refund an existing transaction as well as part of a transaction. This is done through the Credit Memo.

    All of the above transactions and connection to the payment service can be seen on the specific customer as well as on the specific documents. On all places a Transaction Log has been implemented that shows the status of the current transactions and if the connections have been successful or not.

    Getting Started

    Enabling the payment integration does require a couple of steps both inside and outside NAV:

    • First of all, it is required to sign up for the Payment Service. Details can be found here: Microsoft Dynamics Online Payments Introduction. The sign up includes validating and choosing which Payment Provider to use. There is a difference in the cost and in the support, so some investigation is recommended. After signing up a Live ID and a Password is provided that is required when setting up the connection.
    • Within Microsoft Dynamics NAV 2009 R2 there are a couple of steps that needs to be completed:
      • First of all, the connection needs to be setup - and this is where the Live ID and the password are required. Please note  that the Microsoft Dynamics ERP Payment Service Connection Setup is only available in the Classic Client due to security reasons.  

    •  
      • For the connection to carry the correct currency there is a need to fill out the currency field on the General Ledger Setup. Please verify with the help document for the correct values. Below is an example in the CRCARD payment method
      • Finally it is required that a Payment Method is created that uses the Payment Processor field as well as the bank account with correct currency as signed up with

    For more information please look at the following resources:

    -Rikke Lassen

     

Page 2 of 2 (20 items) 12