December, 2012

  • Microsoft Dynamics NAV Team Blog

    Reading and Writing Unicode Files using C/AL

    • 6 Comments

    Hello,

    We have had some partner suggestion for adding Unicode capabilities to the existing Microsoft Dynamics NAV File functions. What we recommend is to use .NET Interop to achieve this functionality.

    For example, you can use an instance of the System.IO.StreamWriter class to write to an OutStream or the System.IO.StreamReader class to read from an InStream. You control the encoding by using the System.Text.Encoding class where you select between Default, Unicode or UTF8 encoding.

    • Please note that XMLports in Microsoft Dynamics NAV 2013 directly supports importing and exporting flat text files in MS-DOS, UTF-8, UTF-16 encodings by setting the new TextEncoding property.

    Writing Unicode text files

    Let’s start with a small example on writing some text to a file in Unicode format.

    Declare the following variables:

    Name DataType Subtype
    outFile File  
    outStream OutStream  
    streamWriter DotNet System.IO.StreamWriter.'mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089' 
    encoding DotNet System.Text.Encoding.'mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089' 

    Then write this code, using a suitable path for the outFile:

      outFile.CREATE('c:\temp\OutFile.txt');

      outFile.CREATEOUTSTREAM(outStream);

      streamWriter := streamWriter.StreamWriter(outStream, encoding.Unicode);

     

      streamWriter.WriteLine('Hello World');

     

      streamWriter.Close();


      outFile.CLOSE();
    • You can use 'Hello World' as above or a text string with some more special characters as you like. I added some Danish characters..: 'Hello ÆØÅ World' to more easily see what’s going on.

     Run the code and verify the file is in Unicode.

    • One way to verify a file is in Unicode is to open it with Notepad and select File/Save As…, and then inspect the Encoding field.

    Try change the above example to use encoding.Default and verify the file is in ANSI (codepage) format (for example using Notepad as above).

    Please note that if you use C/AL to write directly to the outStream, like this:

    outStream.WRITETEXT('Hello World');

    this is still handled using MS-DOS encoding and is compatible with previous versions of Microsoft Dynamics NAV.

    • You can open a file in MS-DOS encoding by starting Wordpad and in the Open file dialog select “Text Documents – MS-DOS Format (*.txt)”.

    In Microsoft Dynamics NAV 2013, all normal text handling are done in Unicode, so the data that are entered on the pages – like customer names one on the Customer Card – can utilize the Unicode character set, the data can be stored in the database and used in C/AL code.

    Let’s see how you can extend the above example with data from the Customer table:

    Add the customer variable:

    Name DataType Subtype
     customer Record   Customer

    And write this code – you can set a filter if needed:

      outFile.CREATE('c:\temp\Customers.txt');

      outFile.CREATEOUTSTREAM(outStream);

      streamWriter := streamWriter.StreamWriter(outStream, encoding.Unicode);

     

      customer.FINDSET();

      REPEAT

        streamWriter.WriteLine(customer.Name);

      UNTIL customer.NEXT = 0;

     

      streamWriter.Close();

      outFile.CLOSE();

     If you open the Customers.txt file with a Unicode file viewer it should contain all the Unicode details you have used for your customer names.

    Reading Unicode text files

    Similar to the above you can read Unicode text files by using the System.IO.StreamReader class to read from an InStream as you can see in the small example below:

    Declare the following variables:

    Name DataType Subtype
    inFile File  
    inStream InStream  
    streamReader DotNet System.IO.StreamReader.'mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089'
    encoding DotNet System.Text.Encoding.'mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089'
    txt Text  

    Then write this code, using a suitable path for the inFile:

      inFile.OPEN('c:\temp\Infile.txt');

      inFile.CREATEINSTREAM(inStream);

       

      streamReader := streamReader.StreamReader(inStream,encoding.Unicode);

      txt := streamReader.ReadToEnd();

      message(txt);

    Create the Infile.txt as a Unicode file and add some Unicode characters to it. You should see them in the message box when you run the C/AL code.

     

     

    I hope these simple examples can get you started with Unicode file handling in C/AL. 

     

    Thanks and best regards!

    Hans Kierulff

  • Microsoft Dynamics NAV Team Blog

    Writing Unit Tests in C/AL

    • 3 Comments

    What is a unit?

    A unit is any code that encapsulates an implementation. It could be one or a combination of individual functions or triggers.
    Thus, a unit could also refer to a local function. A unit, from the C/AL perspective, has the following:

    • Inputs: Database state, input parameters, global variables
    • Outputs: Return values, Parameters passed as VAR, Changed state of database, Error raised

    Why write a unit test?

    A unit test must pass an input to the unit and verify that the output is as expected. It should be coded as a procedure in a codeunit of Test subtype.

    Since unit tests are written to safeguard the implementation contract, they are not necessarily tests for functional requirements. The latter is the domain of functional tests. A successful refactoring project may result in changing some unit tests, but must not make any functional tests fail.

    Since unit tests are quick low-level tests, it is affordable to have many of them. A larger number of unit tests make it possible to have a smaller number of functional tests, as the base behavior already gets tested as one goes towards higher-level tests.

    What does a unit test have?

    Typically a unit test consists of the following sections.

    1. PRECONDITIONS - This is the first step, which creates the setup data required by the unit during the EXERCISE step. Since one of the purposes of unit testing is fast execution, data not required by the unit test at hand should not be created. It must be noted that the purpose is not to create an object that makes business sense (those that are made via GUI) but rather to create data that can serve as input to the called unit.
    2. EXERCISE - This should typically be a one-line call to the unit being tested.
    3. VERIFICATIONS - The output from a unit needs to be verified by means of assertions. The post-conditions could include invariants for a certain unit, i.e. rules that are always conformed to for a certain unit. Example: No value entry records should have Cost is Adjusted = FALSE after a cost adjustment routine has been executed.
    4. TEAR-DOWN - It may be necessary to restore the state of the database to what it was when the test started, so that other peoples’ tests can be executed from a “clean slate”. This step can
      be skipped if the TestIsolation property is enabled on the TestRunner codeunit.

    What is the purpose of a unit test?

    1. Unit tests safeguard the implementation contract to ensure that a certain piece of code behaves as coded. This could be achieved by checking for the contract of the interface of a unit, if the unit is sufficiently small. Some examples of such units are:
      1. The OnValidate trigger for Variant Code field in the Sales Line tabl
      2. The procedure MakeDateText in the TextManagement codeunit.
    2. Improve the general code coverage of the application by writing many low-level tests covering a particular code path. This results in:
      1. Increased ratio of test code compared to application code that these tests cover.
      2. More specific verification of individual code lines in the application.
    3. Unit tests must run fast by executing the minimum amount of application code in order to reach the unit.
    4. In some cases, unit tests may also be used as an alternative to full functional tests, because:

      1. There may be a need to target a unit too small to justify a functional test and/or,
      2. Unit tests are faster to run and/or,
      3. The code-coverage metrics on the application code more closely resemble the code that is being tested, unlike functional tests, which may use application code to both create setup data as well as test it. For more information, see best practices for the PRE-CONDITIONS step.

    Best practices

    With the perspective of the above objectives, a list of best practices has been drafted, which may serve as a guideline to those who create unit tests. The list should be treated as an addition to already existing best practices for writing good C/AL test code.

    1. Have clear demarcation between the three sections: PRE-CONDITIONS, EXERCISE, and VERIFICATIONS. This improves readability.

      The PRE-CONDITIONS step:
    2. Create only data (records and fields in the records) that is needed for the test.
      1. Setup data should be created using direct assignments only (":="). Using VALIDATEs and other C/AL procedures to set up data may go through unnecessary paths unrelated to the test.
      2. Avoid using any functions or triggers in the production code, as that results in "usually" doing more than what the unit test requires. For example, in a case where items need to be sold from inventory, the items may be added by two approaches:
        Avoid  Making a posting through item journals
         Prefer  Create item ledger entries directly
      3. It may not be necessary to fill in all primary key fields, as they may not be relevant to the test. For example, in a test that checks the OnValidate trigger on the Location Code field of the Sales Line table, it may not be necessary to put in the Document No. field.
      4. Because of the above reasons, the use of libraries to create business data should be limited, as they tend to make "complete" business entities. Instead, use LOCAL helper functions to promote reuse of code. As an example, some unit tests may only need (as part of their setup data) an Item record, with the Base Unit of Measure field filled in. Although the base unit of measure for any item should always have Qty. per Unit of Measure set to 1, this requirement may be irrelevant in case of such unit tests.

        LOCAL PROCEDURE CreateItemWithBaseUOM@1(VAR Item@1000 : Record 27);

        VAR

          ItemUnitOfMeasure@1001 : Record 5404;

        BEGIN

          Item.INIT;

          Item.INSERT;

          ItemUnitOfMeasure.Code := 'NewCode';

         
          ItemUnitOfMeasure.INSERT;

          Item."Base Unit of Measure" := ItemUnitOfMeasure.Code;

          Item.MODIFY;

        END;

         
      5. Improving the performance:
        1. Database changes should be limited to only INSERTs. MODIFY calls should be avoided. This limits the number of database calls and speeds up the test. For example, in order to create an item with a base unit of measure:
          Avoid The helper function to create an item record in the  previous example.
          Prefer  

          LOCAL PROCEDURE CreateItemWithBaseUOM@1(VAR Item@1000 : Record 27);

          VAR

            ItemUnitOfMeasure@1001 : Record 5404;

          BEGIN

            Item.INIT;

           
            ItemUnitOfMeasure.Code := 'NewCode';

           
            ItemUnitOfMeasure.INSERT;

            Item."Base Unit of Measure" := ItemUnitOfMeasure.Code;

            Item.INSERT;

          END;

        2. Taking the approach of minimal data creation, calls to INSERT(TRUE) should be avoided. Plain INSERTs should be used as in the above example.
        3. In case record variables are to be passed to helper functions, it is recommended to pass them as VARs even though the records themselves are not going to be altered. This is faster than passing the record variables as
          values.
      6. If the PRE-CONDITIONS step becomes very substantial in the test, it is an indication that the unit chosen is too big. In that case, it may be better to write a functional test instead of a unit test.

        The EXERCISE step:

      7. This step is usually a one-liner, making the call to the unit. For example, in a test that checks the OnValidate trigger on the Location Code field of the Sales Line table, it is sufficient to have this line:

        SalesLine.VALIDATE("Location Code", InputValueLocationCode);
      8. However, there could be instances where a supporting call needs to be made before the actual call to the target unit. Examples of such supporting calls are as follows (this is not an exhaustive list of such cases):
        1. Global variables need to be instantiated to the correct values before making the call. For example, in a test to check the creation of inventory movement for assembly:

          CreateInvtPickMovement.SetAssemblyLine(AssemblyLine);

          CreateInvtPickMovement.SetInvtMovement(TRUE);

          CreateInvtPickMovement.AutoCreatePickOrMove(WarehouseActivityHeader);

        2. A Report object needs to be prepared before calling itself. The following code sample illustrates the call to test if the report returns the bin content:

          BinContent.SETRANGE("Item No.",BinContent."Item No.");

          WhseGetBinContent.SETTABLEVIEW(BinContent);

          WhseGetBinContent.USEREQUESTPAGE(FALSE);

          WhseGetBinContent.InitializeReport(WhseWorksheetLine,WhseInternalPutAwayHeader,0);

          WhseGetBinContent.RUN;


        3. In case the unit is an action to be called on a page, some preparation for the corresponding TestPage object is needed, as illustrated when testing the Show Availability action on the Assembly Order page:

          AssemblyOrderTestPage.OPENEDIT;

          AssemblyOrderTestPage.ShowAvailability.INVOKE;

      9. It may not always be obvious what the EXERCISE step should comprise of if the unit is inaccessible from outside the object it resides in. In that case, the call should be made to the public procedure (or trigger) that traverses the smallest path to the unit. The debugger can facilitate finding the stack from which a unit is called. For example, in order to test the LOCAL function  InsertPickOrMoveBinWhseActLine in Create Inventory Pick/Movement codeunit, the following line may be used as the EXERCISE step (note the comment added to clarify what is being tested here):

        // The next line tests the InsertPickOrMoveBinWhseActLine procedure

        CreateInvtPickMovement.AutoCreatePickOrMove(WarehouseActivityHeader);

        Arranging the unit tests:
      10.  It is recommended that unit tests and functional tests are not mixed in the same codeunit, because:
        1. Unit tests are typically  quicker to run and may therefore be grouped together with only its own kind for the first level of regression testing.
        2. Since the two kinds of tests may have been written with different intents (those making the implementation and those testing it), it may be a good idea to keep them in separate test codeunits as well.
      11. For the purpose of clean categorization, it could be an idea to keep unit tests that call the same object (in the EXERCISE step) grouped together by either placing them in the same codeunit or placing them in adjacent (closely numbered) codeunits.

        Others:
      12. Comments are a big help in understanding the intent of the test creator, especially because a unit test may create non-intuitive or incomplete business entities that may be impossible to be created from the user interface.
      13. It  is recommended to switch on TestIsolation when running unit tests- as these may result in corrupted data.

    Examples of unit tests

    Examples are based on the W1 application code in Microsoft Dynamics NAV 2013.

    Example 1

    Business purpose

    Test that changing the Costing Method on an item to Specific leads to an error if the tracking code is not Serial number specific.

    The unit

    OnValidate trigger of the Costing Method field on the Item table:

      

    ...

    IF "Costing Method" = "Costing
        Method"::Specific THEN BEGIN

    ...   

      IF NOT
        ItemTrackingCode."SN Specific Tracking" THEN

        ERROR(
        Text018,

    ...   

     

    The test

         [Test]
        PROCEDURE VerifyErrorRaisedOnChangingCostingMethodToSpecific@1();
        VAR
          ItemTrackingCode@1001 : Record 6502;
          Item@1000 : Record 27;
        BEGIN
          // Changing the Costing Method on an Item card to Specific
          // leads to an error if the tracking code is not Serial number specific

          // SETUP : Make item tracking code without SN Specific Tracking
          ItemTrackingCode.INIT;
          ItemTrackingCode.Code := 'MyITCode';
          ItemTrackingCode."SN Specific Tracking" := FALSE;
          ItemTrackingCode.INSERT;

          // SETUP : Make item with above item tracking code
          Item.INIT;
          Item."No." := 'MyItem';
          Item."Item Tracking Code" := ItemTrackingCode.Code;

          // EXERCISE : Validate Costing method to Specific
          ASSERTERROR Item.VALIDATE("Costing Method",Item."Costing Method"::Specific);

          // VERIFY : error message
          IF STRPOS(GETLASTERRORTEXT,'SN Specific Tracking must be Yes') <= 0 THEN
            ERROR('Wrong error message');
        END;

    Observations

    The target unit was a very specific line and the  above test was therefore short and precise.

     

    Example 2

    Business purpose

    Test that posting a sales order creates a posted  shipment line with the correct quantity.

    The unit

    The OnRun trigger in Sales-Post codeunit.

    The test

         [Test]
        PROCEDURE TestPostedSalesQuantityAfterPosting@2();
        VAR
          Item@1001 : Record 27;
          SalesHeader@1000 : Record 36;
          SalesLine@1002 : Record 37;
          SalesShipmentLine@1003 : Record 111;
          ItemUnitOfMeasure@1005 : Record 5404;
          Quantity@1004 : Decimal;
        BEGIN
          // Post a sales order and verify the posted shipment line quantity.

          // SETUP : Make item with above item tracking code
          Item.INIT;
          Item."No." := 'MyItem';
          // Create unit of measure
          ItemUnitOfMeasure."Item No." := Item."No.";
          ItemUnitOfMeasure.Code := 'PCS';
          ItemUnitOfMeasure.INSERT;
          Item."Base Unit of Measure" := ItemUnitOfMeasure.Code;
          Item."Inventory Posting Group" := 'RESALE';
          Item.INSERT;

          // SETUP : Create sales header with item and quantity
          SalesHeader."Document Type" := SalesHeader."Document Type"::Order;
          SalesHeader."No." := 'MySalesHeaderNo';
          SalesHeader."Sell-to Customer No." := '10000';
          SalesHeader."Bill-to Customer No." := SalesHeader."Sell-to Customer No.";
          SalesHeader."Posting Date" := WORKDATE;
          SalesHeader."Document Date" := SalesHeader."Posting Date";
          SalesHeader."Due Date" := SalesHeader."Posting Date";
          SalesHeader.Ship := TRUE;
          SalesHeader.Invoice := TRUE;
          SalesHeader."Shipping No. Series" := 'S-SHPT';
          SalesHeader."Posting No. Series" := 'S-INV+';
          SalesHeader."Dimension Set ID" := 4;
          SalesHeader.INSERT;

          // SETUP : Create the sales line
          SalesLine."Document Type" := SalesHeader."Document Type";
          SalesLine."Document No." := SalesHeader."No.";
          SalesLine.Type := SalesLine.Type::Item;
          SalesLine."No." := Item."No.";
          Quantity := 7;
          SalesLine.Quantity := Quantity;
          SalesLine."Quantity (Base)":= Quantity;
          SalesLine."Qty. to Invoice" := Quantity;
          SalesLine."Qty. to Invoice (Base)" := Quantity;
          SalesLine."Qty. to Ship" := Quantity;
          SalesLine."Qty. to Ship (Base)" := Quantity;
          SalesLine."Gen. Prod. Posting Group" := 'RETAIL';
          SalesLine."Gen. Bus. Posting Group" := 'EU';
          SalesLine."VAT Bus. Posting Group" := 'EU';
          SalesLine."VAT Prod. Posting Group" := 'VAT25';
          SalesLine."VAT Calculation Type" := SalesLine."VAT Calculation Type"::"Reverse Charge VAT";
          SalesLine.INSERT;

          // EXERCISE : Call the codeunit to post sales header
          CODEUNIT.RUN(CODEUNIT::"Sales-Post",SalesHeader);

          // VERIFY : A Posted Shipment Line is created with the same quantity
          SalesShipmentLine.SETRANGE("Order No.",SalesHeader."No.");
          SalesShipmentLine.FINDLAST;
          SalesShipmentLine.TESTFIELD(Quantity,Quantity);
        END;

    Observations

    The targeted unit is large and there are many lines before the code to set the quantity on the Sales Shipment Line is reached. In order to reach this line a large setup is needed in the test as well, which makes the unit test bulky. It may be better to write a functional test in this case.

    References

    1. Testing the application
    2. Testing pages
    3. TestIsolation Property

     

     

    Soumya Dutta from the NAV team

  • Microsoft Dynamics NAV Team Blog

    Technical videos for Microsoft Dynamics NAV 2013

    • 0 Comments

    The Microsoft Dynamics NAV support team have posted videos on YouTube that illustrate various aspects of deployment and configuration of Microsoft Dynamics NAV 2013, including tips for how to extend the core product. You can share the videos with other partners, and with your customers - they are the same videos that our internal supporters have been able to consume for a while, and we hope you will enjoy them as much as we do.

    The following table lists the videos that are currently available - note that more videos may be made available later, so make sure you check to see later!

    Video

    Description

    Microsoft Dynamics NAV 2013 –   Technical – NAV Cluster

    This video shows how Microsoft Dynamics NAV 2013 behaves with NLB Clustering in a three-tier setup.

    Microsoft Dynamics NAV 2013 – Technical – Unicode

    This Video shows what’s new in Microsoft Dynamics NAV 2013 for Unicode and how it works for Unicode data.

    Microsoft Dynamics NAV 2013 –  Technical – SharePoint

    This video explains how-to set up, configure and use the new Microsoft Dynamics NAV 2013 SharePoint client. It also highlights some of the client’s features as well as tips to debug the client using commonly used SharePoint debug methods.

    Microsoft Dynamics NAV 2013 – Technical – SQL Filtering

    This video demonstrates the different options for filtering data that are available in the Microsoft Dynamics NAV 2013 query object.

    Microsoft Dynamics NAV 2013 – Technical – SQL Query Object

    This video demonstrates creating
      and executing a typical Query object in Microsoft Dynamics NAV 2013.

    Microsoft Dynamics NAV 2013 – Technical – SQL Queries with C/AL Code

    This video demonstrates how to use a Microsoft Dynamics NAV 2013 query object in your C/AL code, including best practices tips.

    Microsoft Dynamics NAV 2013 – Technical – SQL Joining Tables

    This video demonstrates how to join multiple data items (tables) and filter results with a Microsoft Dynamics NAV 2013 query object.

    Microsoft Dynamics NAV 2013 – Technical – Charts

    This video shows how to create and modify a generic chart, and how to make a simple extended chart.

     

    Best regards,

     

    The Microsoft Dynamics NAV support team

     

  • Microsoft Dynamics NAV Team Blog

    Partners Meet Customer Demand for Faster Implementation

    • 0 Comments

    “Our setup time is down from more than a week to only a few hours using RapidStart Services for Microsoft Dynamics NAV”

    RapidStart Services for Microsoft Dynamics NAV has helped Microsoft partner, Abakion, deploy their Purchase Order Management solution much faster. A standard setup used to take them over a week. With RapidStart Services, implementation time is only a few hours.

    RapidStart Services for Microsoft Dynamics NAV was introduced in response to customer demand for faster implementation. Customers want to hear that it’s easy to convert data, so their transition to their  new ERP system will be smooth,” says Claus Lundstrøm, Senior Product Manager at Abakion.

    What’s the big idea?

    The idea behind RapidStart Services was to enable companies to complete a setup with minimal training and to use ready-to-use data templates. This shortens the implementation time and allows  customers to focus on their business, rather than getting caught up in a time-consuming system switch.

    RapidStart Services, an integrated part of Microsoft Dynamics NAV, comes with standardized Configuration Packages and Questionnaires that make the Whole process faster. It also provides customers with a project overview and the ability to automate importing data from their old system using Master Data Import.

    Data import is no longer a hassle

    Abakion started using RapidStart Services for Microsoft Dynamics NAV to shorten their setup process and respond to the competitive situation.

    “We learned how to use it quickly, and it has decreased our implementation time dramatically. Our Purchase Order Management add-on is now installed and configured at a customer site in only a few hours,” says Lundstrøm.

    Abakion now uses Master Data Import to configure and set up their solution. Not only is the data imported, but all the details are taken care of as well. The company-specific imported data includes both user access rights and roles to determine what users have access to according to the role assigned to them.

    “If the data could not be imported with RapidStart, it would take at least a week to setup manually,” Lundstrøm says.

    Configuration templates save a lot of time

    Companies can use out-of-the-box Configuration Packages in RapidStart Services in order to save time. This makes it fast and easy to configure generic data, such as currency codes, posting groups, VAT templates, chart of accounts, payment terms, and countries.

    Existing Configuration Packages can always be extended later on, if needed. Companies can also choose to import their own customized Configuration Packages using Master Data Import.

    Abakion used their own customized text templates and specific text for multiple languages, importing 30 different text templates in multiple languages.

    About Abakion’s Purchase Order Management solution

    Abakion’s Purchase Order Management is an add-on solution built on Microsoft Dynamics NAV, which allows for control over the changes in purchase orders within the environment of a portal hosted on Microsoft Azure.

    Once the quotes/orders are created by the purchaser, they are then available in the portal, where vendors can update the information about a quote/order.

    “Then, via the portal, the vendor can inform the purchaser if he can deliver on time. If he can only partially deliver, he can make changes to the order or quote,” says Lundstrøm,

    The purchaser is automatically informed about all changes in Microsoft Dynamics NAV and can take necessary actions by looking at the overview of outstanding quotes and orders.

     

  • Microsoft Dynamics NAV Team Blog

    Disassembly in Microsoft Dynamics NAV 2013 Assembly Management

    • 0 Comments

    Assembly Management was released as part of Microsoft Dynamics NAV 2013, and it includes a set of features designed for companies that supply products to their customers by
    combining components in simple processes, such as assembly, light manufacturing, and kitting.

    Kitting and Assembly Management

    Before the Microsoft Dynamics NAV 2013 release, customers and partners in selected regions and countries, such as North America, France, and Australia, may have been familiar with a similar local functionality called Kitting. The uptake of Kitting  in those geographies has been both the driver and inspiration behind the global Assembly Management feature. Its value proposition and the functional breadth have been closely investigated and appreciated by the NAV core development team, resulting in Kitting’s key business and user requirements laying the conceptual foundation for the new Assembly Management feature.

    To ensure the quality of the new global Assembly Management feature as well as its solid integration into the existing supply chain suite, the core development team produced a
    physical and conceptual design that differed significantly from the one in Kitting. The differences are many. While Kitting used the BOM journal to manage the assembly process, Assembly Management operates with the concept of assembly order, which offers more functional flexibility, better user interface and extensibility. Last, but not least, assembly order offers more superior integration points to the rest of inventory, such as availability calculations and order promising, item tracking, supply planning,  warehousing, and costing. This is understandable because the BOM journal, like any other journal in the application, is different from an order in its flexibility and scope, with the BOM Journal’s main goal being to support users in quick data entry for a subset of specialized transactions, while bypassing any auxiliary inventory processes.

     Disassembly

    With the introduction of the assembly order, the BOM journal was deemed unnecessary. This also meant that a disassembly process, which the BOM journal partially supported, would
    have to be redesigned in Assembly Management along the same principles as the common assembly process. More specifically, a disassembly order to manage a reverse conversion process, i.e. from one item to many, would need to be added and integrated to the same supply chain features that the assembly order integrates to (see above). The disassembly order was not included in Microsoft Dynamics NAV 2013.
    Customers, who assemble their items themselves (as opposed to purchasing them from outside or getting them back through returns), and who need to reverse the previously made assembly, can use the new Undo Assembly function.

    The option of keeping the BOM journal in the application for the sake of disassembly was rejected for several reasons:

    a) Though useful in some scenarios, the actual BOM journal support for a disassembly process was limited to automatic creation of negative and positive adjustments for finished items and components respectively.

    b) Its handling of cost flows required serious improvement.

    c) As pointed out earlier, a journal line offers no integration to other parts of the inventory processes.

    The core development team will be evaluating the possibility of including a properly designed disassembly feature in the future NAV releases. Kitting customers that used the BOM journal for their disassembly tasks should approach their partners for a discussion of reintroducing a journal for disassembly scenarios. The core development team will make no  design recommendations in this regard.

    For more information, see Assemble Items in the MSDN Library.

    Best regards,

    The NAV Supply Chain team

Page 1 of 1 (5 items)