Microsoft Dynamics NAV

Team Blog

  • Microsoft Dynamics NAV Team Blog

    Best Practices Tips and Tricks for Upgrading to Dynamics NAV 2013 R2 or Dynamics NAV 2015

    • 10 Comments

    Earlier, we announced that Dynamics NAV 2009 SP1 and Dynamics NAV 2009 R2 would go out of mainstream support in January 2015, and that means that the Classic Client stack is no longer supported:

    Mainstream Support for Microsoft Dynamics NAV 2009 ends January 13 including 2009 SP1 and 2009 R2

    We know that many of you have been upgrading to the latest version for years so that you would benefit from the various impreovements and new functionlity that new versions include. but with Classic Client no longer supported, it's time to get your customers upgraded to the Dynamics NAV Windows client, and that’s why we issued lots of documents and articles to help you in this process, including the option of automating the upgrade process.

    But over the past couple of years, we in the Support organization have collected a range of tips and tricks that we would like to share with you to help you get your customers upgraded. We will be summarizing those tips and tricks in 2 blogs, this one and another one following soon.

    The purpose of this is to lay the ground for future upgrades, and to guide you in the best practices for upgrading from the older Dynamics NAV platforms to the latest.

    This includes the following points:

    • Preparation - what should I take care of before I start?
    • What should I follow to guarantee a successful upgrade?
    • Known Issues and how to work around some complicated situations during upgrade.

    Preparations

    The first question is always, what should I take care of before I start? Do I have to prepare anything at all? What are the available toolkits to use, where do I find them?

    Available Upgrade variations and toolkits

    In this section we will be mentioning the available variations from toolkits and methods used to upgrade to Dynamics NAV 2013 R2 / Dynamics NAV 2015, so as you can plan which toolkit to use based on the current version of the database that need to be upgraded.

    First, get the right upgrade toolkit for the job:

    Upgrading from:

    Toolkit / Variation:

    Microsoft Dynamics NAV 2013

    Standard Upgrade Toolkit. 

    Microsoft Dynamics NAV 2013 / R2

    Standard Upgrade Toolkit / Automated Data Upgrade

    Microsoft Dynamics NAV 2009 SP1

    Upgrade first to NAV 2013 and then to NAV 2013 R2 / NAV 2015

    Microsoft Dynamics NAV 2009  R2

    Upgrade Toolkit directly  to NAV 2013 R2 / NAV 2015

    Older Version

    Has to get first to Microsoft Dynamics NAV 2009

    Get the upgrade toolkit from the product media of the version you upgrade to. At the end of this blog post, we list a couple of links to videos and steps.

    OK, now I have the right tool kit, what should I take care of before I start?

    So as you guarantee that you will not face big Issues while upgrading or later on while working on Microsoft Dynamics NAV 2013 R2 / Microsoft Dynamics NAV 2015, please consider the following:

        • Are all objects compiled?
            • You must make sure that all your objects are compiled before the transformation because any not compiled objects can be a potential reason for lots of upgrade issues.
          • What to do with third party objects?
            • Third party objects also have to be fully compiled before you start.
            • You have to have the right license that allows you to compile these objects.
            • If the third party objects are old, and no longer going to be used, you have to delete them before you start.
            • In some situations the license does not exist because there is no relation to the party that developed the component, in this case we have to work around this by deleting the objects from the database and cleaning it from the Objects table and the Objects Metadata table manually on the older database before you start.
          • Why should all objects (if possible) be compiled?
        • Are you using any other language / Localization different than W1?
          • Having a different collation?
            • There was a collation change between older and newer versions of NAV.
            • Starting from NAV 2013 we are pushing the use of Windows collations Vs SQL Server Collations, this means if you used SQL Server special collations in older Databases, you should have some extra work preparing the data.
            • It is mainly about ensuring that no Primary key fields depend on the SQL old collation, so in case you have primary Keys with any special characters, you might be obliged to work on these values before upgrading.
            • One example in the German language is the letter ‘ß’ that would be recognized from the 2 letters ‘SS’ only in some special collations in sql.
            • If you do have a violation of primary key related to such an example, be aware that it is a data issue, and it has to be manually treated.
            • Consider that collation changing for huge databases might be a long process that needs to be planned properly, we will be back to the collation point later on in this document, talking about collation issues.
          • Special caption problem?
            • More than once it was reported, that while converting the data in NAV 2013 R2 / NAV 2015, you get errors related to a value that is invalid.
            • Mainly these Values are coming from the captions of come objects in the older database.
            • After converting the data from non-Unicode to Unicode data, and moving some values between CAL to xml, some characters could give a value that is not recognized by NAV2013 R2 / NAV 2015.
            • These values would be saved in Metadata after or while compiling those objects the first time in NAV 2013 R2 / NAV 2015
            • As an example:
              “Hexadecimal value 0x[whatever] is an invalid character”
              To check what is valid and invalid, please check quickly the link below:
              http://seattlesoftware.wordpress.com/2008/09/11/hexadecimal-value-0-is-an-invalid-character/
            • The problem here is: sometimes you cannot know exactly where this value might exactly be.
            • If you do not know where the problem is, you could try removing the captions before starting the Upgrade, carry on with your upgrade without it and try to re-import it again afterwards.
        • Do you have Dimension codes and values especially coming from older versions?
          • The structure of dimensions in Microsoft dynamics NAV has changes couple of times between versions, especially if you come from a very old version.
          • What you need to make sure of before upgrading is that if you do have any dimensions used in details tables that they are also existing in the master table of dimensions.
          • Simply you need to make sure that the relation between the dimension value table and all dimension related transaction tables are well maintained.A famous example for a well-known error while upgrading to NAV 2013 R2 / NAV 2015 is the following:
            'Dimension value ID' must contain a value in 'dimension set items': 'dimension set id = 0, dimension code = COST CARRIERS'. Value cannot be null or empty
            or
            'Dimension Value ID must have a Value in Dimension Set Entry' 
            • This is mainly about the relation between the dimension value table and the transaction tables containing dimension Informations
            • For example the values that are in table 355 and do not exist in table 349 can cause such an error.

              You can check the consistency with a simple query (in case it is now related to a missing value in the G/L Entry table):

              select * from

               CRONUS AG$Ledger Entry Dimension] A

               where A.[Dimension Value Code] not in

               (

               select B.Code

               from

               [CRONUS AG$Dimension Value] B

               )

          • In case of any found inconsistency, check the values, and either to add it to the master table 349 or to delete it from 355.
          • The above query example would not apply to all situations, it is just an example to illustrate how to approach such an Issue.
          • Another reason for this, would be having some special characters in the dimension name, which might not be understood after the collation change that took place while opening the database the first time with the Dynamics NAV 2013 client.
          • There was also a good blog discussing the same behavior: http://www.mibuso.com/forum/viewtopic.php?t=57192
          • Another Example would be also related to dimensions in case you had something to do with Service management Module in older versions
            • Also in Step 2 in NAV 2013 R2 / NAV 2015 you would get a similar error:
          • The Metadata Object of Table 5930 was not found.

          •  There was an Older internal KB article related to upgrading to NAV 2009 with service management module Talking about this before stating the following:
          •  Due to major changes in the design of the Service Management functionality, it is necessary to remove unnecessary service-related data stored in the database to avoid data inconsistency and business logic gaps when upgrading to Microsoft Dynamics NAV 2009.

          • A simple solution to this would be deleting all records related to Table 5930 from table Posted Document Dimension before starting the step 2 at all.
        • Huge Database
          • Upgrading huge Databases should be carefully planned, starting from planning the right resources, up to planning enough time for testing, because it might take you the first time very long till you understand how to optimize he upgrade for a huge database.
          • Every database is a different situation, so please take you time planning for every database based on the following:
            • Do you have owned vertical solutions / customized add-ons.
            • Do you have many Companies in the same database?
            • Do you use FlowFields excessively in your customizations?
            • Is your database more or less than 100 GB, did you plan enough hardware resources on your test server? And please consider doing this starting from the first test to avoid wasting time testing on week machines.
            • Think also about involving Microsoft support from the beginning, to get some advices while planning your upgrade.
          • Huge databases might fall into performance issues while upgrading.
          • We will be back to the subject ‘Performance while upgrading’ again in this document, to talk about it in more details

    What should I follow to guarantee a successful upgrade?

    Upgrading to Dynamics NAV 2013 R2 / NAV 2015 can be a complex and a time consuming if you do not know what guide to follow, for such a long process you need to have always a step by step procedure to guide you till the end of this activity. We do have many materials on this in the community, partner source and on our blogs, but we thought it would be good to mention some of these recommended material so as you can follow while upgrading.

    As we mentioned earlier in this blog post, it depends on which version of Dynamics NAV are you coming from. You can get an overview of the steps required in the MSDN Library here:

     

    Also, you can watch these videos:

    For upgrading from Dynamics NAV 2013 R2, you can use automation. For inspiration, watch this video:

    http://www.youtube.com/watch?v=cuYV4FO97U0

    This Toolkit with the step by step explanation on how to use it, can be found in the following Blog:http://blogs.msdn.com/b/nav/archive/2013/11/22/upgrade-toolkit-for-upgrading-data-from-microsoft-dynamics-nav-2009-r2-and-microsoft-dynamics-nav-2009-sp1-to-microsoft-dynamics-nav-2013-r2.aspx?PageIndex=2

    For upgrading from Dynamics NAV 2009 SP1/R2, at first you had to upgrade to Dynamics NAV 20132, but how you can upgrade directly to Dynamics NAV 2015:

    http://blogs.msdn.com/cfs-filesystemfile.ashx/__key/communityserver-components-postattachments/00-10-56-97-14/Upgrading-NAV-2009-R2-or-NAV-2009-SP1-Database-to-NAV-2015.pdf

    And the newly released blog post:

    http://blogs.msdn.com/b/nav/archive/2015/02/23/upgrading-from-microsoft-dynamics-nav-2009-r2-or-microsoft-dynamics-nav-2009-sp1-to-microsoft-dynamics-nav-2015.aspx

    If you upgrade from an earlier verison thaan descirbed here, you have to first upgrade to Dynamics NAV 2009 SP1.

    Be careful with the not compiled objects and the way data were stored in the older versions, as mentioned before in this document.

     

    As I mentioned before, and that the amount of information is too big to be written in one blog, I will stop here, and complete the rest of the tricks in a future post.

    Abdelrahman Erlebach

    Microsoft Dynamics NAV Support EMEA

  • Microsoft Dynamics NAV Team Blog

    Cumulative Update 8 for Microsoft Dynamics NAV 2015 has been released

    • 9 Comments

    Cumulative Update 8 includes all application and platform hotfixes and regulatory features that have been released for Microsoft Dynamics NAV 2015. 

     The cumulative update includes hotfixes that apply to all countries and hotfixes specific to the following local versions:

    •   AU - Australia
    •   AT - Austria
    •   BE - Belgium
    •   CH – Switzerland
    •   CZ – Czech Republic
    •   DE - Germany
    •   DK - Denmark
    •   ES - Spain
    •   FI  - Finland
    •   FR - France
    •   IS - Iceland
    •   IT - Italy
    •   NA - North America
    •   NL - Netherlands
    •   NO - Norway
    •   NZ - New Zealand
    •   RU – Russia
    •   SE - Sweden
    •   UK - United Kingdom

    Note: You must convert the database if you are upgrading to this cumulative update from a cumulative update earlier than Cumulative Update 8 (build 41370). For more information, see Converting a Database in Help for Microsoft Dynamics NAV. 
    However, if you then open the converted database in an older version of the development environment (such as Cumulative Update 7 or earlier), the database will be silently downgraded. You will not see a warning.
    This has no impact on database structure. Users connecting to a Microsoft Dynamics NAV Server that accesses this database will not be affected. But we recommend that you immediately convert the database  again by opening it the new development environment from Cumulative Update 8.
     

    Where to find Cumulative Update 8

    You can download the cumulative update from KB 3069272 – Cumulative Update 8 for Microsoft Dynamics NAV 2015 (Build 41370). 

    Additional Information

    For information about how to install the cumulative update, see How to Install a Microsoft Dynamics NAV 2015 Cumulative Update

    For information about how to work around a recent process change, see How to Get Back the 'Hotfix Directories' from NAV 2015 Cumulative Update 1.

    For a list of all cumulative updates for this version, see Released Cumulative Updates for Microsoft Dynamics NAV 2015.  

  • Microsoft Dynamics NAV Team Blog

    Introducing the Microsoft Dynamics NAV Application Profiler

    • 9 Comments

    Did you ever wish you could monitor how your application code performs at real time? We have produced a sample and a video that can help you get started with C/AL tracing and application performance profiling.

    Get the Microsoft Dynamics NAV Application Profiler here, http://go.microsoft.com/fwlink/?LinkId=403898, and watch the video on how to install, configure and use this tool to enable C/AL code tracing and determine application performance during code execution here, http://go.microsoft.com/fwlink/?LinkId=403897.

     

    David Worthington and Dmytro Sitnik from the Dynamics NAV team

    Format: ???
    Duration: 7:00

  • Microsoft Dynamics NAV Team Blog

    Cumulative Update 7 for Microsoft Dynamics NAV 2013 R2 has been released

    • 9 Comments

    Cumulative update 7 includes all application and platform hotfixes and regulatory features that have been released for Microsoft Dynamics NAV 2013 R2. 

    The cumulative update includes hotfixes that apply to all countries and hotfixes specific to the following local versions:

    • AU - Australia
    • AT - Austria
    • BE - Belgium
    • CH - Switzerland
    • DE - Germany
    • DK - Denmark
    • ES - Spain
    • FI  - Finland
    • FR - France
    • IS - Iceland
    • IT - Italy
    • NA - North America
    • NL - Netherlands
    • NO - Norway
    • NZ - New Zealand
    • RU – Russia
    • SE - Sweden
    • UK - United Kingdom

    Where to find cumulative update 7

    You can download cumulative update 7 from KB 2964528 – Cumulative Update 7 for Microsoft Dynamics NAV 2013 R2 (Build 36703 - our apologies for originally posting the wrong build number).

    For a full list of all hotfixes included in cumulative updates for Microsoft Dynamics NAV 2013 R2, see the following CustomerSource and PartnerSource pages: 

    CustomerSource:

    PartnerSource

    More Information

    For more information about cumulative updates for Microsoft Dynamics NAV 2013 R2, see Announcement of update rollups for Microsoft Dynamics NAV 2013 R2

  • Microsoft Dynamics NAV Team Blog

    Microsoft Dynamics NAV: Faster than ever.

    • 9 Comments

     

    First, let me start by saying that based on the evidence so far: It is. Significantly.

    A number of the changes to the Dynamics NAV 2013 architecture contribute to the performance boosts that many of the test have shown. To outline some of the changes:

    • Middle tier is 64 bit
    • We are no longer using cursors but MARS
    • Ability to auto-update sift fields when looping, rather than repeating the process for each call. SETAUTOCALCFIELDS reduces the amount of statements issued
    • Pages containing flowfields now issue ‘SmartSQL’ queries, basically one source table query with outer joins for each flowfield query, again reducing the ‘chattiness’
      and calculating sift values in one server roundtrip
    • Global cache
    • REPEATABLEREAD is default isolation level
    • Locking changes in posting routines, locking at later point and reducing overall blocking time

    … and more.

    However(!), a few things have surfaced in the course of time that are not as explicitely documented as the changes above, nor as apparent, and might have unexpected side effects. I have collected some of the side effects that you might or not be aware of and that might leave you panicking if not certain what you’re facing.

     

    • Pages (with FlowFields) will in general run faster on Dynamics NAV 2013 than on NAV 2009 due to SmartSQL queries and reduced chattiness. Flow Fields on a page (all of them) are calculated in one go, which greatly reduces number of statements issued. But SmartSQL queries are not cached. Also, since we changed to MARS, SQL seems to be more sensitive to an index's column cardinality then before. You might experience that (depending on various factors) some queries that have run fine in the past now take much longer, this is due to a poor execution plan on SQL. With Dynamic cursors, query plan optimizer tended to optimize for the ORDER BY, while with MARS, SQL is free to choose.  It does a good job at it too, but in rare occasions might chose poorly. These cases are exceptions and contrary to the popular belief, this is not caused by the SmartSQL queries. These queries will have same poor execution plan and perform poorly even when isolated from the SmartsSQL query.

     

    Consider the following example. This is just an illustration of the problem, constructed on CRONUS extended database:

    A lot of Inventory transactions are posted through the Item Journal, generating a lot of Post cost to the G/L Entry. After this, when browsing an item list and opening an Item card, the page opens very slowly.

    After locating the offending query in the SQL profiler and running it isolated in Microsoft SQL Server Management Studio with ‘Include Actual Execution Plan’ option enabled, the plan looks similar to the one shown below:

     

     

    Each sub-query shows reasonably (small) percentage of cost and no apparent reason for bad execution plan. There are no obvious extreme costs, however there is a Clustered Index Scan here:

     

    Looking at the filter that SQL Server applies, namely it filters on “Post Value Entry to G_L”. “Item_No_” = “Item”.”No_”:


     

     

    Although SQL Server reports Operator Cost as small, it shows CPU Cost in excess of 2.3 in approx. 1.94 executions. So, it is likely scanning the table twice and unfortunately the table has 4,7 million rows.

    Although it is not obvious from the actual SQL Server execution plan that this is a problem, profiling with the SQL Server Profiler reports the query to use more than 5 seconds of CPU, while doing 131.519 reads to fetch 52 rows:

     
     

    The reason the Duration is on par with CPU Seconds is that all reads are logical from SQL Server Buffers. Re-issuing the query after adding the supporting index shows this in SQL Server Profiler:

     


    So Reads were reduced by a factor of 100 and (warm) duration was reduced by a factor of 40.

    As you can see, these poor execution plans are not caused by the SmartSQL. However the fact that the SmartSQL queries don’t cache their results will only amplify the issue. To solve it, we have to tackle the performance of that one isolated query by creating a covering index to improve the execution plan.

    And no, it won’t help to merely customize the page or change the visibility of the field. As long as it is contained in page metadata (so unless removed from page altogether), it will be calculated.

    So in short, if you do run into an issue of rather dramatic slowness of a page containing flowfields in Dynamics NAV 2013 or higher, isolating and testing the Flow Field queries separately (focusing on ones with clustered index scan, regardless of cost) should lead you to the culprit fairly quickly.  A supporting index should resolve the problem.

    • Temp tables: These are now moved to .NET. When you create a Temp record, your private C# TempTableDataProvider object is created. It will store whatever records you insert into a C# Dictionary, with the Primary Key Fields as the key for the Dictionary. The Dictionary is in memory and stores records in the order they
      are inserted.  When you add a SetCurrentKey, an AVLTree is built on the fly at first FindXX() you perform with that (current) key. In terms of performance, this is an expensive operation. This tree is in memory however, so it will be cached and reused later. If you issue a query without calling SetCurrentKey, an AVLTree for the primary key will be used.

    However, maintaining all this can consume quite a lot of memory on your middle tier, so plan for ample memory when scaling. Also, as mentioned above, querying temp tables is cached but sorting them is a fairly expensive operation, just something to keep in mind.

    • The next one is not strictly a performance issue, but can have a fairly drastic performance side-effect if you’re affected by it, so deserves to be mentioned:

    You might (or might not) be aware that the transaction scope and behavior of a page action has changed in Microsoft Dynamics NAV 2013 and higher. This will be especially significant if you are calling functions in codeunits (or objects other than the source table), passing the REC as a parameter and intend to lock the record in that function.

    Consider the following example: You have added an action that invokes a function in a codeunit, that in turn locks/ modifies the record (typically calling custom posting routine)

    So OnAction trigger code is, for example, as follows:

     

      PostingFunction(Rec);

     

    Where PostingFunction is a function (in any object other than the source table).

    Now, the consequence of the previously mentioned transaction scope change and the fact that you’re locking the record in the function, is that the entire source table (Rec) you passed as a parameter in the example above, is locked. In other words, you’re passing the current record as a parameter, but you are locking the whole table. The behavior would cause more damage than good to change at this point due to all functionality it would affect, so it won’t be changed, but fortunately – if
    you’re aware of this issue, the solution is very simple:

    SETSELECTINOFILTER(Rec);  //adding this line

    PostingFunction(Rec);

    Adding the line above should reduce the scope of locking and leave you with locking just the one record (or selection of records if on a list). This applies to card pages as well.

     

    • Last, but not least: you might have noticed increased Async_Network_IO waits. These cause quite a lot of concern out there. However, this is a symptom and not a problem per se.

    When Dynamics NAV issues a query that returns thousands of rows, SQL Server will start a work thread to return the result. If the application does not consume the rows as fast as they are delivered from SQL Server then the SQL Server work thread will have to wait and that shows up as wait type Network_Async_IO. If the application never consumes the entire result set then the worker thread will be hanging until the transaction is ended (when we close the statement), or if it was a read transaction, for a longer period either until the connection is closed (as idle) or if the statement is overwritten by another statement (statement cache is full).

    Example: when we do a FINDSET, a number of records (50, 100...) is retrieved. If it is a larger set it will run until all records are retrieved (or buffer is full), even if only first 10 are actually read by application. Eventually the session goes to sleep and after transaction ends sleeping sessions are removed. So in short, these are merely reflecting NAV data access methods, and are not a problem as such. If you want to reduce these, make sure you’re reading all the rows you’re asking for when using FINDSET, otherwise use FIND(‘-‘) or FIND(‘+’).

     

    With thanks to Jesper Falkebo and Jens Klarskov Jensen

     

    Jasminka Thunes

    Microsoft CSS

     

    These postings are provided "AS IS" with no
    warranties and confer no rights. You assume all risk for your use.

  • Microsoft Dynamics NAV Team Blog

    MagicPath (DOWNLOADFROMSTREAM and UPLOADINTOSTREAM without a dialog box)

    • 9 Comments

    The C/AL commands DOWNLOADFROMSTREAM and UPLOADINTOSTREAM have the purpose of sending files between RTC and the NAV Server. A few times now, we had the question: How can we use these functions without it displaying the dialog box to select a file and folder name?

    This is how you can automatically download and upload files without any user interactions:

    The trick is to use MagicPath, like in codeunit 419 "3-Tier Automation Mgt.". MagicPath is initiated by setting the folder name to '<TEMP>' like this:
    DOWNLOADFROMSTREAM(IStream,'','<TEMP>', '',MagicPath);

     

    DOWNLOADFROMSTREAM

    The code example below will copy a specific file from the NAV Server to the RTC machine with no questions asked about folder or file name or anything else:


    IF NOT ISSERVICETIER THEN
      EXIT;
    FileToDownload := 'c:\Temp\ServerFile.txt';
    FileVar.OPEN(FileToDownload);
    FileVar.CREATEINSTREAM(IStream);
    DOWNLOADFROMSTREAM(IStream,'','<TEMP>', '',MagicPath);
    MESSAGE('Path = ' + MagicPath);

    FileVar.CLOSE;

     

    Variables:

    Name Data Type Length
    FileToDownload  Text 180
    FileVar File
    IStream  InStream
    MagicPath  Text 180

     


    Now we have the file on the RTC machine, and MagicPath tells us its location. The location will be something like this:
    C:\Users\[UserName]\AppData\Local\Temp\Microsoft Dynamics NAV\4612\__TEMP__ff7c5a286cfd463f9f7d92ae5b4757e2

    The number 4612 in the MagicPath comes from the Process ID of RTC.

     

    Handling files client side

    So, what if we wanted to rename it to a specific name? We have the FILE object in C/AL, but of course since C/AL runs on the NAV Server and not on RTC, this won't work since the purpose of the above is exactly to copy the file to the client machine. Instead, use this automation:

    'Microsoft Scripting Runtime'.FileSystemObject

    Then create an instance ClientSide:
    CREATE(FileSystemObject,TRUE,TRUE);


    So, if you wanted to continue the code above and place and name the file to something specific on the client's machine, add these lines:


    CREATE(FileSystemObject,TRUE,TRUE);
    DestinationFileName := 'c:\Temp\newfile.txt';
    IF FileSystemObject.FileExists(DestinationFileName) THEN
      FileSystemObject.DeleteFile(DestinationFileName,TRUE);
    FileSystemObject.CopyFile(MagicPath,DestinationFileName);
    FileSystemObject.DeleteFile(magicpath,TRUE);

     

     
    UPLOADINTOSTREAM


    MagicPath works both ways. But with DOWNLOADFROMSTREAM it creates MagicPath for you and tells you where it is. With UPLOADINTOSTREAM you need to know it in advance. Remember the MagicPath location above includes the Process ID of RTC. One way could be to work that out somehow. But what I would suggest instead, is to download a temp test file first, then see where MagicPath downloads it to. The path for upload will be the same:


    // download a temp file to get MagicPath
    FileVar.CREATETEMPFILE;
    FileVar.CREATEINSTREAM(IStream);
    DOWNLOADFROMSTREAM(IStream,'','<TEMP>', '',MagicPath);
    FileVar.CLOSE;
    MESSAGE(MagicPath);


    Then extract the folder name from MagicPath:

    FOR i := STRLEN(MagicPath) DOWNTO 1 DO BEGIN
      IF MagicPath[i] = '\' THEN BEGIN
        MagicPath := COPYSTR(MagicPath,1,i);
        i := 1;
      END;
    END;

    Once you know the location of MagicPath, the next step is to copy the file you want to upload into that folder:

    FileToUpload := 'newfile.txt';
    FolderName := 'c:\Temp\';

    IF ISCLEAR(FileSystemObject) THEN
      CREATE(FileSystemObject,TRUE,TRUE);
    FileSystemObject.CopyFile(FolderName + '\' + FileToUpload,MagicPath + '\' + FileToUpload);

    Then use UPLOADINTOSTREAM to upload the file from MagicPath to the NAV Server:
    UPLOADINTOSTREAM('','<TEMP>','',FileToUpload,IStream);


    And finally, save the InStream to a file on the server:

    FileVar.WRITEMODE(TRUE);
    FileVar.CREATE('c:\Temp\OnServer.txt');
    FileVar.CREATEOUTSTREAM(OStream);
    COPYSTREAM(ostream,istream);
    FileVar.CLOSE;

     

    So, put all this together and the end result is:

    The file c:\Temp\ServerFile.txt gets downloaded to C:\Temp\NewFile.txt, and then uploaded back to the server as C:\Temp\OnServer.txt.

     

     

    Lars Lohndorf-Larsen

    Dynamics NAV Support EMEA

     

     

     

     

     

  • Microsoft Dynamics NAV Team Blog

    More Charts

    • 9 Comments

    This is a follow up to this post:

    Using Client Extensibility in NAV 2009 SP1 to create Charts

    News in this post are:

    2-series line charts. For example showing number of new Orders and Quotes for the last 10 days:

    2SeriesChart

    Item Sales Chart (basically identical to the Customer Sales Chart in the previous post):

    ItemSales

    Doughnut charts have been shined up a bit to make them look better:

    Don

    Double-click callback to NAV: When you double-click a chart, it will call the ControlAddin trigger. The example below will open up a list of Cust. Ledger Entries when double-clicking on a chart on the Customer Card page:

    Sales (LCY) - OnControlAddIn(Index : Integer;Data : Text[1024])
    CustLedgEntry.SETCURRENTKEY("Customer No.");
    CustLedgEntry.SETRANGE("Customer No.","No.");
    PAGE.RUN(0,CustLedgEntry);

    All you need is attached at the end of this post. It contains .dlls, including their source code, and a NAV codeunit. To implement some examples:

    Implement the charts add-on:

    1)  If you want to modify the attached c# project you must install "Microsoft Chart Controls Add-on for Microsoft Visual Studio 2008".

    2)  On any client machine where you want to run the charts, you must install "Microsoft Chart Controls for Microsoft .NET Framework 3.5".

    3)  Detach and unzip the attached file.

    4)  From the unzipped file, copy the files from \XTCharts\bin\Debug\ into the "Add-ins" sub folder of the RoleTailored Client (RTC).

    5)  And import the codeunit from XTChart.fob in Object Designer.

    6)  In a Classic Client, run table 2000000069 "Client Add-in" and enter one line (if it doesn't already exists from the previous post):

    Control Add-in Name:    XTChart
    Public Key Token:    d1a25808afd603da

    (making sure to have exactly these values, in the right case)

    Adding charts to pages:

    On the page where you want a chart, insert a new line of type Field. In the property "ControlAddIn", select the XTChart (XTChart;PublicKeyToken=d1a25808afd603da). This adds the control itself. To send data to the control you must specify SourceExpression too. Declare a Global variable called XTChartUtil, Type = Codeunit, Sub Type = Codeunit 75550 "XT Chart Util" to use in SourceExpression. If the chart is on a Customer Card page, then specify this SourceExpression:

    For a line-chart, showing "Sales (LCY)" for the last 6 months:  XTChartUtil.CustPointsChart(Rec)

    For a Doughnut chart, showing "Balance (LCY)" percentage of "Credit Limit": XTChartUtil.CustDoughnutChart(Rec)

    On an Item Card page, for a line chart showing Item "Sales (LCY)" for the last 6 months, have SourceExpressions XTChartUtil.ItemPointsChart(Rec)

    On the main Role Centre, to have a 2-series chart showing number of Quotes and Orders created in the past 10 days, I added the chart in Page 9060 with this SourceExpression: ChartUtil.SalesCueChart

    Then, for each example above, if you want to trigger any activity when the user double clicks the chart, just put some C/AL code on the corresponding OnControlAddIn-trigger.

    You can add charts in many other places, except for pages of the following types which cannot handle Extensibility components:

    • Under a Repeater Control (i.e. a list page)
    • Action Pane
    • Command Bar
    • Filter Pane

    Note:

    In the example here, I added a chart to the main Role Centre. In a production environment this is maybe not such a good idea, for these two reasons:

    1)  If a chart goes wrong and crashes, for example if a user has not installed the Chart Controls for .NET (link above), then it may crash the whole RTC. If this happens from some page which shows a chart, then at least the user knows that such and such page has a problem. If it happens on the Role Centre, then RTC will crash as soon as it is opened, and the user cannot use RTC at all, and it may not be obvious why RTC crashes.

    2)  Performance is not great in the SalesCueChart-example here, counting Sales Header records filtering on "Document Date". If a company has 1.000s of orders, then this would have some impact on performance. So, at least only calculate the chart on request, and not on the main page whether the user actually needs it or not.

    Development notes

    If you want to extend this to create new charts, then this is how the attached example works:

    Chart Data is generated in xml format in codeunit 75550 "XT Chart Util". It has a function for each type of chart. The xml data must have this format:

    - <Chart>

    <ChartType>1</ChartType> <-- Chart Type 1 = Lines, 2 = Doughnut and 3 = 3D lines

    - <Data Title="Orders"> <-- Title is optional. If specified, it will show as legend on the chart

    <REC1 Title="16/01/11">0</REC1> <-- Name the elements REC1, REC2, etc.

    <REC2 Title="17/01/11">2</REC2>

    <REC3 Title="18/01/11">0</REC3>

    <REC4 Title="19/01/11">2</REC4>

    </Data>

    - <Data2 Title="Quotes"> <-- Data2 section is optional. Used for making 2-series charts. If there is a Data2 section, then it must have the same number of elements as Data, and with the same names.

    <REC1 Title="16/01/11">0</REC1>

    <REC2 Title="17/01/11">0</REC2>

    <REC3 Title="18/01/11">0</REC3>

    <REC4 Title="19/01/11">1</REC4>

    </Data2>

    </Chart>

    In case of a Doughnut chart, just generate an xml document like this:

    - <Chart>

    <ChartType>2</ChartType>

    - <Data>

    <REC1 Title="Balance">9.20</REC1>

    <REC2 Title="CreditLimit">90.80</REC2>

    </Data>

    </Chart>

    with 2 REC-elements which total up to 100.

    The c#-code is all inside of one big try{} catch{} - structure. If anything goes wrong, then it will just show the data that was received in a MessageBox, but not what actually went wrong. Only if you remark the try{} and catch{} - section, it will throw the actual error message to tell you what went wrong (it should log it in the Application log). But the downside of this is, that RTC is also likely to crash.

    Lars Lohndorf-Larsen

    CSS EMEA

    This posting is provided "AS IS" with no warranties, and confers no rights

  • Microsoft Dynamics NAV Team Blog

    How to create/read xml file from Microsoft Dynamics NAV without using xmlports

    • 9 Comments

    Periodically we receive support requests about XMLport errors, which really are missing features in xmlport's functionality.
    However most of these requests are rejected because: xmlports are not supposed to fulfill all possible xml standard scenarios.
    So there always will be xml file which can't be created or readed by NAV xmlport.
    Possible workaround could be to use XML DOM automations.

    Create it by codeunit:

    ----------------------------------------

     OBJECT Codeunit 50052 xml create
    {
      OBJECT-PROPERTIES
      {
        Date=03/24/10;
        Time=[ 1:33:31 PM];
        Modified=Yes;
        Version List=;
      }
      PROPERTIES
      {
        OnRun=BEGIN
                CREATE(xmlDoc);
                xmlMgt.SetNormalCase;
                xmlProcessingInst:=xmlDoc.createProcessingInstruction('xml','version="1.0" encoding="UTF-8" standalone="yes"');

                CurrNode := xmlDoc.appendChild(xmlProcessingInst);
                CurrNode := xmlDoc.createElement('soapenv:Envelope');
                CurrNode := xmlDoc.appendChild(CurrNode);


                xmlMgt.AddAttribute(CurrNode,'xmlns:soapenv','http://schemas.xmlsoap.org/soap/envelope/');
                xmlMgt.AddAttribute(CurrNode,'xmlns:mbs','http://www.microsoft.com/mbs/xml');

                xmlMgt.AddElement(CurrNode,'soapenv:Header','','soapenv',NewChild);
                  CurrNode:=NewChild; //One level deeper
                  xmlMgt.AddElement(CurrNode,'soapenv:Body','','soapenv',NewChild);
                       CurrNode:=NewChild; //one level deeper
                       xmlMgt.AddElement(CurrNode,'mbs:enumeration','','mbs',NewChild);
                            CurrNode:=NewChild; //one level deeper
                            xmlMgt.AddElement(CurrNode,'mbs:table','Customers','mbs',NewChild);

                         recCustomer.SETRANGE("No.", '10000','20000'); //Filter only few records
                         IF recCustomer.FINDFIRST THEN BEGIN
                           REPEAT
                            vName   :=recCustomer.Name;
                            vNo     :=recCustomer."No.";
                            vContact:=recCustomer.Contact;
                              recCustomer.CALCFIELDS("Balance (LCY)");
                              vBalance:= FORMAT(recCustomer."Balance (LCY)");
                            vSPcode :=recCustomer."Salesperson Code";


                            xmlMgt.AddElement(CurrNode,'mbs:Customer','','mbs',NewChild);
                                  CurrNode1:=NewChild; //One level deeper, but keep current level too
                                  xmlMgt.AddElement(CurrNode1,'mbs:CustomerAuthentication','','mbs',NewChild);
                                         CurrNode2:=NewChild; //One level deeper to sublevel
                                         xmlMgt.AddElement(CurrNode2,'mbs:No',vNo,'mbs',NewChild);
                                         xmlMgt.AddElement(CurrNode2,'mbs:Name',vName,'mbs',NewChild);

                                  xmlMgt.AddElement(CurrNode1,'mbs:CustomerData','','mbs',NewChild);
                                         CurrNode2:=NewChild; //One level deeper to sublevel
                                         xmlMgt.AddElement(CurrNode2,'mbs:Balance',vBalance,'mbs',NewChild);
                                         xmlMgt.AddElement(CurrNode2,'mbs:SalespersonCode',vSPcode,'mbs',NewChild);
                                         xmlMgt.AddElement(CurrNode2,'mbs:Contacts','','mbs',NewChild);
                                             CurrNode1:=NewChild;//One level deeper
                                             xmlMgt.AddElement(CurrNode1,'mbs:Contact',vContact,'mbs',NewChild);


                            CLEAR(vName);
                            CLEAR(vNo)  ;
                            CLEAR(vContact);
                            CLEAR(vBalance);
                            CLEAR(vSPcode);

                           UNTIL recCustomer.NEXT=0;


                           xmlDoc.save('D:\xmlFile.xml');
                           CLEARALL;
                           MESSAGE('xmlFile.xml is created');
                         END;
              END;

      }
      CODE
      {
        VAR
          xmlDoc@1000 : Automation "{F5078F18-C551-11D3-89B9-0000F81FE221} 6.0:{F6D90F11-9C73-11D3-B32E-00C04F990BB4}:'Microsoft XML, v6.0'.DOMDocument";
          CurrNode@1003 : Automation "{F5078F18-C551-11D3-89B9-0000F81FE221} 6.0:{2933BF80-7B36-11D2-B20E-00C04F983E60}:'Microsoft XML, v6.0'.IXMLDOMNode";
          CurrNode1@1005 : Automation "{F5078F18-C551-11D3-89B9-0000F81FE221} 6.0:{2933BF80-7B36-11D2-B20E-00C04F983E60}:'Microsoft XML, v6.0'.IXMLDOMNode";
          CurrNode2@1013 : Automation "{F5078F18-C551-11D3-89B9-0000F81FE221} 6.0:{2933BF80-7B36-11D2-B20E-00C04F983E60}:'Microsoft XML, v6.0'.IXMLDOMNode";
          NewChild@1004 : Automation "{F5078F18-C551-11D3-89B9-0000F81FE221} 6.0:{2933BF80-7B36-11D2-B20E-00C04F983E60}:'Microsoft XML, v6.0'.IXMLDOMNode";
          xmlProcessingInst@1001 : Automation "{F5078F18-C551-11D3-89B9-0000F81FE221} 6.0:{2933BF89-7B36-11D2-B20E-00C04F983E60}:'Microsoft XML, v6.0'.IXMLDOMProcessingInstruction";
          xmlMgt@1002 : Codeunit 6224;
          "---- Variables----"@1006 : Integer;
          recCustomer@1012 : Record 18;
          vName@1007 : Text[30];
          vNo@1008 : Text[30];
          vContact@1009 : Text[30];
          vBalance@1010 : Text[30];
          vSPcode@1011 : Text[30];

        EVENT xmlDoc@1000::ondataavailable@198();
        BEGIN
        END;

        EVENT xmlDoc@1000::onreadystatechange@-609();
        BEGIN
        END;

        BEGIN
        END.
      }
    }

    ---------------------------------------

    And can be read by codeunit: 

    OBJECT Codeunit 50050 xml read
    {
      OBJECT-PROPERTIES
      {
        Date=03/24/10;
        Time=12:25:26 PM;
        Modified=Yes;
        Version List=;
      }
      PROPERTIES
      {
        OnRun=BEGIN
                ffile.OPEN('D:\XmlFile.xml'); //this must be your file name
                ffile.CREATEINSTREAM(strInStream);

                IF ISCLEAR(xmldomDoc) THEN CREATE(xmldomDoc);

                xmldomDoc.load(strInStream);

                xmlNodeList1 := xmldomDoc.getElementsByTagName('mbs:Customer');

                ii:=xmlNodeList1.length();

                FOR i:=0 TO xmlNodeList1.length()-1 DO BEGIN
                 xmldomElem1:= xmlNodeList1.item(i); //mbs:Customer
                 IF xmldomElem1.hasChildNodes() THEN
                    BEGIN
                      xmlNodeList2:= xmldomElem1.childNodes();
                      IF NOT ISCLEAR(xmlNodeList2) THEN
                      xmldomElem2:= xmlNodeList2.item(0); //mbs:CustomerAuthentication
                      IF NOT ISCLEAR(xmldomElem2) THEN
                      IF xmldomElem2.hasChildNodes() THEN
                         BEGIN
                         xmlNodeList3:= xmldomElem2.childNodes();
                         IF NOT ISCLEAR(xmlNodeList3) THEN
                          xmldomElem3:= xmldomElem2.firstChild();//mbs:No
                         IF NOT ISCLEAR(xmldomElem3) THEN
                          txtNo:=xmldomElem3.text();
                         xmldomElem3:=xmlNodeList3.item(1); //mbsName
                         IF NOT ISCLEAR(xmldomElem3) THEN
                          txtName:=xmldomElem3.text();

                      xmldomElem2:= xmlNodeList2.item(1); //mbs:CustomerData
                      IF NOT ISCLEAR(xmldomElem2) THEN
                      IF xmldomElem2.hasChildNodes() THEN
                         BEGIN
                         xmlNodeList3:= xmldomElem2.childNodes();
                         IF NOT ISCLEAR(xmlNodeList3) THEN
                          xmldomElem3:= xmldomElem2.firstChild();//mbs:Balance
                         IF NOT ISCLEAR(xmldomElem3) THEN
                          txtBalance:=xmldomElem3.text();
                         xmldomElem3:=xmlNodeList3.item(1); //mbsSalesPersonCode
                         IF NOT ISCLEAR(xmldomElem3) THEN
                          txtSPcode:=xmldomElem3.text();
                         END;

                         xmldomElem3:=xmlNodeList3.item(2); //mbs:Contacts
                         IF NOT ISCLEAR(xmldomElem3) THEN
                            txtContact:=xmldomElem3.text();

                     END;
                END;
                      MESSAGE('This is record "%1"\No "%2"\Name "%3"\Contact "%4"\Balance "%5"\Salesperson code "%6"\of Total "%7"',
                      FORMAT(i+1),
                      txtNo,
                      txtName,
                      txtContact,
                      txtBalance,
                      txtSPcode,
                      ii);

                END;
                ffile.CLOSE;
                CLEARALL;
              END;

      }
      CODE
      {
        VAR
          xmldomDoc@1000 : Automation "{F5078F18-C551-11D3-89B9-0000F81FE221} 6.0:{F6D90F11-9C73-11D3-B32E-00C04F990BB4}:'Microsoft XML, v6.0'.DOMDocument";
          xmlNodeList1@1005 : Automation "{F5078F18-C551-11D3-89B9-0000F81FE221} 6.0:{2933BF82-7B36-11D2-B20E-00C04F983E60}:'Microsoft XML, v6.0'.IXMLDOMNodeList";
          xmlNodeList2@1017 : Automation "{F5078F18-C551-11D3-89B9-0000F81FE221} 6.0:{2933BF82-7B36-11D2-B20E-00C04F983E60}:'Microsoft XML, v6.0'.IXMLDOMNodeList";
          xmlNodeList3@1019 : Automation "{F5078F18-C551-11D3-89B9-0000F81FE221} 6.0:{2933BF82-7B36-11D2-B20E-00C04F983E60}:'Microsoft XML, v6.0'.IXMLDOMNodeList";
          xmldomElem1@1007 : Automation "{F5078F18-C551-11D3-89B9-0000F81FE221} 6.0:{2933BF86-7B36-11D2-B20E-00C04F983E60}:'Microsoft XML, v6.0'.IXMLDOMElement";
          xmldomElem2@1010 : Automation "{F5078F18-C551-11D3-89B9-0000F81FE221} 6.0:{2933BF86-7B36-11D2-B20E-00C04F983E60}:'Microsoft XML, v6.0'.IXMLDOMElement";
          xmldomElem3@1011 : Automation "{F5078F18-C551-11D3-89B9-0000F81FE221} 6.0:{2933BF86-7B36-11D2-B20E-00C04F983E60}:'Microsoft XML, v6.0'.IXMLDOMElement";
          txtNo@1001 : Text[30];
          txtName@1002 : Text[30];
          txtContact@1009 : Text[30];
          txtBalance@1012 : Text[30];
          txtSPcode@1013 : Text[30];
          ffile@1003 : File;
          strInStream@1004 : InStream;
          i@1006 : Integer;
          ii@1018 : Integer;

        EVENT xmldomDoc@1000::ondataavailable@198();
        BEGIN
        END;

        EVENT xmldomDoc@1000::onreadystatechange@-609();
        BEGIN
        END;

        BEGIN
        END.
      }
    }

    That's all
    Special thanks to
    Rainer Kuhnlein

    Gedas Busniauskas (gediminb)
    Microsoft Customer Service and Support (CSS) EMEA

     

  • Microsoft Dynamics NAV Team Blog

    How to get a Dynamics NAV report with a Web Service

    • 9 Comments

    1. Create a new codeunit. In this scenario we will call this codeunit "CUWebReport" with ID 50000

    2. Navigate to "C/AL Globals" and create a function called "GenerateReport"

    image 

    3. Select "Locals"

    4. Select the "Return Value" tab

    5. Set "Return Type"=Text and "Length"=100

    image

    6. With this completed close "C/AL Locals" window.

    7. Now with "C/AL Globals" windows active again. Select "Variables" tab.

    8. Create a variable called "filename" with "Data Type"=Text and "Length"=100

    image

    9. Now let's add the following code to this codeunit:

    filename := 'C:\inetpub\PdfDocuments\';
    filename += FORMAT(CREATEGUID);
    filename := DELCHR(filename, '=', '{-}');
    filename += '.pdf';
    REPORT.SAVEASPDF(111,filename);
    EXIT(filename);

    image

    10. Save and compile the codeunit.

    11. Now it's time to expose this codeunit as Web Service. Navigate to "Administration/IT Administration/General Setup/Web Services"

    12. Select codeunit 50000 and give this a service name, we use "Get_PDF_Report"

    image

    13. Now it is time to verify that we can see this web service. Open this URL  http://localhost:7047/DynamicsNAV/WS/services.

    You should now see this message in your browser, and your Web Service can now be called :

    image

    If you don't see this message, you might want to check that  the service "Microsoft Dynamics NAV Business Web Services" has been started.

    14. Now it is time to call the Web Service, in this example we use Microsoft Visual Web Developer 2005. And we use the following code to call the Web Service:

    Partial Class _Default
        Inherits System.Web.UI.Page

        Protected Sub Button1_Click(ByVal sender As Object, ByVal e As System.EventArgs) Handles Button1.Click
            Dim service As localhost.Get_PDF_Report = New localhost.Get_PDF_Report()
            service.UseDefaultCredentials = True
            service.Url = "
    http://localhost:7047/DynamicsNAV/WS/CRONUS_International_Ltd/Codeunit/Get_PDF_Report"
            Response.ContentType = "application/pdf"
            Dim pdfFileName As String = service.GenerateReport()
            Response.TransmitFile(pdfFileName)

        End Sub
    End Class

    But how to consume this Web Service is not in scope for this blog, so we suggest you have look our online help how to consume a Web Service from NAV 2009.

    Online help found here: http://msdn.microsoft.com/en-us/library/dd339004.aspx

    15. Now also remember to impersonate your web servers application to an appropriate Dynamics NAV user.

    16. After compiling a running our project we then get this button in our browser.

    image

    17. When activating this button, we get the report specified in Codeunit 50000 displayed as an PDF file in our browser.

    image

    Conclusion, now you can give this URL to people who don't have access to Dynamics NAV, and they can execute the report when they see a need for this.

    Thanks,

    Torben Meyhoff, SDE & Claus Lundstrøm, Program Manager, Microsoft Dynamics NAV

  • Microsoft Dynamics NAV Team Blog

    Reading and Writing Unicode Files using C/AL

    • 9 Comments

    Hello,

    We have had some partner suggestion for adding Unicode capabilities to the existing Microsoft Dynamics NAV File functions. What we recommend is to use .NET Interop to achieve this functionality.

    For example, you can use an instance of the System.IO.StreamWriter class to write to an OutStream or the System.IO.StreamReader class to read from an InStream. You control the encoding by using the System.Text.Encoding class where you select between Default, Unicode or UTF8 encoding.

    • Please note that XMLports in Microsoft Dynamics NAV 2013 directly supports importing and exporting flat text files in MS-DOS, UTF-8, UTF-16 encodings by setting the new TextEncoding property.

    Writing Unicode text files

    Let’s start with a small example on writing some text to a file in Unicode format.

    Declare the following variables:

    Name DataType Subtype
    outFile File  
    outStream OutStream  
    streamWriter DotNet System.IO.StreamWriter.'mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089' 
    encoding DotNet System.Text.Encoding.'mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089' 

    Then write this code, using a suitable path for the outFile:

      outFile.CREATE('c:\temp\OutFile.txt');

      outFile.CREATEOUTSTREAM(outStream);

      streamWriter := streamWriter.StreamWriter(outStream, encoding.Unicode);

     

      streamWriter.WriteLine('Hello World');

     

      streamWriter.Close();


      outFile.CLOSE();
    • You can use 'Hello World' as above or a text string with some more special characters as you like. I added some Danish characters..: 'Hello ÆØÅ World' to more easily see what’s going on.

     Run the code and verify the file is in Unicode.

    • One way to verify a file is in Unicode is to open it with Notepad and select File/Save As…, and then inspect the Encoding field.

    Try change the above example to use encoding.Default and verify the file is in ANSI (codepage) format (for example using Notepad as above).

    Please note that if you use C/AL to write directly to the outStream, like this:

    outStream.WRITETEXT('Hello World');

    this is still handled using MS-DOS encoding and is compatible with previous versions of Microsoft Dynamics NAV.

    • You can open a file in MS-DOS encoding by starting Wordpad and in the Open file dialog select “Text Documents – MS-DOS Format (*.txt)”.

    In Microsoft Dynamics NAV 2013, all normal text handling are done in Unicode, so the data that are entered on the pages – like customer names one on the Customer Card – can utilize the Unicode character set, the data can be stored in the database and used in C/AL code.

    Let’s see how you can extend the above example with data from the Customer table:

    Add the customer variable:

    Name DataType Subtype
     customer Record   Customer

    And write this code – you can set a filter if needed:

      outFile.CREATE('c:\temp\Customers.txt');

      outFile.CREATEOUTSTREAM(outStream);

      streamWriter := streamWriter.StreamWriter(outStream, encoding.Unicode);

     

      customer.FINDSET();

      REPEAT

        streamWriter.WriteLine(customer.Name);

      UNTIL customer.NEXT = 0;

     

      streamWriter.Close();

      outFile.CLOSE();

     If you open the Customers.txt file with a Unicode file viewer it should contain all the Unicode details you have used for your customer names.

    Reading Unicode text files

    Similar to the above you can read Unicode text files by using the System.IO.StreamReader class to read from an InStream as you can see in the small example below:

    Declare the following variables:

    Name DataType Subtype
    inFile File  
    inStream InStream  
    streamReader DotNet System.IO.StreamReader.'mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089'
    encoding DotNet System.Text.Encoding.'mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089'
    txt Text  

    Then write this code, using a suitable path for the inFile:

      inFile.OPEN('c:\temp\Infile.txt');

      inFile.CREATEINSTREAM(inStream);

       

      streamReader := streamReader.StreamReader(inStream,encoding.Unicode);

      txt := streamReader.ReadToEnd();

      message(txt);

    Create the Infile.txt as a Unicode file and add some Unicode characters to it. You should see them in the message box when you run the C/AL code.

     

     

    I hope these simple examples can get you started with Unicode file handling in C/AL. 

     

    Thanks and best regards!

    Hans Kierulff

  • Microsoft Dynamics NAV Team Blog

    Update Rollup 4 for Microsoft Dynamics NAV 2013 R2 has been released

    • 9 Comments

    Update rollup 4 includes all application and platform hotfixes and regulatory features that have been released for Microsoft Dynamics NAV 2013 R2 and includes hotfixes that apply to all countries and hotfixes specific to the following local versions:

    • AU – Australia
    • DE - Germany
    • DK - Denmark
    • FR - France
    • IT - Italy
    • NA – North America
    • NL – Netherlands
    • NZ – New Zealand
    • SE - Sweden
    • UK - United Kingdom

     

    New in update rollup 4 

    Beginning with update rollup 4, the update rollups for Microsoft Dynamics NAV 2013 R2 now also includes hotfixes for the upgrade toolkit. 

     

    Where to find update rollup 4

    You can download update rollup 4 from KB 2930617 - Update Rollup 4 for Microsoft Dynamics NAV 2013 R2 (Build 36078).

    The hotfixes that have been released since update rollup 3 are listed in KB 2930617. For a full list of all hotfixes included in the update rollup, see the following CustomerSource and PartnerSource pages: 

    CustomerSource:

    PartnerSource

     

    More Information

    For more information about update rollups for Microsoft Dynamics NAV 2013 R2, see Announcement of update rollups for Microsoft Dynamics NAV 2013 R2.

  • Microsoft Dynamics NAV Team Blog

    Microsoft Dynamics NAV 2009 R2, Microsoft Dynamics NAV 2013, and Microsoft Dynamics NAV 2013 R2 Compatibility with Microsoft Windows Server 2012 R2

    • 9 Comments

    At the time when we released Microsoft Dynamics NAV 2009 R2, Microsoft Dynamics NAV 2013, and Microsoft Dynamics NAV 2013 R2, all supported client and server operating systems were tested. However since that time, Windows Server 2012 R2 has become available. 

    Over the last few months, the Microsoft Dynamics NAV team has been testing the compatibility, and we are now proud to announce that Microsoft Dynamics NAV 2009 R2, Microsoft Dynamics NAV 2013, and Microsoft Dynamics NAV 2013 R2 are compatible with Windows Server 2012 R2!

    For more information, see the latest version of the System Requirements in the MSDN Library.

  • Microsoft Dynamics NAV Team Blog

    Memory usage in Microsoft Dynamics NAV 2013 print preview

    • 9 Comments

    Periodically we receive requests where customers asking us about RTC print preview consumes all available memory and computers hangs.

    Repro scenarios are more/less similar: run report, click preview, go per pages up/down, close report. Run another report, click preview, go per pages up down and… RTC hangs.
    If we look to memory usage, NAV is using all available memory.

    If we analyze what has happened, we see: with every move per pages memory usage increase and increase and increase until all memory is used.
    It looks like typical memory leak because in NAV 2009 with the same repro everything is OK.

    However it is not so simple. What NAV does in this scenario is: it loads Microsoft Report Viewer 2010 with 2 files: report definition (RDL file where is described report structure) and record set. And that is all, next actions like preview/print/export are managed by report viewer. The same we have in NAV 2009 just it loads Report Viewer 2008 and .NET Framework 3.5 as NAV 2013 uses Report Viewer 2010 and .NET Framework 4.5. And here is reason pointing us to memory usage behavior: Report Viewer 2010 and .NET 4.5 uses another way to manage memory (named more advanced and more secure). However this ‘another’ way gets us mentioned questions from customers and really we can do nothing from NAV platform side – system works as it is designed to work.

    But things are not so bad. In .NET memory is managed very smart and not used memory is released by function named ‘garbage collector’. This function periodically review memory blocks and release not used. And actually it works: if we monitor memory during report previewing and after it close we see that in preview and going per pages memory usage increase, after preview close memory usage is still the same, but after 5-10 min used memory decrease to initial numbers. For example in my tests started RTC using 98.5 MB, after few previews and scrolls and etc. usage increases to 215 MB and doesn’t decrease if I close preview. But after few minutes garbage collector returns usage to 100.4 MB.
    We can force (in NAV platform) garbage collector to work faster and memory release will be faster, but this will increase processor usage and in many cases it is worse than memory usage.

    Few trick could help users do not meet memory problems:

    1. When we open report preview we see report in “funny” interactive mode. This is preview mode where we can change report view in some way described in report code. For example in report ‘Customer – Top 10 List’ sorting can be changed. But exactly this mode is memory monster. If we click ‘Print layout’ button in menu we come to ‘page view’ where we can change nothing in report and see exactly what will be printed. This mode doesn’t use memory so much, so we can use this view easier. Few additional points here: a) all report initially open in interactive mode so users need manually change mode; b) Mode is not available if report is not allowed to preview like Sales Invoice or Order Confirmation.
    2. Report run SAVEASPDF works better if we call it directly without request form. However then users need to do more actions to open report. But maybe this can be reasonable in some cases.
    3. And finally if there are requirements to run big reports with many pages (100 and more) then better is to run them on server side, just because server is 64 bits and usually has more available RAM. This can be done by creating codeunit which runs report SAVEASPDF. Then add codeunit to job queue. Job queue executes codeunit and create pdf file.

     

    New Information since Update Rollup 5:

     

     

     

    These postings are provided "AS IS" with no warranties and confer no rights. You assume all risk for your use.

     

    Gedas Busniauskas
    Microsoft Lithuania
    Microsoft Customer Service and Support (CSS) EMEA

     

  • Microsoft Dynamics NAV Team Blog

    Microsoft Dynamics NAV and SQL Server Database Compression

    • 9 Comments

    Microsoft SQL Server 2008 introduces two new compression features (available in Enterprise Edition) that are of interest to users with large databases. Data compression applies to individual tables and indexes within a database and comes in two options: ROW and PAGE. Backup compression is a new option when backing up a database to external storage. Both compression techniques can be used to reduce the amount of online storage required for databases. Data compression can be configured for an entire heap (table with no clustered index), clustered index, non-clustered index, or indexed view. For partitioned tables and indexes, the compression option can be configured separately (and differently) for each partition. Compression can also reduce disk utilization and sql memory utilization as dead is stores on disk in a compressed state and also reads in the SQL cache in a compressed state. Compression can add 10 - 30% percent increased CPU utilization depending on what tables and indexes are compressed and what level of compression is used.

    For Dynamics NAV we recommend only compressing tables and indexes that have a read to write ration of 80%/20% (This is a conservative threshold) or higher as compressing tables with a higher write ratio can actually decrease performance. We also recommend using ROW compression if the space saving between ROW and PAGE level compression is less that 10%; if the difference is over 10% then we recommend PAGE compression. This is because if the space savings from PAGE compression is close to or similar to ROW compression then it is not recommended to incur the additional overhead associated with PAGE compression. An example of NAV tables that would benefit greatly from compression are the large "Entry" tables such as G/L Entry, Value Entry, and Item Ledger Entry. An example for NAV tables that would not benefit from compression and where performance may actually decrease due to compression are "Line" tables where the data is temporary in nature such as Sales Line, Purchase Line, and Warehouse Activity Line. SQL Server compression is completely transparent to the Dynamics NAV application.

    Compression is done on Table or Index basis. On tables with clustered indexes the clustered index is the table so compressing the clustered index is equal to compressing the table. Running the ALTER TABLE.. WITH COMPRESSION is only necessary on HEAPS for all tables with Clustered Indexes you can use the ALTER INDEX.. WITH COMPRESSION.

    How to determine the read/write ratio of an index? Luckily SQL keeps track of this for us and all we need to do is extract this data. SQL Server stores this information in the sys.dm_db_index_operational_stats DMV. Remember DMV's are "recycled" each time the SQL Server service is restarted so if SQL has only been up and running for a day or a week this information will be of minimal use. Ideally you would want the server to be up any running for several weeks and through a month end close to get a true idea of actual data access patterns of your database.

    The following query will tell you how long the SQL Server instance has been up and running:

    select 'Sql Server Service has been running for about '

           + cast((datediff(hh, create_date, getdate()))/24 as varchar(3)) + ' days and '

           + cast((datediff(hh, create_date, getdate())) % 24 as varchar(2)) + ' hours'

           from sys.databases where name = 'tempdb'

    The following query will give the approximate read write balance of all the used indexes in the database.

    SELECT o.id, 

                            o.name,

                            x.name,

                            i.index_id,

                            x.type_desc,

                            s.rowcnt,

                            i.leaf_update_count * 100.0 /

                                  (i.range_scan_count + i.leaf_insert_count

                                      + i.leaf_delete_count + i.leaf_update_count

                                      + i.leaf_page_merge_count + i.singleton_lookup_count

                                  ) as Writes,

                            i.range_scan_count * 100.0 /

                                  (i.range_scan_count + i.leaf_insert_count

                                      + i.leaf_delete_count + i.leaf_update_count

                                      + i.leaf_page_merge_count + i.singleton_lookup_count

                                  ) as Reads

                  FROM sys.dm_db_index_operational_stats (db_id(), NULL, NULL, NULL) i

                  JOIN sys.sysobjects o ON o.id = i.object_id

                  JOIN sys.indexes x ON x.object_id = i.object_id AND x.index_id =

    i.index_id

                  JOIN sys.sysindexes s ON s.id = x.object_id and s.indid = x.index_id

                  WHERE (i.range_scan_count + i.leaf_insert_count

                            + i.leaf_delete_count + leaf_update_count

                            + i.leaf_page_merge_count + i.singleton_lookup_count) <> 0

                  AND objectproperty(i.object_id,'IsUserTable') = 1

                 

    You can also run a simulation with the sp_estimate_data_compression_savings stored procedure in SQL with will run a sample of the data through the chose compression level and give you the estimate of the space saved by enabling compression.

    I strongly recommend reading the following MSDN article before engaging in database compression activities.

    http://msdn.microsoft.com/en-us/library/cc280449.aspx

    Note: You can use the following query to check to see if there are currently any compressed indexes in the database.

    SELECT o.name Table_Name, p.index_id as Index_ID, i.name as Index_Name,

          CASE

          WHEN p.data_compression = 1 THEN 'ROW Compression'

          WHEN p.data_compression = 2 THEN 'PAGE Compression'

          ELSE 'ERROR'

          END Compression_Type

    FROM sys.partitions p

    JOIN sys.objects o ON p.object_id = o.object_id

    JOIN sys.sysindexes i ON o.object_id = i.id AND p.index_id = i.indid

    AND p.data_compression in (1,2)

    ORDER BY o.name, p.data_compression, p.index_id

     

    Michael De Voe

    Senior Premier Field Engineer

    Microsoft Dynamics

    Microsoft Certified Master - SQL Server 2008

  • Microsoft Dynamics NAV Team Blog

    About Object Metadata, and why I can't see object changes in RTC

    • 9 Comments

     

    A few times we have seen case where changes to NAV objects are not seen in RTC until the NAV Server is restarted. This post explains a likely solution to such problems. And as we are on the topic anyway, also explains a bit about the process that turns C/AL code into Metadata (c#).

     

    How object changes get to RTC


    When you save an object in Object Designer, then table 2000000071 "Object Metadata" is updated as well. An easy way to see this part of the process in action is:
    Run table 2000000071 and delete Page 1 (or any object). Then compile Page 1 from Object Designer. Then check that it has been re-created in table 2000000071.

    If you look at this table in SQL Server Management Studio, you can see that it has SQL triggers which update the table "Object Tracking". It is the "Object Tracking" table that NAV Server uses to see when an object has changed, so that it knows to send updated Metadata to RTC.

    There are two ways that the NAV Server can get updated on activity in the "Object Tracking" table:
      SQL Server Broker
      Polling

     

    SQL Broker

    If SQL Server Broker is enabled, then NAV Server will rely on that to notify it when an object has changed. You can see whether the broker is enabled from SQL Server Management Studio under Database Properties, then under Options look for "Broker Enabled".

    If the broker is enabled, then when NAV Server starts up and on first activity (When first RTC connects), it will create a queue and a service under SQL Service broker. You can see this in SQL Server Management Studio under the NAV Database if you expand Service Broker. Here, under Queues and under Services you will see new objects with names like SqlQueryNotificationService-27b7fb21-74a7-4a63-876a-c96b8eecd583. These are created by NAV Server and removed again when NAV Server stops. Their job is to listen to the "Object Tracking" table and notify NAV Server when there are any changes there.

     

    Polling

    If the broker is not enabled, then NAV Server will use polling, i.e. check for changes every now and then. You can see this in two ways. First in the Application Log the NAV Server will log this event shortly after startup:

    Service: MicrosoftDynamicsNavServer
    SQL Query Notifications are unavailable on SQL Server '.' in Database 'NAVDatabase'. The Object Change Listener has switched to polling.

    Secondly, when NAV Server is in polling mode and you start SQL Profiler you will see this query being run regularly by .New sqlClient Data Provider:

    exec sp_execute 1,@lastKnownTimeStamp=463223

    This is a pre-prepared query which looks like this:

    SELECT [Object Timestamp], [Object Type], [Object ID], [Object Key] FROM [dbo].[Object Tracking] WHERE [Object Timestamp] > @lastKnownTimeStamp',@lastKnownTimeStamp=463225

    In polling mode, this is how NAV Server checks for object changes since last time it checked, so it knows whether to send updated object definitions (metadata) to RTC.

     

    Troubleshooting

    So that's the background. How do we handle the case where object changes are not seen in RTC until we restart NAV Server? If this problem happens, then switch method from SQL Broker to Polling or visa versa. You switch by enabling / disabling the broker like this:

    ALTER DATABASE [MyNAVDatabase]
    --SET ENABLE_BROKER
      SET DISABLE_BROKER
    WITH ROLLBACK IMMEDIATE

    The "WITH ROLLBACK IMMEDIATE"-part of this query is to avoid what happened for me that the query just hang, or would only run if the database was put in single user mode.

    In the cases we have seen, the broker has been enabled but had some kind of problem. A slightly cryptic message was recorded in the SQL Server Log every time an object was changed in Object Designer. This message was not logged in the Application log. So make sure to check in SQL Server Management Studio under Management -> SQL Server Logs -> Current. If this gives enough information to solve the problem then good. If not, then there is the option of switching to polling (disable the broker) until the root of the problem can be resolved.

    Enabling the broker is the preferred option since it saves NAV Server for checking ever so often for object changes. But if the broker doesn't work then at least there is the option to disable it until any problems can be resolved.

     

    Lars Lohndorf-Larsen

    Dynamics NAV Support EMEA

     

Page 5 of 51 (763 items) «34567»