• Microsoft Dynamics NAV Team Blog

    Is SQL2000 more clever than SQL2005?

    • 0 Comments

    I have seen a number of (good) arguments that some issues with Dynamics NAV on SQL2005 are caused by a bad query optimizer in SQL2005, and that the SQL teams need to fix this. After all, some issues we see on SQL2005 did not exist in SQL2000. This blog tries to argue that SQL2005 is maybe more clever than SQL2000, and tries to give some input into the discussions about whether SQL2000 is more clever than SQL2005, and whether there is anything to fix in SQL2005 regarding this specific issue.

    The scenario belows is the situation where the reuse of a cached query plan causes a Dynamics NAV client to hang while it is browsing forms.


    Here is a repro scenario which will show why Dynamics NAV ends up causing a clustered index scan on SQL2005, while the same scenario on SQL2000 did not cause any such scans. It is based on a W1 5.00 demo database, and it requires a Solution Developer's license to run it. Run the steps on a SQL2000 and SQL2005 database and you will see where the differences are between these two platforms:


    1.  Create a new G/L Account, No 1105
    2.  Create 50.000 new records in table 17. For this purpose, it doesn't matter if you post these entries or just create a codeunit to insert the records.
    3.  Run table 17 from Object Designer, and change the G/L Account No. to 1105 for the first 3, and the last 3 entries
    4.  On SQL Server, update statistics on this table:
    update statistics [CRONUS International Ltd_$G_L Entry]
    5.  Run Dynamics NAV with Maximized forms.
    6.  In Dynamics NAV, go to "Chart of Accounts" and drill down on the new account 1105 and you should see 6 entries. Make sure to place the cursor on the first entry. Then close the drill-down to go back to the "Chart of Accounts".
    7.  On SQL Server, run DBCC FREEPROCCACHE. This will clear out any cached query plans.
    8.  Start a profiler trace - include the following events (on top of the default ones)
      On SQL2005: Performance:Showplan Text, on SQL2000: Performance:Execution Plan
      Stored Procedures:SP:CacheHit
      Stored Procedures:SP:CacheInsert
    9.  In Navision, drill down on account 1105. Then move the cursor with arrow-down, until you get to the last entry. Then move back up to the top again with arrow-up.
    10.  Stop the profiler trace.


    On SQL2005, you should see one of the last entries causing a relatively large number of reads. In my tests 2079 reads. This is the offending query. The same query on SQL2000 causes much fewer reads. In my tests 126 reads.

    The query looks like this:
    SELECT  * FROM "W1500"."dbo"."CRONUS International Ltd_$G_L Entry" WHERE (("G_L Account No_"=@P1)) AND  "G_L Account No_"=@P2 AND "Posting Date"=@P3 AND "Entry No_"<@P4 ORDER BY "G_L Account No_" DESC,"Posting Date" DESC,"Entry No_" DESC ',@p3 output,@p4 output,@p5 output,N'@P1 varchar(20),@P2 varchar(20),@P3 datetime,@P4 int','1108','1108',''2007-12-31 00:00:00:000'',52761

    Notice that the last parameter value is 52761. So the part of the query to focus on here, in fact reads:
    WHERE "Entry No_" < 52761

    Then take a look at the execution plan. SQL 2005 uses the index [CRONUS International Ltd_$G_L Entry$0], which is the clustered index ("Entry No_"). SQL2000 uses the index [CRONUS International Ltd_$G_L Entry].[$1], which is the index which begins with "G_L Account No_". So based on this query it is not strange that SQL2005's plan is causing many more reads that SQL2000's plan.


    Here is an important point to make: Neither SQL2000 or SQL2005 compiled the query plan for this query. You can see by the presense of SP:CacheHit events in the profiler trace, that the plan was taken from the plan cache. So in order to find out why the two versions of SQL makes different plans, we need to go to the place where the plan was made.

    Go to the SP:CacheHit event and look at the data. Then go backwards in the trace until you find the SP:CacheInsert event with the same data. This is the place where the query plan was made. The query in this place looks like this:


    SELECT  * FROM "W1500"."dbo"."CRONUS International Ltd_$G_L Entry" WHERE (("G_L Account No_"=@P1)) AND  "G_L Account No_"=@P2 AND "Posting Date"=@P3 AND "Entry No_"<@P4 ORDER BY "G_L Account No_" DESC,"Posting Date" DESC,"Entry No_" DESC ',@p3 output,@p4 output,@p5 output,N'@P1 varchar(20),@P2 varchar(20),@P3 datetime,@P4 int','1108','1108',''2006-12-31 23:59:59:000'',1

    This time, the last parameter value is 1 (not 52761)! So this time, the part of the query to focus on is:
    WHERE "Entry No_" < 1

    Remember that "Entry No_" is also the clustered index.

     

    So here is the question: What is the best possible query plan for this query? And I think the answer is easy for this scenario: Use the clustered index to "scan" this one record! The number of Reads in the trace should also confirm this. In my tests, SQL2005 did 21 reads. SQL2000 did 245 Reads.

    So in this case, SQL2005 makes a better plan than SQL2000!


    The way that query plans are cached and reused has not changed between SQL2000 and 2005. The following points are valid for both versions:


    1.  When a query plan is designed, SQL will take the parameter values into consideration (In this example, whether the last parameter is 1 or 52761). This is also called parameter sniffing.
    2.  When a query plan is reused from cache, the parameter values are NOT taken into consideration. The Query that the plan is valid for is converted into a hash-value. SQL simply looks in the plan cache if a plan exists for that hash-value, and then reuses the plan if there is. If SQL also had to revalidate the plan against the current parameter values, then this would to some extend negate the whole purpose of reusing cached plans (performance).
    3.  SQL's query optimizer does not have any kind of risk-assessment when it designs a query plan. There are no mechanisms in place to consider "If I put this plan into cache, and it was reused with other parameters, what is the potential damage?"

    These behaviours are fundamental to current and previous version of SQL, and most likely to future versions as well.

     

    So, for this scenario we can see that:
      - When the plan was made, SQL2005 made the most optimized plan.
      - The behaviour of caching plans and reusing them are the same on both SQL2000 and SQL2005.

     

    Without going into too many details here about how to troubleshoot a situation like this, there are various ways to handle it. The main methods for Dynamics NAV are:

      - Index hints:

    In this situation, if the query had included an index hint on the $1 index ("G_L Account No_"), then SQL2005 would not have chosen the clustered index as it did. The behaviour would have been like on SQL2000, and the problem query (2079 reads) would not have happened. For more details about Index Hinting in Dynamics NAV, check thhe blog "Index Hinting in Platform Update for Microsoft Dynamics NAV 4.0 SP3 KB940718".

      - Recompile hints

    Adding a Recompile hint to a query is a way to tell SQL Server to make a new query plan, and not take one from cache. In this way you may get query plans that are better optimized for the current parameter values, but it also adds an overhead to SQL Server because making a new query plan always takes longer than re-using a cached one.

    - Lohndorf

  • Microsoft Dynamics NAV Team Blog

    Unique IDs for ISVs

    • 0 Comments

    Problem Statement

    When you create an add-on solution, you must make sure that the new objects you create are given unique IDs (UIDs) from the range assigned to the add-on in question. This is done by using the appropriate license. There will be no problem as long as only one add-on solution is included in the developer’s license.

     

    However, if the developer’s license includes permission to create more than one add-on solution, it gets more complicated. In this case, the UIDs must be added manually during development instead of allowing them to be added automatically.

     

    One consequence of not manually applying the UIDs could be that two separate add-on solutions are given the same range of IDs. This will generate merge errors when you try to implement the two add-on solutions in the same Microsoft Dynamics NAV database.

     

    Manually setting the UIDs so that they comply with the appropriate add-on UID range is both a tedious and an error-prone task.

    Cause

    The Microsoft Dynamics NAV platform is not able to identify more than one range of unique IDs in a license.

    Resolution

    A function (with a fixed, predefined ID) can be added to codeunit 1. This function returns the base UID for the application, much like the SetGlobalLanguage function does for the application language. The function is called whenever Microsoft Dynamics NAV needs to find the base UID for the application. Starting from that base UID, the function finds the next available UID, which is then used for the new function, text constant etc in question.

     

    Notation:

    PROCEDURE GetUidOffset@212122() : Integer;

     

    Example:

    While creating an add-on solution, a developer at a Microsoft Certified Partner can change codeunit 1 (and restart Microsoft Dynamics NAV or re-open the company they’re working in – because of codeunit 1’s Single Instance-like behavior) to reflect the UID range that was assigned to the add-on in question.

     

        BEGIN

          Exit(Insert add-on solution ID here);

        END;

     

    When the partner adds a new text constant, function etc., the C/AL Editor will start at the inserted add-on solution ID and find the next available number.

    The procedure for creating an add-on solution and adding UIDs is:

    1.       Open Codeunit 1 and add PROCEDURE GetUidOffset@212122() : Integer;. This ensures that you use the appropriate add-on solution ID.

    2.       Save and compile.

    3.       Open Company.

    4.       Start developing ‘Add-on Solution 1’.

    5.       Save and compile ‘Add-on Solution 1’.

    6.       Change Codeunit 1 – delete PROCEDURE GetUidOffset@212122() : Integer;.

    7.       Save and compile Codeunit 1.

    You must repeat these steps for each separate Add-on UID range you work with.

    Known Issues

    ·         After this workaround has been implemented in codeunit 1, you must open a company in Microsoft Dynamics NAV before you open the C/AL designer. This is due to the fact that codeunit 1 is not called before you open a company.

    ·         If this workaround is not implemented when you are developing an add-on solution, the UIDs will be assigned according to the old implementation and might cause problems when you merge more than one add-on solution into a Microsoft Dynamics NAV database.

     

     - martinni

  • Microsoft Dynamics NAV Team Blog

    Diagnose your SQL Server

    • 0 Comments

    I assume that anyone reading this will be very familiar with collecting traces with SQL Server Profiler. And be equally familiar with the two main limitations of SQL Server Profiler:

    1. It adds a big overhead to SQL Server, which is the last thing you need when troubleshooting performance problems, and
    2. How to make sense of the vast amount of data that you can end up with.

    This blog shows how to make each of these limitations much less of a problem. It is in two sections: Data collection, and Data analysis. The data collection part is very different depending on whether you run on SQL 2000 or SQL 2005. This blog focuses on SQL 2005, but there is a small section in the end about how to achieve the same on SQL 2000.


    Data collection
    There used to be (and still is) a tool called PSSDiag. PSS being "Product Services and Support", the tool was designed for support engineers to collect trace files, logs, etc from customers who may not know much about SQL Server. So the support engineer decides what data to collect, and then sends a data collection tool to the customer which is easy to start.

    In SQL 2005, this tool is now part of SQL Server, and is called SQLDiag. This is how to use it:

    From a command prompt on the SQL Server machine, go into this folder:
    C:\Program Files\Microsoft SQL Server\90\Tools\Binn\
    then type:
    sqldiag

    The tool will start up, which will take a little while. If it shows any warnings, then ignore those - as long as it starts up. Once it is ready, it will display the following in green text:
    SQLDIAG Collection started.  Press Ctrl+C to stop.

    SQLDiag will now run in the background, collecting SQL Profiler traces and other information. As default, it will collect it to a subfolder called SQLDIAG. If this folder doesn't exist, the tool will create it. Once the customer has seen enough of the problem(s) that you are troubleshooting, they just have to stop the tool (Ctrl+C), then zip the content of the SQLDIAG subfolder, and send it to the support engineer for analysis.


    There is one optional parameter that should be used for the data collection. SQLDiag uses the file SQLDiag.XML to configure which SQL Profiler events and other events to apture. As default, this file does not contain much, so you should use another configuration file. Note: You should not modify SQLDiag.XML directly, but make a copy of the file, and then modify the copy. At the moment there are no simple tools to help you with this, so unfortunately it is not too simple to change the configuration. But attached is a configuration file (PSSDiagCustom.XML) which configures SQLDiag to collect trace files that are more similar to a standard SQL Profiler trace.

    To tell SQLDiag to use this file, use the /I parameter. For example, copy the attached file into a folder called C:\SQLDiagCOnfig\. Then run SQLDiag like this:
    SQLDiag /I C:\SQLDiagCOnfig\PSSDiagCustom.XML


    Now you should be able to use SQLDiag to collect information from the customer's system. SQLDiag creates a number of files in the output folder, but here we will only look at the .trc file that it creates. This is a normal SQL Profiler trace file which you can open and analyse with SQL Server Profiler.

    You can of course make .bat files to specify the commands instead of the customer having to go to the command prompt. You can also set SQLDiag up to run as a service, and then the customer just has to start this service. Or you can schedule the service to start and stop at certain times like any other service.


    Other things to be aware of are:
      - As mentioned, the default output folder for SQLDiag is C:\Program Files\Microsoft SQL Server\90\Tools\Binn\SQLDIAG\, but you can change it with the /O parameter. But always use a folder on the SQL Server itself. If you set it to use a network location, you will add overhead to the tool.

      - SQLDiag can collect a lot of information. Expect 100s of gigabytes. So you must make sure to have a lot of free disk space available. Also, the customer should run it only long enough to reproduce the problems they have. I would recommend to run it for not more than an hour or so.

      - SQLDiag creates much less of an overhead on SQL Server than running a normal SQL Profiler trace because it does not have a user interface. Much of the overhead caused by SQL Profiler goes into displaying the collected information.


    Run "SQLDiag /?" from the command prompt to see the additional parameters that are available, or look up "SQLDiag Utility" in SQL Server online help for further information about the tool.

     

    Data analysis
    Once you have collected SQL Profiler trace(s) (whether you use SQLDiag or start a SQL Profiler trace manually), the next challenge is to analyse this. You can of course just open the trace file(s) in SQLProfiler, and then take a look. But this is not an easy way to spot the worst queries.

    Luckily, there is an easy syntax to load trace files into a SQL table, which will allow you to query the events ordered by Duration, Reads, Writes or anything else. This is the syntax:

    --Load trace files into a SQL table so it can be queried:
    SELECT * INTO temp_trc
    FROM ::fn_trace_gettable('c:\MyTrace.trc', default)


    SQLDiag will make trace files up to 350GB in size, and then create a new trace file. So you will often end up with files like this in the Output folder:
    Server__sp_trace.trc - 350GB
    Server__sp_trace.trc_1 - 350GB
    Server__sp_trace.trc_2 - 150GB

    The "default" parameter in the syntax above means that it will automatically continue with the next file. So in this example, if you run this from SQL Management studio:

    SELECT * INTO temp_trc
    FROM ::fn_trace_gettable('c:\Server__sp_trace.trc', default)

    then it will automatically read all three files into the temp_trc table.

    If you want to limit how much to read into the temp_trc table, then instead of specifying "default", specify a number which will tell it the maximum number of files to read.

    This is useful because the command can take a long time to run, and take a lot of database space. So if for example you have collected 20 trace files, then it can be necessary to read 5-10 of them at a time.


    The syntax will automatically create a new table in the current database. Once you have this table, then you can add indexes. Useful indexes would be Reads, Duration, Writes etc.

    And then you have an easy way to identify the "worst" queries, ordered by any of these, for example:

    SELECT Reads, * FROM temp_trc ORDER BY 1 DESC --or:
    SELECT Duration, * FROM temp_trc ORDER BY 1 DESC

    "ORDER BY 1" just means: Order by the first column that you specify - in the example above, Reads or Duration. This is to avoid "Ambiguous column name 'Reads'."-errors if you have both a column and an index called the same.


    So in this way you can easily sort your events by anything you need. The top events may look like this:
    exec sp_execute 380,1,'20000'

    I would advice that you ignore these events, since they don't really tell you anything. Focus on events that you can relate to the activities that you are troubleshooting.

     

    Data collection on SQL2000

    As mentioned, SQLDiag is new in SQL2005. For SQL2000 you need a tool called PSSDiag. To download it, go to support.microsoft.com and just search for "PSSDIAG data collection utility". After installing this, run the file DiagConfig.exe, which lets you decide what events to collect. This part is actually a lot simpler than with SQLDiag because you have a graphical interface here.

    Once you have decided what to collect, click the "Save" button. This will create a file called \Customer\pssd.exe which you can send to the customer. You also have simple directions in the file \docs\PSSDIAG Instructions.doc to send to the customer as well. From here, the steps to start collecting are the same as with SQLDiag.

    - Lohndorf

  • Microsoft Dynamics NAV Team Blog

    Platform Roll-up Update for Microsoft Dynamics NAV 5.0

    • 0 Comments

    I am pleased to announce that the first platform update is now available for Microsoft Dynamics NAV 5.0. You can receive the Platform Roll-up Update for Microsoft Dynamics NAV 5.0 KB943858 by requesting it here, just fill in the KB number (943858) and submit the request.

    This update contains a wide number of error corrections for the following components:

    • The Microsoft Dynamics NAV C/SIDE client
    • NODBC
    • C/FRONT
    • NAS
    • Server

    As a reminder, Microsoft Dynamics NAV platform updates are cumulative and contain all previously released hotfixes and updates for each version of Microsoft Dynamics NAV and this update involves a technical upgrade of your Microsoft Dynamics NAV ERP database. Please notice that a technical upgrade can be a time consuming process if the database is including content of several hundred Giga bytes.

    I encourage everybody to download this update, but please notice that index hinting is turned on by default if the Microsoft Dynamics NAV SQL Option is used, see more about index hinting here.

    Martin Nielander
    Program Manager

  • Microsoft Dynamics NAV Team Blog

    Employee Portal for Microsoft Dynamics NAV 5.0

    • 0 Comments

    I am pleased to announce the update of Employee Portal for Microsoft Dynamics NAV 5.0. This update is now available for you to download at PartnerSource ,login credentials are required.

    Employee Portal for Microsoft Dynamics NAV 5.0 includes all the known functionality from earlier versions of Employee Portal and in addition supports Windows SharePoint Services 3.0 (WSS 3.0) and Microsoft Office SharePoint Server 2007 (MOSS 2007).

    From the link above we have released:

    • Product for download
    • Fact Sheet to support the sales process
    • Installation Guide to support installation

    Mtoo Norrild

    Program Manager

  • Microsoft Dynamics NAV Team Blog

    How to use the new Employee Portal with Microsoft Dynamics NAV 4.0 SP3

    • 0 Comments

    The Released Employee Portal can also work with NAV 4.0 SP3. If you want to run the new employee portal with NAV 4.0 SP3, you need to turn off the compression. This is because the compressions work differently on .NET Framework 1.1 and 2.0. Normal communication and encrypted communication is working fine. Partners will have to remove the compression in NAV and in the web.config of the portal. 

     

    1.       Remove the “Use compression” check mark in NAV

     

    ·         To do this, open your NAV client and select the menu entry

    ·         “Administration” – “Application Setup” – “Employee Portal” and click on the link to

    ·         “Application Server Setup”

    ·         The Application Server Setup card opens. Now navigate to the correct NAS record and uncheck the “Use compression” checkbox (see the following screenshot).

     

    2.       Remove the compression from the web.config of your SharePoint portal

     

    ·         On your machine running the SharePoint services please open the Windows Explorer and navigate to the root path of your site (e.g. C:\Inetpub\wwwroot\wss\VirtualDirectories\80) and open the web.config file.

    ·         Go to the end of the file to the block “appSettings”.

    ·         Change the value of the node “UseCompression” from “1” to “0”.

     

    Mtoo Norrild

    Program Manager

  • Microsoft Dynamics NAV Team Blog

    Exact Cost Reversal function becomes more humane in Microsoft Dynamics NAV 5.0

    • 0 Comments
    The Exact Cost Reversal function - the function that helps the users to ensure that returned items, both in sales and purchase,  are valued at exactly the same cost as the original transaction when being put back on inventory/drawn from inventory - has been available in NAV since version 3.01.

    Many NAV users are aware of and actively use this function when handling their return documents.  "All" they have to do is to fill in "Apply-from/to Item Entry" field on the order line . Or even better, checkmark the  “Exact Cost Reversing Mandatory ” field as part of S&R/P&P setup and use the Copy Document function - then the "Apply-from/to Item Entry" field on the copied return line will be filled in automatically.

    In practical terms, this seemingly easy task may turn out to be rather cumbersome and even obscure. Do a lookup in the “Apply-from Item Entry” field, and most likely you will be confronted with a huge amount of entries among which you have to find the relevant one.  Copy Document function will only allow you to copy and cost reverse posted shipment/receipt and not posted invoices.

    In NAV 5.0 release, the exact cost reversal received a usability facelift, while retaining the same core functionality.  With the new Get Posted Document Lines to Reverse functions, available on return orders and credit memos, you now:
    • can access regular documents (rather than item ledger records) and copy one or multiple lines, be they from the same or different documents
    • among the documents to copy from, will find invoices (as well as credit memos, if that's relevant) - the most logical reference document in returns situation  (see screenshot below; click to expand)
    • will not have to manually enter serial/lot numbers for the returned items - the program automatically copies item tracking lines to the return document from the original document
    • can rely on the program to keep track of the already returned and cost reversed quantity on the sales documents - so you don't have to worry about cost reversing return of the same sold quantity more than once.

     
     

    Olga T. Mulvad
     

  • Microsoft Dynamics NAV Team Blog

    Convergence 2008 - More NAV Sessions Than Ever!

    • 0 Comments

    Convergence 2008 is on the horizon - March 11-14 in Orlando, Florida.

    The preliminary agenda only showed a few Dynamics NAV sessions, but has recently been updated revealing the full glory of NAV-iness at the Conference.  This year's Convergence will have more sessions than ever before: currently 3 partner sessions, 1 customer general session, 28 breakouts and 10 interactive discussion sessions.  See the session catalog for details.

    Register here

    (Early Bird Registration finishes today, so move fast if you want to take advantage of that.)

    Convergence is a great event for anyone involved with Dynamics products.  We hope to see you in Orlando!

    - Ilana Smith

  • Microsoft Dynamics NAV Team Blog

    Microsoft Dynamics NAV and RFID

    • 0 Comments

    RFID technology simply put will someday become just as universal as barcodes are today bringing fully automated inventory management, asset tracking, retail checkouts benefits for all of us as consumers, retailers, distributors, shippers, manufacturers, suppliers, shippers or any other entity in the entire product life-cycle.

    In Dynamics NAV group, we started discussing how best to bring forward a RFID solution for our partners and customers quite some time ago. It was however always difficult to find enough resource in the group to justify significant investment compared to other competing priorities (delivering NAV 5.0, making progress on NAV 6.0, service pack…). However we simply could not ignore the importance of RFID.

    In my early days with-in the group, I had naïvely assumed that RFID was only relevant to large retailers such as Walmart who could invest heavily for technology adoption in the early days. But then I saw the technology in action in the recently renovated Seattle public libraries, where I could check out a whole stack of books just by putting them at a small pad (seeing all the Titles pop-up on the screen automatically was quite a thrill for a tech geek! – see related story here.), I was a convert on its global adoption at all scales and sizes. Also I was reminded by my team member Stuart that RFID adoption will most likely follow the similar route to Bar Codes.

    As industry big-wigs start mandating RFID tags from their partners on their inventory items, partners in turn will look for outsourcing RFID tags printing and stamping work to dedicated suppliers who will invest in RFID printing solutions. And thus every manufacturer, distributer from large to small will easily be able to outsource RFID labeling of their merchandise just as they did so for bar codes in the beginning creating a viable marketplace for RFID enabled tags and scanners in a very short span of time.

    Now it was left for us to how to provide a solution in a timely manner that our partners and customers could benefit from. To begin with we chose Microsoft BizTalk RFID technology that provides a device independent platform for integrating various RFID scanners, printers and solutions. This technology platform coupled with power of Dynamics NAV gave us ample opportunity to build a showcase implementation.

    We narrowed down scope of our efforts primarily based on available resources, time to market and partner considerations.  The end result is a white paper, coupled together with a sample implementation that helps our partners quickly get up to speed on Microsoft BizTalk RFID technology and build their own solutions using native extensibility and integration features of Dynamics NAV as they already know today. Note this sample itself is not a customer ready implementation, as it not only uses the sample RFID emulator device but also it uses a COM+ based events listener that is not ideal for production systems. Partners have already built numerous solutions using MSMQ or Microsoft SQL Server based integration solutions for Dynamics NAV and those will be better suited for production RFID systems.

    Download the Dynamics NAV RFID Whitepaper and Sample Implementation (PartnerSource access required)

    - Naveen Garg

  • Microsoft Dynamics NAV Team Blog

    Sure Step is Now Available to All Microsoft Dynamics Partners

    • 0 Comments

    Doug Kennedy, Vice President Dynamics Partner Team, announced the availability of Microsoft Dynamics Sure Step today to all Dynamics partners - as part of the Worldwide Partner Conference Dynamics Keynote in New Orleans. Starting July 13, 2009, Sure Step is available to all Dynamics partners  to help drive increased partner productivity, better partner collaboration and improved customer satisfaction. For more information, see the Microsoft Dynamics Sure Step page on PartnerSource.

  • Microsoft Dynamics NAV Team Blog

    Join us at the Microsoft Dynamics NAV Sessions at Convergence in New Orleans

    • 0 Comments

    We're just two weeks away from Convergence 2009 in New Orleans from March 9-13. We have created more than 70 sessions and Hands on Labs in the Microsoft Dynamics NAV track. Many of the sessions and all Hands on Labs will be focused upon the newest release, Microsoft Dynamics NAV 2009. For the list of sessions see the session list on Convergence web site.

    You can familiarize yourself with the new RoleTailored User Experience, Web Services and Reporting capabilities. You can learn about the different application areas, such as Manufacturing, Distribution, Finance and Service Management.  There are also industry related sessions where you can hear how Microsoft Dynamics NAV is used in different industries.

    To register for Convergence 2009, or for more information, see the Convergence web site.

    -Eva Sachse

  • Microsoft Dynamics NAV Team Blog

    Another Update to the Microsoft Dynamics NAV 2009 Developer and IT Pro Help

    • 0 Comments

    We've released an update of the Microsoft Dynamics NAV 2009 Developer and IT Pro Help to MSDN and the Microsoft Download Center. This is the second of our periodic updates to developer and IT Pro content since NAV 2009 released in November.

    This release includes:

    • Bug fixes and new content. About 17% of the topics have been updated or are new in this release.
    • Debugging with Visual Studio. Added new topics about debugging RoleTailored client objects with Visual Studio.
    • Delegation. Added new topics about setting up delegation.
    • Form Transformation. Added new troubleshooting topics.
    • Improvements to Walkthroughs

    You can download the updates and copy it into your NAV 2009 installation, updating the Help you receive from F1. If you have feedback on any of the content, please use the feedback link at the bottom of each page in the CHM, or use the Ratings and Feedback form on each MSDN page.

    - Bob, Elona, Jill, and John (the NAV dev & IT Pro writers)

  • Microsoft Dynamics NAV Team Blog

    Basic SQL - Restoring a SQL Server backup

    • 0 Comments

    This post is part of "Overview of NAV-specific SQL features for application consultants".

    You can back up your Microsoft Dynamics NAV database either from a NAV client or from SQL Server Management Studio. To restore a backup made from SQL Server, follow these steps:

    1)     Open SQL Server Management Studio
    2)    You don’t need to create a database first, like you do from NAV. Just right click on Databases, then select “Restore Database...”.
    3)    In the box that opens, type in a name for your new database. You can name it anything. Then change the source from “From database” to “From device”. Device, in this case, just means file.
    4)    Click the Assist Edit button next to “From Device”, which opens a new dialogue box. Click Add, and then select the SQL backup you are restoring, and click OK.
    5)    SQL Server will list a few details about the backup you selected. Tick the “Restore” box, and then click OK:

    Restore

    The new database should now appear in SQL Server management studio, and you can access it from a NAV client.

     

    Lars Lohndorf-Larsen (Lohndorf)

    Microsoft Dynamics UK

    Microsoft Customer Service and Support (CSS) EMEA

  • Microsoft Dynamics NAV Team Blog

    Dynamics NAV 5.0 SP1 and export to Excel

    • 0 Comments

    A number of issues previously reported when running with Dynamics NAV 5.0 and using send-to Excel functionality, are now corrected.  These issues include (among others): decimals exported as text, Item/Customer numbers exported as decimals, dates exported as decimals.

    To implement correction, update with 5.0 SP1, update 1 (KB 956161, build 27191), use default style sheets included in SP1, and make the following change to default style sheet NavisionExportToExcel.xslt:

    Open the style sheet file in notepad, default file is NavisionFormToExcel, placed in Stylesheet folder of the Client folder.

    Browse to the following line :   

    <

    xsl:when test="@subtype != 'number'">

    and replace the line with

    <

    xsl:when test="@datatype != 'Decimal'">

     

    Then locate the following line :   

    <

    xsl:when test="@subtype = 'number'">

    and replace the line with

    <

    xsl:when test="@datatype = 'Decimal'">

     

    Jasminka Vukovic (jvukovic)

    Microsoft Dynamics NO


    Microsoft Customer Service and Support (CSS) EMEA

  • Microsoft Dynamics NAV Team Blog

    NAV 2009 - Report Designer - Introduction to the new environment

    • 0 Comments

    Microsoft Dynamics NAV has always had its own report designer. In NAV 2009 it still does, but in addition to this you can also use the Visual Studio (VS) report designer.

    NAV 2009 – both the classic and the new client – will still run reports designed in NAV’s report designer. So in way nothing has changed. You can still use the existing report designer. VS report designer offers a lot of new options and features. The idea of this post is to describe what features out of 100s that you actually need. When it comes to a simple report, only a very few features are needed to get started. This post tells you which ones they are.

     

    Old versus New


    The classic client can still only run reports designed in the classic report designer. The new client (Role Tailored client – RTC) can run reports designed with either the classic or with VS report designer. When RTC launches a report, it checks if a layout has been defined in VS Report designer. If it has, then it will run that. If no layout has been defined, it will launch the report engine from the classic client and run the report exactly like it would have been run from a classic client. This does require that a classic client has been installed as well, even if the user will never have to run this client.
    I this way you can use the classic report designer for some reports, and VS Report designer for others.
    Report design is still done from Object Designer in the classic client.


    VS report designer is a different environment. It has features that are not available in the classic report designer, but also limitations compared to the classic report designer. For example, it does not support Trans-Headers and Footers. But as described above, the VS report designer is a choice you have – you can still use the classic report designer if there are things you can't achieve with the VS report designer.

    The new environment


    Sections is where you make your report layout in the classic report designer. This will run in the classic client as well as in RTC. For RTC you can create a report layout, which is done in VS report designer. So “Sections” refer to the classic report layout while “Layout” refers to the layout done in VS.


    The classic report designer introduces four new options:


    Tools -> Create Layout Suggestion:


    This makes a “best effort” to transform your classic report design and create a suggested layout in VS report designer. This can do a lot of the hard work for you so you don’t have to start from scratch, or design sections first in the classic report designer, and then create the layout in VS report designer. The tool can’t guarantee to transform every report, but it will always at least give you a good start, and for many reports it will do all the work needed.

    Tools -> Delete Layout:


    This deletes the report layout

    View -> Layout:


    This opens VS report designer and is where you will go to design your report for the RTC.

    View -> Request Page:


    The classic client runs forms while the new client runs pages. The same goes for request forms on a report. So if you want to add options to the report for the new client, then do that in the request page.

     

    VS Report Designer


    Going to View -> Layout opens VS Report Designer which looks like this:

    VSReportLayout

    The parts of this environment you need, are:


    Toolbox:


    Normally you can switch between the Toolbox and “Website Data Source” (fields) as shown in the red circle in the picture above. But if for some reason the toolbox is not visible, then go to View -> Toolbox. If you use the “Create Layout Suggestion”, then this will add the necessary elements to the report, and you won’t need the toolbox. If you create a layout from scratch, then all you need from the toolbox – at least for simple reports , is a table:

    ToolBox

     

    Website Data Source:


    This is where you select the fields to print on the report. It will automatically show you anything that you have added to sections in the classic report designer. So to add / remove fields from here, go to sections and add / remove them from there.
    To add the fields from the WebsiteData Source, just drag and drop them into a table or to where you want them displayed.

    VS report designer has 100s more options, features and elements, but the ones mentioned here are the only ones you need to get started.

    Workflow – designing reports


    So having explained which features you need - at least to create a simple report - this is how to use them:


    When you have done your layout in VS report designer, close and save it which will bring you back to the classic report designer. Moving one line up or down will prompt you if you want to load the report layout. When you do that, the VS report layout is saved in the report object itself.
    So exporting a report from Object Designer will export all of it, including the layout you have designed in VS report designer, whether you export the object as .txt, .xml or .fob. If you export the report as .txt or .xml, you can see the VS report layout added at the bottom of the report, in a section called RDLDATA.


    Finally, to run the report, either run it directly from Start -> Run, like this to run report 99800:
    dynamicsnav:////runreport?report=99800
    or of course you can add the report to a page to run it from the new client.

     

    Lars Lohndorf-Larsen (Lohndorf )

    Microsoft Dynamics UK

Page 35 of 44 (652 items) «3334353637»