This post is starting of a summary of style sheet issues reported in Dynamics NAV 5.0 when using send-to Excel/Word functionality, and attempt to provide workaround suggestions for as many of these issues as possible.
Most issues are related to regional settings and use of decimal separator and or/date format. In 5.0, when xml file with data is created by application, only values displayed on form are passed, along with the decimal separator sign used. Style sheets are determining the field type based on the format of the field value. As different date formats are used in different places, along with different decimal separator/thousand delimiter combinations, this can lead to number of issues. In newer versions, 'data type' attribute is passed along with decimal separator and field value, to avoid ambiguous formatting issues.
Keep in mind that matrix forms and forms like Customer Sales History are not supported by style sheets at all.
The list below contains three of the issues reported on 5.0, some corrected in 5.0 SP1 version, and a suggestion for possible workaround.
Other reported issues will be handled in coming blogs.
1. When running forms that contain several subforms (example form 521, Application Worksheet), and selecting Send to Excel, the following error is reported in excel:
Problems came up in the following area during load: Workbook Setting...
Table box names created in excel should be added a counter value to the name to avoid this issue. To do this:
Open the style sheet file in notepad, default file is NavisionFormToExcel, placed in Stylesheet folder of the Client folder.
Browse to the following section and replace the marked line :
<xsl:template match="Control[@type='TableBox']"> <Worksheet xmlns="urn:schemas-microsoft-com:office:spreadsheet"> <xsl:attribute name="ss:Name"> <xsl:variable name="TableBoxCaption"> <xsl:value-of select="//Object/@caption"/> </xsl:variable> <xsl:value-of select="substring($TableBoxCaption,1,30)"/> <-- REPLACE THIS LINE --> </xsl:attribute>
with the following line:
<xsl:value-of select="substring($TableBoxCaption,1,28)"/><xsl:value-of select="position()"/>
2. Error when exporting date fields to excel.
If a form contains date fields and date format is dd.mm.yy or dd.mm.yyyy , following error occurs when running send-to excel, with Decimal Separator equals ‘.’:
Problems came up in the following area during load: Table...
If value of decimal separator is '.', style sheet interprets all fields containing numeric values (only) and '.' as decimals. Dates are then interpreted as invalid decimal values. One way of solving this in 5.0 is to check for number of dots, and number of places between the dots, 2 implying the value is a date. To do this:
<xsl:when test="$DecimalSeparator = '.'">
<xsl:when test="$DecimalSeparator = ','">
<xsl:when test="$DecimalSeparator = '.'">
<xsl:attribute name="ss:Type">String</xsl:attribute> <xsl:value-of select="@value"/>
<xsl:when test="$DecimalSeparator = ','">
3. Date fields exported as decimals when Decimal Separator equals ‘,’.
The same method of workaround could be applied here as for the previous issue. Checking for number of dots and number of digits between the dots and reading value 2 would normally imply that the value is a date.
1. Open the style sheet file in notepad, default file is NavisionFormToExcel, placed in Stylesheet folder of the Client folder.
2. Browse to the following section and replace the section:
<xsl:when test="$DecimalSeparator = ','">
Jasminka Vukovic (jvukovic )
Microsoft Dynamics NO
In some situations, using the SQLIndex property on a key in Microsoft Dynamics NAV can harm performance. In this blog I
describe what to be careful about, and why the use of this property has been removed in the NAV 5 SP1 application.
The property is still available and it is still a valuable tool in performance tuning. But from SP1, it is not used anymore
in the standard application. The document "Changes in NAV 5 0 SP1.doc" on the SP1 product CD lists the 72 tables where the
SQLIndex has been removed (Change A222).
Background:The key-property SQLIndex was introduced in NAV version 4 SP1. The idea of the property is to make it possible to change
the index on the SQL Server while maintaining the same application (same sorting) in NAV.
The main use of the property is to make the SQL index more selective. In the NAV application there are many keys that begin
with an option which is not very selective, for example the primary key "Document Type","No." on the Sales Header table.
"Document Type" - having only 6 possible options - is not very selective, and SQL Server might choose not to use it. If the
index was changed to be the other way around ("No.","Document Type"), it would be much more selective and more effecient
for SQL Server to use in SELECT statements.
The other benefit of this property is to enable "covering indexes", so that you can have a few indexes to cover for most
searches, and then disable the maintenance of other indexes. For example an index on the "Sales Header" table beginning
with "No." can be used effeciently with many different filters, reducing the need to have one key for every possible exact filter.
Problems:When you have a query which includes an "ORDER BY"-clause, SQL Server has to return the reords in the order specified by
that clause. If SQL Server doesn't have a matching index, it has to retrieve data using a different index and then do some
internal sorting to return data in the correct order. If there are no good indexes, then SQL Server may choose to
use the clustered index which can be bad enough. But when the query also has an index hint, then SQL Server is forced to use the
index specified by the hint, and this can lead to large amounts of reads.
Example:In a recent support case, the customer had generally bad performance. In this case, the "SELECT TOP 30"-query from the post "Simple query to check the recent performance history" showed that out of the top 30 "worst" queries, 26 were similar to this one
SELECT TOP 501 * FROM "Reservation Entry" WITH (UPDLOCK, INDEX("$1")) WHERE (("Source ID"=@P1)) AND (("Source Ref_
No_"=@P2)) AND (("Source Type"=@P3)) AND (("Source Subtype"=@P4)) AND (("Source Batch Name"=@P5)) AND (("Source Prod_ Order
Line"=@P6)) AND (("Reservation Status"=@P7)) ORDER BY "Source ID","Source Ref_ No_","Source Type","Source Subtype","Source
Batch Name","Source Prod_ Order Line","Reservation Status","Shipment Date","Expected Receipt Date","Entry No_","Positive"
The query itself looks good enough: WHERE-clause and "ORDER BY"-clause match each other, and there were no immediate
reasons why this query should cause more reads than the number of records in the table. But in the standard application, the SQLIndex property for this key was:
"Source ID","Entry No.",Positive
The idea with this is SQLIndex is to have an index which can cover more situations, and in that way reduce the number of indexes that
need to be maintained on SQLServer. And the SQLindex is fine for the SELECT-part of the query. The problem is, that the
index can't be used for the "ORDER BY"-part of the query.
So what happens, is:
SQL Server may have planned to use the clustered index to read all data and then do some internal sorting. But in this
case, the Index Hint forces SQL Server away from that plan. The result is that SQL Server is forced into doing a very
difficult task while being restricted by the index speicifed by the index hint.
In this case we designed the "Reservation Entry"-table and removed the SQLIndex property from the key, and performance went
Conclusion:When you see a query which causes many reads, even if SQL Server has a good index, then also consider if the index is
good for the "ORDER BY"-part of the query. The "ORDER BY"-part of the query depends on the key in NAV. But if the SQLIndex
property has been set for this key, then by definition, the "ORDER BY" and the SQL index will not be matching.
You should still consider the use of the SQLIndex property as part of tuning performance of a system. But just be aware that it can also cause problems as described here.
Lars Lohndorf-Larsen (Lohndorf)
This post describes how you can automate importing and exporting of Microsoft Dynamics NAV objects with C/Front.
To make a basic C/Front project, see the post Beginners guide: How connect to NAV from a .net project, using C/Front. This post only describes the additional code needed to import or export NAV objects.
The line below will export Tables 1..10 to the file "C:\x\ExportFob.fob":
The filter has to be in the same syntax as used in the "DataItemTableView" in a report. The simplest way to create the filter is to make a new report in NAV, based on the Object Table. Then in the DataItem Properties, specify a filter in DataItemTableView, and copy that into your C/Front code.
The file that you export is a normal .fob file, which can be imported back in the usual way (File -> Import from Object Designer), or via C/Front as described next.
C/Front has a special datatype "NavisionImportMode" to control the actions when the objects you import conflict with existing objects. The possible settings are Overwrite, Skip and ThrowError. The code needed to re-import the object we exported before, is:
ImportMode = NavisionImportMode.Overwrite;
You can only export objects as binary .fob files. It is not possible to export objects as text.
In SP1 for NAV version 5, C/Front will no longer check for license permissions when importing files. So the import-functionality works just like importing objects manually, which means that the license file is not checked. In previous versions it was not possible to import new objects or tables with new fields unless the NAV license would allow the user to create those objects or fields manually.
Lars Lohndorf-Larsen (Lohndorf)Microsoft Dynamics UK
This post describes the simplest possible way to get started with creating a .net project in Microsoft Visual Studio, and use C/Front in c# code to connecct to a Microsoft Dynamics NAV database. The example here is based on the sample from the product CD, which will install to this folder: C:\Program Files\Microsoft Dynamics NAV\SDK\CFRONT\DotNetSampleApplication. More details can be seen from that sample, and I will also make further blog posts to describe further use of C/Front. For this post, all we want to do is to make a connection.
The example here was made in Visual Studio 2008. You can do the same in Visual Studio 2005. The NAV database in this example is running on SQL Server. To make it connect to a Native NAV database you nee to adjust NetType and DriverType.
Time to complete: Approximately 20 minutes.
C/Front is part of SDK (Software Developers Kit), which also includes the Communication Components. If you haven't installed it already, then install SDK from the NAV product CD. Then follow the steps below:
1) Open Microsoft visual Studio 2005/2008
2) Create a new project (File -> New -> Project). As project type, select "Visual c#", then select "Windows Forms Application". Type in a name and click OK.
3) Include C/Front to your project by rightclicking on "References" on the right hand side, under "Solution Explorer", and select "Add Reference". Go to the Browse-tab, and select Microsoft.Navision.CFront.CFrontDotNet.dll. As default this is installed in this folder: C:\Program Files\Microsoft Dynamics NAV\SDK\CFRONT\.
4) Now, add a button to your Form (View -> Toolbox, then double-click a button to add it). Also add a TextBox so we have a place to display data from C/Front. Double-click on the button to get to the code.
5) At the top of the code-window where Visual Studio has already added a number of "using"-commands, also add a line for C/Front: using Microsoft.Navision.CFront;
6) All the preparations done, we are now ready to program with C/Front. Go back down the the function "private void button1_Click", and copy in the code below. Just adjust values NavisionPath, ServerName and CompanyName to match your system:
// Declare specific C/Front variables
NAVText = Data;
textBox1.Text = NAVText.ToString();
7) Save the project. As default it will save to this folder: C:\Users\[UserName]\Documents\Visual Studio 2008\Projects\CFrontSampleVS2008\[ProjectName]
8) Finally, copy the files CFront.dll and CFrontSQL.dll from the C/Front folder (default C:\Program Files\Microsoft Dynamics NAV\SDK\CFRONT) into the project sub-folder bin\Debug
Now try to run it. Build the project first (Buld -> Build [ProjectName]), and check that it builds without errors. Then run it (Debug -> Start Debugging - F5). If succesful, it will display the customer name of the first customer on the form.
Always set the NavisionPath and point it to a NAV client folder of the same version as C/Front. This is where C/Front will locate essential files like dlls, flf and stx files. If you don't set this path, you are likely to get error 2 in module 1 ("The system cannot find the file specified.").
C/Front requires granule 1,800 C/FRONT. It connects just like a normal client, so it will also consume a normal user session while connected.
Lars Lohndorf-Larsen (Lohndorf )
Microsoft Dynamics UK
Dynamics NAV 2009 contains a new subsystem for dealing with Web Services. This feature has been well received by partners and customers alike. Partners have expressed interest in having web services available for earlier versions of Dynamics NAV. This feedback resulted in a technology talk at Directions2007 in Florida, where the topic was what could be done to day. The conclusion of the talk was that everything we where intending to deliver was already possible today, yes some code is needed but strictly from function/feature perspective all of it is possible, and it is not even all that ugly. Dynamics NAV 2009 will provide out-of-the-box programmatic web service access to the application and will therefore remove the need for this additional technology plumbing described here.
I have to say that the response to my talk has been tremendous. After the response to my talk on Web Services in NAV 5.0 and previous versions I decided to write this blog post and make the source files available.
This post is about how to bridge the gap between the need for web services now and the current platform, it will help you understand how you can provide Web Services directly from Dynamics NAV today, in a “simple” and flexible way, already today.
To work with the samples in this post you will need: Visual Studio 2005, Dynamics NAV 5.0 and .Net 3.0 installed on your system. This sample should work on Dynamics NAV 4.0 to but has not been tested on that version.
The system we will build contains 4 different components/moving parts: Web Service Listener, Event Dispatcher, Codeunit Eventhandler and XMLPort for stream handling.
Any client that understands how to communicate with Web Services; like InfoPath, Visual Studio, SharePoint or any custom application written by you.
Is the physical communication port that the WCF listens to.
Defines the data contracts and service contracts for the Web Service, it also implements the concrete service and opens for listening in the WCF subsystem, it then delegates the requests to the COM Event Dispatcher component.
This component provides the hookups for Dynamics NAV, both to activate the service and to register event sinks. It defines 2 IDispatch interfaces the IServiceEvents and the IWebServiceListner, as well as the concrete implementation of the IWebServiceListner in the WebServiceListner class that provides the actual code for hooking up the WCF Web Service to Dynamics NAV.
We are using the CLR runtime for writing our Web Service component and our COM plugin. Some of this blog entry is about interop between Dynamics NAV and .NET through COM.
Is responsible for starting up the WCF Web Service through the COM interface, it then registered for events coming from the WCF Web Service Component. The events routed to XMLPort for processing.
It deals with the actual business logic and data coming from or going to the Web Service.
The implementation is in 2 programming languages: C# and C/AL.
Please take a look at the provided code sample, for the rest of the information contained in the posting. It can be found here: http://code.msdn.microsoft.com/nav/Release/ProjectReleases.aspx?ReleaseId=896
I have included comments in the code that should explain what is going on, if you feel something is missing, first look at the documentation for the WCF or post a comment to this post and I will try to answer it.
To deploy the sample you will first have to download it, unpack it.
Then open it up with Visual Studio and compile.
Then import the codeunit.txt and xmlport.txt into your NAV installation and compile those objects, starting with the XMLPort
To run the service simply open the Object Designer in NAV, find the Codeunit that you just imported and press run.
There is no dependency on IIS or other external components. No further deployment steps should be needed.
In the Visual studio solution is a ConsoleTestApp project. After you have followed the steps above you can run that project, it will test if your install was successful, as well as provide sample on how to use the web service.
In this sample I’m using XMLPort to handle the XML stream that is provided.
You can take many different approaches to this, and still reuse large please of the code provided in the sample.
To use the XMLPort as handler you will have to set the encoding property to UTF-8. This is due to a null termination bug in stream handler in NAV.
With this approach you can already today, incorporate web services in your projects in straightforward way.
The appropriate usage is whenever you need to give external application access to Dynamics NAV data or business process.
For any questions or comments please feel free to ask them in the comment section of this blog post. I will answer questions to best of my ability on this post in the comments section as well.
One last thing: This is a sample code. It has not been tested, you should thoroughly test this code before usage.
Best regards,Kris Rafnsson
Bulk Inserts is a new feature in NAV 5.0 SP1 which is designed to improve performance when inserting multiple records by delaying the physical inserts on the SQL Server.
Before this change, inserts happen in the order that the C/AL code is running. With Bulk Insert, the NAV-client delays the actual inserts until the last possible moment in the transaction. This means that tables get locked later, so it reduces the amount of time that a table is locked.
"The last possible moment", means that the inserts will take place just before COMMIT. Or, if you use the return value of the INSERT command (IF Rec.INSERT THEN;). NAV also has to do the inserts if you run any MODIFY, DELETE or FIND methods on the table. So to take full advantage of this feature, you have to consider this when designing a NAV process.
Note: This feature has nothing to do with the Transact SQL command "BULK INSERT". The Bulk Insert all happens on the client. It is not using any special SQL Server features.
Note: This feature has nothing to do with the Transact SQL command "BULK INSERT". The Bulk Insert all happens on the client. It is not using any special SQL Server features.
The Pseudo-code below illustrates how you will see the effect of Bulk Insert in a SQL Profiler trace. Without Bulk Insert, the SQL updates may look like this:
INSERT INTO "NAV5"."dbo"."CRONUS International Ltd_$G_L Entry"
SELECT TOP 1 FROM "NAV5"."dbo"."CRONUS International Ltd_$Cust_ Ledger Entry"
SELECT TOP 1 FROM "NAV5"."dbo"."CRONUS International Ltd_$G_L REgister"
With Bulk Insert, the same C/AL code would look lilke this:
So, you will see exactly the same SQL updates. But with Bulk Insert, the inserts will be accumulated and run at the end of the process.
Lars Lohndorf-Larsen (Lohndorf),
I spent last week performing a Microsoft Dynamics NAV 5.0 Update 1 to Microsoft Dynamics NAV 5.0 SP1 database conversion for a customer. The process went very smoothly and the customer was very excited about the increase in performance we were able to achieve.
Just as an illustration of our success, the customer ships hundreds of packages a day. This process has been very painful in the past. When the conversion was completed and a little tuning was done, we were able to achieve the following improvements …
· Create a Shipment --> Reduced by 27%
· Create a Pick --> Reduced by 15%
· Register a Pick --> Reduced by 54%
· Post a Shipment --> Reduced by 53%
Now, I mentioned a little tuning was necessary so let me elaborate on that. SP1 seems to highlight poor filtering used for the FlowFields. The tuning that I had to do amounted to determining which CALCSUMS were not performing well. I did this by using the Client Monitor in the Performance Toolkit. I was then able to examine the filters that were applied and determine what a more appropriate key would be. I added the additional key to the table along with the appropriate SumIndexField(s). This made all the difference for this customer.
This customer was also plagued by a large number of locking/blocking issues. After applying SP1, with the Bulk Inserts and Indexed Views, the locking/blocking issues became almost non-existent.
Some other pieces of information that might be helpful:
· Log file size – during the conversion, the log file did NOT grow at all.
· Database Size – when the conversion was done, this particular customer was able to get back approximately 10 GB of free space in the database out of a 65 GB database.
· Conversion Time – it took between 40 -60 seconds per GB depending on what hardware was used.
· Indexed Views – do NOT add any additional indexes to the Indexed Views. When you design the table that the Indexed View is associated with, the view will be dropped and recreated and so will any additional indexes that you might have added.
Robert Miller (rmiller)
One of the major changes in Microsoft Dynamics NAV version 5 SP1 (in relation to performance on SQL Server), is a new way to send queries to SQL Server. In previous versions of NAV, we some times saw SQL Server 2005 optimizing query plans for extreme parameter-values, which - when re-used from cache for queries with other parameter-values - could cause long response time. A behaviour which is described in more details in KB 935395 on PartnerSource (login required). Some of the updates for NAV version 4 introduced new features to give better control of the query plans that SQL Server makes, such as index hints and Recompile-option.
SP1 for NAV version 5 has restructured the way that queries are sent to SQL Server, with the aim that SQL Server will now make query plans that are optimized for average parameter-values rather than extreme parameter-values. It is also a method which lets SQL Server make the plan, without forcing it in a certain direction with index hints or recompile-option.
Before NAV SP1, a typical query could look like this:
exec sp_cursoropen @p1 output,N'SELECT * FROM "Demo Database NAV (5-0)"."dbo"."CRONUS International Ltd_$G_L Entry" WHERE (("G_L Account No_"=@P1)) AND "G_L Account No_"=@P2 AND "Posting Date"=@P3 AND "Entry No_">=@P4 ORDER BY "G_L Account No_","Posting Date","Entry No_" ',@p3 output,@p4 output,@p5 output,N'@P1 varchar(20),@P2 varchar(20),@P3 datetime,@P4 int','1105','','1753-01-01 00:00:00:000',0
Notice the parameter-values in bold ('1105','','1753-01-01 00:00:00:000'). SQL Server will make a query plan based on running the query with these parameter-values. It will then cache this plan, and use the same plan for other queries which are identical, but have different parameter-values.
In SP1, the query above will look like this:
declare @p1 int
declare @p5 int
declare @p6 int
exec sp_cursorprepare @p1 output,N'@P1 varchar(20),@P2 varchar(20)',N'SELECT * FROM "W1500SP1RTM"."dbo"."CRONUS International Ltd_$G_L Entry" WHERE (("G_L Account No_"=@P1)) AND "G_L Account No_">@P2 ORDER BY "G_L Account No_","Posting Date","Entry No_" ',1,@p5 output,@p6 output
select @p1, @p5, @p6
declare @p1 int
And then another query:
So before, we had one query. In SP1 we have two! So what's the benefit of that?
If you look at the first query from SP1, notice that it is a sp_cursorprepare statement, and not sp_cursoropen. So the actual query is not run at this point. More importantly, the first query does not contain the parameter-values. This is the query for which SQL Server makes the query plan. Not having the parameter-values, SQL Server makes the plan based on its statistics about the data in the table. Only after this, NAV then executes the statement in the second query (sp_cursorexecute).
This method guarantees that SQL Server's query plan will not be affected by the parameter-values. It means that some times, SQL Server is prevented from making the optimum query plan for a certain set of parameter-values. But remember that the query plan will be re-used for other parameter-values. So at the expense of having a few highly optimized queries, the method will give well optimized queries with better consistensy.
Another cost of this method, is of course that now NAV sends 2 queries instead of 1, requiring an extra roundtrip to SQL Server. But this only happens the first time the query is run. If the same query is run again, NAV will only run the second query (sp_cursorexecute).
One side effect of this is, that tracing a query in SQL Profiler becomes different. With SP1 you will see a lot of queries like the second one above, which does not show what NAV is actually doing. Take a look at the second query again:
exec sp_cursorexecute 1073741861,@p2 output,@p3 output,@p4 output,@p5 output,'1110',''
exec sp_cursorexecute 1073741861,@p2 output,@p3 output,@p4 output,@p5 output,'1110',''
With SP1 you will see a lot of queries like this, and then wonder what the actual query is. To find out, you need to use the cursor ID, and then find the original sp_cursorprepare statement, which will contain this line:
and then the actual query.
In summary, this method is designed to give persistently, good overall performance, and to avoid sudden drops in performance that could be the result of cached query plans on SQL Server.
Lars Lohndorf-Larsen (Lohndorf)Escalation Engineer
This is a follow-up to the post "how to write a simple XML document". The other post conitains a few more background details which I won't repeat here. So even if you only want to read XML Documents, then you may want to have a look there anyway.
But let's get straight to the point. This is how you can read an XML Document from Microsoft Dynamics NAV, using MSXML DOM:
1. Create a new codeunit.
2. Declare these 5 new global variables:
XMLDoc Type = Automation 'Microsoft XML, v4.0'.DOMDocument DOMNode Type = Automation 'Microsoft XML, v4.0'.IXMLDOMNode XMLNodeList Type = Automation 'Microsoft XML, v4.0'.IXMLDOMNodeList
CurrentElementName Type = Text 30i Type = Integer
3. Initialize the Dom Document: CREATE(XMLDoc); XMLDoc.async(FALSE); XMLDoc.load('C:\XML\MyXML.xml');
4. Set up a loop to browse through the nodes in the document: XMLNodeList := XMLDoc.childNodes; for i := 0 to XMLNodeList.length - 1 do begin DOMNode := XMLNodeList.item(i); ReadChildNodes(DOMNode); end;
You also have to create ReadChildNodes as a function with one parameter: CurrentXMLNode, Type = Automation 'Microsoft XML, v4.0'.IXMLDOMNode.
The reason for the functionis to enable you to run it recursively. This is useful because you don't know how many child nodes there are. And there can easily be a child node inside of another child node.
So create the function ReadChildNodes.The function has one parameter: CurrentXMLNode : Automation "'Microsoft XML, v4.0'.IXMLDOMNode"
And a number of local variables: TempXMLNodeList Automation = 'Microsoft XML, v4.0'.IXMLDOMNodeList TempXMLAttributeList Automation = 'Microsoft XML, v4.0'.IXMLDOMNamedNodeMap j Integer k Integer
The function checks to see what type of node it was called with. A node can be an element, text (data), attribute etc. The node property .nodeType tells you which it is. See the documentation (included in MSXMLSDK) for a complete list of possible nodeTypes.
This is the function:ReadChildNodes(CurrentXMLNode : Automation "'Microsoft XML, v4.0'.IXMLDOMNode")CASE FORMAT(CurrentXMLNode.nodeType) OF
'1': // Element BEGIN // Global variable CurrentElementName to keep track of what node // we are currently processing CurrentElementName := CurrentXMLNode.nodeName;
// Process Attributes // If the element has attributes, then browse through those. TempXMLAttributeList := CurrentXMLNode.attributes; FOR k := 0 TO TempXMLAttributeList.length - 1 DO ReadChildNodes(TempXMLAttributeList.item(k));
// Process Child nodes TempXMLNodeList := CurrentXMLNode.childNodes; FOR j := 0 TO TempXMLNodeList.length - 1 DO ReadChildNodes(TempXMLNodeList.item(j)); END;
'2': // Attribute BEGIN MESSAGE(CurrentElementName + ' Attribute : ' + FORMAT(CurrentXMLNode.nodeName) + ' = ' + FORMAT(CurrentXMLNode.nodeValue)); END;
'3': // Text BEGIN MESSAGE(CurrentElementName + ' = ' + FORMAT(CurrentXMLNode.nodeValue)); END;END;
Now, try to run the codeunit against any XML document, and it will give a MESSAGE for each node and each attribute. This codeunit can be used for scanning any XML document from begin to end.
Now, lets say you are looking for specific nodes in an XML document. As an example, take this document:
One of the good things in MSXML DOM is, that it reads the whole document into memory, so you can jump between nodes, and browse them, and select sub-sets based on node names. So you do not have to read a document from top to bottom if you know exactly which parts of the document you are interested in.
First, we can count how many orders the document contains (see how many "Order"-elements are under the root-element):
OnRun()CREATE(XMLDoc); XMLDoc.async(FALSE); XMLDoc.load('C:\XML\Order.xml');
XMLNodeList := XMLDoc.getElementsByTagName('Document/Order');MESSAGE('The document has %1 elements called Order',XMLNodeList.length);
This will open the XML document, then use the getElementsByTagName-function to get all elements that match the tag name. Note that the tag name you specify is case sensitive (like everything else in XML)! So in this case, it would not find elements called "order".
Once you have the XMLNodeList, then you can go through the nodes in that list, using the first example.
Finally, an example which will read the XML document shown above, and show an Amount-total for each order:
XMLNodeList := XMLDoc.getElementsByTagName('Document/Order');FOR i := 0 TO XMLNodeList.length - 1 DO BEGIN NumberDecTotal := 0; DOMNode := XMLNodeList.item(i); XMLNodeList2 := DOMNode.selectNodes('Line/Amount'); FOR j := 0 TO XMLNodeList2.length - 1 DO BEGIN DOMNode2 := XMLNodeList2.item(j); NumberAsText := FORMAT(DOMNode2.text); EVALUATE(NumberDec,NumberAsText); NumberDecTotal := NumberDecTotal + NumberDec; END; MESSAGE('Order has total amount of %1.',NumberDecTotal);END;
This example uses DomNode.SelectNodes, which will select all matching nodes. If you are looking for an individual node, then you can use DomNode.selectSingleNode which will only return one node (the first that matches).
In an XML document, everything is Text. So you have to convert to numbers where needed. Final note: Make sure to make your code more robust than this! What if Amount contains a character? What if an element is missing? If you don't take these things into consideration, then your code can fail with very unfriendly errors.
These postings are provided "AS IS" with no warranties and confer no rights. You assume all risk for your use.
This post if for when you need to export data in the form of XML from Microsoft Dynamics NAV. If you are not familiar with how to program with MSXML DOM, then it can take a lot of time and work to create even a simple XML document. The purpose of this post is to show a simple example, and to get you started programming with MSXML DOM from NAV.
I have split this into two posts: "How to write" (this post), and "How to read" an XML Document (click here), since the methods and functions used are quite different.
Why not just use XML-Ports? You can of course, but then you limit yourself to the functionality and limitations that exist in XML-Ports. An example of a limitation in XML-Ports is that they do not support namespaces. If you think that one day you may need functionality that is not supported by XML-Ports, and for which you have to use MSXML DOM anyway, then consider to just use MXSML DOM from the start. It is more complex to begin with, but once you have a set of functions to add / search etc elements in an XML document, then it does not have to be much more complex than using XML-Ports. Note that it is not possible to access an XML docuemnt via MSXML DOM from an XML-Port while the XML-Port is accessing it. The two methods are mutually exclusive.
An XML document is organized as a heararchy of elements, and everything (elements and data) gets added piece by piece to existing elements.
Let's get started with the simplest possible XML DOcument:
Create a new codeunit with 3 variables:Name DataType Subtype LengthXMLDoc Automation 'Microsoft XML, v4.0'.DOMDocument DOMNode Automation 'Microsoft XML, v4.0'.IXMLDOMNode DomTextNode Automation 'Microsoft XML, v4.0'.IXMLDOMText
In this example I use MSXML version 4, but you can use version 5 or 6 if you prefer. Version 4 is mostly used because it most likely contains all the functionality you will need, and it will be backward compatible to more machines - even old ones.
Initialize the document:
Unless you have specific reasons not to, set asynch = FALSE. Especially when you are reading an XML document. It means that it will load the whole document into memory before it starts reading through it. If you don't set this property, you may begin to process the document before it is completely loaded into memory.
Now create a new node, and then add it to the document:
DOMNode := XMLDoc.createNode(1,'NodeName',''); XMLDoc.appendChild(DOMNode);
Then add some data. This is done in the same way as you add an element: Creat the element first, and then add it to the right place in the document:
DOMTextNode := XMLDoc.createTextNode('NodeValue'); DOMNode.appendChild(DOMTextNode);
You can add any additional elements like this: Creat the element of the type that you need (node, text, attribute etc), and then append it to the document.
To see the document, save it to disk:XMLDoc.save('C:\XML\MyXML.xml');
Note: If you run on Vista, it might not allow you to save files on the root of the c: drive. So create a new folder, for example c:\XML\
Then run this codeunit. You should get a very simple document like this:<NodeName>NodeValue</NodeName>
If you want to add further nodes to this document then repeat the steps above to create a new node, then append it to an existing node. For example, to add a new node:
Create a new variable: NewDOMNode : 'Microsoft XML, v4.0'.IXMLDOMNode
Initialize the new node: NewDOMNode := XMLDoc.createNode(1,'NewNode',''); DOMNode.appendChild(NewDOMNode);
And add text to the new node: DOMTextNode := XMLDoc.createTextNode('NewNodeValue'); NewDOMNode.appendChild(DOMTextNode);
Now, your xml document will look like this:
Adding namespace and Attributes:
You can add a name space either to the whole document, or to individual nodes. To add a name space to the document, go back to the relevant line above, and specify the 3rd parameter:
DOMNode := XMLDoc.createNode(1,'NodeName','MyNameSpace');
Finally, adding an attribute to a node is similar to adding a text element: Initialize the Attribute, then add it to the relevant node. Use the function SetNamedItem to add attributes. Let's add an attribute called ID with the value 10000:
Create a new Automation variable: XMLAttributeNode : 'Microsoft XML, v5.0'.IXMLDOMAttribute
Initialize the attribute, then add it to the node: XMLAttributeNode := XMLDoc.createAttribute('ID'); XMLAttributeNode.value := '10000'; DOMNode.attributes.setNamedItem(XMLAttributeNode);
The whole codeunit should now look like this:
// Initialize the document CREATE(XMLDoc); // Create Node and attach it to the document. Use name space MyNameSpacefunction DOMNode := XMLDoc.createNode(1,'NodeName','MyNameSpace'); XMLDoc.appendChild(DOMNode);
// Add data (text) to the node DOMTextNode := XMLDoc.createTextNode('NodeValue'); DOMNode.appendChild(DOMTextNode);
// Add an attribute to the node XMLAttributeNode := XMLDoc.createAttribute('ID'); XMLAttributeNode.value := '10000'; DOMNode.attributes.setNamedItem(XMLAttributeNode);
// Initialize a new node: NewDOMNode := XMLDoc.createNode(1,'NewNode',''); DOMNode.appendChild(NewDOMNode);
// And add text to the new node: DOMTextNode := XMLDoc.createTextNode('NewNodeValue'); NewDOMNode.appendChild(DOMTextNode);
// Finally, save the document for viewing: XMLDoc.save('C:\XML\MyXML.xml');
MSXML is further documented in MSXML Software Development Kit (MSXMLSDK). I would recommend that you download this to have the online help available for all of the MSXML DOM functions.
These postings are provided "AS IS" with no warranties and confer no rights.You assume all risk for your use.
Microsoft Dynamics UKMicrosoft Customer Service and Support (CSS) EMEA
It is with great pleasure that I can announce the release of Microsoft Dynamics NAV 5.0 SP1. During today the first countries will be available from PartnerSource and I would really like you to pay attention to the following two web sites on PartnerSource:
Microsoft Dynamics NAV 5.0 SP1 General information:https://mbs.microsoft.com/partnersource/products/navision/newsevents/news/50sp1.htm
Microsoft Dynamics NAV 5.0 SP1 Download Information:https://mbs.microsoft.com/partnersource/downloads/releases/MicrosoftDynamicsNAV50SP1.htm
March 28, 2008:Release in United States, Germany and Denmark. This is also the release date for the W1 version of Microsoft Dynamics NAV 5.0 SP1.
May 7, 2008:Release in Australia, New Zealand, Canada (EN+FR), France, India, Indonesia, Ireland, Italy, Malaysia, Netherlands, Philippines, Singapore, Spain, Thailand and United Kingdom.
June 17, 2008:Release in Austria, Belgium (NL+FR), Finland, Iceland, Mexico, Norway, Sweden, Russia, Czech Republic, Slovakia, Estonia, Hungary, Latvia, Lithuania, Poland, Portugal, Switzerland (FR+IT+DE), Bulgaria, Croatia, Romania and Slovenia
SQL Server 2005 keeps (some) history of the number and time that blocking occurred on indexes. This blog contains a query that queries this information to help identifying on which tables blocking happens most. The query shows accumulated numbers (no. of blocks, wait time, etc) since SQL Server was last restarted.
The query does have some known in-accuracies. Also, the result needs to be interpreted: It may not point directly at which indexes need changing or disabling. See below for details about this.
The query uses Dynamic Management Views, which means it will only work from SQL Server 2005 and later.
Make sure to run the query in the NAV database, otherwise you won't see the table names.
So, here it is:
--order by No_Of_Blocks desc
Inaccuracies:No_Of_Blocks is recorded accurately, but the Block_wait_time is not. SQL Server only records Block_wait_time when the block is a clear pagelock or rowlock. It will not record wait_time in case of a rangelock, which is common in NAV. Also, Block_Wait_Time only gets recorded when a transaction completes. So if a transaction is aborted after the block (for example by Lock-Timeout), then Block_Wait_Time for that transaction will not be counted. This means that the real Block_Wait_Time is likely to be higher, and it may be distributed on different tables / indexes than the query shows. Anyway, I hope the query is still accurate enough to give a good idea about where blocking occurs.
How to interpret the result:The query shows blocking per index. But you should not put too much importance in the individual index that shows blocking. Instead, look if there may be other indexes in that table which are not used. Remember, an update on Index X may require update on all other indexes in the table. So look at the whole table when deciding if the table is over-indexed. For this, use the query "Index Usage", which shows you the usage of each index in a table.
Blocking will happen on the first table in a process which is blocked. Maybe the process begins processing small tables, but the real blocking happens because processing of other tables takes a long time. In that case, the query will show the blocking on the first table, but not show blocking on the later tables, which may be where the real problem is.
I will be more than happy to receive any feedback on experiences with this query, and suggestions for how to improve it!
We get a couple of questions asked quite regularly: “How can I make my ERP system SEPA compliant” and a more basic one “What is SEPA? And how does this affect my daily work?” With this blog entry I’ll try to answer these questions.
SEPA is short for Single European Payment Area – which is set up by the European Commission as a political agreement to reinforce the Economic Monetary Union. The goal is to form a single market for payment transactions within the European Economic Area. One of the ways to do this is to create a set of standards for electronic payments using ISO20022 XML format together with a set of rules and guidelines as to how Euro payments must be handled. With SEPA fully implemented, all electronic payments in Euro within the SEPA area will be regarded as domestic payments - even if they are cross-border payments.
With a common standard it will be easier to process payments, which will lead to more efficiency in the banking system and hopefully lower prices. It will also make it possible for companies that have business activities in many countries to only interact with one bank, since the cross-border payment is assumed to be the same as the domestic payment. The long term vision is that this standard can expand and also be used in other areas such as e-invoicing and e-reconciliation.
The SEPA implementation is still in its early stages. In January 2008, the SEPA instruments for Credit Transfer became available to be used. The Credit Transfer standard will be initially used for interbank relationships with regards to cross boarder euro payments within the SEPA Countries. Banks which talks about SEPA compliant refer to this fact, meaning that they are able to create these kinds of transfers. It is important to note that right now it is only required that banks can use this format when dealing with each other; it is not required that customers adhere to these standards in either customer-to-customer or customer-to-bank situations. It is also important to note that the SEPA only covers Euro transactions – not other currencies.
Looking forward it is expected that national instruments for credit transfers, direct debits and cards are replaced by the relevant SEPA instruments by 2010 and that customers will be able to use the standards for creating electronic payments.
This will affect Microsoft Dynamics NAV customers that have banking relationships within Europe. Short term there will not be a big difference – it will not be required that customers use the SEPA standard, when they interact with their bank. Furthermore it is also only a few banks that can accept the format today. It will be possible to ask the bank for a SEPA Credit Transfer, but it will be the bank that creates the file in the proper format. It is anticipated that many of the banks within the SEPA region will be able to receive a SEPA Credit Transfer in the near future, but it is expected that there will be a transition period where they accept the old electronic formats.
One thing to prepare for is that both Bank Identifier Code (BIC) and the International Bank Account Number (IBAN) is a crucial part of the SEPA standard. These two identifiers are needed to process a SEPA Credit Transfer. This mean these numbers will have to be correct when an electronic payment is created and is something that the customers can prepare for by requiring that these fields are used and updated when a customer or a supplier is created or updated in Microsoft Dynamics NAV.
One question that we also hear and want to comment on is “If I want to make a SEPA payment, am I required to create an ISO 20022 compliant XML file”? The answer to this is both Yes and No:
There are many places on the internet where it is possible to get more information on what SEPA is and what it means to businesses in EU and to banks. Many banks are also aware of the changes to come and what effect it has to their customers. I hope that the points above indicate that SEPA is an interesting new standard that potentially can make it possible to make interactions between Microsoft Dynamics NAV and banks more effective. From our perspective a common standard will make it possible for us to create bank integration that can serve more than one country or one bank – given that the banks stick to one schema! In the short term the thing to keep in mind is that it is possible to prepare for this by updating the data on supplies and customers, so that the creation of a SEPA compliant credit transfer can be created when it is required.
Some facts around SEPA
From the European Payment Council
From the European Central Bank
From the European Commission
- Rikke Lassen
Greetings from the Microsoft Dynamics NAV User Education team. We are the team responsible for application and platform documentation, including online Help, the Application Designer’s Guide, and install and config manuals. You’ve made some complaints, and we’ve been listening. Your feedback says that it’s difficult to find the Help information you need, and the content that’s there is “superficial” and “inadequate.“ As the team’s new Content Architect, I have been working with UE writers, editors and managers to make sure you have a better experience with the next release. In this post, I want to share a couple of the changes we’re making to the NAV docs for the next version: to describe business and development processes, and establish better discoverability of information.
How Do I…
Internally, we’ve been calling one new topic type, “How Do I” topics, because they’re written to answer the question, “How do I perform this process?” Here’s a draft of one of these topics, which basically answers the question, “How do I create new vendor accounts?”
(click image to see larger size)
The links at the top of the topic indicate where you are in the larger process – in this case, you are creating new vendor accounts as part of the section, Configure Purchase Processes , in the Purchasing department. The content of the topic includes overview information on the general sequence of tasks you perform to achieve a business or development goal, and then the table describes and links to topics to help you complete that goal. For application documentation, the processes are based on the customer model (http://www.microsoft.com/dynamics/product/familiartoyourpeople.mspx).
Walkthrough topics provide end-to-end processes comprised of two or more tasks. Using the CRONUS International Ltd. demo company, the walkthrough tasks enable you to learn the steps involved in a process before you perform them using your own data. Application walkthroughs provide the beginning-to-end steps for processes like tracking sales campaign results, or calculating work in progress for a job. Development walkthroughs step you through processes like designing a customer sales order report.
In coming weeks, we will begin sharing specific versions of these topics with you, and hope you’ll give us feedback on how well they meet your needs, and the specific content you’d like to see us develop.
We’ve had a lot of feedback that finding information in the NAV documentation is difficult, so we’re working on improving discoverability from several angles. In previous versions, platform documentation has been delivered in nearly 20 different manuals. We’re compiling some of the most strategic of these manuals into online Help files for development, installation, and C/SIDE reference. These logical collections reduce the number of places you have to look for information, and since they’re Help files, you get the benefit of other discoverability aids like Search, an index, and a table of contents. In addition, platform documentation is going to be published in the MSDN library, for easy online access to topics.
Other discoverability improvements include:
Again, we welcome any suggestions, concerns, hopes, and fears you have regarding these plans or on the documentation as a whole. You can contact us directly at email@example.com. And if you are traveling to Convergence next week, please attend the User Education sessions we are running, on enabling partners to extend NAV Help, and on transforming forms for use in the RoleTailored client. Specific details are below.
NAV Help Extensibility Tuesday, March 11, 12:30-1:30 Room W311E
NAV 2009 Form Transformation Thursday, March 13, 12:30-1:30 Room W311E
- Michelle Fredette
Microsoft Dynamics NAV plans to support Microsoft SQL Server 2008 and Microsoft Windows Server 2008 once they are available. To ensure compatibility, Dynamics NAV development will start testing Dynamics NAV 4.0 SP3 and Dynamics NAV 5.0 SP1 on these two server platforms at the beginning of calendar year 2008.
See full announcement here.
Martin Nielander (martinni)Program Manager