Hi, everyone, I hope you are doing well and having a great day! Just this week, we cut our first release candidate (“RC1”) for NAV 2013, which means we’re very close to RTM and shipping the product to you. I can’t say enough about this product and how it’s come together. But I’ll certainly try, since it’s so much fun to talk about!
Dynamics NAV 2013 has powerful and diverse new application functionality, ranging from charting to cash flow analysis to time sheet registration to assembly to expanded out-of-the-box RoleCenters to Unicode. Have to catch my breath for a second … The user experience innovations range from the SharePoint and web clients to end user productivity features on the rich client to improved integrations with OneNote and Excel. From all the feedback we’ve gotten, even diehard classic client fans will be on RoleCenters very soon! The runtime investments increase performance and scalability for transactions and reporting. They’re also built to run optimally on Azure as well as to improve partner-hosted performance. To improve manageability we created a new admin console and added APIs for powerful scripting. The server also now complies with the rigorous Microsoft’s Common Engineering Criteria (CEC) server “hygiene” standards. In response to your needs for developing on pages, we’ve enhanced tooling substantially around page design and debugging (and testing!). Finally, we’ve rethought rapid deployment through the new Rapid Start module. That’s just off the top of my head! Perhaps, more than anything, this release has been about quality: NAV 2013 exceeds every quality bar for any release of NAV we’ve had thus far. I guess that’s what I’m proudest about.
Together, this array of functionality and capability manifests our value proposition for the release and the product – Microsoft Dynamics NAV 2013 is “A Business Solution from Microsoft.”
I hope you go to PartnerSource later this month, download the product, and get to know it better. Then, go aggressively sell it to new customers and upgrade your existing customers!
Thanks to all the people who’ve made NAV 2013 happen, both externally as partners and customers providing feedback through our Technical Adopter and Beta programs and internally on the Microsoft development team. Collectively, you’ve all done an amazing job getting out a great product. It’s an exciting time to be in the NAV business and for the NAV product. From the tremendous energy at a sold-out Directions EMEA in Rome to the release of NAV 2013 this month to what I expect will happen at Directions in the Phoenix later this fall, I feel the momentum growing. Perhaps that’s why it’s such a bittersweet time for me personally. As some of you already know, after four years living in Denmark, my family and I moved back to the U.S. this summer to be closer to our extended family. As a result, after four and half years as the General Manager of NAV R & D, my time has come to say farewell and thank you for all the incredible memories I’ve had with you our partners, with the Microsoft team in Vedbaek, and with the product itself. It’s been an amazing experience! Thank you all very, very much! My successor, Erik Tiden, is a person worthy of this great community, and I welcome him warmly. (I’ll let him tell his story in the next blog.)
Best of luck to you all, and I hope our paths cross again soon. Sincerely,
You might wonder what user credentials are used to run the NAS services on Microsoft Dynamics NAV 2013.
The NAS services start when Microsoft Dynamics NAV Server starts, unconditionally of any users logging in to Microsoft Dynamics NAV server.
The credentials that are used to run the NAS services are the user account that runs Microsoft Dynamics NAV Server.
In a default installation the following user name is used: NT AUTHORITY\NETWORK SERVICE
Please note the security note about the credentials used to run Microsoft Dynamics NAV Server described in this How to: Create a Microsoft Dynamics NAV Server Instance (http://msdn.microsoft.com/en-us/library/hh168936(v=nav.70).aspx) help topic, where we recommend, for security reasons, that you always run Microsoft Dynamics NAV Server instances under a dedicated domain user account in production installations.
The user credentials that are used to run the NAS services must be created as an NAV user, and that user must be assigned the necessary permissions in order to execute the C/AL code in the NAS Services.
Example Useful in Demo Installations
NT AUTHORITY\NETWORK SERVICE created as a NAV user and assigned the SUPER Permission set:
If you haven’t created the NAV user, Microsoft Dynamics NAV Server will exit with this information in the event log:
Server instance: DynamicsNAV70Session type: NasSession ID: 0The Microsoft Dynamics NAV Server instance "DynamicsNAV70" did not start because the NAS session did not start. The user "NT AUTHORITY\NETWORK SERVICE" that is configured to start the NAS session was not identified as a user that has permissions for Microsoft Dynamics NAV. Set up a Microsoft Dynamics NAV user to run the NST, or create the NST user, such as NT AUTHORITY\NETWORK SERVICE, as a Microsoft Dynamics NAV user, and then start the NST again. Type: Microsoft.Dynamics.Nav.Types.NavDatabasePasswordExceptionMessage: The user ID and password are not valid. Please try again.
When the user hasn’t the needed permissions the Microsoft Dynamics NAV Server will exit with this information in the event log..:
Session type: NasSession ID: 1The Microsoft Dynamics NAV Application Server session has failed. Reason: Type: Microsoft.Dynamics.Nav.Types.NavPermissionExceptionMessage: You do not have the following permissions on CodeUnit ApplicationManagement: Execute
I hope this help you utilize the Microsoft Dynamics NAV 2013 NAS services. :-)
Enabling Tracing in Microsoft Dynamics NAV 2013
Microsoft Dynamics NAV 2013 has a feature that allows you to see the AL call stack for a SQL commands. Here I am going to describe how it can be used to profile your application code.
There are multiple steps required to start tracing.
First, you need to start the Session List page. This is the same page that you need to open to start the debugger. So you need to start the development environment, then go to Tools, Debugger, Debug Session. You will get the Session List page.
This window contains Start Full SQL Tracing/Stop Full SQL Tracing buttons. As well there is a SQL Tracing editable check box on each line.
Start Full SQL Tracing/Stop Full SQL Tracing enables/disables tracing for all new and existing sessions and the SQL Tracing check box enables/disables tracing for a given session.
Let’s assume we want to profile one of the sessions. Then we need to enable tracing for it, for example by clicking the check box.
Configuring SQL Profiler
The important part here is to select appropriate events. In this case we are interested in seeing SQL statements’ text. To achieve that we need to enable SP:StmtCompleted and SQL:BatchCompleted events. The setup should be like on the following picture.
It allows seeing SQL queries for all statements issued from the AL.
After this if you do some operations in the client, for example open the Sales Orders page, you will see comments in SQL Server Profiler.
This is an example of what you can get.
All SQL statements in-between consecutive comments correspond to the AL statement from the first comment. For example in previous screenshot, CALCFIELDS issues six SQL queries.
The SQL profiler will contain Get connection from the pool and Return connection to the pool comments.
These comments correspond to the events when the connection is retrieved and returned to the Microsoft DynamicsNAV Server connection poll respectively. These comments are needed to separate SQL queries issues from different clients on the same SQL connection. The SQL statement that corresponds to these comments is issued by Microsoft Dynamics NAV Server but not originated from AL.
Comments that contain only user name also correspond to SQL statements issued by Microsoft Dynamics NAV Server but not originated from AL. For example Microsoft Dynamics NAV Server executes queries to calculate calculated fields shown on the fact boxes. We need to have this type of comments because Microsoft Dynamics NAV Server might execute an SQL query without returning connection to the pool and not originated from AL.
Filtering Your Statements
In the SQL profiler you will see statements from the different connections. This is because you can have multiple clients running or, for example, you can have SQL reporting services or some other service enabled. It is important to filter out everything except what is coming from the client you are profiling.
To do that for each SQL statement you need to find first previous comment with the same SPID. If this comment is Return connection to the pool then this SQL statement is not originated from the AL code of the client that is being profiled.
User name in the comment identifies the client by which SQL statement is generated.
Collecting the Data and Analyzing
Before the profiler is started, the server should be "warmed up," otherwise there is going to be a lot of queries for metadata reading. The scenarios that are going to be profiled should be executed previously at least once.
After the SQL trace is collected it should be saved into the SQL table.
Bellow I have an example of an SQL script that finds the most expensive AL statements. The trace was saved into the NAV7_Trace table in the master database.
SELECT * FROM [master].[dbo].[NAV7_Trace] --query the trace table content
DECLARE @ApplicationName NVARCHAR(100)DECLARE @GetConnection NVARCHAR(100)DECLARE @ReturnConnection NVARCHAR(100)DECLARE @ContainsUserName NVARCHAR(100)DECLARE @EmptyCallStack NVARCHAR(100)
SET @ApplicationName = 'Microsoft Dynamics NAV Service'SET @GetConnection = '%Get connection%'SET @ReturnConnection = '%Return connection%'SET @ContainsUserName = '%User: Your user name%'SET @EmptyCallStack = '/*__User: Your user name__*/'
IF OBJECT_ID('tempdb..#ProfilerData') IS NOT NULL DROP TABLE #ProfilerData
SELECT * INTO #ProfilerData FROM( SELECT [RowNumber] AS [SqlStatement RowNumber], [TextData] AS [SQL Statement], [Reads], [Writes], [Duration], [StartTime], [EndTime], [SPID] FROM [master].[dbo].[NAV7_Trace] WHERE [ApplicationName] = @ApplicationName and [TextData] not like @ContainsUserName and [TextData] not like @GetConnection and [TextData] not like @ReturnConnection and [TextData] not like @EmptyCallStack) SqlStatementCROSS APPLY( SELECT TOP 1 [RowNumber] AS [Stack RowNumber], [TextData] AS [Call Stack] FROM [master].[dbo].[NAV7_Trace] WHERE [SPID] = SqlStatement.[SPID] and [RowNumber] < SqlStatement.[SqlStatement RowNumber] and [ApplicationName] = @ApplicationName and [TextData] like @ContainsUserName ORDER BY [RowNumber] DESC) AS Stack
SELECT * FROM #ProfilerData --this table contains mapping of SQL statements to the AL call stack
SELECT CAST([Call Stack] AS NVARCHAR(4000)) AS [Call Stack], SUM(Duration) AS [Sum Duration], AVG(Duration) AS [Average Duration], MIN(Duration) AS [Min Duration], MAX(Duration) AS [Max Duration], SUM(Reads) AS [Sum Reads], SUM(Writes) AS [Sum Writes]FROM #ProfilerDataGROUP BY CAST([Call Stack] AS NVARCHAR(4000))ORDER BY [Sum Duration] DESC -- this query finds the most expensive AL statements
Result of previous query shows the most expensive AL calls. Second and fifth rows show the total time spent in the SQL calls issued by the server and not originated from AL.
Also you can create a query which finds SQL statements which correspond to appropriate call stacks.
SELECT * FROM #ProfilerDataWHERE [Call Stack] like '%"Sales-Post"(CodeUnit 80).OnRun(Trigger) line 1556%'
SELECT * FROM #ProfilerDataWHERE [Call Stack] like '%"Gen. Jnl.-Post Line"(CodeUnit 12).InsertGLEntry line 59%'
SELECT * FROM #ProfilerDataWHERE [Call Stack] like '%"Item Jnl.-Post Line"(CodeUnit 22).ApplyItemLedgEntry line 252%'
It is also easy to create a query which will count the number of times the same call stack occurs in the trace.
SELECT COUNT(CAST([TextData] AS NVARCHAR(4000))) AS Count, CAST([TextData] AS NVARCHAR(4000))FROM [master].[dbo].[NAV7_Trace]WHERE [ApplicationName] = @ApplicationName and [TextData] like @ContainsUserName and [TextData] not like @GetConnection and [TextData] not like @ReturnConnection and [TextData] not like @EmptyCallStackGROUP BY CAST([TextData] AS NVARCHAR(4000))ORDER BY COUNT(CAST([TextData] AS NVARCHAR(4000))) DESC
Have you ever had the need to replace some values from a list of values? For example, you need to change the Postal/Zip Code for a number of your customers. Most of the time you would need to write a codeunit or report to accomplish that or even have the customer update each entry by hand. What if you needed to do the same with your vendors? That would require either modifying the previous codeunit/report or possibly just creating a new object to handle this or again having the customer update each entry by hand.
Here is another option – a Replace page that you can add to any page with just a few lines of code.
Fig 1. Replace Page
How to implement the page:
If you import the attached object text file and follow these steps you will be able to add this to almost any page (at least where it makes sense). For this example I will be updating the Customer List (Page 22).
1. Import and compile the attached object “Replace NAV2013.txt”.
2. Design the Customer Page (21).
3. Add a global variable named Replace and point it to Page 50050 (or whatever you may have changed it to prior to import)
4. Create a new action.
5. Go to the OnAction trigger for your new action and add the following code …
6. Compile your object and you should be ready to use it.
How it works:
Note: When use this Replace page, keep in mind that the TIME and DATETIME fields are very sensitive from a matching perspective. Even though the TIME may only appear to be HH:MM:SS on the page, there will probably be a millisecond component to the TIME as well. So, to have the best luck, it would probably be best to copy the value of the cell that you want to replace and use that as the value in “Find What” field. This is true with the DATETIME fields as well.
These postings are provided "AS IS" with no warranties and confer no rights. You assume all risk for your use.
Hello NAV community,
Please allow me to introduce myself: My name is Erik Tidén, and it is an honor and a privilege for me to take on the role of General Manager, Microsoft Dynamics NAV R&D. I am excited about the prospects of working with all of you who are involved with NAV. It is a great product with a community that enjoys an industry-wide reputation for passion and excellence. I consider myself fortunate to be joining this tradition and to have the opportunity to take NAV to even greater heights together with all of you.
What do I bring to the party?
You might ask what I will be bringing to the NAV community. Well, after 20 years of experience in business software solutions, I am passionate about building great software. My recent experience includes executive roles at HP and at SAP, where I was General Manager and Global Head of Custom Development. Prior to that, I held the position of Senior Vice President for SAP CRM Development, where I was responsible for the software development of SAP’s award winning CRM 7.0 product.
What do I want from the role?
Working closely with customers and partners is a big part of why I have found my past roles so rewarding. I especially enjoy working with small and medium-sized companies. They are every bit as sophisticated as large enterprises in their business requirements, yet they have the added challenges of simplicity and ultra-low TCO.
The heart of NAV
If Vedbaek is the home of NAV, then you could say that I have already entered the NAV community, having recently moved to Denmark with my wife. The heart of NAV, however, is not in one geographical location, but wherever the partners and customers work with the product around the world.
There will be many opportunities for me to meet with you, the people of the NAV community soon, and I look forward to my journey starting with Directions US in Phoenix, Arizona on October 27th-30th.
General Manager, Microsoft Dynamics NAV R&D
Actually, NAV should never crash, but sometimes there are circumstances that cause a crash of one of the NAV processes or services. For the user and the administrator, it is usually not clear why a component crashes. Most of the time the application event log does not show enough information to determine root causes of the crash.
A "crash dump" provides very good information about the module and code which caused the crash. The crash dumps can be analyzed by Microsoft Dynamics NAV support and help a lot to find the problem.
With the "Debug Diagnostic Tool v1.1. 2" it is very easy to create the crash dump files in case of a crash.
The tool can be downloaded under the following link: http://www.microsoft.com/en-us/download/details.aspx?id=26798
Note: If the operating system is not configured for "en-us" for region and language, then the following steps are necessary to install the Debug DiagnosticTool v1.2:
After you install and run the application, you are welcomed by the "Rule Wizard."
For the crash dump creation, you choose Crash and then choose the Next button. Following is the selection of what you would like to check. For NAV, it is necessary to select "a specific process":
The next selection window shows all processes. Here, you can select, for example, "Microsoft.Dynamics.NAV.Client.exe".
With "Advanced Configuration" it now possible to set the granularity of process monitoring.
The next screen shows the path where the crash dump is saved. You must select a hard disk with enough capacity. Crash dumps can be very large in some circumstances (e.g. if you selected "Full Userdump" in the "Advanced Configuration" screen).
In a last step the rule must be activated:
Once the created rule was activated, "Debug Diag" monitors the configured process.
In case of a crash, "Debug Diag" creates now the necessary .dmp files in the specified folder and additional log files in the installation folder of "Debug Diag" (e.g. "C:\Program Files\DebugDiag\Logs").
I hope the described steps help you to create the important crash dump information.
Microsoft Dynamics Germany