With Microsoft Dynamics NAV 2009 R2 it is time to drill into the new features we have for you in the reporting area.
Enhanced connection with Visual Studio when editing layout (RDLC)
a. Easy refresh of dataset in Visual Studio
In NAV 2009 SP1 you had to close down Visual Studio when you made any modifications to the dataset (Section designer). With NAV 2009 R2 you can keep both the NAV Report designer and Visual Studio Layout designer open.
To see this feature in action
1. Open a report in design mode and select “View/Layout”
2. With both Report Designer and Visual Studio window, open add new field to the dataset (Section Designer)
3. Now, to activate the refresh action you need to both save and compile. Only saving, or only compiling the report, will not activate the dataset refresh action in Visual Studio.
4. Navigate back to Visual Studio and you will see this message:
5. Select “Yes” to accept the Refresh
6. Lastly, right click ”Result” under ”DataSet”, and select Refresh
Select the Refresh Button
7. You will now see your new added item to the dataset, and you can now add this to your layout.
b. Better protection when closing Report Layout in Visual Studio
In NAV 2009 SP1, it was possible to accidentally close down the Report Designer in NAV and thereby leaving Visual Studio with the Layout open in an unsaved state. With NAV 2009 R2 it is no longer possible to close down the Report Designer in NAV without closing Visual Studio first.
To see this feature in action:
2. With both Report Designer and Visual Studio window open, try closing the NAV Report Designer window. You will see this message:
c. Better protection when opening report layout for design in Visual Studio
In NAV 2009 SP1 it was possible to accidentally open several version of the same report layout in Visual Studio. This can easily cause confusion, so with NAV 2009 R2 it is no longer possible to open several versions of the layout in a report.
2. With both Report Designer and Visual Studio window open select “View/Layout” again and you will see this message:
Printer Selections now available in Role Tailored client
In NAV 2009 SP1 the only way to define which reports should be printed to which printers was in the Printer Selections form as Printer Selections did not work from Role Tailored client. So you had to have the Classic client installed to configure the Printer Selections in NAV 2009 SP1. You could of also have use the workaround described here: http://blogs.msdn.com/b/nav-reporting/archive/2009/10/19/printer-selections-in-role-tailored-client.aspx
With NAV 2009 R2 we now have Printer Selections working from Role Tailored client.
1. Open Role Tailored client and type “Printer” in the search box
2. Select Printer Selections and you get this page:
3. To edit the list or create a new select “New” and you will be able create a new or edit what you already have defined:
New action images for PDF
You might have seen my blog post on how to send an e-mail with a report attached as a PDF file. If you have not seen this here is the link: http://blogs.msdn.com/b/nav/archive/2009/10/08/send-email-with-pdf-attachment-in-nav-2009.aspx
In NAV 2009 R2 we have some new icons which we can use.
So if you want to send an e-mail with a report attached as a PDF file in NAV 2009 R2, you might want to consider using the image called: “SendEmailPDF”
And if you want to save a report as a PDF file from RoleTailored client, you might want to use the image called:” SendAsPDF”
And yes it would be great if guys one day could add your own icons to the RoleTailored client. Hopefully we will have this feature in a future version of NAV.
Recently Microsoft hosted a Hot Topic session that included the reporting features discussed in this post. It is called "Microsoft Dynamics NAV 2009 R2 Hot Topic: What's New for Developers." A recorded version of the session can be seen at the Partner Learning Center.
This is what I had to share today; I hope you appreciate the new reporting features which will be available in NAV 2009 R2. And as always I’m happy to get feedback about reporting features you would like for us to implement in future releases. So use the Contact Form to write directly to the core reporting team or use MS Connect to give suggestions:https://connect.microsoft.com/dynamicssuggestions
Thanks, Claus Lundstrøm, Program Manager, Microsoft Dynamics NAV
When you start a classic report from RTC (a report with no layout defined), it starts the report engine from the classic client. It can happen that after updating RTC, you end up with a version of RTC which is not compatible with the default classic client. In this case you will typically get this error message when trying to run the report:
Microsoft Dynamics NAV Classic client was opened from an untrustworthy component. Contact your system administrator.
Or if you have a test machine with multiple versions of RTC and classic you may want to start the classic client from another folder.
RTC finds the location of Finsql.exe in this place in registry:
So if you have any of the issues above, make sure that the classic client in this folder matches the version of RTC.
Microsoft Dynamics UK
Microsoft Customer Service and Support (CSS) EMEA
The C/AL commands DOWNLOADFROMSTREAM and UPLOADINTOSTREAM have the purpose of sending files between RTC and the NAV Server. A few times now, we had the question: How can we use these functions without it displaying the dialog box to select a file and folder name?
This is how you can automatically download and upload files without any user interactions:
The trick is to use MagicPath, like in codeunit 419 "3-Tier Automation Mgt.". MagicPath is initiated by setting the folder name to '<TEMP>' like this:DOWNLOADFROMSTREAM(IStream,'','<TEMP>', '',MagicPath);
The code example below will copy a specific file from the NAV Server to the RTC machine with no questions asked about folder or file name or anything else:
IF NOT ISSERVICETIER THEN EXIT;FileToDownload := 'c:\Temp\ServerFile.txt';FileVar.OPEN(FileToDownload);FileVar.CREATEINSTREAM(IStream);DOWNLOADFROMSTREAM(IStream,'','<TEMP>', '',MagicPath);MESSAGE('Path = ' + MagicPath);
Now we have the file on the RTC machine, and MagicPath tells us its location. The location will be something like this:C:\Users\[UserName]\AppData\Local\Temp\Microsoft Dynamics NAV\4612\__TEMP__ff7c5a286cfd463f9f7d92ae5b4757e2
The number 4612 in the MagicPath comes from the Process ID of RTC.
So, what if we wanted to rename it to a specific name? We have the FILE object in C/AL, but of course since C/AL runs on the NAV Server and not on RTC, this won't work since the purpose of the above is exactly to copy the file to the client machine. Instead, use this automation:
'Microsoft Scripting Runtime'.FileSystemObject
Then create an instance ClientSide:CREATE(FileSystemObject,TRUE,TRUE);
So, if you wanted to continue the code above and place and name the file to something specific on the client's machine, add these lines:
CREATE(FileSystemObject,TRUE,TRUE);DestinationFileName := 'c:\Temp\newfile.txt';IF FileSystemObject.FileExists(DestinationFileName) THEN FileSystemObject.DeleteFile(DestinationFileName,TRUE);FileSystemObject.CopyFile(MagicPath,DestinationFileName);FileSystemObject.DeleteFile(magicpath,TRUE);
MagicPath works both ways. But with DOWNLOADFROMSTREAM it creates MagicPath for you and tells you where it is. With UPLOADINTOSTREAM you need to know it in advance. Remember the MagicPath location above includes the Process ID of RTC. One way could be to work that out somehow. But what I would suggest instead, is to download a temp test file first, then see where MagicPath downloads it to. The path for upload will be the same:
// download a temp file to get MagicPathFileVar.CREATETEMPFILE;FileVar.CREATEINSTREAM(IStream);DOWNLOADFROMSTREAM(IStream,'','<TEMP>', '',MagicPath);FileVar.CLOSE;MESSAGE(MagicPath);
Then extract the folder name from MagicPath:
FOR i := STRLEN(MagicPath) DOWNTO 1 DO BEGIN IF MagicPath[i] = '\' THEN BEGIN MagicPath := COPYSTR(MagicPath,1,i); i := 1; END;END;
Once you know the location of MagicPath, the next step is to copy the file you want to upload into that folder:
FileToUpload := 'newfile.txt';FolderName := 'c:\Temp\';
IF ISCLEAR(FileSystemObject) THEN CREATE(FileSystemObject,TRUE,TRUE);FileSystemObject.CopyFile(FolderName + '\' + FileToUpload,MagicPath + '\' + FileToUpload);
Then use UPLOADINTOSTREAM to upload the file from MagicPath to the NAV Server:UPLOADINTOSTREAM('','<TEMP>','',FileToUpload,IStream);
And finally, save the InStream to a file on the server:
So, put all this together and the end result is:
The file c:\Temp\ServerFile.txt gets downloaded to C:\Temp\NewFile.txt, and then uploaded back to the server as C:\Temp\OnServer.txt.
Dynamics NAV Support EMEA
This post discusses the use of test data when developing automated tests based on the testability features that were released with Microsoft Dynamics NAV 2009 SP1. The practices outlined here have been applied during the development of the tests included in the Application Test Toolset for Microsoft Dynamics NAV 2009 SP1.
Overall, the execution of automated tests proceeds according to a four-phased pattern: setup, exercise, verify, teardown. The setup phase is used to place the system into a state in which it can exhibit the behavior that is the target of a test. For data-intensive systems such as Dynamics NAV, the test data is an important part of setting system state. To test the award of discounts, for instance, the setup phase may include the creation of a customer and setting up a discount.
One of the biggest challenges of testing software systems is to decide what test data to use and how and when to create it. In software testing the term fixture is used to describe the state in which the system under test must be to expect a particular outcome when exercised. The fixture is created in the Setup phase. In the case of an NAV application that state is mostly determined by the values of all fields in all records in the database. Essentially, there are two options for how to create a test fixture:
The advantage of creating the fixture in the test is that the test developer has as much control over the fixture as possible when tests execute. On the other hand, completely creating the test fixture from within a test might not be feasible in terms of development effort and performance. For example, consider all the records that need to be created for a simple test that posts a sales order: G/L Setup, G/L Account, General Business Posting Group, General Product Posting Group, General Posting Setup, VAT Business Posting Group, VAT Product Posting Group, Customer, Item, and so forth.
The advantage of using a prebuild test fixture is that most data required to start executing test scenarios is present. In the NAV test team at Microsoft, for instance, much of the test automation is executed against the same demonstration data that is installed when the Install Demo option is selected in the product installer (i.e., CRONUS). That data is reloaded when necessary as tests execute to ensure a consistent starting state.
In practice, a hybrid approach is often used: a common set of test data is loaded before each test executes and each test also creates additional data specific to its particular purpose.
To reset the common set of test data (the default fixture), one can either execute code that (re)creates that data or restore an earlier created backup of that default fixture. The Application Test Toolset contains a codeunit named Backup Management. This codeunit implements a backup-restore mechanism at the application level. It may be used to backup and restore individual tables, sets of tables, or an entire company. Table 1 lists some of the function triggers available in the Backup Management codeunit. The DefaultFixture function trigger is particularly useful for recreating a fixture.
Table 1 Backup Management
When executed the first time, a special backup will be created of all the records in the current company. Any subsequent time it is executed, that backup will be restored in the current company.
Creates a special backup of all tables included in the filter.
Restores the special backup that was created earlier with BackupSharedFixture.
Creates a named backup of the current company.
Restores the named backup in the current company.
Creates a backup of a table in a named backup.
RestoreTable(name, table id)
Restores a table from a named backup.
There are both maintainability and performance perspectives on the creation of fixtures. First there is the code that creates the fixture. When multiple tests use the same fixture it makes sense to prevent code duplication and share that code by modularizing it in separate (creation) functions and codeunits.
From a performance perspective there is the time required to create a fixture. For a large fixture this time is likely to be a significant part of the total execution time of a test. In these cases it could be considered to not only share the code that creates the fixture but also the fixture itself (i.e., the instance) by running multiple tests without restoring the default fixture. A shared fixture introduces the risk of dependencies between tests. This may result in hard to debug test failures. This problem can be mitigated by applying test patterns that minimize data sensitivity.
A test fixture strategy should clarify how much data is included in the default fixture and how often it is restored. Deciding how often to restore the fixture is a balance between performance and reliability of tests. In the NAV test team we've tried the following strategies for restoring the fixture for each
With the NAV test features there are two ways of implementing the fixture resetting strategy. In all cases the fixture (re)creation code is modularized. The difference between the approaches is where the fixture creation code is called.
The fixture can be reset from within each test function or, when a runner is used, from the OnBeforeTestRun trigger. The method of frequent fixture resets overcomes the problem of interdependent tests, but is really only suitable for very small test suites or when the default fixture can be restored very quickly:
// test code
An alternative strategy is to recreate the fixture only once per test codeunit. The obvious advantage is the reduced execution time. The risk of interdependent tests is limited to the boundaries of a single test codeunit. As long as test codeunits do not contain too many test functions and they are owned by a single tester this should not give too many problems. This strategy may be implemented by the test runner, the test codeunit's OnRun trigger, or using the Lazy Setup pattern.
With the Lazy Setup pattern, an initialization function trigger is called from each test in a test codeunit. Only the first time it is executed will it restore the default fixture. As such, the fixture is only restored once per test codeunit. This pattern works even if not all tests in a test codeunit are executed or when they are executed in a different order. The Lazy Setup may be implemented like:
LOCAL PROCEDURE Initialize();
IF Initialized THEN
// additional fixture setup code
Initialized := TRUE;
// test code scenario A ...
// test code scenario B ...
As the number of test codeunits grows the overhead of recreating the test fixture for each test codeunit may still get too large. For a large number of tests, resetting the fixture once per codeunit will only work when the tests are completely insensitive to any change in test data. For most tests this will not be the case.
The last strategy only recreates the test fixture when a test fails. To detect failures caused by (hidden) data sensitivities, the failing test is then rerun against the newly created fixture. As such, some type of false positives (i.e., test failures not caused by a product defect) can be avoided. To implement this strategy the test results need to be recorded in the OnAfterTestRun trigger of the test runner to a dedicated table. When the execution of a test codeunit is finished, the test results can be examined to determine whether a test failed and the test codeunit should be run again.
For the implementation of each of these fixture strategies it is important to consider any dependencies that are introduced between test codeunits and the test execution infrastructure (e.g., test runners). The consequence of implementing the setup of the default fixture in the test runner codeunit may be that it becomes more difficult to execute tests in other contexts. On the other hand, if test codeunits are completely self-contained it becomes really easy to import and execute them in other environments (e.g., at a customer's site).
I am happy to announce that due to popular demand, we have recently posted a whitepaper covering SEPA on PartnerSource and CustomerSource.
To give you a little bit of background, in 2002, key European banks decided to create a standardized payment strategy called the Single Euro Payments Area or (SEPA) to allow payments to easily transfer electronically. SEPA is the framework that manages the necessary infrastructure and standards for the euro. SEPA actually consists of two different core concepts: credit transfers and direct debits. SEPA Credit Transfers (SCT) are a payment that a payer initiates. The payer must send payment instructions to a bank and then the bank transfers the funds from the payer's to the beneficiary's account. These are transactions between two accounts held by financial institutions in the European Economic Area (EEA). SCT are currently supported in the following localized versions of Microsoft Dynamics NAV.
SEPA Credit Transfers
Released in Microsoft Dynamics NAV version
4.0 SP3, 5.0 SP1, 6.0 SP1
5.0 SP1, 6.0 SP1
Development is currently underway for SEPA Credit Transfer XML payment format for the following countries.
Expected release date
(This list is subject to change.)
SEPA Direct Debits (SDD) is the other part of SEPA. SDD are different from SCT because they are initiated by a creditor. We currently do not have any concrete plans on releasing support for SDD but are keeping up to date with business, market, and legal requirements and will continue to support direct debits in countries where the functionality already exists as a localization of Microsoft Dynamics NAV.
For those of you who would like to know more about SEPA and how it affects our customers and products, please take a look at the whitepaper here:
- Selena Laccinole Jensen
Michael De Voe, a Senior Premier Field Engineer at Microsoft, has compiled a set of recommendations for SQL Server configuration to improve performance when running Microsoft Dynamics NAV 5.0 and later versions with one of the following versions of SQL Server:
The attached document contains Michael's recommendations, including the following options and parameters:
These postings are provided "AS IS" with no warranties and confers no rights. You assume all risk for your use.
You can add and delete buttons and lists in the Navigation Pane.
1. In the top right corner of the Role Center, click Customize button.
2. Click Customize Navigation Pane.
3. In the Customize Navigation Pane window, click New to create a new button.
4. Type a name for the new button.
5. Choose an icon.
6. Click OK.
7. Use Move Up, Move Down, and Rename to edit the position and name of the new button.
8. Click Add to add new lists.
9. Browse to the list you want to add.
10. Click OK.
11. Repeat steps 8 through 10 for all the lists you want to add.
12. Use Move Up and Move Down to edit the position of the lists.
13. Use Move To and Copy To to move lists from one navigation pane button to another.
14. Click OK.
15. Click Yes to restart the Role Tailored Client.
After restart, the changes will be applied to the Navigation Pane.
For more information about usability and the RoleTailored client, see the blog post Useful, Usable, and Desirable.
With the release of Dynamics NAV 2009 R2 we have spent some time trying to understand the issues related to hosting the product. Based on these discussions we have added some things to the R2 release and will be adding things to the next release as well to accommodate hosting of NAV even further. The features we have added related to hosting the product are:
To begin with let us have a look at the environment we are adding to the mix with these features.
In current releases this would be possible by adding a VPN connection to the mix with the performance overhead and administrative cost that it adds.
To mitigate a possible man in the middle attack we have added the ability to add a certificate on the server side of the setup. This helps ensure that the server to which the client is connecting is actually the one that it should be connecting to.
As most hosting scenarios require any client to login to a domain that is run by the hosting company we have added the ability to show a login dialog when opening the RoleTailored client to allow a user of NAV to easily provide those credentials to the remote site in a secure fashion. This happens over an encrypted channel to the SSL-verified server in the other end.
After the connection is established the communication between the RoleTailored client and the server is also encrypted to ensure that if someone is listening in on the communication it would be garbled while being transmitted over the internet.
The natural question to ask is then "what does this mean in terms of performance?" We have spent quite a while looking into that to be able to come up with some guidelines about which requirements should be put upon the network connectivity between the hosting site and the customer site.
We have focused our tests around two factors - latency and bandwidth. The way we have tested this is in a simulated environment where we were able to throttle both bandwidth and latency to be able to simulate different types of connectivity. The tests we performed were 10 concurrent users posting 10 one-line sales orders in an automated fashion through the User Interface of the RoleTailored client. An important note here is that this type of measurement is not entirely realistic as the actual entering of the Sales Orders are fast to the point that the UI doesn't render before the Post button is pressed. This also means that if this is the "benchmark load" any real life load with similar operations will be slower.
We tested bandwidths ranging from a 10/1 mbit line up to a 300/300 mbit line which would be somewhat similar to a LAN. Bandwidths are setup so a 10/1 line would be 10 mbit download speed to the client site and 1 mbit upload speed from the client site.
Latencies were tested between 0 ms to 600 ms, which is ranging between a fast LAN connection to a slow ADSL connection or even a fast satellite connection, which would be between 500-1000 ms.
The graph below shows the response times as the red line, the maximum kilobytes received per second as well as the average kilobytes received per second, the maximum kilobytes sent per second and the average kilobytes sent per second.
The x-axis signifies the round-trip time added to the connection in milliseconds.
Looking at the graph it shows that latency linearly impacts the response time for obvious reasons. It also shows that a higher latency impacts the ability to utilize the available bandwidth and that the sweet spot/elbow is between a latency of 100ms and 150ms.
The graph for the bandwidth scenario is somewhat less complicated. It shows the bandwidth per user on the x-axis and the response time for the 10 sales orders on the y-axis. Note that the bandwidth per user means that 5/1 is equivalent to a 50/10 mbit connection. 2/0,5 is a 20/5 mbit connection.
The graph shows that the elbow is somewhere between 1,5/0,3 mbit per user or a 15/3 mbit line and 2/0,1 mbit per user or a 20/1 mbit line. Additional studies also show that the determining factor for these connections are the upload speed rather than the download speed and that the elbow is between 0,1 and 0,3 mbit per user for the tested scenarios.
As this is targeted towards limited bandwidth scenarios it is worth noting that for any type of connection it will be possible for a single session to use it all if transferring a large file or even running a large report.
Together with this release we will provide documentation to help configure the network infrastructure that is needed for the RoleTailored client to be able to communicate with the NAV Server over the WAN.
Recently Microsoft hosted a Hot Topic Session called "Microsoft Dynamics NAV 2009 R2 Hot Topic: RoleTailored Client for remote and roaming users". A recorded version of the session can be seen at the Parther Learning Center.
- Claus Busk Andersen
In order for the RTC (Role Tailored Client) to export to a shared location using Constrained Delegation, you need to setup delegation for the account that is running the NST service to the machine that is hosting the share. Here are the instructions on setting up delegation:
1. Open the User that is running in NAV Server Service in AD (Active Directory) and go to the Delegation Tab
2. Click on 'Add'
3. Select 'Users or Computers'
4. Enter in the name of the machine that is hosting the Shared Folder
5. Click 'Check Names' and now it should show an underline below the servername
6. Click 'OK'
7. Now you should see a list of Services for this machine that contains the shared folder
8. Click on the Service called 'cifs'
9. Click 'OK'
10. The end result should now look like this:
Nick Haman, North America Escalation Engineer
In fields that take input, such as Sell-to-Customer No., Location Code, or Address, as you start entering characters, a drop-down list shows possible field values that match the characters you have typed. Typcally, Microsoft Dynamics NAV sets the default filter to the number value in number fields such as Customer No. and to the text value in fields such as Address.
But you can change the default field you use as a filter. In this example, you can change the filter from No. to Name.
Now, as you type in the Sell-to Customer No. field, the filter is set to look in the Name column rather than the No. column.
You can find any page, report, or Departments view in your installation by using the Search box to the right of the Address box. In this way you reduce the amount of time you spend searching.
When you start typing characters in the Search box, a drop-down list shows page and report names containing the character(s) you type. The drop-down list changes as you type more characters, and you can select the correct page or report from the list when it is displayed. The second column in the drop-down list shows the navigation paths to the found pages and reports. You can select the path in the second column to navigate to the Departments page where the page or report exists.
Here are the detailed steps for searching for pages, reports, and Departments views from the Role Center.
1. In the Search field in the top right corner, type "sales".
2. Continue typing with "invoices".
The second column in the drop-down list shows the navigation paths to the found pages and reports and you can select the path to navigate to the Departments page where the page or report exists.
3. Select either the page or report in the first column of the search results or the path in the second column of the search results.
For more information about usability and the RoleTailored client, see the blog post Useful, Usable, and Desirable.
Sometime ago I have promised to publish the numbers used in NAV for the data types. These numbers are used all over the system, but are more visible when encoding record links (please refer to my post about encoding record links). This list contains the data types that are available in Tables in NAV 2009 SP1.
(0x2E ‘+’ 0)
(0x2E ‘+’ 1)
(0x2E ‘+’ 22)
(0x13 ‘+’ 125)
Internally some types are “extensions” of other types, and therefore are composed as a base type and subtype ID. As an example Time (2E ‘+’ 1) is a “subtype” of Date (2E ‘+’ 0).
I have collected few tips maybe could be used in NAV 2009 developing.Don't claim me too much if this looks to easy... :)
With the release of Office 2010 and SharePoint 2010 the relationship between internal line of business applications and business productivity software is stronger than ever. There have been added many exciting new features, which will bring value to many customers, including those customers that are using Microsoft Dynamics NAV today.
The user interface (UI) is the "face" of a software application - A good user interface is intuitive, familiar, and easy to use. It improves productivity by minimizing the number of clicks required to get a task done. This is what we accomplished with the release of the RoleTailored client in Microsoft Dynamics NAV 2009. The Fluent UI is now used by all Microsoft Office programs as well as SharePoint Server 2010, and does away with menus, which were growing increasingly complex, replacing them with a clear set of icons that are relevant to the task being performed.
With the 2010 release, Microsoft Office, Microsoft SharePoint Server, and Microsoft Dynamics now share this strong "facial" resemblance, making them more consistent to use and easier to adopt.
Just as beauty is more than skin deep, so the ties between Microsoft Dynamics and Microsoft's business productivity infrastructure run deeper than just the UI. Business Connectivity Services (BCS) is a new technology that crosses Microsoft Office 2010 and Microsoft SharePoint Server 2010, and can be thought of as "plumbing" for connecting business applications through Web Services in Microsoft Dynamics NAV 2009 with SharePoint and Office. This is no ordinary plumbing, though, as it enables some powerful new scenarios for Microsoft Dynamics NAV customers, including the ability to update information stored in a Microsoft Dynamics NAV database directly from a SharePoint site, and making it easier to take Microsoft Dynamics NAV information offline through either Outlook 2010 or SharePoint Workspace 2010.
The majority of Microsoft Dynamics customers use Microsoft Excel to analyze their business information. PowerPivot for Microsoft Excel 2010 offers the ability to quickly create PivotTables or PivotCharts that are pulling in data from Microsoft Dynamics ERP or CRM in real time. New Excel 2010 features such as Slicers and Sparklines can then be added to bring the numbers to life and gain deeper insights into what's happening in the business.
Since Microsoft Dynamics NAV always has had a strong integration to the Office and SharePoint products, we are proud to announce that Microsoft Dynamics NAV 2009 SP1 and Microsoft Dynamics NAV 5.0 SP1 Update 2 are compatible with Microsoft Office 2010 and Microsoft SharePoint 2010!
The details in the support for the different Office and SharePoint integrations are listed below. Please note that Office 2010 is available in both a 32-bit version and a 64-bit version, but some NAV areas are currently not supported in the 64-bit version. The recommended version of Office 2010 in combination with NAV is the 32-bit version. Further reading on the difference between the two versions can be found here: http://blogs.technet.com/b/office2010/archive/2010/02/23/understanding-64-bit-office.aspx.
Announcing the next installment of new videos on MSDN. http://msdn.microsoft.com/bb629407.aspx
These videos target the developer audience for Microsoft Dynamics NAV 2009. The new offerings are:
More videos are in the works and will target both the platform and the application, so check back often to see what's been added. All videos are in English.