New Data Tools Features in Visual Studio Orcas

New Data Tools Features in Visual Studio Orcas

Rate This
  • Comments 5

Here's a list of new data tools features in Visual Studio. I will be discussing each one in more detail in upcoming posts.

Object Relational Designer

The object relational mapping technology lets you map relational databases to objects. Once the mapping is done, you can manipulate mapped objects as normal objects and easily submit changes back to relational database without writing complex data access logic. It is the latest enhancement in data access technology and there are many different implementations out there currently. However, no O/R implementation comes with a graphical designer that makes it easy to create and modify mapping.

The Object Relational Designer in Visual Studio is Microsoft's answer to how O/R mapping should be done. It's a graphical designer which lets you easily map database objects such as tables and stored procedures to LINQ To SQL classes and methods. You can just drag out database objects from the Database Explorer onto the designer. Designer then takes care of creating mapping and generating proper LINQ To SQL code. You can also modify different aspects of mapping through the designer and let the designer take care of updating the code automatically. Association and inheritance relationships can also be easily created easily. The stored procedure support is also a very unique feature. Instead of letting LINQ To SQL generate INSERT, UPDATE and DELETE SQL statements, you can map methods created based on the stored procedures to each behavior.

If you've tried May CTP of LINQ, you probably remember a version of the O/R Designer we included. It was called DLinq Designer. What you see in March CTP of Visual Studio Orcas is the next generation of DLinq Designer and it's now called the O/R Designer.

  

Hierarchical Update in Typed Dataset

Keeping track of all inserts, updates and deletes across multiple related datatables and sending those changes in the right order back to the server is not an easy task. How do you make sure that new orders for your customer are added correctly to the system while updating the shipping address of the same customer and deleting one of old orders that had been cancelled all at the same time?

With hierarchical update support in Typed Dataset, all you need to do is to call UpdateAll() method of the new TableAdapterManager component we've added. It takes care of collecting all changes and sending them back to the server in the right order. Of course, everything is wrapped into a transaction.

We believe this would significantly improve the productivity of developers using Typed Dataset to create data applications. You can try this new feature in March CTP of Visual Studio Orcas.

  

N –Tier Support in Typed Dataset

I admit. Typed Dataset code we generate in Visual Studio 2005 is not N-Tier ready. If you open up the code file, you will notice that we've done a great job separating types from data access logic. Dataset and typed datatables are declared in one Typed Dataset class and all TableAdapter classes are declared under a separate namespace. But we generate both sets in one code file. For a lot of applications this is not an issue. But when you are building multi-tiered applications, having them in one file doesn't really help. This exact problem is discussed in detail on Splitting Typed Datasets from TableAdapters blog entry from Steve Lasker. The solution proposed in this blog entry was to open up the generated code and manually copy and paste type declaration out into a new class file. Although this did allow you to use Typed Dataset in N-Tier scenario, it had one major problem. Since you are manually modifying generated code, anytime you make a change to Typed Dataset via Dataset Designer, you have to remember to copy and paste code. That was painful.

In Orcas, you can instruct the designer to generate Dataset portion of the code into another project in your solution. No more manual copy/paste. You can even make changes to Typed Dataset from the designer and it will make sure that updated code is generated into the specified project. You can still choose to stay in 2-Tier model. Designer will continue to generate everything in a single file. When you are ready to take your 2-Tier application to N-Tier, just tell the designer which project is the Dataset project and you are done. It's that simple. A preview of this feature is in March CTP of Visual Studio Orcas and you will be able to try the polished version in Beta 1.

  

Local Data Cache with SQL Compact Edition

SQL Compact Edition enables many exciting scenarios for developers. The most interesting scenario is to use SQL Compact Edition database file as a local cache of data that do not change often. For instance, your application might keep a list of products in local cache while exchanging order information with remote server. Once in a while, you can synch products list from the remote server but you will mostly use local cache.

Since SQL CE is light-weight database that has very little overhead, it's the perfect candidate for the local cache store. You can use the Sync Service for ADO.NET to synchronize data between remote database server and local SQL CE database file. Sounds great, right? But how do you set everything up so that you can do this?

Visual Studio Orcas includes new project item template "Local Data Cache". Adding "Local Data Cache" to your project creates .sync file which is an XML file that describes what gets synched and how. This file comes with the designer that lets you configure different aspects of synchronization easily. Designer also creates SQL CE .sdf database file to be used as local cache store as well as synchronization code necessary to interact with the Sync Service. A preview of this feature is in March CTP of Visual Studio Orcas and you will be able to try the polished version in Beta 1.

 

 

Other Enhancements to Data Tools

Above four are major data tools features in Visual Studio Orcas. But there are tons of other enhancements we've made to existing data tools. I will be dedicating a separate post to describe other important enhancements we've made.

Leave a Comment
  • Please add 6 and 4 and type the answer here:
  • Post
Page 1 of 1 (5 items)