Sharing the goodness…
Beth Massi is a Senior Program Manager on the Visual Studio team at Microsoft and a community champion for .NET developers. Learn more about Beth.
More videos »
In this interview Saaid Kahn, a Program Manager on the Visual Studio Pro Tools team (and former member of the VB Team), shows us how to create an n-tier application against a database using ADO.NET Data Services (Astoria) and an Entity Data Model, both now available in Visual Studio 2008 Service Pack 1.
ADO.NET Data Services use WCF REST-ful services and provides all the plumbing so you can focus on the program logic by programming against a service proxy. ADO.NET Data Services allow you to easily create data services exposed on the web using URIs to point to pieces of data and simple, well-known formats to represent that data.
Saaid shows us how to create a simple service and then consume it using a Windows client via the "Add Service Reference" dialog in Visual Studio. He also walks through the client proxy methods that work with the data service.
ADO.NET Data Services (Astoria) in Visual Studio 2008 SP1
PingBack from http://blog.a-foton.ru/index.php/2008/10/22/channel-9-interview-adonet-data-services-in-visual-studio-2008-sp1/
Since the release of Astoria, I was looking for something meaningful (like discussed in this interview) on how to use them in WindowsClients.
I followed entire discussion and a it's fantastic from learning concept's point of view.
A real time data model, however, will have more than 3-4 table. They certainly will have tables in 100s and even more SPs. So in situations like these what is recommended solution?
Should there be more 1 data model? Each consisting of relevant tables needed for 1 winform/WPF screen?
If we have more that 1 such data model, does that imply we will have equal number of ADO.Net data services as well?
..and will all of these services be hosted inside same single Web Project as shown in interview?
Lot of questions...though I am sure it will help in many others adopt this great platform...
Day 3 at PDC is off to a great start so far. I just attended a session on the architecture and development
The trade-off is between flexibility and scalability. On one hand, entities in the same model can be queried together, updated together, related to form associations, etc. On the other hand, as your question indicates, this may not be the most practical options. Sometimes splitting the data in a few models makes things much easier. There is no hard line on when to start splitting the model, that varies a lot based on the nature of the application and the data you’re tracking.
I wouldn’t go with a model-per-screen in most cases (I’m sure there is some extreme case where it makes sense, but generalizing…). Probably if you have several 100s tables then you want to split the model in concern areas, and then most screens should map more or less well to those. For the few exceptions that may come up, if you have to use a couple of services it’s not that bad if it happens just once or twice in the app.
And yes, if the model grew so large that you need to split it in parts, you’ll want to split the data service as well. You can host all the services in a single web project. In most cases this shouldn’t be a problem at all. In extreme scalability cases you could need to spread the services across different processes (using IIS pooling), but that shouldn’t be the default path…it’s more of something that you chance when you see your application growing too much too fast after being deployed for a while.