Diego Vega

Entity Framework news and sporadic epiphanies

Posts
  • Diego Vega

    EntityDataSource: To wrap or not to wrap

    • 6 Comments

    Note: Somehow the <TEntity> generic argument had disappeared from the extension method definition. I am fixing it today after several months. Sorry for the inconvenience!

    Fresh from the forums today: A customer asks how to get the real entity object in the RowDataBound event of a GridView.

    We made some complex design decisions during the development of the EntityDataSource, and I guess that will give us plenty of material for blogging :)

    With the EntityDataSource, we faced the interesting problem of building a bridge between the existing ASP.NET databinding infrastructure and the world of EDM. One of the techniques we use to solve the problem is wrapping entities in smart databinding objects. As Colin explains it in his answer posted to the forums:

    Why are you seeing a wrapper instead of the entity? Some of the unique features of the Entity Data Model prevent us from directly binding the entity. For instance, when inserting a Product I also need to insert a relationship to a Category. The wrapper adds the relationship to the entity, basically as a foreign key value.

    The wrapper object implements the ICustomTypeDescriptor interface, which makes it work well with databinding, but when you try to get your original entity object, for instance form the arguments of the RowDataBound, you will get a wrapper object instead of the entity object you are expecting.

    I don't want to make things much more complicated than they need to be, but I think the generic solution may be useful for some customers.

    First of all, here are the rules for wrapping:

    • The wrapping mechanism only takes place if you initialize your EntityDataSource using EntitySetName.

    • When you instead set CommandText to a query that returns entities (i.e. "SELECT VALUE c FROM Northwind.Customers AS c", then you get normal entities.

    • When you instead set CommandText to a query that returns a projection of properties (i.e. "SELECT c.CustomerID, c.CustomerName FROM Northwind.Customers AS c"), then you get a DbDataRecord.

    • Finally, if you set the Select property to do a projection (i.e. "it.CustomerID, it.CustomerName", you get DbDataRecord regardless of how you start your query.

    If you use the RowDataBound event very often in your code, then, I would suggest having around some code similar to this (thanks David for coming up with this code first):

    static class EntityDataSourceExtensions

    {

        public static TEntity GetItemObject<TEntity>(object dataItem)

            where TEntity : class

        {

            var entity = dataItem as TEntity;

            if (entity != null)

            {

                return entity;

            }

            var td = dataItem as ICustomTypeDescriptor;

            if (td != null)

            {

                return (TEntity)td.GetPropertyOwner(null);

            }

            return null;

        }

    }

    And this is the usage:

    protected void GridView_RowDataBound(object sender, GridViewRowEventArgs e)

    {

       var entity = EntityDataSourceExtensions.GetItemObject<Product>(e.Row.DataItem);

       //...

    }

     

    I hope this will help some.

  • Diego Vega

    New EntityDataSource Control for ASP.NET

    • 3 Comments

    As it was announced today, the EntityDataSource is now part of Entity Framework, and a first beta version of it is available in .NET 3.5 SP1 Beta 1 and Visual Studio SP1 Beta 1.

    I have meant to answer Julie's post "Thinking about the EntityDataSource" for several months. But I always thought it was risky to talk about stuff that hadn't been released.

    Now that customers can actually smell and touch it, it is about time:

    Some history and acknowledgments

    EntityDataSource was developed by a small subset of the Entity Framework Team in less than nine months, which is a short period for most software projects at Microsoft. It is amazing to see what this small team has produced.

    But bringing it to its current state would have not been possible without the help of several members the ASP.NET team, who were always helpful and devoted their time and knowledge to the project. They helped us not only in making it suit the needs of “classic” ASP.NET developers, but also of ASP.NET Dynamic Data.

    Why a new DataSource?

    You may have already learned that LinqDataSource is an excellent alternative for displaying the results of any LINQ query (including LINQ to Entities) in ASP.NET, although, to update the database, you need a LINQ to SQL backend. Also, it is possible to write some code and get ObjectDataSource perform 2-way databinding against Entity Framework classes, but that dilutes one of the design goals for DataSources, that it should be possible to implement most common scenarios in a declarative only fashion.

    Entity Framework includes a text based query language called EntitySQL, which in v1 is the only way to get access to the full query capabilities of EF. The language is very “composable”, which enables us to create queries using builder methods (similar to LINQ sequence operators, but text based). On the other side, DataSources as all ASP.NET controls can be initialized using string properties in markup.

    It is easy to conclude that using EntitySQL, it should be possible to create a DataSource control that does 2-way databinding using only markup.

    In fact, this simple markup produces a fully functional CRUD form for the Customers EntitySet:

        <asp:DetailsView ID="CustomerDetailsView"
            runat="server"

            AllowPaging
    ="True"
            AutoGenerateRows="True"
            DataKeyNames="CustomerID"
            DataSourceID="CustomersDataSource">
        </asp:DetailsView>
        <asp:EntityDataSource ID="CustomersDataSource"    
            runat
    ="server"

            ConnectionString
    ="name=NorthwindEntities"
            DefaultContainerName="NorthwindEntities"
            EnableDelete="True"
            EnableInsert="True"
            EnableUpdate="True"
            EntitySetName="Customers">
        </asp:EntityDataSource>

    More to follow...

  • Diego Vega

    Lazy loading in Entity Framework

    • 1 Comments

    Recently, I wrote this little article that got published in the new Insights sidebar in MSDN Magazine. In it, I mention one of the fundamental tenets of ADO.NET: 

    *Network roundtrips should not be hidden from the developer*

    But guess what... It is not always the case that there is a network (or even a process boundary) between your application and your database. Also, there are many scenarios in which you know that most of your data seldom changes or that you don't care if things changes a bit while your application is running (think of a cache). In those circumstances, implicit lazy loading just makes sense.

    We have been sending out the message that you can get implicit lazy loading by changing the standard code generation process in Entity Framework.

    My colleague Jarek went far further and created an experimental set of entity classes that completely replaces the default code-generated classes in EF. And his implementation actually includes some very cool ideas that go beyond lazy loading.

    Take a look at his post. And you can find the bits (compatible with the just released Visual Studio 2008 SP1 Beta) from our portal in Code Gallery.

    Update: Just wanted to add some relevant links to customers asking for lazing loading in EF:

  • Diego Vega

    Colin explains a simple LINQ to Relational materializer

    • 0 Comments

    Just a short note about this: You can find his article here. I had the chance to see his presentation before he went to DevConnections in Orlando. Very much recommended stuff!

  • Diego Vega

    Entity Framework Extensions (EFExtensions) Project available in CodeGallery

    • 1 Comments

    When I announced the start of the Entity Framework Toolkits & Extensions section in CodeGallery, Colin already had a big chunk of what he is now making available in the works. And so I had it in my mind when I defined the Entity Framework Toolkits & Extensions as a collection of source code and tools to augment EF's capabilities and extend its reach to new scenarios.

    It took Colin 2 months to get some free time (readying a product for release is no easy task), to write some extra functionality (the custom materializer was first introduced a couple of weeks ago) and to get his code properly reviewed, etc.

    I would like to recommend you to go and download EFExtensions from the project page at CodeGallery, and then enjoy Colin's first blog post explaining some of the stuff the current EFExtensions are good for.

    By the way, one of my my favorite parts of the project is the EntitySet class and its GetTrackedEntities() method :)

    Alex already introduced Colin as one super smart colleague. In fact, I cannot stress enough how smart Colin is. He is the uber developer. But I must add that intelligence is not his only quality!

    Please, send us feedback on this. The most straightforward way is to use the Discussion tool in CodeGallery, but feel free to use the email links in our blogs.

    And expect some really cool new EF Tools & Extensions form Colin and other members of the team. I know what I am talking about! :)

  • Diego Vega

    A different kind of sample

    • 1 Comments

    Samir, a developer in the Data Programmability Team started blogging today.

    He also published a sample application with a unique feature: It can switch between Entity Framework and LINQ to SQL for persistence. He actually uses a Strategy Pattern (my beloved one) to isolate the business logic from the persistence concern. He describes how it works here.

    If that is not novel enough, his application is a graphics editor named SketchPad...

    If you haven't already, look for other samples and extensions in the ADO.NET Entity Framework and LINQ to Relational Data Portal in CodeGallery.

    The people I work with never ceases to amaze me!

  • Diego Vega

    Then, should I write a data access layer or not?

    • 1 Comments

    Danny and I appear to be giving inconsistent advice on this regard in our recent weekend posts:

    In reality, I think we had different scenarios in mind.

    Danny is talking about the general case, and he is absolutely right that the benefits of creating an encapsulated data access layer have diminished dramatically because of the Entity Framework. EF now provides a complete abstraction layer that isolates application code from the store and from schema differences.

    For many applications, ObjectContext is going to be the data access layer.

    But in my opinion, the benefits of the TDD approach and of Persistence Ignorance are reasons you may still want to go the extra mile. Also, there is an argument for avoiding having code that depends on a certain persistence technology all over the place.

    Whether the guidelines I am suggesting are enough, I would say it is a work in progress. Roger already noticed some inconsistencies in them (I wish I knew what the inconsistencies are in his opinion!).

    Moreover, Danny and I have participated in some conversations on ways to "have your cake and eat it too" when it comes to seamless use of TDD on EF.

    Edit: Added some context and corrections.

  • Diego Vega

    Unit Testing Your Entity Framework Domain Classes

    • 10 Comments
    Technorati Tags: ,

    One interesting question customers that are TDD practitioners usually ask is how to do unit testing with the Entity Framework using mock objects. That is, testing only the domain logic part of the model, without ever touching persistence logic or round-tripping to the database. Usual reasons you want to do this include:

    • Test performance
    • Size of the database
    • Avoid test side effects

    The saving grace for this approach is that persistence is a separate concern from the business logic in your domain, and so it should be  tested separately.

    Also, we test the Entity Framework a lot here at Microsoft. So, for customers using our code, it should be more cost effective to test their own code :)

    How easy it is to apply this practice to the Entity Framework depends heavily on how your code is factored. There are a few things to consider:

    Explicitly separate concerns

    If you want to unit test your domain classes (either IPOCO or code-generated classes fleshed with domain logic, since EF v1 does not support pure POCO classes), the first step is to push out of the picture all the code paths that define and execute queries against the database.

    That means that all code that deals with IQueryable<T>, ObjectQuery<T>, and IRelatedEnd.Load() needs to be encapsulated in a separate DAL component.

    I can envision a pattern in which such component exposes fixed function methods that produce entire object graphs based on specific parameters.

    As a simple example, we can specify an interface with all the necessary methods to get Northwind entities:

    Edit: I changed the name of the interface from INorthwidnContext to INorthwindStore to show that it is not necessarily something you implement in your typed ObjectContext.

        interface INorthwindStore  
       
            IEnumerable<Product> GetProducts
                    (int? productID, int? categoryID); 
            IEnumerable<Customer> GetCustomers
                    (string customerID, string customerName); 
            IEnumerable<Order> GetOrdersWithDetailsAndProducts
                    (int orderID);
        ...
        }

    Once defined, the interface can be implemented as methods that hydrate the object graphs from the database, but also as a mock that hydrates pre-built object graphs for your tests.

    Why not IQueryable<T> properties?

    There is a case for exposing IQueryable<T> properties directly (or ObjectQuery<T> properties as typed ObjectContexts do) instead of fixed function methods: The ability to compose queries in LINQ comprehensions gives much flexibility and is very attractive.

    However, not all IQueryable implementations are made equal, and the differences among them are only apparent at runtime.

    There are a number of functions that LINQ to Objects support that LINQ to Entities doesn’t. Also, there are some query capabilities that in EF v1 are only available to ESQL and not for LINQ.

    Moreover, there is no way to execute ESQL queries against in-memory objects.

    Finally, query span behavior (i.e. ObjectQuery<T>.Include(string path) method) would be too difficult to reproduce for in-memory queries.

    By implementing our query method as fixed function points, we are drawing a definite boundary at a more appropriate level of abstraction.

    The good news is that it is relatively easy get an IEnumerable<T> results either from a LINQ or ESQL query, and doing so does not imply loosing the streaming behavior of IQueryable<T>.

    You can simply return query.AsEnumerable() or (in C#) write a foreach loop that “yield returns” each element.

    What happens with lazy loading?

    When I say that a method must produce entire graphs, the real constraint is that once the method is invoked, client code should be safe to assume that all necessary objects are going to be available. In theory, that constraint can be satisfied with either eager or automatic lazy loading.

    EF v1 codegen classes only support explicit loading, but if you implement your own IPOCO classes or you manipulate code generation, you can get automatic lazy loading working.

    Still, the mock implementation should better populate full graphs in one shot.

    Edit: All this is said assuming you know that you want lazy loading even if this is at the risk of in-memory inconsistencies and extra round-trips. See here for an implementation of transparent lazy loading for Entity Framework.

    How to deal with ObjectContext?

    As Danny explains in a recent post, ObjectContext provides a number of important services to entity instances through their lifecycle, and so it is generally a good idea to keep a living ObjectContext around and to keep your entity instances (at least the ones you expect to change) attached to it.

    There are few approaches that would work:

    1. Encapsulate ObjectContext in your DAL component.
    2. Pass an ObjectContext instance in each method invocation to your DAL component.
    3. Maintain some kind of singleton instance available that all the code can share.

    For the mocking implementation, it is possible to initialize a context with a “metadata-only” EntityConnection:

    var conn = new EntityConnection(
        @"metadata=NW.csdl|NW.ssdl|NW.msl;
        provider=System.Data.SqlClient;");
    var context = new NorthwindEntities(conn);

    This will provide enough information for all but the persistence related functions of ObjectContext to work.

    One common concern about keeping an ObjectContext around is that it will keep a database connection alive too. However, ObjectContext contains connection management logic that automatically opens the connection when it is needed and then closes it as soon as it is not being used.

    What about CUD operations and SaveChanges()?

    Besides providing a launch point to queries, ObjectContext implements the Unit of Work pattern for EF. Most of the behavioral difference resulting from having your entities attached to an ObjectContext, only take place at the time you perform CUD operations (Insert, Update or Delete) or invoke the SaveChanges() method. This is when changes are tracked and saved, and then is when concurrency control is enforced.

    Invoking AddObject(), Delete() or changing property values on your entities from within your test cases should work without changes.

    In order for the mock DAL component not to hit the database every time SaveChanges() is invoked, we should redirect SaveChanges() to AcceptAllChanges().

    Most operations will work as expected whether the ObjectContext is fully connected or “metadata-only”. But to make things more complicated, there are some additional side effects we need to take care of:

    • SaveChanges() may trigger the refresh of store generated values.
    • EntityKeys on entities and EntityReferences may have different values after SaveChanges().

    To mitigate these issues, no code outside your persistence layer should rely on those side effects. A simple rule of thumb that satisfies this requirement is to start anew with a fresh ObjectContext every time you finish your unit of work.

    Also, EntityKeys should be dealt with only in persistence code or serialization code, not in business logic.

    Conclusion?

    It is actually premature to use the word “conclusion”. Mixing EF and TDD in the same pan is something I am only starting to think about. This is a set of scenarios that I want to see among our priorities for future versions.

    In order to come to a real conclusion, I need to at least develop a sample application in which I apply and distill the approaches I am suggesting in this post. I hope I will find the time to do it soon.

  • Diego Vega

    EFContrib: An Entity Framework Community Contribution Project

    • 1 Comments

    I got the news today that Ruurd Boeke, a member of the developer community, has created an Entity Framework Contrib project in CodePlex. The project home is:

    http://www.codeplex.com/efcontrib

    The initial goal sounds like a good idea, but overall I am just very happy to see an EF Contrib project starting and I hope it will be very successful and why not, famous :)

    Evidently, I cannot talk for the owner of the project, but since in his own blog post he is inviting people to contact him or leave comments:

    If you are another member of the community and you planned or wished to contribute your ideas on how to extend the Entity Framework capabilities in such a community project, I encourage you to check it out and contact Ruurd and see what happens.

  • Diego Vega

    Welcome to the Entity Framework Toolkits &amp; Extensions

    • 2 Comments

    Today we went live with something very dear to me: The Entity Framework Toolkits & Extensions. This is a collection of source code libraries and design time tools created to augment the ways you can use the Entity Framework.

    We distribute them under the Microsoft Public License, and it lives in the ADO.NET Entity Framework & LINQ to Relational Data Portal at the MSDN Code Gallery.

    The purpose of putting this collection together is to showcase the ways to address new scenarios with the Entity Framework, but also to create a very tight feedback loop between the members of our team and the Developer Community.

    You can think of our Tools and Extensions as things we just couldn't resist to build, but also stuff we hope you will find super useful.

    Right now the list is short (but certainly not modest):

    And on the samples section:

    Of course, we have more in the works!

    Update: I first took the Sample EDMX Code Generation as something Asad has been working on. Oops! You can read more about about Sanjay's sample here and here.

  • Diego Vega

    Some differences between ESQL and LINQ to Entities capabilities

    • 2 Comments

    John Papa asks in my comments about some differences among the two query languages.

    Let's start from the beginning:

    What is almost the same

    Updatable queries: Neither LINQ to Entities nor ESQL currently enclose a real Data Manipulation Language. However, both can return tracked entities that you can then update and send back to the store. To make it clear, that is the case with ESQL only when you run your queries against Object Services (ObjectContext + ObjectQuery<T>), and not when you use it directly on top of Entity Services (EntityClient).

    What is somewhat different

    Dynamic queries: Being a text based language, ESQL can be really very dynamic. You can get whatever you want by manipulating strings (which in my opinion can become dirty business :)). However, you better know what type will be returned from the query in order to use the result appropriately. I have seen some blog posts describing very smart ways to build dynamic LINQ queries without using any text based query languages. Those functional programming inspired techniques should be applicable in general to LINQ to Entities.

    What is way different

    Access to functions: While in ESQL you have full access to EDM and store functions, in our LINQ implementation your options are currently limited to some CLR methods that we have mapped. For this we choose only methods that we can guarantee will behave consistently in a database store.

    Association navigation: In ESQL you can always navigate an association using NAVIGATE, even if there is no navigation property in the conceptual model. In LINQ if you don’t have a navigation property, you need to fall back to a JOIN.

    Equal comparability: In ESQL entities are “equal compare”, but not in LINQ to Entities. This affects the way you writ GROUP BY and JOIN operations in LINQ (not that you need them a lot if you have navigation properties).

    Syntax: In LINQ you basically get the nice query comprehension syntax, including very nice things like FROM first, LET, etc. In ESQL you get the nice SQL-like syntax, including SELECT first, and what else? ;)

    Result types: You can get entity types, complex types, scalars, or IEnumerables of entity types from either ESQL or LINQ to Entities. Only LINQ returns anonymous types and IGroupings, and only ESQL queries can return DbDataRecords.

    Conclusion

    This is not a comprehensible list, just what I can tell from the top of my mind today. In general, very few of these differences are related to hard constraints. We actually have some designs to overcome many of the limitations of our LINQ implementation, and even some things we are thinking to borrow from LINQ to improve ESQL.

    But it is very difficult to tell at this stage if any of those will make it for v1.

  • Diego Vega

    Entity Framework Beta 3 and Entity Designer CTP2 are out!

    • 2 Comments

    Today we made a huge step towards RTM. I know this is no news anymore, but well, it has been one of those days ;)

    Now please, go get the bits, start creating your models with the designer. Then tweak them with some nice inheritance. Why not, taste a few EntityReaders. Generate some classes and take a look at the generated code. Get familiar with our objects and namespaces! Surround an ObjectContext in a Using statement. Plunk some ObjectQuery<T> inside. Indulge yourself with some LINQ to Entities queries. Change values here and there, add some Entities. Call SaveChanges()...

    If you have been waiting to get familiar with the Entity Framework, there is no way and no reason not to start now.

    BuildYourSkills() and most of all, SendYourFeedback()! :)

  • Diego Vega

    Entity SQL Non-Quirkiness

    • 8 Comments

    Zlatko has been posting about one LINQ to Entities new feature included in the upcoming Beta 3, so I decided to take revenge and make a 100% Entity SQL post. Here is something I ran against the other day:

    Let's assume we need to retrieve the Order with the maximum OrderID, which is a really awful way to get the ID of the order you just inserted! :)

    In your everyday store-specific SQL, you can use a MAX() aggregate function in a subquery as a WHERE predicate. In Transact SQL, it should look like this:

    SELECT *
    FROM   Products AS p
    WHERE
      p.ProductID =(SELECT MAX(p2.ProductID
                        
    FROM   Products as p2);

    So far, so good. If you have been playing a little with Entity SQL, you will probably guess how the equivalent Entity SQL would look like:

    SELECT VALUE p
    FROM   Products AS p
    WHERE  p.ProductID =(SELECT MAX(p2.ProductID
                        
    FROM   Products as p2);

    But if you run this query, what you get is this interesting exception:

    System.Data.QueryException: Argument types 'Edm.Int32' and 'Transient.collection[Transient.rowtype[(_##groupAggMax2,Edm.Int32(Nullable=True,DefaultValue=))](Nullable=True,DefaultValue=)]' are incompatible for this operation, near WHERE predicate, line 1, column 60.

    The subquery is actually returning a Transient.collection of a Transient.rowtype... Those are internal things, so for illustration purposes, let's turn to the LINQ perspective of life:

    var query = from p in context.Products
               
    select new { p.ProductID };

    int
    productID = query;

     

    (Argh, this post is no longer 100% Entity SQL!)

    No surprisingly, what you get is a compile-time exception:

    Cannot implicitly convert type 'System.Linq.IQueryable<AnonymousType#1>' to 'int'.

    Both exceptions are homologous, and for a text-based query language, Entity SQL happens to be very type-safe at its core. Standard SQL makes the basic assumption that it is ok to implicitly convert single-item collections of single-column projections to discrete scalars. We don't.

    The basic theme in Version 1.0 of the Entity Framework is to build a solid foundation for the future. As a consequence, one thing we avoid doing is "magic" type conversions except when they make perfect sense (think union of projection queries with exactly the same shape). The motive: magic conversions tend to mine the stability and composability of the language.

    That said, this buys us freedom to hand-pick certain implicit behavior in the future, if we find enough feedback and proof that it makes sense.

    That's enough on the rationale. Now, how do I make it work? There are two approaches.

    First:

    SELECT VALUE p
    FROM
       Products AS p
    WHERE
      p.ProductID = MAX(SELECT VALUE p2.ProductID
                            
    FROM   Products AS p2);

    This one works because:

    a) The SELECT VALUE returns the scalar itself, instead of a projection (rowtype) of the scalar.

    b) MAX() operates on the collection of scalars returned by the subquery, returning a single maximum value that will be directly comparable (same type) as ProductID.

    Second:

    SELECT VALUE p
    FROM
       Products AS p
    WHERE
      p.ProductID = ANYELEMENT(
              
    SELECT VALUE MAX(p2.ProductID)  
               FROM   Products AS p2);

    This works because:

    a) The subquery will return a single-item collection of a scalar value.

    b) ANYELEMENT will retrieve a single element (in this case, the only one) contained in the collection. That element will be directly comparable with ProductID.

    In case you are wondering now how efficient this is, don't worry. Entity SQL is still a functional language. So, while understanding the type reasoning is interesting and useful, these queries still express "what you want to get" rather than "how you want the job done".

    As a matter of fact, with our current SqlClient implementation, these queries will be translated to some simple, yet unexpected Transact-SQL. But I'll leave that to you as an exercise...

  • Diego Vega

    Choosing an Entity Framework API

    • 1 Comments

    Last month, a question was asked in the ADO.NET Prerelease forum that went more or less like this:

    Considering that there are many APIs you can use (Entity SQL, ObjectQuery<T>, LINQ to Entities), is there any guidance that could help me decide when to use each?

    The best I could do based on my knowledge at the time:

    It is matter of taste.

    While my answer was partially correct and had the great quality of being easy to look at, I immediately realized I should do a better job in helping people choose the appropriate API for each of their scenarios.

    I won’t pretend here to give the definitive and detailed answer, just a head start. You will find more information in our docs and I am sure this topic alone will easily fill a few chapters in upcoming books about the product.

    Service Layers and Query languages

    We basically support two distinct programming layers and two different query languages your applications can use:

    Service layers and query languages supported

    Query language

     

    Entity SQL

    LINQ Queries

    Service layer

    Entity Services

    Yes

     

    Object Services

    Yes

    Yes

    For those coming from the Object/Relational Mapping world, one easy way to look at our stack is to understand that we have two mapping tools layered one on top of the other:

    1. An Entity/Relational Mapper known as Entity Services.
    2. An Object/Entity Mapper named Object Services.

    Of course, once you have mapped your relational tables to entities and your entities to objects, what you get is a fully functional O/R Mapper.

    But as it is usual in our profession, adding a level of indirection uncovers a lot of power and flexibility :)

    First Service Layer: Entity Services

    The public surface of this layer is the EntityClient component, which is a new type of ADO.NET provider that gives you access to a store agnostic entity-relationship model of your data called Entity Data Model (EDM), and decouples your code from the store specific relational model that lives underneath.

    Besides a pair of new classes, the EntityClient contains most of the same types as previous providers: Connection, Command, DataReader, Parameter, Adapter, Transactions and a ProviderFactory.

    To be able to use this layer, you typically need three elements:

    1. ADO.NET provider that is specific to your database engine and has been extended to work with the Entity Framework. Basically, the extensions involve the inclusion of a detailed provider manifest,support for command objects consisting of command trees and the ability to generate store specific SQL from those command trees. An appropriate provider for SQL Server will be included with the Entity Framework, and various provider writers are working right now to give you access to non-Microsoft relational databases.
    2. Mapping information in the form of SSDL, CSDL, and MSL files that describe your storage model, your application’s conceptual model and the mapping among the two. More recently we have added EDMX, a format that packages all the mapping information in a single file at design-time.
    3. Queries expressed in Entity SQL (eSQL), which is a new dialect of SQL that delivers the power of the Entity Framework. Typically, the EntityClient will take a string containing eSQL everywhere your regular provider would accept a string containing store specific SQL.

    One advantage of programming against this layer is that being the first public surface intended for application development, it is also the most lightweight.

    Moreover, at this level you use full eSQL queries to obtain data readers and not actual entity classes. For this reason, we call EntityClient our “value” oriented programming interface. Neither the columns included in your rows, nor the source of your rows, nor the filtering, grouping or sorting criteria, are fixed at compile time. The query is just a string that we parse at run-time, and the results are just EntityDataReaders.

    All this makes Entity Services suitable for applications that today typically exploit the flexibility of writing dynamic SQL queries, like reporting, ad-hoc querying, etc.

    Notice however, that even when the EntityClient closely follows the traditional ADO.NET connected object model, you cannot get an ADO.NET DataSet on top. There are two main reasons for this:

    1. The DataSet does not have the necessary constructs to represent the variety of relationships the EDM can support.
    2. The EntityClient does not support the metadata protocols used to create the DataSet schema.

    Moreover, the Entity Framework currently lacks a string based data manipulation language, so you cannot directly express UPDATE, INSERT and DELETE operations in eSQL. Given this, our EntityAdapter is hardly any similar to the previous DataAdapters. We do not even derive it from the DbDataAdapter class!

    Second Service Layer: Object Services

    Object Services lives immediately on top of the EntityClient, and provides your application an Object Oriented view your data. Many public classes live in this space, but the two most important are ObjectContext and ObjectQuery<T>.

    ObjectContext

    This object’s main role is to encapsulate the underlying EntityConnection, and serve as a porthole for objects performing CRUD operations.

    When you choose to use our code generation, you get a type-safe ObjectContext that incorporates some methods specific to your data model.

    ObjectQuery<T>

    ObjectQuery<T> and its builder methods let you create queries in an completely object oriented way. It also provides a type-safe way to create queries. Most of the time, the shape and source of your data, the filtering, grouping and sorting criteria are known at compile time. So we call this our object-oriented programming interface.

    You can still use fragments of eSQL with many builder methods, but the idea here is that you typically use ObjectQuery<T> in an early-bound manner to build queries that get compiled in your application. Even more important, the results of those queries can be full entity classes or new types created for projections.

    First Query Language: Entity SQL

    Entity-SQL is a text based query language that currently gives you the most expressiveness over the Entity Framework stack on late-bound scenarios. You can use Entity-SQL to get collections of rows in the Entity Services layer, but also instances of entity classes, when used with Object Services.

    I highly recommend reading Zlatko Michailov’s Entity SQL post for a head start on the language and on its main differences with traditional SQL.

    Second Query Language: LINQ Query Comprehensions

    The Language Integrated Query is a set of strategic language extension Microsoft is including both in C# and VB that facilitate the creation of query expressions using a terse syntax familiar to anyone who has used SQL.

    LINQ is very powerful, and it is broadly applicable since it aims to solve the problem of querying any data source, including objects in memory, databases and XML files while maintaining a consistent, object-oriented and type-safe programming interface.

    For the Entity Framework, ObjectQuery<T> is the center of our LINQ implementation. This class implements the necessary interfaces to fully support the creation and deferred execution of queries comprehensions against our stack.

    We have invested a great amount of work in correctly mapping CLR features that can be useful in queries to our EDM and query capabilities. Still, LINQ and the Entity Framework are built and optimized against different goals and assumptions, and some concepts of LINQ and the Entity Framework simply do not map one-to-one.

    We certainly plan to continue investing in better alignment. But right now the reality is that there are some things you can do with Entity SQL that still cannot be expressed in LINQ, and there are a few things you can do with LINQ that still we cannot be translated or compose over in our LINQ implementation.

    Conclusion

    My original answer stays correct: Using one or other API to create your applications also has to do with a matter of taste. This is specially true thanks to the flexibility of ObjectQuery<T>, which allows you to mix and match start with query building methods that take eSQL fragments inside or LINQ queries. Just be aware that you could run into some corners scenarios in which we cannot completely go from one model to the other and back.

    Edit: The assertion that you can mix and match LINQ and ESQL was incorrect. Once you started one way, you have to keep in that route in ObjectQuery<T>.

  • Diego Vega

    Entity Framework FAQ

    • 2 Comments

    One of the best learning resources I have found since I joined my team at Microsoft is this page Danny Simmons has just published. Look here for his post explaining it. If you have any further questions that are not covered or you feel are not clear enough, feel free to add your questions in comments here or even email me.

    Danny already answers lots of questions everyday in the forums, and I like "learning by explaining", so give me a chance :)

  • Diego Vega

    Stretching myself on the wrong axis

    • 1 Comments

    Something you need to learn as a Program Manager at Microsoft is how to scale. This mean that you need to drive issues, multitask, excel at doing it, choose your fights, etc.

    Last week I tried a different approach that kind of worked when I was younger: stretching on the time axis. I found that it doesn't work for me as well as it used to do.

    So, this is my word of advise: If you build a big backlog, and you are executing under your expectations, don't stop sleeping. Two reasons:

    1. The more you sleep, the more clear your mind is when you are awake.

    2. The problem is you are doing something wrong. Either you are spending much time solving the wrong problems or the expectations are too high.

    So, if you are lucky enough to work in a place like Microsoft (with thousands of talented people around), do yourself a favor: Raise your hand, ask for help.

  • Diego Vega

    This is what happened since 10/23

    • 1 Comments

    I was in the kitchen close to my office having a coffee and mulling about what great thing I could do next about the blog. Then I came here and I saw this comment by Guillaume (a good friend in disguise), that translated from Spanish means "so, what happened since 10/23?". I guess I will try to lower the bar a bit and just tell what is happening:

    Well, I have been mostly learning what it means to be a PM at Microsoft, and in particular in Data Programmability. I have been sharing my days with this group of super smart people that is working on shipping this great product (we are close to beta 3). The atmosphere is very interesting and it is easy to be overwhelmed with the heaps of information I get exposed to everyday. Not only I am learning about the Entity Framework stack but also about our LINQ implementation, about our procedures, about our branching strategies, thread modeling, driving features as a PM, filing bugs, status reports, etc.

    I am waiting until I feel more confident with the stack before I do some more technical posting. I am ok with going with the basics, but I just don't want to post any inaccurate information.

    In the meanwhile, life here is nice. The family is adapting very well. We have most of our stuff solved, have a car, rent an apartment on the east side. I am riding The Connector everyday to work, so I can start early and finish late with email... Which is a good thing, isn't it?

    By the way, I see in Outlook that right now I have 1223 messages in my inbox and 443 in my sent items folder. Seems a little unbalanced :)

  • Diego Vega

    Hello Data

    • 1 Comments

    My name is Diego Vega and I am a new Program Manager in the Data Programmability team.

    This is my first post as a Microsoft employee!

    I came here from an ISV/small software company world: A world in which data access is such an important thing as breathing, and yet, it is seldom done unconsciously or even comfortably.

    At Microsoft, they (oh, should I now say "we"?) have been investing for many years in this space, trying to come up with easier approaches, more suitable abstractions, and merely better tools.

    I personally think we have largely succeeded in improving our offers and in simplifying our customers job each time. From ODBC, to ADO.NET and more recently LINQ, Microsoft's contribution has been very much influential.

    The latest fruit of this investment is the Entity Framework, a piece of technology that I think could affect the way we do and perceive data access in a magnitude comparable to what relational databases did more than 20 years ago.

    I am very excited to be part of the team that is building this technology and I can hardly think of a better place to be at Microsoft at this time!

    But as I have said, "we" (as in us, nous, nosotros) have been building this for years and "I" (as in me, je, yo) have just arrived. Only groking it well (and simultaneously learning how to be a good PM), will be a great challenge.

    Therefore, I intend to use this blog as learning tool for myself, by compiling my impressions, code samples, and whatever I find pivotal in understanding the technology. In my experience, you learn more when you explain things. Can't explain why :)

    Hopefully, in sharing my learning progress, I will also make a small contribution in simplifying your job.

Page 2 of 2 (43 items) 12