MVP Summit Recapped: Linq for Entities, MonoRail, and Shameless Name Dropping

Just to get this out of the way, here is my recap from the MVP summit.  You will find something to argue about here, and that’s okay.

LINQ for Entities is NOT the O/R Mapper I want today, but might be if and when they…

As it stands right now, I would still choose NHibernate (or WilsonORMapper) over Linq for Entities as an O/R Mapping solution, especially since it looks very likely that we’ll have Linq support for NHibernate as well.  That’s a little disappointing because there is a lot of promise to Linq for Entities.

At the MVP Summit, the Microsoft team building Linq for Entities very graciously spent some time with several of us to talk over some of the details around making LINQ for Entities more suitable for Domain Driven Design and evolutionary design.  Specifically, we were largely concerned with the intrusiveness of Linq for Entities into your Domain Model classes, the general clumsiness of the configuration model as it is right now, and the mechanisms for tracking object state.  There is definitely some very cool things in Linq for Entities, but it’s a shame that the usability isn’t there yet.

 

What I don’t like:

  • It does not support a Persistence Ignorant approach.  This is fairly significant to me.  I’m in the camp that really doesn’t like any infrastructure code in my business logic classes.  It’s also a Signal to Noise problem, the only signal I want to care about is the business logic.  Anything else is noise code.
  • The configuration is too complex because it exposes that 3rd conceptual model. 
  • The “changed” state of the persisted objects is tracked in the objects themselves with a marker interface somewhat like the INotifyPropertyChanged interface.  I really, really don’t like this.  I don’t like the implicit black magic idea of managing the change state inside of the objects themselves.  I think it adds noise code and makes the transaction boundaries less clear.
  • As of now, Linq for Entities is optimized out of the box for a datacentric approach that calls for designing the database model upfront and then codegen’ing the object model from the database.  This isn’t the way I want to work because this approach almost forces you into a heavier upfront design.  We’ve already got a dozen+ workable solutions for datacentric application building.  I wish they’d delivered a solution upfront for Domain Driven Design to differentiate it more from Linq to SQL.
  • Attributes in the domain model classes.  I’m not sure I have a hard opinion on using attributes for the mapping, but the attributes for Linq for Entities are a duplication of the Xml configuration.  It’s a violation of the DRY principle of good design.

What I want:

All of these “wants” were promised to us in a post-Orcas release.  I’d like to lay these out here to get more visibility for these “wants” to make sure they get a better place on the Linq to Entities roadmap.

The first thing I want is support for a pure Persistence Ignorant approach.  No marker interfaces, no codegen, no partial classes.  Just plain “POO.”

The next thing is support for the Unit of Work pattern.  Transaction boundaries are important details.  I want the contents of a transaction explicitly defined, and expressed in a way that is easy to test.  Enter the Unit of Work:

    public interface IUnitOfWork

    {

        void Added(object target);

        void Deleted(object target);

        void Updated(object target);

        void SubmitChanges();

    }

The configuration model is somewhat divorced from the configuration format already (this was a huge lesson I learned with StructureMap), so we should in theory begin to write alternative configuration formats.  The first thing I would do is create a simplified format that completely hides the conceptual model from the user.  Next, maybe think about a “convention over configuration” approach ala ActiveRecord in Ruby.  I think you could also find a way to generate DDL from an object model the way that you can with NHibernate. 

Evolutionary or Continuous Design

I’m a practicing XP’er and a believer in the benefits of using Continuous Design, but it doesn’t just come for free.  The obvious objection to Continuous Design is the risk of churn and the cost of changing code, and we try to beat this by purposely choosing tooling that enables easy evolution of a software design.  I’ll say this and let the arguing start:  it’s significantly easier to evolve the design of the object model first, then let the database drop out from the object model (and then optimized!) only when the object model is solidified.  I want to be able to quickly add properties, rename properties, and add methods to evolve little by little.  I can lean very heavily on refactoring tools and fast running unit tests to make small, evolutionary changes in middle tier code.  C# is soft.  Even with the Agile Database Techniques described by Scott Ambler, the database structure is still more work and effort to change in comparison. 

I don’t want to codegen my domain model classes.  Having your classes split into a pair of partial classes or an abstract class with data elements and a subclass with logic is clumsy to me.  I think that code is harder to understand because of the “CTRL-TAB” factor switching back and forth.  I also don’t like having to fire up a modeling tool to make a change, then depend on the compiler to find all the other code I’ve busted by changing the signature of the model.  Again, I want to lean on ReSharper and my unit tests for little changes.

What’s Cool about Linq for Entities?

As David Laribee pointed out, Linq for Entities is much more than an O/R Mapper.  It potentially provides us with a unified data access strategy over heterogeneous data sources (web services, xml, non relational databases, etc).  That’s great if it succeeds, but that strategy, IMO, has added quite a bit of complexity that’s fully exposed to the end users in the form of 3 way mapping (object model, conceptual model, and relational model).  The conceptual model only adds value for mapping non-relational data (I’d say it’s Gregor’s Canonical Data Model pattern).  If it succeeds and they address the ease of use and POO issues, Linq to Entities could be really good.

Other things:

  • The underlying database mappers are emitted rather than using reflection
  • It supported every reasonable mapping scenario I could think of to ask about
  • I love the Linq query language to express queries in terms of the Domain Model with Intellisense

If Linq for Entities should fail, it’s going to be an object lesson.  Instead of choosing one major use case first (O/R Mapping) or focusing on the user experience, they put a huge chunk of work into the foundational architecture.  Fine, but at this point Linq for Entities is like a giant big block V8 engine mounted to a weak transmission.  The raw power is there, but it’s clumsy to use it in my opinion. 

What about LINQ for SQL/DLINQ? 

By and large I actually like DLINQ, but it’s not really aimed at the same scenarios as a fullblown O/R Mapper.  In my mind, DLINQ looks like an evolutionary improvement over strongly typed datasets, and applies to basically the same scenarios where a DataSet would be acceptable.  DLINQ is specifically targeted for very data centric development, so I don’t think you would want to try a rich domain model approach with it.  I would consider using it for reporting applications, simple CRUD applications, data services maybe, and most immediately for writing automated tests against a database with FIT.  In the end, I think I would describe DLINQ as putting Intellisense on top of the database model.

Mapping to Stored Procedures

Yes, you will be able to map entities to stored procedures, and I’m seeing people working through how to do this.  I’m going to make a fearless prediction:  mapping to stored procedures will take significantly more mechanical work than mapping entities directly to database tables and columns.  I don’t want to rehash the sproc arguments yet again, but keep in mind the mechanical cost of using sproc’s in this case balanced against any kind of perceived value. 

 

MonoRail is a Watershed Moment for the .Net Community

I think MonoRail is a huge, huge deal for the .Net community.  Finally, we have an application framework that comes from the community that provides a great deal of value and enables a development style that Microsoft does not.  If you’re not already familiar with it, MonoRail is an open source Model View Controller framework for web development.  As you can probably tell, it’s somewhat influenced by Ruby on Rails.

First, let’s look at the simplified page cycle in MonoRail (and Rails):

  1. A web request is made.  A Front Controller takes the request first, then using the url of the request, chooses the appropriate action method on a  controller class to call.
  2. Call the action method on a controller
  3. The controller makes any necessary updates
  4. The controller builds a model of some sort
  5. The model is passed into a view for rendering

There are a lot more details, and I’ve over simplified a lot, but compare and contrast this to the WebForms page cycle of events.  The MVC nature of MonoRail encourages and even enforces a consistent separation of concerns between the controller and view templates.  MonoRail is also much, much easier to unit test than the equivalent pages in a WebForms application – even with a Model View Presenter structure.  You can test a MonoRail application with less friction because it is much more decoupled from the runtime.  The lifecycle of a MonoRail page is much more in tune with the reality of a web page, so there’s much less “leak” in its abstractions.  That in turn makes TDD more valuable as a way to model the actual behavior of the code.  You can MVP the life out of a WebForms page, but you still end up twiddling with the page events to get everything just right.

More, in no particular order:

  • As an architecture, ASP.Net WebForms has some serious weaknesses in regards to testability and maintainability.  The attempt to abstract web development as a stateful forms based model simply does not work very well.  I thought WebForms was a brilliant idea in 2001, but in use I think it adds far too much heft to development tasks that used to be easy in ASP classic.  It’s not just me that feels this way either, a number of the people I spoke to at the MVP Summit shared the same general feeling that the WebForms model just isn’t the right direction.  Plus, so does:
  • It’s community driven.  It’s being built by the very .Net developers that use it, without any assistance from Microsoft.  Read that again.  .Net developers, in the field, are building this thing to suit the way they want to work.  We do not have to wait for Microsoft to do everything for us.  I met a lot of smart people at Microsoft, but they’re just as human as you and I, and there’s not an infinite number of developers at Microsoft to build everything we could possibly want.  MonoRail is not bound to the Orcas release cycle, so it can move at a much faster rate than something bundled up into the official .Net platform.
  • It represents innovation from outside of Redmond, and we could always use more innovation that in the .Net community.
  • My esteemed CodeBetter colleague, David Hayden, recently compared and contrasted MonoRail to the new Web Client Software Factory.  The WCSF might reduce the mechanical cost of generating the initial code with WebForms and add some better practices like MVP and DI, but MonoRail will still have a potentially large advantage in terms of any type of architecture with WebForms:  Testability & Maintainability.  Over any length of time, these two “ilities” lead to lower Total Cost of Ownership and a better Return on Investment.  The software factory codegen features are cool, but ROI and TCO are sexy to management.
  • As you might have read on Jeffrey’s blog post, Scott Guthrie is working on a new concept for a true MVC framework for ASP.Net.  I liked what ScottGu demonstrated, but there’s absolutely nothing concrete planned at the moment.  No expected dates, no commitment.  What I’m getting at here is that there is no reason to bypass MonoRail in the near future if you want an MVC framework that provides a high degree of testability and productivity.  Besides, would it really hurt to have some serious diversity in the tooling and approaches you can take for building dynamic websites in .Net?
  • In comparing WebForms to MonoRail think on this.  MonoRail might need some things added to it (documentation, extra features, etc.) to catch up in some spots, but WebForms needs complexity ripped out.  Guess which option is easier – adding or removing complexity?

 

The latest Hanselminutes podcast is an introduction to MonoRail with the Eleutian guys.  I would highly recommend you give it a listen for some background (I was in the room while they recorded it.  You can blame any background noise on me).

There, are you happy Hamilton?   Anything big I missed?

 

Other Observations

  • Big UML is Dead!  Long live Little UML! – Nobody was talking about large scale UML designs anymore.  Executable UML didn’t even come up in the talks that were skirting on Model Driven Architecture.  I still think UML is useful, but only in a lightweight whiteboard modeling sense or as documentation after the fact.  One thing that Sam said that I heartily agree with is to be somewhat precise about UML notation when you do use it to avoid misunderstandings.  If you want to go fast, you need to be clear.  I still think I can teach another developer everything they really need to know about UML in 15 minutes.
  • Domain Specific Languages – This was a huge topic all week.  I liked the talk we saw from Don Box on this subject.  One thing he made clear was that DSL’s represent an attempt to raise the abstraction level for very specific problems.  Another point I wish he’d made louder is that DSL’s do not automatically equate to new modeling dialects or custom Xml formats (coding in Xml, been there, done that).  A lexical DSL written in near English is arguably easier to understand, and probably to write.  I didn’t get a chance to talk to him much, but Neal Ford was there representing the Ruby angle on DSL’s.  Look for a book from him and some other fatbrain Thoughtworkers on embedding your own DSL’s in Ruby soon.
  • Queries are a Business ConcernAyende said it first, and many people I spoke to thought so too.  There are a lot of business rules embedded into “where” clauses.  I really like the idea of moving this business logic to business logic classes.  I think it reduces the intellectual overhead of understanding a system by gathering related logic into a single place.

Name Dropping

I wasn’t gonna do this, but everyone else is, so why not?  The MVP Summit was a great experience.  The official content was so-so, but the people I got to interact with were tremendous.  In no particular order, and I’m sure I left someone out:

I finally met more of the CodeBetter gang – Karl Seguin, Darrell Norton, Raymond Lewellan, Jeff Lynch, and Greg Young.  Plus new CodeBetter addition Jean Paul Boodhoo.  My old Austin friends Scott and Jeffrey were there, as was CodeBetter dean Sam Gentile.  I spent a lot of time around the flower of Des Moines, Iowa development Nick Parker, Tim Gifford, and Javier Lozano.  Somebody let a Google guy in.  I finally met Scott Allen of OdeToCode fame.  Ian Cooper came over from the UK.  I had to go all the way to Seattle to finally meet Don Demsak and David Laribee.  I spoke quite a while with TShak, and Mario Cardinal is a life long friend for asking me about StructureMap.

About Jeremy Miller

Jeremy is the Chief Software Architect at Dovetail Software, the coolest ISV in Austin. Jeremy began his IT career writing "Shadow IT" applications to automate his engineering documentation, then wandered into software development because it looked like more fun. Jeremy is the author of the open source StructureMap tool for Dependency Injection with .Net, StoryTeller for supercharged acceptance testing in .Net, and one of the principal developers behind FubuMVC. Jeremy's thoughts on all things software can be found at The Shade Tree Developer at http://codebetter.com/jeremymiller.
This entry was posted in Uncategorized. Bookmark the permalink. Follow any comments here with the RSS feed for this post.
  • http://triala.net/ Brain

    Thanks! good post.

  • http://triala.net/ Brain

    Thanks! good post.

  • http://desgnlive.ru Alex

    Very nice post

  • http://www.bellware.net ScottBellware

    I forgot to mention that the entity classes are also instrumented with Entity Framework attributes as well…

  • Brian Dawson – MSFT

    Thanks Scott & Jeremy,

    That make sense to me. Keeping things simple has a balance, and we definitely want to keep the framework easy to use, but have incredible power. We’ll keep working on the balance.

    I see your point about having three layers of abstraction. From the dba standpoint, they want the same thing but from the other direction. I’m not quite sure of the perfect answer as we have so many requirements.

    Something we heard was that no matter how many layers, you want to definitely create OOD first without the persistance knowledge, then plug the persistance based on the shape of the design.

    Keeping giving feedback.

    -Brian

  • http://codebetter.com/blogs/jeremy.miller jmiller

    Brian,

    1.) What Scott said
    2.) I still think it would be sweet if the configuration model is programmable itself so you could create your own configuration schema or a programmatic API to hide a lot of the details of the underlying model

  • http://www.bellware.net ScottBellware

    Brian,

    The Entity Framework’s ORM is two ORM layers with two separate sets of mapping files. NHibernate maps objects to the relational store – it uses a single set of mapping files. The Entity Framework’s mapping schema itself is also more complex than NHibernate’s schema. The volume of the XML in the Entity Framework metadata itself is heavier than NHibernate’s mapping and some of the element names are confusing and don’t seem to have any association to common ORM verbiage.

  • http://iancooper.spaces.live.com/ Ian Cooper

    On OSS. Sara Ford spoke to us at the Summit about trying to champion OSS in MS. http://blogs.msdn.com/saraford/default.aspx. I’d hope that we can make her feel appreciated.

  • http://iancooper.spaces.live.com/ Ian Cooper

    >By and large I actually like DLINQ, but it’s not really aimed at the same scenarios as a >fullblown O/R Mapper. In my mind, DLINQ looks like an evolutionary improvement over >strongly typed datasets, and applies to basically the same scenarios where a DataSet >would be acceptable.

    I’m not sure that this is not doing LINQ to SQL a disservice in that I think it covers a lot of the same functionality as say Wilson O/R Mapper. I’m try to catalogue what I see as the LINQ mapping to Fowler’s patterns right now, so we can try to identify what’s there, what’s missing and how our existing experience might apply. I’ll try to get a blog up in the next week or so and you guys can find the holes in my thinking here. You can certainly deliver a persistance ignorant solution with LINQ for SQL right now as far as I can tell, which doesn’t seem to be the case for LINQ to Entities right now. MS is not really documenting the approach to do this, so that does make it harder to see right now. For me though the gap seems to be around legacy data schemas. Perversly I think LINQ to SQL works better from a domain driven angle if you use the appropriate feature set because a lot of the lightwieght ORM’s struggle with schemas that don’t meet their expectations particularly around inheritance. Certainly applying an ORM to an existing schema has been one of my main sources of grief to introducing them to enterpises that have not used them (along with the DBA resistance to anything that uses dynamic SQL)I’m less convinced than others on the wins for LINQ to Entities over LINQ to SQL outside the space of mapping to complex schemas. I prefer the simplicity of LINQ to SQL unless I can justify the cost of LINQ to Entities around mutliple mapping files, EQL, and active record. However, I wasn’t in that meeting with you guys so I may be missing parts of the story around LINQ to Entities here.

  • Brian Dawson – MSFT

    Jeremy,

    Great write up. Much better them my internal notes. Question for you regarding one of the bullet points you made.

    “The configuration is too complex because it exposes that 3rd conceptual model.” Why? Can you explain more?

    We’ve talk internally almost as much has you’ve blogged.

    Thanks,
    Brian

  • sergiopereira

    > * The Microsoft guys were all feeling a bit persecuted when the topic of
    > OSS came up. On one hand they’re asked to build equivalents by
    > one community, and criticized by another for competing with OSS tools.
    > I feel for them a little bit on that one.

    I feel for them too. Isn’t there an alternative that makes everyone happy? Or at least most of the people?
    Would it be the end of the world if MS developers started participating more in these high-visibility OSS projects like NHibernate, *Unit, Castle Project, etc? So they could maintain a list of solutions that they endorse to customers and maybe even offer some level of support. Support is probably the keyword here. Embracing OSS components would require some changes in the way support is charged and offered.

  • http://davidhayden.com/blog/dave/ dhayden

    You are a class act my friend. It is a real joy and learning experience reading your posts :)

    I think MonoRail is a huge deal for the .NET Community as well and am loving it more and more each day. A lot of great functionality, but it is a little hard to find at times and some of it is only available in the repository as opposed to available for download. I look forward to a new release of MonoRail as well as just a fresh release ( RC3 ) of all the products. I wonder whatever happen to their nightly build idea?

    There are some things in WCSF that I would definitely like to see in MonoRail. They have a page flow application block to make workflow child’s play in ASP.NET. They have some really nice Site Map and Security Services that I would like to see in MonoRail.

    As far as things proposed in WCSF, Hammett has already done some nice things with a new validator component that I was hoping would be available for download soon. Windsor already has AOP ( learned that one :) so all the Policy Injection Application Block stuff is a moot point.

    Overally, I would say that MonoRail is definitely standing out really well here. I am still a little ignorant on the different view engines, however, and wish I could leverage what I knew about Webforms a little easier. But, again, maybe I can. I am continually learning it and things are getting easier.

    Good stuff, Jeremy!

  • http://hammett.castleproject.org hammett

    No, you pretty much nailed it :-D

  • http://turtle.net.nz/blog Jeremy Boyd

    Hey Jeremy,

    Good post – but why is MonoRail such a watershed? Ive put some thoughts up about my experiences with using MonoRail: http://turtle.net.nz/blog/post/ThoughtsOnMonoRail – my 2c is I wish we could get some better integration with the ASPX view engine (from the ASP.NET team perhaps).

    I agree with your thoughts that it would be nice to get better support of community projects coming from the product teams. I think they are listening a lot more these days though! :)

  • http://www.ayende.com/Blog Ayende Rahien

    > We didn’t see any examples of to many relationships.

    There is a part of the sentence missing here, no?

  • http://www.ayende.com/Blog Ayende Rahien

    >You know that I think we’d be better off if we could somehow make OSS acceptable in the .Net community at large instead of just a niche thing.

    If you have any suggestions, I am listenning

  • http://codebetter.com/blogs/jeremy.miller Jeremy

    Ayende,

    I didn’t see anything either way. Sorry. We didn’t see any examples of to many relationships.

    Jeremy

  • http://codebetter.com/blogs/jeremy.miller Jeremy

    I missed a couple of points related to OSS,

    * The Microsoft guys were all feeling a bit persecuted when the topic of OSS came up. On one hand they’re asked to build equivalents by one community, and criticized by another for competing with OSS tools. I feel for them a little bit on that one.

    * OSS just doesn’t seem to be accepted as a mainstream thing in .Net yet. Several of the consultants I spoke with consistently had trouble getting clients to adopt OSS tools that they felt were superior to commercial offerings. You know that I think we’d be better off if we could somehow make OSS acceptable in the .Net community at large instead of just a niche thing.

    I’m writing this on my wife’s Mac. I gotta say that I instantly feel smarter and more creative.

    Jeremy

  • http://www.ayende.com/Blog Ayende Rahien

    Jeremy,
    What about custom types, especially custom collections?
    This is an important issue to me, since I work on temporal systems, where basically anything was accessible on the time zone, and the collections had to support this.
    I was able to go a long way with NH in this regard, but the answers I got from MS in the past (admittedly) long ago, was that it wasn’t going to be supported.