Working faster and fewer mapping errors with NHibernate

EDIT:  To access the codebase below, the user name is “guest” and the password is blank.



David Laribee related some problems he experienced with refactorings in his
domain model leading to some breaking problems with NHibernate mappings. 
Specifically, the issues are:

  1. Changing the property names of a domain model can break the NHibernate mapping
  2. Changing the database fields can break the NHibernate mappings

David went on to bemoan the absence of a quick way to validate NHibernate
mappings.  Ayende popped in with the suggestion that the presence of
integrated tests around the NHibernate usage would spot mapping problems. 
Other folks mentioned that there’s a new ReSharper plugin to validate and
refactor NHibernate mappings.  I’ll circle back to the refactoring plugin
in a while. 

First I want to talk about quick ways to validate NHibernate
mappings.  Ayende is right of course about the integrated tests against the
individual queries, but I’m going to suggest that that isn’t the most efficient
answer to the question of validating the mappings.  The bigger integration tests will tell you that something is
wrong, but from experience they’ll be harder to diagnose because there is so
much more going on than simple property checking, and they provide a slow
feedback cycle because of how much stuff is going on.  What would be nice
is a way to walk right up to a mapping and specify which properties on a class
are supposed to be persisted and how.

I thought I would come out of my blogging retirement and show an example of the
Chad and Jeremy
approach to testing NHibernate mappings:


        public void SetUp()
            // Ensure that the StructureMap configuration is bootstrapped
            // In our case, this includes everything we need setup to 
            // execute NHibernate (mappings + ISessionFactory configuration)
            // This will be pretty application specific here
        public void SaveAndLoadCustomerContact()
            new PersistenceSpecification<CustomerContact>()
                .CheckProperty(x => x.Name, "Frank")
                .CheckProperty(x => x.Email, "Email")
                .CheckProperty(x => x.Extension, 123)
                .CheckProperty(x => x.FaxNumber, "111-111-1111")
                .CheckProperty(x => x.TelephoneNumber, "222-222-2222")
                .CheckProperty(x => x.Title, "Mr.")

All this test does is ensure that the 6 properties of the CustomerContact class
(Name, Email, Extension, FaxNumber, TelephoneNumber, Title) are mapped in
NHibernate.  We have some other methods for checking to many and many to
one type relationships. 

Behind the scenes the PersistenceSpecification<T> class:

  1.  Creates a new instance of T
  2. Uses the lambda expressions and suggested values in the calls to CheckProperty
    to load values into the new instance of T
  3. Grabs our IRepository out of StructureMap (of course), and saves the object
  4. Grabs a second IRepository out of StructureMap
  5. Fetches a second copy of the same T out of the second IRepository
  6. Verifies that all of the specified properties in the specification were saved
    and loaded.  If any single property does not match between the first T and
    the second T, the test will fail.

Here’s the implementation of the PersistenceSpecification.VerifyTheMappings() method:

        public void VerifyTheMappings()
            // Create the initial copy
            var first = new T();

            // Set the "suggested" properties, including references
            // to other entities and possibly collections
            _allProperties.ForEach(p => p.SetValue(first));

            // Save the first copy

            // Get a completely different IRepository
            var secondRepository = createRepository();

            // "Find" the same entity from the second IRepository
            var second = secondRepository.Find<T>(first.Id);

            // Validate that each specified property and value
            // made the round trip
            // It's a bit naive right now because it fails on the first failure
            _allProperties.ForEach(p => p.CheckValue(second));

The advantage of this testing is that it gives a (relatively) fast feedback cycle
focused specifically on the mappings.  Tools that check the hbm.xml
mappings can only verify that what’s there is correctly formed.  The
mapping tests above will catch missing mappings and desired behavior.  At a
bare minimum, you really should have at least one smoke test in your CI build
that simply tries to create an NHibernate ISession object from your
configuration.  Let that test run and possibly fail the build before wasting any time on integrated
tests that can’t succeed.

Now, the ReSharper plugin for NHibernate sounds pretty cool.  I definitely
want little or no friction in renaming or adding properties in my Domain Model
classes (why I absolutely despise codegen your business object solutions). 
We beat the refactoring problem by eliminating HBM.XML.  As part of my New
Year’s resolution to eliminate my exposure to angle bracket hell, we’ve created
the beginning of a Fluent Interface API to express NHibernate mappings. 
Using copious amounts of Generics (I guess .Net code just “wants” to have lots
of angle brackets) and lambda expressions, we’re able to express NHibernate
mappings in a completely compiler-checked, ReSharper-able way.  Since we
switched to the FI, we’ve experienced far less trouble with mapping problems. 
Here’s a couple of examples:

    // Simple class with properties and a single "to-many" relationship
    public class CustomerContactMap : ClassMap<CustomerContact>
        public CustomerContactMap()
            Map(x => x.Name);
            Map(x => x.Email);
            Map(x => x.Extension);
            Map(x => x.FaxNumber);
            Map(x => x.TelephoneNumber);
            Map(x => x.Title);
            References(x => x.Customer);
    // Class with a "Component"
    public class CustomerDeliveryAddressMap : ClassMap<CustomerDeliveryAddress>
        public CustomerDeliveryAddressMap()
            Map(x => x.Name);
            References(x => x.Customer);
            Component<Address>(x => x.Address, m =>
                   m.Map(x => x.AddressLine1);
                   m.Map(x => x.AddressLine2);
                   m.Map(x => x.AddressLine3);
                   m.Map(x => x.CityName);
                   m.Map(x => x.CountryName);
                   m.References(x => x.State);
                   m.References(x => x.PostalCode);
    // Class with some "has many" relationships
    public class CustomerMap : ClassMap<Customer>
        public CustomerMap()
            HasMany<CustomerContact>(x => x.Contacts).CascadeAll();
            HasMany<CustomerJob>(x => x.Jobs).CascadeAll();
            HasMany<CustomerDeliveryAddress>(x => x.DeliveryAddresses).CascadeAll();

            Map(x => x.Name);
            Map(x => x.LookupName);
            Map(x => x.IsGeneric);
            Map(x => x.RequiresPurchaseOrder);
            Map(x => x.Retired);

With this approach, and backed up with the little PersistenceSpecification tests,
we can happily change class names and property names with relative confidence. 
Besides, the simple usage of Intellisense plus compiler safe code cuts down on
the number of mapping errors.  We’re more or less greenfield at the moment, so we can get away with generating
the database from our NHibernate mappings on demand, but you can specify
specific table and column names in the language above for brownfield scenarios. 
I’d very confidently say that we’re faster with this approach than we would be
with HBM.XML.

If you’re interested, the complete code for everything I talked about is in the ShadeTree.DomainModel project in the StoryTeller codebase (and effectively released under the Apache 2.0 license). The code is at Use the src\ShadeTree.sln for this stuff. I don’t know that we’ll ever get around to a fully supported release of this stuff, but I wanted to throw out the idea anyway. At this point I’m only extending this code when we need something new for our current project. 
A lot of the advantages of this approach are tied to application specific
conventions and also by tieing the forward generation of the database structure
to validation rules.


As for IoC container testing…

I’ll overlook the fact that my friend David also implicitly implied that an
IoC container not named StructureMap was the de facto standard.
Bil Simser posted a little snippet of code to smoke test the configuration of one of those other IoC containers.  StructureMap has had quite a bit
of diagnostic support since version 0.85 (StructureMapDoctor.exe), but StructureMap 2.5 will add the
ability to do this:

        public void SmokeTestStructureMapConfiguration()

This will attempt to build every possible configured instance in StructureMap, perform any designated
environment tests (like trying to connect to a database), and generate a complete report of all errors encountered by StructureMap. 
If you’re aggressive about managing all external services and configuration into your IoC container, this diagnostic
test can go a long way towards detecting environmental and configuration errors of a code installation.

About Jeremy Miller

Jeremy is the Chief Software Architect at Dovetail Software, the coolest ISV in Austin. Jeremy began his IT career writing "Shadow IT" applications to automate his engineering documentation, then wandered into software development because it looked like more fun. Jeremy is the author of the open source StructureMap tool for Dependency Injection with .Net, StoryTeller for supercharged acceptance testing in .Net, and one of the principal developers behind FubuMVC. Jeremy's thoughts on all things software can be found at The Shade Tree Developer at
This entry was posted in Database and Persistence, StructureMap. Bookmark the permalink. Follow any comments here with the RSS feed for this post.
  • Ben

    This is great! Old article, but very useful.

  • Jeremy D. Miller


    The short answer is yes. We do exactly what you describe in our project with the progenitor of Fluent NHibernate

  • Frederik Gheysels


    This might be a comment that is a bit late, but anyway …

    This fluent interface to NHibernate seems really cool and I wonder whether it would be possible to inherit ‘Classmapping’ types ?

    For instance, I have an EntityBase class which has some properties that all my Entities have.

    So, my entities inherit from EntityBase.
    Right now, I’m using XML files to describe my mappings, and it is a bit tedious if I have to map the properties that have been declared in my EntityBase class every time again in each hbm.xml file.
    So, it would maybe be cool if I could create a ClassMapping type for my EntityBase class, which would then contain mapping information for these base-properties.
    Then, I could have a ClassMapping which inherits the EntityBaseClassMapping, so that I do not have to define mapping information for those common properties.

    Offcourse, this would mean that, in the EntityBaseClassMapping one cannot define a ‘table’ to where the class must be mapped onto, and, this also means that my common properties (declared in EntityBase) should always have to be mapped to certain columns. I mean: by defining this mapping in the Base-mapping, it requires that all those columns in the DB have the same name accross all tables.

  • Colin Jack

    We do that too, sometimes they are also overkill (also coveed in the book). Even if you do use a factory its quite nice to see absolutely required (and unchanging) dependencies on the constructor.

    For example an Account may need to be told what kind of Account it is by being passed an AccountKind (assuming thats the design we have). We could use a factory, but we may choose to pass that value to the constructor as its not ever valid for it to be null or for it to change.

    Also one question mark does the job :)

    Tried using ObjectFactory.AssertConfigurationIsValid() when I knew I had invalid mappings. It did indeed cause an exception to be thrown:

    “No Default Instance defined for PluginFamily…”

    Unfortunatley after travelling through 3 other catch blocks it got to a catch blow ValidationBuildSession which appeared to swallow the exception, the code carried on and in the next iteration of a foreach in PipelineGraph.Visits you get an InvalidOperationException because “Collection was modified; enumeration operation may not execute.”

    We’re using 2.5 (downloaded from SVN this morning) and although this doesn’t really matter I thought I should report it.

  • Brian Chiasson


    The dependency on NHibernate is fixed, but now it is complaining about the ShadeTree.WinFormsTesting reference. It looks like the reference in the ShadeTree.Testing library is relying on StoryTeller being compiled ahead of time. It’s probably worth adding the ShadeTree.WinFormsTesting project to the ShadeTree solution.

    (FYI – I am using the VS 2008 solutions.)

  • Jeremy D. Miller


    Fixed, and thanks for letting me know.

    And of course Ayende wrote Linq for NHibernate. He’s a machine.

  • Brian Chiasson


    I checked out the ShadeTree solution from the above repository, but the code does not compile (no check out and go). It is missing some NHibernate references that are not in the StoryTeller repository. I downloaded the latest builds and the latest releases from NHibernate, but they do not contain the NHibernate.Linq library – did Ayende write this?

    Just wanted to let you know.

  • Sheraz

    “Interesting on the default constructors, my first focus is on getting the correct model and I often find default constructors don’t cut it.”
    Why so????? Why not use Factories if you want to inject values in your model??? If I recall Jimmy Nilson also discuss this approach in his DDD book.

  • Colin Jack

    The only other pattern I mentioned was aggregate though and if I didn’t bring it up my DDD badge would be taken off me.

    Interesting on the default constructors, my first focus is on getting the correct model and I often find default constructors don’t cut it. One of the guys I worked with…oh hold on I thought ahead to what I was going to say and I realized it was definitely gonna go into DDD jargon land so I’ll stop there :)

  • Jeremy D. Miller


    Someday you and I are going to have a long talk about fitting a mouthful of DDD jargon into every sentence;-)

    Yes, I’m assuming a default constructor on each and every domain model object. It makes a lot of testing automation scenarios easier. Easier testing == productivity.

  • Colin Jack

    Actually one other thing I noticed, you new up the aggregate using “new T()”. Does this mean your prefer all your aggregate roots to have default constructors or is it just for that example?

  • Colin Jack

    “Using two repositories was just a way to make sure we were pulling the persisted data back out of the database.”

    Sure, I just presumed you’re two repositories must be using different ISessions (unless your clearing the session somewhere). Just wondered, thought something interesting might be happening.

    “Yes, the ObjectMother approach would largely eliminate the need for this, but is that really a big deal?”

    I think for us it would be, but I guess it depends. Seeing the code to populate an entire aggregate (for the tests that do that) inside the method that performs the test is going to get kinda long. Plus that code isn’t reusable. Shouldn’t have brought it up though, not a big thing.

    “You go ahead and do all that copy and pasting of type names and property names to the xml. I’ll happily use Intellisense;)”

    Maybe you’re right, will definitely give it a shot anyway.

  • Jeremy D. Miller


    I fixed the formatting. CS decided to “help” me format the code. I just had some nasty flashbacks to coding ASP Classic with Frontpage 98.

    “One thing I think that I don’t get is the way you use the builder. Surely the specific values don’t matter, for example it doesn’t matter you used 123 as the extension rather than 456″

    I just need some sort of unique value to set the properties. I tried once upon a time to create a data binding tester that would just use made up values behind the scenes to set and check. That was such a PITA that I just made the “suggested” values explicit. Yes, the ObjectMother approach would largely eliminate the need for this, but is that really a big deal?

    Using two repositories was just a way to make sure we were pulling the persisted data back out of the database.

    There’s more to the FI than a one to one replacement to the HBM.XML. We’re also sinking in some conventions to create “auto-mapping.” At a start, you’ll notice that there isn’t any mention of identity columns in the sample above because that’s done behind the scenes by the mapping language.

    You go ahead and do all that copy and pasting of type names and property names to the xml. I’ll happily use Intellisense;)

  • Colin Jack

    First off, dunno why but although they initially showed up fine the code snippets now won’t display properly in Firefox or IE. In FF each code snippet acts like it is just one line and it has
    tags in it, not sure why.

    * Mapping Verification *
    On the mappings verification stuff, nice.

    Our approach is a little different. You inherit from an abstract AggregateRootPersistenceTestBase and have to override a few methods such as CreateAggregateRoot, ModifyAggregateRoot and CreateRepository. The base class does everything else so for free you get create/update and concurrency tests which works nicely and isn’t that different.

    One thing I think that I don’t get is the way you use the builder. Surely the specific values don’t matter, for example it doesn’t matter you used 123 as the extension rather than 456 (though a separate test that used null might be required). If I’m right and the values usually don’t matter then I actually think seeing them in the tests is something I’d usually avoid and I’d use an object mother. We do this anyway because the object mothers can be useful in other tests, so for example we might want to automatically tests cascading or cross-aggregate mappings.

    On the VerifyTheMappings, we do something similar as we just have an ObjectHierarchyComparer that trundles down the entire hierarchy below an object and verifies ALL of the properties match the originals (other than ones you specifically said to ignore).

    In your case what does CheckValue do though and are the two repositories just working with different units of work?

    * Fluent Interface *
    Looking forward to giving it a shot but for me the XML approach with NHibernate is pretty good.

  • Jeremy D. Miller


    PWD is blank

  • Marc Brooks

    That SVN repository is asking for credentials. Got any guest ones?

  • Mike


    Good work, very seet…

    What do you think about being able to generate the database by extending the Map to support the DbType such as nvarchar(50) NOT NULL?

  • Nate Kohari

    I love it! The primary reason I use Castle’s ActiveRecord is for the attribute-based mapping, because I can’t stand XML. I’ve tinkered with the idea of a fluent NHibernate mapping interface, but haven’t actually implemented one.

    You should really consider making that into a full-fledged open-source project. If you’re sure you’re not interested, let me know, and I might take what you’ve done so far and run with it… :)

  • Rob

    Awesome! I just started working on a fluent interface for NHibernate as well. The result looks almost identical to what you’ve got, which is cool. Here’s a sample of my DSL:

    public class TodoMapping : ClassMapping
    public override void Map()
    id(c => c.ID).Generator.Assigned();

    property(c => c.Description);
    property(c => c.CreatedOn);
    property(c => c.CompletedOn);

    many_to_one(c => c.Owner);