I have been investigating nHibernate recently. I am writing some web service interfaces to our ERP product and I would like some painless way of talking to the database.
In general nHibernate seems like a clean solution that is easy to get going with and yet seems to cover all the bases. It definitely seems like a mature product. There was just one issue that was bugging me. Considering it is not necessary to derive entity classes from some base class and data is stored in simple fields, I was perplexed as to how nHibernate is able to determine what fields are dirty when doing a database update.
I searched for quite a while on the web looking for the information and came up empty other than people talking about Hibernates "Automatic Dirty Checking" capability. No explanations on how it works.
So I rolled up my sleeves and dug into the source. What I found scared the crap out of me. From what I can tell, nHibernate takes a copy of each entity that is loaded and does a compare of each property when you flush.
From my perspective there are two major drawbacks to this approach. First, considering the data you are manipulating is probably consuming the majority of the applications memory, you have double the memory requirements for the writable objects. I am assuming you only take this hit for objects retrieved as writable, but even so, this just seems unreasonable especially considering the fact that many people use Hibernate on a server that is a shared resource!
The second problem is that when doing a flush, it is necessary to do a comparison on each and every entity property. This is necessary even if no changes have been made.
I guess the thing that surprised me the most is that I could find no discussion on the relative merits of this approach. Does nobody else see this as a problem?