When an architect is involved in these talks the good ones always bring the discussion up to a helicopter view, since the discussion will inevitably move towards the best model and to comparisons against the 'what-would-be-the-perfect'.
Now seriously, is there any perfect framework model?
The purists’ attacks against the Microsoft EF always say that "EF is not an independent layer neither multiplatform oriented", so it can't be reused and integrated with ease across the other systems in the enterprise, for that would be the dreams of any architects.
Since the introduction of shared folders you could easily for example place a text file in some URI and use it shared across many systems. Obviously as we can see the data would not be cached and few few mechanisms were available to raise a notification to the data holders about that update. The EF is not supposed to accomplish that task as it is a framework to access data using objects.
So here it comes another point: should we put a data access layer above it?
Remember, Microsoft data access methodologies have been changing dramatically over the last 5 years, which paves the way and gives us hints that in a near future it will change again. Legacy systems will be always a reality of the high paced IT market. The arguments against it is that having so many layers in an application will over-engineer the problem and will affect the performance and maybe the final costs rather than a simple refactoring. Remember the discussion about table normalization and de-normalization? It is pretty much the same here. These people are the same that advocate it is better to achieve the application's ROI before any major refactoring. And honestly, 5 years is not enough time for many application to break even the ROI.
Here's a scenario for discussion: A programmer decides to use EF and he maps a common class against the database. He notices that it is very simple IF ONLY he follows 'the yellow-brick-road'. For what I have seen he must follow the EF rules; he must inherit and implement mandatory interfaces dictated by the EF so everything falls right in place. If not, the EF will be just one big expensive fancy feature. Yeah, there is a term for this: Persistence Ignorant. In a glance is like the EF do no adapt to the model, you have to make the model adapt to the EF.
A model should be model ignorant especially nowadays where test-driven development is becoming common in the companies. You can actually test business rules in a higher level. That's why we start to see things like Linq for SQL and binding interfaces. They want to cover the gap left by the persistence ignorance.
The consequence of this is that more and more people are using EF as a data access tool and left to using Linq in the business layer. The business layers will then return datasets within structures called ObjectContext. And good news, the ObjectContext is transactional meaning that you can use System.Transaction to keep the data update rules properties. EF, Linq and ObjectContext: Is this a new implementation being born? Only time will tell but at least they are simple to use, have good performance and gives the programmer good deliverable times...long gone are the days when developers wanted to stay long hours at the office.
Strong points for the EF:
- All the query results are objects and you can parse and traverse them in memory without any cost;
- There is an embedded conceptual layer where you can do things like denormalize the data structure without affecting much the application.
- Linq can be restricted to be used ONLY when needed; Linq is great but it is no silver bullet and people are tempted to overusing it
But it is perfect? IMHO, it is not... and after all, what is perfection anyway?
By Edge Pereira