I need some advice how we can decouple nHibernate dependencies in the presentation layer. Currently we have a three tier C# winforms application consisting (simplified) of the following layers;
User Interface (UI)
Business Logic (BAL)
Data Access Logic (DAL)
We are migrating this application to an ORM (nHibernate) and would ideally like to have only the DAL referencing nHibernate. We also want to employ the "Unit of Work" functionality which is included in nHibernate, adopting a "Session per conversation" methodology.
To achieve this we need to create and open a session in the UI, pass the session through the BAL to the DAL however we cannot achieve this without creating a dependency to nHibernate in both the BAL and DAL.
Any advice would be appreciated. How should we structure the architecture to avoid any references to nHibernate in the UI and BAL. Any ideas?
I must also add that we do not want the UI to have a reference to the DAL either.
UI => BAL => DAL
Impossible to do it that way, since the UnitOfWork pattern is implemented by NHibernate's Session object.
However, you only want to reference NHibernate from your DAL, which is quite useless, since your DAL doesn't know anything about the application's context, and this context is necessary in order to use the UnitOfWork.
Take a look at the NHibernate 3.0 Cookbook I found it quite useful for getting to grips with NHibernate.
You will need to extract your entities and create POCOs (Plain Old CLR Objects). Your UI will not require any knowledge of NHibernate. You will create methods in your Data Layer to manipulate your data.
Configure your IoC container in a separate class library, for example by using the "GuyWire" pattern described here:
http://nhforge.org/blogs/nhibernate/archive/2009/11/07/nhibernate-and-wpf-the-guywire.aspx
I recently built an example of a decoupled architecture using nhibernate in an asp.net mvc application. It uses a repository pattern and a separate unit of work. Most of these concepts should be reusable in a thick client also. Here is a search link on my blog with the posts that may be interesting.
http://blog.bobcravens.com/?s=Nhibernate
Hope this gets you started. Let me know if you have questions.
Bob
Related
I am creating my first stand alone desktop WPF application using Entity Framework. Do I need a WCF layer to access database? Is this a bad practise if I just call DBContext directly from ViewModels?
TL; DR
The short answer is: it depends!
The long answer
It depends on your use case you need to implement. If you need to add another layer of abstraction -the WCF layer- to hide your OR/M away you can do it. However if your strategy is easy enough like a standalone WPF application I wouldn't bother making a WCF layer. You can simply access the IDBContext in your application, but keep in mind to not tightly couple your viewmodels with EF.
Always worth try keeping the concerns separate!
And these concerns are:
Data- or Persistence (EF) Models that are used to map your database to your OO models
ViewModels that are supporting your Views with data to show
Mapping of your Persistence and ViewModels
This way you can achieve a lightweight setup that aims for better separation and better testing ability.
Further Extensibility
Later on your development path, when you arrive at a point where you need to add an infrastructural concern like a WCF layer which could serve as a public API -or an entry point- to your shared database access, you can easily add it as a new project, put its classes behind interfaces (these are the only ones you will have as reference added to your WPF project) and let this project have the exact implementations.
I want to know the right concept about it. If I have a MVC application with Repository Pattern, where the BL should be?
Should it be inside the Model? Model should have all the business
logic before call the unitofwork to insert or not the data into
database?
Should it be in the controller? Before call the model?
Should I have a service layer to do the business logic and decide if
I should call the Model to call the UnitOfWork to save the data?
A good explanation will help a lot too.
The short answer - it depends. If it's a fairly complex or sizable application, I like to create a service layer project with the repositories as dependencies. If it's a small application, I'll put the logic in the controller. In my opinion, if it takes more time and effort to create the service layer than it would be to create the application (i.e. one or two controllers), then it doesn't make sense to me to go that route. You also need to consider the likelihood that the application will grow. What might start small could grow into something much bigger and in that case, again, it might be more beneficial to create the separate service layer.
The third one... and then some.
Your application structure could look like this (each in different projects):
Data storage layer (e.g. SQL database)
ORM (e.g. NHibernate or Entity Framework)
Domain (including abstract repositories and entities)
Service layer (and optionally business)
MVC application (which has it's own models relating to the entities)
but there are many ways to go about this depending on the complexity and size of your application.
There is no "correct" answer to this question, it is primarily opinion-based. You can read about my opinion in the following project wiki:
https://github.com/danludwig/tripod/wiki/Why-Tripod%3F
https://github.com/danludwig/tripod/wiki/Dependency-and-Control-Inversion
https://github.com/danludwig/tripod/wiki/Tripod-101
https://github.com/danludwig/tripod/wiki/Testing,-Testing,-1-2-3
https://github.com/danludwig/tripod/wiki/Command-Query-Responsibility-Segregation-(CQRS)
Another piece of advice I would like to offer is never put any business logic in viewmodels or entities. These classes should not have methods, only properties to contain data. Separate your data from behavior. Use models for data, and other types for behavior (methods).
I have been using Entity Framework and the repository pattern for some time now.
I was asked the other day to write a data layer without using Entity Framework, just plain old ADO.NET. I was wondering what would be the best approach for this? Do I also use a repository pattern for my CRUD operations using plain old ADO.NET?
If I go to Codeplex and search for repository pattern then 99.9% of all the sample projects use Entity Framework. Is there a different pattern that needs to be used if I use plain ADO.NET with stored procedures?
No, the repository pattern is used extensively outside of the Entity Framework, and is an all round useful way of handling data access.
From MSDN
It centralizes the data logic or Web service access logic.
It provides a substitution point for the unit tests.
It provides a flexible architecture that can be adapted as the overall design of the application evolves.
http://msdn.microsoft.com/en-us/library/ff649690.aspx
Other benefits:
Simple to add logic in the repository, such as caching results over a web request
Common query's can be added, such as userRepository.FindByEmailAddress(emailAddress);
Repository can be changed out with another, such as switching a dabase to a web service with minimal effort
I don't think this is the right way. But there are some assumptions
Adding a Repository pattern on top of EF code. This keep distances you from the features of your ORM. The Entity Framework is already an abstraction layer over your database.
If you want to use the Dependency Injection and Test Driven Development over EF then you follow the Repository Pattern. By using RP your code become testable and Injectable / maintainable.
Out of the box EF is not very testable, but it's quite easy to make a mockable version of the EF data context with an interface that can be injected.
If we don’t want our code to be testable or injectable then just don’t use RP.
I saw a blog post: http://www.nogginbox.co.uk/blog/do-we-need-the-repository-pattern
Martin Fowler's "Patterns of Enterprise Architecture", provides the following definition for a repository:
Mediates between the domain and data mapping layers using a collection-like interface for accessing domain objects.
A common way to implement that in C# is to have a generic Repository<T> class where T is a persistent object that implements IQueryable<T> and provides additional methods like Add(entity), Remove(entity).
It would be very difficult to implement without an ORM. You can make a simpler repository that takes SQL statements as WHERE conditions but it can get messy.
Numerous examples use concrete repository classes for each type with different persistence methods. But those are just disguised DAO classes.
Here is the situation :
I have an Asp.Net 3.5 website using a BLL project with object classes
containing methods which call a DAL which call stored procedures
(Oldschool)
I am creating a new website (Asp.Net 4 and MVC 3) using the same Database and objects. I will also add a bunch of classes and new tables (a dozen I guess) which are related to the existing classes and tables.
My probleme is :
How can I make all this working together. I really want to use MVC for the new project, and make all working as proper as I can.
The old project is a complete mess but it so huge that we can't touch it.
So how to handle the new object ? Use the same old method with classe and stored procedures ?
Create a repository pattern for the new object and classes ?
etc...
Thank you very much
ps : I don't know if I am clear but I can give you any other details if needed.
ps2 : I am not looking for coding solutions, but some indication of best method to follow for a smooth integration
I am currently finishing the DataAccess layer on a project with EXACT issues and this is how I have solved it:
I've used the repository pattern to insulate my "new code" from knowing anything about the old code. In the repository I use calls into the existing logic to generate any of the "legacy" entities and use Entity framework to manage any new ones. Some entities require a "combined" approach to the CRUD operations. For example when I create a new User entity I need to call two different objects from the legacy system, the first returns me a PIN that I use to call into another legacy object to create a user instance and return an Id and that Id is finally used with the new Entity framework to manage the "newUser" part of the object. This data access layer is then exposed to the MVC project that only knows about the entities as the new system sees them and have no idea what the underlying legacy structure is. As such our MVC project (and even business logic layer in our case) is only aware of repositories and if we decide to move the legacy logic/CRUD operations to the new system, nothing will change but the actual repository. This makes for some rather "bulky" repositories in some cases (most complex is the one I mentioned about users), but gives a clean separation of new and legacy code.
I know that this is injecting some business logic into the data access layer, but for our project this was a deliberate choice due to other factors. You can have your business logic layer make the calls into the legacy layer if you want to.
Hope this has helped, in case it is not clear just let me know and I will explain further as it might not be as clear here as it is in my head :)
You don't need to use the repository pattern, or EF, or DI, to use MVC. If your BLL is what you use to get your business objects (CRUD operations), there's no reason you can't use that directly in your controllers.
As far as adding new classes/tables, it depends on how tightly coupled your BLL is. If it's loosely coupled using interfaces it should be possible for you to use a different pattern for some DAOs, but if the DAOs reference each other directly it may be more trouble that it's worth.
If you want to go the extra mile I would say a good first step would be to define your CRUD operations in interfaces (if there are not interfaces already) and have yout BLL classes implement them so you can use DI and make it less painful when you do switch to a different DL strategy.
I'm implementing a DAL using entity framework. On our application, we have three layers (DAL, business layer and presentation). This is a web app. When we began implementing the DAL, our team thought that DAL should have classes whose methods receive a ObjectContext given by services on the business layer and operate over it. The rationale behind this decision is that different ObjectContexts see diferent DB states, so some operations can be rejected due to problems with foreign keys match and other inconsistencies.
We noticed that generating and propagating an object context from the services layer generates high coupling between layers. Therefore we decided to use DTOs mapped by Automapper (not unmanaged entities or self-tracking entities arguing high coupling, exposing entities to upper layers and low efficiency) and UnitOfWork. So, here are my questions:
Is this the correct approach to design a web application's DAL? Why?
If you answered "yes" to 1., how is this to be reconciled the concept of DTO with the UnitOfWork patterns?
If you answered "no" to 1., which could be a correct approach to design a DAL for a Web application?
Please, if possible give bibliography supporting your answer.
About the current design:
The application has been planned to be developed on three layers: Presentation, business and DAL. Business layer has both facades and services
There is an interface called ITransaction (with only two methods to dispose and save changes) only visible at services. To manage a transaction, there is a class Transaction extending a ObjectContext and ITransaction. We've designed this having in mind that at business layer we do not want other ObjectContext methods to be accessible.
On the DAL, we created an abstract repository using two generic types (one for the entity and the other for its associated DTO). This repository has CRUD methods implemented in a generic way and two generic methods to map the DTOs and entities of the generic repository with AutoMapper. The abstract repository constructor takes an ITransaction as argument and it expects the ITransaction to be an ObjectContext in order to assign it to its proctected ObjectContext property.
The concrete repositories should only receive and return .net types and DTOs.
We now are facing this problem: the generic method to create does not generate a temporal or a persistent id for the attached entities (until we use SaveChanges(), therefore breaking the transactionality we want); this implies that service methods cannot use it to associate DTOs in the BL)
There are a number of things going on here...The assumption I'll make is that you're using a 3-Tier architecture. That said, I'm unclear on a few design decisions you've made and what the motivations were behind making them. In general, I would say that your ObjectContext should not be passed around in your classes. There should be some sort of manager or repository class which handles the connection management. This solves your DB state management issue. I find that a Repository pattern works really well here. From there, you should be able to implement the unit of work pattern fairly easily since your connection management will be handled in one place. Given what I know about your architecture, I would say that you should be using a POCO strategy. Using POCOs does not tightly couple you to any ORM provider. The advantage is that your POCOs will be able to interact with your ObjectContext (probably via Repository of some sort) and this will give you visibility into change tracking. Again, from there you will be able to implement the Unit of Work (transaction) pattern to give you full control over how your business transaction should behave. I find this is an incredibly useful article for explaining how all this fits together. The code is buggy but accurately illustrates best practices for the type of architecture you're describing: Repository, Specification and Unit of Work Implementation
The short version of my answer to question number 1 is "no". The above link provides what I believe to be a better approach for you.
I always believed that code can explain things better than worlds for programmers. And this is especially true for this topic. Thats why I suggest you to look at the great sample application in witch all consepts you expecting are implemented.
Project is called Sharp Architecture, it is centered around MVC and NHibernate, but you can use the same approaches just replacing NHibernate parts with EF ones when you need them. The purpose of this project is to provide an application template with all community best practices for building web applications.
It covers all common and most of the uncommon topics when using ORM's, managing transactions, managing dependencies with IoC containers, use of DTOs, etc.
And here is a sample application.
I insist on reading and trying this, it will be a real trasure for you like it was for me.
You should take a look what dependency injection and inversion of control in general means. That would provide ability to control life cycle of ObjectContext "from outside". You could ensure that only 1 instance of object context is used for every http request. To avoid managing dependencies manually, I would recommend using StructureMap as a container.
Another useful (but quite tricky and hard to do it right) technique is abstraction of persistence. Instead of using ObjectContext directly, You would use so called Repository which is responsible to provide collection like API for Your data store. This provides useful seam which You can use to switch underlying data storing mechanism or to mock out persistence completely for tests.
As Jason suggested already - You should also use POCO`s (plain old clr objects). Despite that there would still be implicit coupling with entity framework You should be aware of, it's much better than using generated classes.
Things You might not find elsewhere fast enough:
Try to avoid usage of unit of work. Your model should define transactional boundaries.
Try to avoid usage of generic repositories (do note point about IQueryable too).
It's not mandatory to spam Your code with repository pattern name.
Also, You might enjoy reading about domain driven design. It helps to deal with complex business logic and gives great guidelines to makes code less procedural, more object oriented.
I'll focus on your current issues: To be honest, I don't think you should be passing around your ObjectContext. I think that is going to lead to problems. I'm assuming that a controller or a business service will be passing the ObjectContext/ITransaction to the Repository. How will you ensure that your ObjectContext is disposed of properly down stream? What happens when you use nested transactions? What manages the rollbacks, for transactions down stream?
I think your best bet lies in putting some more definition around how you expect to manage transactions in your architecture. Using TransactionScope in your controller/service is a good start since the ObjectContext respects it. Of course you may need to take into account that controllers/services may make calls to other controllers/services which have transactions in them. In order to allow for scenarios where you want full control over your business transactions and the subsequent database calls, you'll need to create some sort of TransactionManager class which enlists, and generally manages transactions up and down your stack. I've found that NCommon does an extraordinary job at both abstracting and managing transactions. Take a look at UnitOfWorkScope and TransactionManager classes in there. Although I disagree with NCommon's approach of forcing the Repository to rely on the UnitOfWork, that could easily be refactored out if you wanted.
As far as your persistantID issue goes, check this out