I'm looking into implementing the IRepository pattern using NHibernate and I have question that I've not been able to answer searching the net.
Assume I have 3 Repositories, PersonRepository, PersonAddressRepository and PersonAccountRepository. Now assume that business logic dictates that there be an "Deactivate Person" process that calls PersonRepository.Deactivate(), PersonAddressRepository.Deactivate() and PersonAccountRepository.Deactivate().
I want to be able to do something along the lines of..
using (ITransaction transaction = session.BeginTransaction()) {
session.Update(Person);
session.Update(PersonAddress);
session.Update(PersonAccount);
}
So that if any of those updates fail that the entire process rolls back within the database. Now at the moment my understanding of NHibernate is you can only create a Session per object so..
var cfg = new Configuration();
cfg.Configure();
cfg.AddAssembly(typeof(Person).Assembly);
ISessionFactory sessionFactory = cfg.BuildSessionFactory();
using (ISession session = sessionFactory.OpenSession()) {
using (ITransaction transaction = session.BeginTransaction()) {
session.Save(Person);
}
Is this correct or am I mistaken? What are the best practices for Transactions regarding multi table updates and Transactions with regards to NHibernate.
Thanks in advance.
You should not create transactions in the repositories or somewhere else "bellow". Transactions are defined by the application logic. This is one of the most common mistakes I see in transaction handling.
I wrote a transaction service which manages the transactions:
using (TransactionService.CreateTransactionScope())
{
repositoryA.DoX();
repositoryB.DoY();
TransactionService.Commit();
}
The repository is getting the session with an open transaction from the service:
TransactionService.Session.CreateQuery("...");
Depending on your environment, you need to make it a bit more complicated. For instance, the session may not be visible to the business logic and should to be put onto another interface etc.
I thought NHibernate understands the System.Transactions.TransactionScope class. Why wouldn't you use that?
One thing you can do -this is how I do it right now- is pass the ISession instance that should be used to your repository instances.
What I will do, in the future, is this:
I have a UnitOfWork class, which is quite generic and is a wrapper around NHibernate's ISession object. This UnitOfWork class contains no 'application' or domain specific methods
In a project which uses NHibernate (and my UnitOfWork wrapper), I'll create a set of extension methods on the UnitOfWork class, that look like this:
public static class UnitOfWorkExtension`
{
public static IPersonRepository GetPersonRepository( this UnitOfWork uow)
{
return new PersonRepository(uow);
}
public static IAccountRepository GetAccountRepository( this UnitofWork uow )
{
return new AccountRepository(uow);
}
}
Then, this would allow me to do this, for instance:
using( var uow = unitOfWorkFactory.CreateUnitOfWork() )
{
var person = uow.GetPersonRepository().GetPerson (1);
var accounts = uow.GetAccountRepository().GetAccountsForPerson(person);
}
But, looking at your example, I'm wondering whether you should have a repository for 'PersonAddress' and 'PersonAccount'.
In my humble opinion, Person is an 'aggregate root' which consists of a PersonAddress and a PersonAccount in your example, and there should be a PersonRepository which handles the Person aggregate root (including the PersonAddress and PersonAccount objects - which are, in fact not entities but value objects (as I see it)).
Related
We currently use Entity Framework to access our DB. We had one large class, thousands of rows long, full of methods that queried database.
void saveFirstThing() {
using (var dbContext = new DatabaseContext()) {
...
}
}
AnotherThing returnAnotherThing() {
using (var dbContext = new DatabaseContext()) {
...
}
}
...
But if we wanted to change database provider (or test our application by using fake values), it would be pretty annoying to do. So I created:
interface IRepository<T>, which provides actions with T - Get(int id), Add(T entity), GetAll(), ...
Implementation of the interface, that implemented the methods by calling EF and returning IQueryable.
public class AnotherThingRepository : IRepository<AnotherThing> {
...
public IQueryable GetAll() {
...
}
...
}
public class Something {
AnotherThing returnAnotherThing() {
return anotherThingRepository.GetAll().Where(...).Single();
}
}
However, I have a problem. We have methods that require joins over multiple tables. So I cannot use using(dbContext) inside implementation of IRepository, since it would drop dbContext before we finish querying the result (of type IQueryable). I cannot return IEnumerable or List, since it won't load all joined fields (and there might be lot of them). If I wanted to do some messy reflection, it would be slow and full of circular references that I would need to solve.
What is the best solution? The only thing that works now is keeping dbContext forever, and share it for all queries in given repository implementation. However it stops working when I need to update something from DB, since dbContext caches everything and dbContext shouldn't be used in a way how I'm using it right now regardless.
The only thing that works now is keeping dbContext forever, and share it for all queries in given repository implementation.
No. The DbContext should be scoped to the Unit-of-Work, which might span involve multiple repository instances.
A simple implementation would inject the DbContext in the constructor of each of your repository types. eg
using (var db = new MyDatabase())
{
var things = new ThingRepository(db);
var anotherThings = new AnotherThingRepository(db);
. . .
db.SaveChanges();
}
Apparently (and quite possibly) there's a flaw in my current UnitOfWork implementation, because I have connection errors when doing many calls at once.
Exception:
The underlying provider failed on Open.
Inner Exception:
The connection was not closed. The connection's current state is
connecting.
This results in a HTTP 500 response on the client side.
UnitOfWork implementation
public class ScopedUnitOfWork : IUnitOfWork
{
public Entities Context { get; set; }
public UnitOfWorkState State { get; set; }
public ScopedUnitOfWork(IEnvironmentInformationProvider environmentInformationProvider)
{
this.Context = new Entities(environmentInformationProvider.ConnectionString);
this.State = UnitOfWorkState.Initialized;
}
public UowScope GetScope()
{
this.State = UnitOfWorkState.Working;
return new UowScope(this);
}
public SaveResult Save()
{
if (this.State != UnitOfWorkState.Working)
throw new InvalidOperationException("Not allowed to save out of Scope. Request an UowScope instance by calling method GetScope().");
this.Context.SaveChanges();
this.State = UnitOfWorkState.Finished;
return new SaveResult(ResultCodes.Ok);
}
}
Working on a single UowScope would solve the issue but that's not possible given the current circumstance, because each request is completely separate. De facto each request IS using an UoWScope, but apparently it goes wrong when the UoW receives many calls at once.
The UoW is injected through Unity IoC, so I suppose it's a singleton in effect.
The question
Is there a way to adapt the UoW so that separate high-frequency requests are not an issue?
Preferably I'd solve this server side, not client side, any tips? Thanks!
Disclaimer
I don't claim I fully understand UoW, so my implementation may need improvement, be gentle :). Any improvements on that are certainly welcome!
UPDATE
I -know- the EF Context is an UoW, I use mine at Domain level to enable transactional processing of data that is functionality related. And it's also by customer demand, I have no choice.
The issue you have is that the unit of work object is effectively a singleton as your IoC framework is keeping it around for the duration of your application. This means that your context is also being kept as a singleton as it's inside the UoW. So you will almost certainly get multiple concurrent calls to your context which will throw exceptions.
However, I think you are misusing the concept of what a UoW supposed to do. A UoW is there to provide a container for a group of transactions. For example lets say you have an eCommerce platform. When you create an order, you will insert a row in the orders table, then as part of the same transaction you will also insert rows into the order items table, update a users loyalty points etc. So you should do all this inside a single unit of work, commit it, then destroy it. Let the IoC framework (Unity in this case) create your unit of work for each session.
I'm getting stuck into NHibernate at the moment and I'm trying to work out how best to correctly design my Repositories whilst correctly managing the lifetime of the Session.
From the examples I've seen, it seems common practise to inject an ISession into each of the repositories as follows:
public class SomeRepository : IRepository
{
private readonly ISession _session;
public SomeRepository(ISession session)
{
_session = session;
}
public IList<T> LoadSomeData()
{
return _session.Query<T>().Where(...).ToList();
}
}
So this is fine, but I can see a couple of cases where there will be problems:
1) There may not be 1 Session for the lifetime of the app.
2) When accessing NHibernate we should always wrap our calls in a Transaction - if the transaction faults then we must Rollback the transaction and close the Session.
In both these cases the Repository would be defunct as it references a closed Session.
My particular application is a long running process, which will only occasionally call into NHibernate, so I would expect a high turnover Sessions instead of keeping 1 Session open for the lifetime of the app.
I am therefore wondering what the established pattern for dealing with this particular situation? I see several potential solutions, but it's hard to see which is the best practise:
1) Rebuild the repositories (and create a new Session) every time the process needs to do some DB work.
2) Inject the Repositories with a SessionFactory. The Repository then exposes this SessionFactory. The consumer of each repository then Opens a new Session at the same time as starting a transaction.
3) Create a UnitOfWork class which is injected into the Repositories and is responsible for managing Session lifetimes & Transactions. Each time a call into a Repository is made, is calls into UnitOfWork, which creates a brand new Session and the call is executed within a Transaction. The repositories therefore have no knowledge of the Sessions/Transactions.
To me, 3) seems like the best solution, and I've seen a few examples of this online, but in all of the examples they only create 1 session within UnitOfWork, and they don't recreate it if the transaction is rolled back.
In addition, a limitation with 3) is that the UnitOfWork is tied to a particular repository, and so you couldn't have a transaction which made calls into different Repositories.
Hope that makes sense and would appreciate any guidance.
Thanks
Actually Session in NHibernate can be used as UnitOfWork. But if you want to use this pattern, here's the implementation: http://nhibernate.info/doc/patternsandpractices/nhibernate-and-the-unit-of-work-pattern.html.
But however the best solution in your list is 1. So you need to create session and repositories every time you need to make a job with DB.
Using the UnitOfWork it will looks like:
using(var uow = new UnitOfWork())
{
var rep1 = new SomeRepository1(uow);
rep1.DoSomeJob();
var rep2 = new SomeRepository2(uow);
rep2.DoSomeOtherJob();
}
Using the native ISession:
using(var session = SessionFactory.OpenSession())
using(var tr = session.BeginTransaction())
{
var rep1 = new SomeRepository1(session);
rep1.DoSomeJob();
var rep2 = new SomeRepository2(session);
rep2.DoSomeOtherJob();
tr.Commit();
}
Also you can use the power of DI/IoC containers to manage sessions lifetime.
If you interested in last option I can show you an example to.
I'm creating a repository for my project, actually a subset of it, with c#, poco's and old ado.net (no ORM's).
I've several entities and my repository will CRUD those through a DAL.
My DAL is IDisposable so I managed to open a connection to database when instantiate it and close that connection in Dispose method.
Now I want to know, should I have to use in repository one class per entity or one huge class for all of them?
The second approach will allow me to open connection, retrieve many entities as I want e then close it. The first one I've to open and close one connection per entity, like this:
//One big class for all entities, opens one connection
using(RepositoryForAll rfa = new RepositoryForAll())
{
Customer c = rfa.GetCustomer(Customerid);
Order o = rfa.GetOrder(OrderId);
}
//closes the connection
//One repository class per entity, also one connection
using(RepositoryCustomer customers = new RepositoryCustomer())
{
var c = customers.Get(CustomerId);
}
//closes the connection for RepositoryCustomer
using(RepositoryOrder orders = new RepositoryOrder())
{
var c = orders.Get(OrderId);
}
//closes the connection for RepositoryOrder
This makes sense?
I read about AggregateRoot in some book which suggests another approach.
This is a fairly simple example and my repository does not have to be that sophisticated.
how about a different solution? create your connection (or transaction, or unit of work) outside of the repository, and pass it to the repository.
like so-
using(var tx = new Transaction())
{
RepositoryCustomer customers = new RepositoryCustomer(tx)
RepositoryOrder orders = new RepositoryOrder(tx)
var c = customers.Get(CustomerId);
var o = orders.Get(OrderId);
}
(This is just a simple example, of course; I would recommend using some sort of an IoC mechanism instead of instantiating objects yourself.
Also you might want to read a little about the Unit of work concept, which may apply here)
Organize classes in manageable groups.
In your case I think Order and Customer should belong to a OrderRepository.
A Product would go into a CatalogueRepository.
If classes need to be mocked when Unit Tested they go into their own Repository. That might be a PaymentRepository.
I am learning EF and have seen many examples, and during my learning I came to know about using repository and unit of work patterns. I got why to use repository but I do not have understanding of unit of work really is.
Having no understanding is making DAL understanding difficult. Kindly guide me.
Thanks
The DataContext or ObjectContext is the Unit of Work.
So, your DAL will save, delete and retrieve objects and your DataContext/ObjectContext will keep track of your objects, manage transactions and apply changes.
This is an example just to illustrate the idea of the solution.
using(var context = new ObjectContext()) { // Unit of Work
var repo = new ProductRepository(context);
var product = repo.GetXXXXXXX(...);
...
// Do whatever tracking you want to do with the object context. For instance:
// if( error == false) {
// context.DetectChanges();
// context.SaveChanges(SaveOptions.AcceptAllChangesAfterSave);
// }
}
And your repository will look like:
public abstract class Repository {
public Respository(ObjectContext context){
CurrentContext = context;
}
protected ObjectContext CurrentContext { get; private set; }
}
public class ProductRespository : Repository {
public ProductRespository(ObjectContext context) : base(context){
}
public Product GetXXXXXX(...){
return CurrentContext... ; //Do something with the context
}
}
Another way is to put the unit of work (Object context) globally:
You need to define what will be your unit of work scope. For this example, it will be a web request. In a real world implementation, I'd use dependency injection for that.
public static class ContextProvider {
public static ObjectContext CurrentContext {
get { return HttpContext.Items["CurrentObjectContext"];
}
public static void OpenNew(){
var context = new ObjectContext();
HttpContext.Items["CurrentObjectContext"] = context;
}
public static void CloseCurrent(){
var context = CurrentContext;
HttpContext.Items["CurrentObjectContext"] = null;
// Do whatever tracking you want to do with the object context. For instance:
// if( error == false) {
// context.DetectChanges();
// context.SaveChanges(SaveOptions.AcceptAllChangesAfterSave);
// }
context.Dispose();
}
}
In this example, ObjectContext is the unit of work and it will live in the current request. In your global asax you could add:
protected void Application_BeginRequest(object sender, EventArgs e){
ContextProvider.OpenNew();
}
protected void Application_EndRequest(object sender, EventArgs e){
ContextProvider.CloseCurrent();
}
In your Repositories, you just call ContextProvider.CurrentContext
One of the most common design patterns in enterprise software development is the Unit of Work. According to Martin Fowler, the Unit of Work pattern "maintains a list of objects affected by a business transaction and coordinates the writing out of changes and the resolution of concurrency problems."
The Unit of Work pattern isn't necessarily something that you will explicitly build yourself, but the pattern shows up in almost every persistence tool that I'm aware of. The ITransaction interface in NHibernate, the DataContext class in LINQ to SQL, and the ObjectContext class in the Entity Framework are all examples of a Unit of Work. For that matter, the venerable DataSet can be used as a Unit of Work.
For more detail info Please click here to read this article, it's a good one.
For Tutorial on Implementing the Repository and Unit of Work Patterns in an ASP.NET MVC (MVC 4 and EF 5) Application (9 of 10) please click here
For EF 6 and MVC 5 tutorial please click here
I hope this will help, it helped me!
Unit of Work
Maintains a list of objects affected by a business transaction and coordinates the writing out of changes and the resolution of concurrency problems.
When you're pulling data in and out of a database, it's important to
keep track of what you've changed; otherwise, that data won't be
written back into the database. Similarly you have to insert new
objects you create and remove any objects you delete.
You can change the database with each change to your object model, but
this can lead to lots of very small database calls, which ends up
being very slow. Furthermore it requires you to have a transaction
open for the whole interaction, which is impractical if you have a
business transaction that spans multiple requests. The situation is
even worse if you need to keep track of the objects you've read so you
can avoid inconsistent reads.
A Unit of Work keeps track of everything you do during a business
transaction that can affect the database. When you're done, it figures
out everything that needs to be done to alter the database as a result
of your work.
http://martinfowler.com/eaaCatalog/unitOfWork.html