I defined my interfaces in infrastructure layer, to use Dependency Injection, but now problem, how can i resolve dependency of DBContext using interface, without adding reference to EF dll, in infrasturcure layer and service layer.
If you need to hide EF completely from your application, you will need to use the repository pattern, hide EF behind your repositories and generate (or write) POCO entities.
If you're more pragmatic, you can use generic repositories with IQueryable support, which allows a great development and unit testing experience, but what to choose is up to you.
You can modify the T4 files (aka T4 templates or .tt files) to create interfaces along with context and even separate them into separate T4 files for each of the two, so you can place them in separate assemblies. You can also make the context return IQueryable instead of ObjectQuery, however...
In order to write optimized query that run on the database and not in memory, the queries must take into account the technology beneath them, you can not write generic queries, unit test them on a in memory list, then expect them to translate to SQL correct and run efficiently and without exceptions.
- You will have to test your queries above a real database (with demo data).
What you should do is implement services which hide the DAL technology from the layers above it, yet inside their implementation use the full power of EF to work as efficiently as possible.
These services can be mocked, to test the layers above them and the services themselves can be tested together with their usage of EF, using a test DB (e.g. using a LOCALDB instance created and started by the test class).
A few of the many relevent links:
Generic Repository With EF 4.1 what is the point
ASP.NET MVC3 and Entity Framework Code first architecture
Is UnitOfWork and GenericRepository Pattern redundant In EF 4.1 code first?
https://softwareengineering.stackexchange.com/questions/133448/unit-integration-testing-my-dal
Related
Background
At the company I work for I have been ordered to update an old MVC app and implement a repository pattern for a SQL database. I have created the context of the database using Entity Framework Database-First and got 23 entities.
The first question
Do I need to create a repository for each entity or implement a generic repository for the context? I'm asking this because I have found following while searching internet:
One repository per domain
You should think of a repository as a collection of domain objects in memory. If you’re building an application called Vega, you shouldn’t have a repository like the following:
public class VegaRepository {}
Instead, you should have a separate repository per domain class, like OrderRepository, ShippingRepository and ProductRepository.
Source: Programming with Mosh: 4 Common Mistakes with the Repository Pattern
The second question
Does a generic repository work for Entity Framework Database-First? This is because I have found following while searching internet:
Entity framework
Do note that the repository pattern is only useful if you have POCOs which are mapped using code first. Otherwise you’ll just break the abstraction with the entities instead (= the repository pattern isn’t very useful then). You can follow this article if you want to get a foundation generated for you.
Source: CodeProject: Repository pattern, done right
To begin with, if you are using full ORM like Entity Framework or NHibernate, you should avoid implementing additional layer of Repository and Unit Of Work.
This is because; the ORM itself exposes both Generic Repository and Unit Of Work.
In case of EF, your DbContext is Unit Of Work and DbSet is Generic Repository. In case of NHibernate, it is ISession itself.
Building new wrapper of Generic Repository over same existing one is repeat work. Why reinvent the wheel?
But, some argue that using ORM directly in calling code has following issues:
It makes code little more complicated due to lack of separation of concerns.
Data access code is merged in business logic. As a result, redundant complex query logic spread at multiple places; hard to manage.
As many ORM objects are used in-line in calling code, it is very hard to unit test the code.
As ORM only exposes Generic Repository, it causes many issues mentioned below.
Apart from all above, one other issue generally discussed is "What if we decide to change ORM in future". This should not be key point while taking decision because:
You rarely change ORM, mostly NEVER – YAGNI.
If you change ORM, you have to do huge changes anyway. You may minimize efforts by encapsulating complete data access code (NOT just ORM) inside something. We will discuss that something below.
Considering four issues mentioned above, it may be necessary to create Repositories even though you are using full ORM - This is per case decision though.
Even in that case, Generic Repository must be avoided. It is considered an anti-pattern.
Why generic repository is anti-pattern?
A repository is a part of the domain being modeled, and that domain is not generic.
Not every entity can be deleted.
Not every entity can be added
Not every entity has a repository.
Queries vary wildly; the repository API becomes as unique as the entity itself.
For GetById(), identifier types may be different.
Updating specific fields (DML) not possible.
Generic query mechanism is the responsibility of an ORM.
Most of the ORMs expose an implementation that closely resemble with Generic Repository.
Repositories should be implementing the SPECIFIC queries for entities by using the generic query mechanism exposed by ORM.
Working with composite keys is not possible.
It leaks DAL logic in Services anyway.
Predicate criteria if you accept as parameter needs to be provided from Service layer. If this is ORM specific class, it leaks ORM into Services.
I suggest you read these (1, 2, 3, 4, 5) articles explaining why generic repository is an anti-pattern. This other answer discusses about Repository Pattern in general.
So, I will suggest:
Do NOT use repository at all, directly use ORM in your calling code.
If you have to use repository, then do not try to implement everything with Generic Repository.
Instead, optionally create very simple and small Generic Repository as abstract base class. OR you can use Generic Repository exposed by your ORM as base repository if ORM allows it.
Implement Concrete Repositories as per your need and derive all them from Generic Repository. Expose concrete repositories to calling code.
This way you get all the good of generic repository still bypassing its drawbacks.
Even though very rare, this also helps switching ORM in future as ORM code is cleanly abstracted in DAL/Repositories. Please understand that switching ORM is not a primary objective of Data Access Layer or Repository.
In any case, do not expose Generic Repository to calling code.
Also, do not return IQueryable from concrete repositories. This violates basic purpose of existence of Repositories - To abstract data access. With exposing IQueryable outside the repository, many data access decisions leak into calling code and Repository lose the control over it.
do I need to create a repository for each entity or implement a generic repository for the context
As suggested above, creating repository for each entity is better approach. Note that, Repository should ideally return Domain Model instead of Entity. But this is different topic for discussion.
does a generic repository works for EF Database First?
As suggested above, EF itself exposes Generic Repository. Building one more layer on it is useless. Your image is saying the same thing.
According to the MSDN the DbSet :
DbSet<TEntity> Class
A DbSet represents the collection of all entities in the context, or
that can be queried from the database, of a given type. DbSet objects
are created from a DbContext using the DbContext.Set method.
And according to the MSDN the DbContext :
DbContext Class
A DbContext instance represents a combination of the Unit Of Work and
Repository patterns such that it can be used to query from a database
and group together changes that will then be written back to the store
as a unit. DbContext is conceptually similar to ObjectContext.
So that the EF use the repository pattern and the UOW internally .
DbSet <----> Repository
DbContext <----> Unit Of Work
Why should I build a repository pattern with a unit of work on the top of my EF?
Why should i build a repository pattern with a unit of work on the top of my EF?
Depends on how you want to manage your dependencies.
If Entity Framework is your abstraction layer and the database itself is the dependency, then Entity Framework does indeed already provide your repositories and unit of work. The trade-off is that your domain relies on Entity Framework. As long as that dependency is acceptable, you're good.
If, on the other hand, you want to treat Entity Framework itself as a dependency that can potentially be swapped out without making changes to domain code, then you'd want to create an abstraction as a wrapper around that.
Basically, it all comes down to where you draw the line of what is or is not an "external dependency". For some projects it doesn't matter, for some it's the physical database, for some it's the data access framework, etc.
Why should I build a repository pattern with a unit of work on the top
of my EF?
Because of the Interface Segregation Principle. The method signatures in DbSet and DbContext are basically a big low-level mess, there's a huge mismatch between them and what is typically expected in a Repository and a Unit of Work. In other words, if you use DbSet and DbContext directly, your Application Services code will suffer from leaky abstractions.
In your Application layer, you need to manipulate appropriate semantics. The code in that layer only needs to speak in terms of business transactions and large collections where you can fetch and store stuff. These are very high-level, minimalist abstract concepts. Entity Framework lingo is just too fuzzy and low-level for that, so you need to introduce other idioms - Repository and UoW.
I'm wondering about the utility of making a dal layer with EF.
Why not calling EF directly in business layer, considering EF DBContext is a unitOfWork and List DBSet are repositories ?
So why adding an extra DAL layer, wich is finally a facade..
The only advantage i see, is in case of we have to change the data access implementation, like replace EF by Hibernate or other. But honestly, i've never seen that happen.
Actually with a data mapper the necessity of developing a DAL is plain useless because it would contain 0 lines of code.
Everything on top of a data mapper isn't a data access layer but actual domain, because a data mapper implementation like an OR/M translates your objects into the underlying relational data and viceversa, and you work on top of them is to develop your domain and miss the pain of object-relational impedance.
The point of introducing the repository pattern on top of a data mapper is because you want to both be able to switch the underlying data store even to a non-relational one in the long run (also, switch from NoSQL to SQL, who knows!), and there's another definitive reason to introduce the repository layer in your software: because you want to be able to mock the data store with fakes in order to unit test your domain.
Finally, even when Entity Framework implements unit of work and other patterns, sometimes their implementation may not suit your own domain requirements and you need to wrap them to provide more concretion to your domain.
I am trying to create a system that allows you to switch multiple data sources, e.g. switching from Entity Framework to Dapper. I am trying to find the best approach to do this.
At the moment I have different projects for different data layers, e.g. Data.EF for Entity Framework, Data.Dapper for Dapper. I have used a database approach but when it creates the models the information generated is coupled together and not easy to refactor, e.g. separation of models.
I have a project called models, this holds domain and view models, and I was thinking of creating Data.Core and follow the repository pattern. But then, doing this will add an extra layer so I would have Presentation / Business / Repository / Data.
I would like to know the best structure for this approach. Should I also do a code-first approach to create my database? This helps separate concerns and improve abstraction. This is quite a big application so getting the structure right is essential.
I'd suggest factoring your data interfaces either to the model through repository interfaces for your entities or to an infrastructure project. (I think the latter was your rationale behind creating a Data.Core project.)
Each data source will then implement the very same set of interfaces, and you can easily switch between them, even dynamically using dependency injection.
For instance, using repositories:
Model
\_ Entities
Entity
\_ Repositories
IEntityRepository
Data.EF
EntityRepository : Model.IEntityRepository
Data.Dapper
EntityRepository : Model.IEntityRepository
Then in your business you won't need to even reference Data.EF or Data.Dapper: you can work with IEntityRepository and have that reference injected dynamically.
I think you approach is correct. I'd say Presentation / business / repository / data is pretty standard these days.
I'd say the code first approach using POCOs is the preferred option today in the industry. I would advise to start creating a project containing your POCO data structures with any logic in it and take it from there. The advantage of this is that your objects model the domain more naturally. If you start with a db centric approach the problem is that, if you are not careful, you may end with objects more akin to SQL relational databases than to the real model. This was painfully evident in the first versions of .net where it was encouraged to use Datasets tighly coupled with the db and that often caused problems to work with in the business layer.
If needed you can do any complex mapping between the business objects and the db objects in the repository layer. You can use a proxy and/or a unit of work if you need to.
I would suggest you create your domain objects, use the code-first approach and also apply the repository pattern
Yes the repository pattern does bring in an extra layer. Have a look at this post for more detail information Difference between Repository and Service Layer?
RE: code-first approach to create my database
It doesn't matter how big your application is, it is a question of what else you intend to use the database for. If this database is simply a repository for this application then using code-first is fine as you are simply storing your code objects. However if you are using this database as an integration point between applications then you may wish to design the database seperately to the application models.
I've been reading about POCO (Plain Old CLR Object) for a while but still can't find the real added value of using it instead of using the auto generated partial classes of the entity framework?
One more thing is it best to use my entity framework directly from the presentation layer or creating a BLL will be better?
The main benefit of a POCO is that you can pass it to a class, library or assembly that doesn't need to know anything about Entity Framework to do its job.
Remember the Single Responsibility Principle - your DAL should know about EF, but your Domain should not, and neither should your presentation layer, because EF deals with data access and those layers do not. Passing EF generated class objects up to those layers generally means you need to make those layers aware of EF, breaking the SRP and making it harder to unit test those layers in isolation.
In response to Ali's further query in the comments below, here is an expanded explanation.
In more complex applications, you will want to split the logic up into separate concerns - data access, business logic, presentation (and perhaps a lot more).
As entity framework deals with data access, it resides in the data access layer - here, you will see little difference between POCOs and EF generated classes. This is fine, because this layer already knows about Entity Framework.
However, you may want to pass data up to the business logic layer - if you do this with EF generated classes, then your business logic layer must also know about Entity Framework because the EF classes rely on a lot of things EF provides specially. What this does is remove the isolation that your business logic layer should have - you need this isolation so you can unit test it correctly by injecting known data into the class from a fake data access layer, which is incredibly hard to do if you let the business logic layer know about EF.
With unit testing, you should be testing the layers functionality, not the functionality of third party libraries - but with EF you end up testing a lot of EF's functionality, or your own functionality which relies very heavily on that of EF's. This isn't good, and it can mask errors or issues.
Removing the business logics dependency on EF generated classes also allows you to move the layer to as remote a location as you like from the data access layer - you can even stick it behind a web service and it would be completely happy. But you can only do this with POCOs, you cannot do this with EF generated classes.
POCO's really come into their own in large, complex multi layered applications - if you aren't layering your app, then you won't see a whole load of benefits imho.
All of this is my opinion, and I'm just a coder with experience - I'm not a coding rockstar, so some other commenters may like to further expand my answers...
The real benefits with POCO is that you can use code first and EF Migrations. If you are not going to use code first you can use the designer generated classes.
If you have a large application you should create a separate BLL, but if your application is very small you can probably go directly with the EF classes in the presentation layer.
Using POCO classes in an ORM allows you to create tests for that code in an easier manner. It also allows you to have a layer of abstraction between your model objects (the POCO classes) and the data access code so if you need to you can swap the data access code (EF for NHibernate, for instance).
I've worked with the POCO model in the past and I can tell you that it's useful for big enterprise projects and large teams of developers where changes to the model happen often and where the monolithic file model used by default by EF does not scale well. The benefits on small projects or in rapid application development are hard to see.
TLDR version: If you're asking yourself what the benefits of POCO and code first are, you probably won't gain anything from using them.