If there is a class which works as a data access layer, and provides the functionality of CRUD operations for entities, Which version is the best practice when considering performance & Multi threaded environment (i.e methods of this class are called by multiple threads simultaneously . .). . .
Version 1:
DbContext created at class level, shared by all the methods . .
class EmployeeService{
private DbContext db=new DbContext();
public Employee GetEmployee(int id)
return db.Employees.Find(id);
}
public void AddEmployee(Employee employee){
db.Employees.Add(employee);
db.SaveChanges();
}
}
Version 2:
DbContext for each method call . .
class EmployeeService{
public Employee GetEmployee(int id){
using(DbContext db=new DbContext()){
return db.Employees.Find(id);
}
}
public void AddEmployee(Employee employee){
using(DbContext db=new DbContext()){
db.Employees.Add(employee);
db.SaveChanges();
}
}
}
UPDATE:
May be the question posted is too generic in scope which leads several points to consider.
The point of interest is, the cost of instantiating DbContext object. Can it be created per request (Version 2) or is it heavy weight object and better to create few instances and share them across different calls (Version 1)
There's even a third approach based on manual or automatic dependency injection:
public interface ISomeService
{
// Interface members
}
public class SomeService : ISomeService
{
public SomeService(DbContext dbContext)
{
DbContext = dbContext;
}
private DbContext DbContext { get; }
}
Then, SomeService won't be the responsible of defining the life-time of injected DbContext, but it's an external class who does it.
That way, your service focuses on doing just what's meant to do (working with the domain and reading/writing data).
Depending on the execution environment, you'll want different DbContext life-styles: per service instance, per request, per thread... There're a lot of choices here depending on the case.
Perhaps you're not considering another scenario: a shared transaction between two or more services. You would need to transfer the responsibility of instantiating DbContext to an upper layer, and then you would inject same DbContext on all participating services, and you would confirm or abandon the whole transaction globally.
Performance of EF is made up of several factors, scope of the DbContext being one of them.
Some background information regarding the scope is found here: https://msdn.microsoft.com/en-us/data/jj729737.aspx
Scope is not only about performance, it's also about the objects returned. If lazy loading is applied and the DbContext is disposed before you access some navigation properties it will give an exception. See http://www.entityframeworktutorial.net/EntityFramework4.3/lazy-loading-with-dbcontext.aspx
You could write code like this:
public class EmployeeService
{
public EmployeeDto GetEmployee(int id)
{
using(DbContext db=new DbContext())
{
return db.Employees.Select(e =>
new EmployeeDto
{
Id = e.Id,
Name = e.Name,
Department = e.Department.Name
}).First(e => e.Id == id);
}
}
}
public class EmployeeDto
{
public int Id { get;set;}
public string Name { get;set;}
public string Department { get;set;}
}
So rather than return the whole object you can use projection to limit the set of data that is returned. This you can use to reduce load on the database server as the queries will be less verbose but it also helps to load all the required data before disposing the context. See http://www.entityframeworktutorial.net/querying-entity-graph-in-entity-framework.aspx for some examples of generated queries.
So my advice is to limit the scope of the DbContext. You can inject the DbContext but then you will have no control over the DbContext It will lead to errors like this: http://wallacekelly.blogspot.nl/2012/01/linq-to-entities-objectdisposedexceptio.html
But it all depends on your needs and on what kind of service it is you are building.
Related
I'm working on a classic .Net Framework Web API solution.
I have 3 layers. Let's call them
MVC - with POST, GET, UPDATE, DELETE controllers.
BIZZ - for business with my service class. My service class are king of repositories with CREATE, READ, UPDATE, DELETE and specific methods.
DATA - with POCO and definition of DB context.
I will not develop the EF layer. It is a classic Entity Framework project with POCO.Here is a sample of a Service and with BaseService class
public abstract class Service : IDisposable
{
protected DbContext dbContext = new DbContext();
public void Dispose()
{
dbContext.Dispose();
}
}
Then I have a cart service and a order service. They are similar in their structure so I will only write the code useful for this example.
public class CartService : Service
{
public Cart Create(Cart cart)
{
// Create the cart
}
public Cart Read(Guid id)
{
// Read
}
public Cart Update(Cart cart)
{
// I do some check first then
}
public void Delete(Cart cart)
{
// Delete
}
public void Checkout(Cart cart)
{
// Validation of cart removed in this example
dbContext.Cart.Attach(cart);
cart.DateCheckout = DateTime.UtcNow;
dbContext.Entry(cart).State = EntityState.Modified; // I think this line can be removed
dbContext.SaveChanges();
using (var orderService = new OrderService())
{
foreach (var order in cart.Orders)
{
order.DateCheckout = cart.DateCheckout;
order.Status = OrderStatus.PD; // pending
orderService.Update(order);
}
}
}
}
public class OrderService : Service
{
public Cart Create(Cart cart)
{
// Create the cart
}
public Cart Read(Guid id)
{
// Read
}
public Cart Update(Cart cart)
{
dbContext.Entry(order).State = EntityState.Modified;
dbContext.SaveChanges();
// More process here...
return order;
}
public void Delete(Cart cart)
{
// Delete
}
}
So, I have a service, cart service, that call another service, order service. I must work like this because I cannot simply accept the cart and all orders in it as it is. When I save a new order or update an existing order I must create a record in some other tables in other databases. The code is not in my example. So, I repeat I have a service that call another service and then I have 2 dbContext. At best this just create 2 context in memory, at worst this create exception. Exception like you cannot attach an entity to 2 contexts or this entity is not in context.
Well, I would like all my service use the same context. I suppose you will al tell me to use Dependency Injection. Yes, well ok but I don't want, each time I create a new service have to pass the context. I don't want to have to do that:
public void Checkout(Cart cart)
{
// ...
using (var orderService = new OrderService(dbContext))
{
// ...
}
}
I would like to do something that impact my base service only if possible. A singleton maybe... At this point I can see your face. Yes I know Singleton are soo bad. Yes but i'm doing a IIS Web API. Each request is a new instance. I don't care about the impact of the singleton. And I can load my database by changing the connection string in config file so the benefit of DI is there already. Well, I also know it is possible to have singleton with DI. I just don't know how.
So, what can I do to be sure I share my dbContext with all my services?
Disclaimer: This example is not intended to be a "good" one and certainly does not follow best practices, but faced with an existing legacy code base which from your example already suffers from a number of questionable practices, this should get you past the multiple context issues.
Essentially if you're not already using a IoC Container to perform dependency injection then what you need is to introduce a unit of work to manage the scope of a DbContext where your base Service class provides a DbContext provided by the unit of work. (Essentially a DbContext Registry)
For the unit of work and assuming EF6 I would recommend Mehdime's DbContextScope which is available as a NuGet package. Alternatively you can find the source code on Github and implement something similar without too much trouble. I like this pattern because it leverages the CallContext to serve as the communication layer between the ContextScope (Unit of Work) created by the DbContextScopeFactory and the AmbientDbContextScope. This will probably take a little time to get your head around but it injects very nicely into legacy applications where you want to leverage the Unit of Work and don't have dependency injection.
What it would look like:
In your Service class you would introduce the AmbientDbContextLocator to resolve your DbContext:
private readonly IAmbientDbContextLocator _contextLocator = new AmbientDbContextLocator();
protected DbContext DbContext
{
get { return _contextLocator.Get<DbContext>(); }
}
And that's it. Later as you refactor to accommodate Dependency injection, just inject the AmbientDbContextLocator instead of 'new'ing it up.
Then, in your web API controllers where you are using your services, (not the services themselves) you need to add the DbContextScopeFactory instance..
private readonly IDbContextScopeFactory _contextScopeFactory = new DbContextScopeFactory();
Lastly, in your API methods, when you want to call your services, you need to simply use the ContextScopeFactory to create a context scope. The AmbientDbContextLocators will retrieve the DbContext from this context scope. The context scope you create with the factory will be done in a using block to ensure your contexts are disposed. So, using your Checkout method as an example, it would look like:
In your Web API [HttpPost] Checkout() method:
using (var contextScope = _contextScopeFactory.Create())
{
using(var service = new CartService())
{
service.Checkout();
}
contextScope.SaveChanges();
}
Your cart service Checkout method would remain relatively unchanged, only instead of accessing dbContext as a variable (new DbContext()) it will access the DbContext property which gets the context through the context locator.
The Services can continue to call DbContext.SaveChanges(), but this isn't necessary and the changes will not be committed to the DB until the contextScope.SaveChanges() is called. Each service will have its own instance of the Context Locator rather than the DbContext and these will be dependent on you defining a ContextScope to function. If you call a Service method that tries to access the DbContext without being within a using (var contextScope = _contextScopeFactory.Create()) block you will receive an error. This way all of your service calls, even nested service calls (CartService calls OrderService) will be interacting with the same DbContext instance.
Even if you just want to read data, you can leverage a slightly faster DbContext using _contextScopeFactory.CreateReadOnly() which will help guard against unexpected/disallowed calls to SaveChanges().
When using the ASP.NET Core stack, the tutorial for using EF with it defaults to using DI to provide your DB context, just not with a service layer. That said, it actually does the right thing for this out of the box. I'll give a brief rundown of the bare minimum necessary for this to work, using whatever the latest versions of ASP.NET Core Web API and EF Core were on NuGet at the time of writing.
First, let's get the boilerplate out of the way, starting with the model:
Models.cs
public class ShopContext : DbContext
{
public ShopContext(DbContextOptions options) : base(options) {}
// We add a GUID here so we're able to tell it's the same object later.
public string Id { get; } = Guid.NewGuid().ToString();
public DbSet<Cart> Carts { get; set; }
public DbSet<Order> Orders { get; set; }
}
public class Cart
{
public string Id { get; set; }
public string Name { get; set; }
}
public class Order
{
public string Id { get; set; }
public string Name { get; set; }
}
Then some bare-bones services:
Services.cs
public class CartService
{
ShopContext _ctx;
public CartService(ShopContext ctx)
{
_ctx = ctx;
Console.WriteLine($"Context in CartService: {ctx.Id}");
}
public async Task<List<Cart>> List() => await _ctx.Carts.ToListAsync();
public async Task<Cart> Create(string name)
{
return (await _ctx.Carts.AddAsync(new Cart {Name = name})).Entity;
}
}
public class OrderService
{
ShopContext _ctx;
public OrderService(ShopContext ctx)
{
_ctx = ctx;
Console.WriteLine($"Context in OrderService: {ctx.Id}");
}
public async Task<List<Order>> List() => await _ctx.Orders.ToListAsync();
public async Task<Order> Create(string name)
{
return (await _ctx.Orders.AddAsync(new Order {Name = name})).Entity;
}
}
The only notable things here are: the context comes in as a constructor parameter as God intended, and we log the ID of the context to verify when it gets created with what.
Then our controller:
ShopController.cs
[ApiController]
[Route("[controller]")]
public class ShopController : ControllerBase
{
ShopContext _ctx;
CartService _cart;
OrderService _order;
public ShopController(ShopContext ctx, CartService cart, OrderService order)
{
Console.WriteLine($"Context in ShopController: {ctx.Id}");
_ctx = ctx;
_cart = cart;
_order = order;
}
[HttpGet]
public async Task<IEnumerable<string>> Get()
{
var carts = await _cart.List();
var orders = await _order.List();
return (from c in carts select c.Name).Concat(from o in orders select o.Name);
}
[HttpPost]
public async Task Post(string name)
{
await _cart.Create(name);
await _order.Create(name);
await _ctx.SaveChangesAsync();
}
}
As above, we take the context as a constructor parameter to triple-check it's what it should be; we also need it to call SaveChanges at the end of an operation. (You can refactor this out of controllers if you want to, but they'll work just fine as units of work for now.)
The part that ties this together is the DI configuration:
Startup.cs
public void ConfigureServices(IServiceCollection services)
{
services.AddControllers();
// Use whichever provider you have here, this is where you grab a connection string from the app configuration.
services.AddDbContext<ShopContext>(options =>
options.UseInMemoryDatabase("Initrode"));
services.AddScoped<CartService>();
services.AddScoped<OrderService>();
}
AddDbContext() defaults to registering a DbContext to be created per-request by the container. Web API provides the AddControllers method that puts those into the DI container, and we also register our services manually.
The rest of Startup.cs I've left as-is.
Starting this up and opening https://localhost:5001/shop should log something like:
Context in CartService: b213966e-35f2-4cc9-83d1-98a5614742a3
Context in OrderService: b213966e-35f2-4cc9-83d1-98a5614742a3
Context in ShopController: b213966e-35f2-4cc9-83d1-98a5614742a3
with the same GUID for all three lines in a request, but a different GUID between requests.
A little additional explanation of what goes on above:
Registering a component in a container (using Add() and such above) means telling the container those components exist and that it should create them for you when asked, as well as what identifiers they're available under and how to create them. The defaults for this are more or less "make the component available as its class, and create it by calling its one public constructor, passing other registered components into it" - the container looks at the constructor signature to figure this out.
"Scoped" in an ASP.NET Core app means "per-request." I think in this case one could also use services with a transient lifetime - a new one created every time it's needed, but they'll still get the same DbContext as long as they're created while handling the same request. Which one to do is a design consideration; the main constraint is that you can't inject shorter-lived components into longer-lived components without having to use more complex techniques, which is why I favour having all components as short-lived as possible. In other words, I only make things longer-lived when they actually hold some state that needs to live for that time, while also doing that as sparingly as possible because state bad. (Just recently I had to refactor an unfortunate design where my services were singletons, but I wanted my repositories to be per-request so as to be able to inject the currently logged in user's information into the repository to be able to automatically add the "created by" and "updated by" fields.)
You'll note that with support for doing things this way being built-in to both ASP.NET Core and EF Core, there's actually very little extra code involved. Also, the only thing needed to go from "injecting a context into your controllers" (as the tutorial does) to "injecting a context into services that you use from your controllers" is adding the services into DI - since the controller and context are already under DI, anything new you add can be injected into them and vice versa.
This should give you a quick introduction into how to make things "just work" and shows you the basic use case of a DI container: you declaratively tell it or it infers "this is an X", "this is an Y", "this is a Z and it needs to be created using an X and a Y"; then when you ask the container to give you a Z, it will automagically first create an X and Y, then create Z with them. They also manage the scope and lifetime of these objects, i.e. only create one of a type for an API request. Beyond that it's a question of experience with them and familiarity with a given container - say Ninject and Autofac are much more powerful than the built-in one - but it's variations on the same idea of declaratively describing how to create an object possibly using other objects (its dependencies) and having the container "figure out" how to wire things together.
I am interested in the architectural solution of the following moment.
I have:
public class GenericRepository<T> : IDisposable {
public GenericRepository(ISession session){
_session = session;
};
public T InsertAsync(T entity){...};
public IQueryable<T> Read(){...};
public T UpateAsync(T entity){...};
public void DeleteAsync(T entity){...};
public Task Commit(){
return _session.Transaction.Commit();
};
public void Dispose(){
if(_session.Transaction.IsActive){
_session.Transaction.Rollback();
}
};
}
public class UserService{
public UserService(GenericRepository<User> repository){...}
public long CreateUser(string userName){
...
_repository.Commit(); // [1]
};
}
public class OrganizationService{
public OrganizationService(GenericRepository<Organization> repository){...}
public int CreateOrganization(string code){
...
_repository.Commit(); // [2]
};
}
The following registration is used:
services.AddScoped<ISession>(x => x.GetRequiredService<NHSessionProvider>().OpenSession());
services.AddScoped(typeof(GenericRepository<>));
services.AddScoped<UserService>();
services.AddScoped<OrganizationService>();
These CreateOrganization and CreateUser can be used independently in any parts of the code:
public IActionResult Post([FromServices] OrganizationService service, [FromBody] string code){
service.CreateOrganization(code);
return Ok();
}
public IActionResult Post([FromServices] UserService service, [FromBody] string userName){
service.CreateUser(userName);
return Ok();
}
However, now I have a new service:
public class MyBillingService{
public MyBillingService(GenericRepository<Contractor> repository, OrganizationService organizationService, UserService userService){...}
public int CreateNewContractor(string organizationCode, string userName){
...
_organizationService.CreateOrganization(organizationCode);
...
_userService.CreateUser(userName);// [3]
...
_repository.Commit(); // [4]
}
}
In this implementation, CreateOrganization and CreateUser have their own transactions, and if [3] throws an exception, then the organization will be created anyway.
Ok, because ISession is registered as Scoped, then I can delete _repository.Commit from CreateOrganization and CreateUser([1] and [2]). In this case, [4] will be responsible for committing all changes.
But what then to do when OrganizationService and UserService are used independently? After all, now they have become non-independent services and cannot save data without delegating the commit of changes to some other service:
public IActionResult Post([FromServices] UserService service, [FromServices] TransactionService transaction, [FromBody] string userName){
service.CreateUser(userName);
transaction.Commit();
return Ok();
}
As far as this decision is a good one?
Transactions requires a unit of work. There is no other way to coordinate repositories. The reason you're facing issues here is that your entire design is wrong.
First and foremost, you should not have these repositories at all. You're using EF Core, which is an ORM, and already implements the repository and unit of work patterns. Using an ORM is opting to use a third-party library for your DAL. Wrapping your own DAL layer around that is pointless and imposes needless maintenance and testing costs on your application with zero benefit. Your services should depend on your context directly.
Then, services should be self-contained units of functionality. If they depend on other services, you're doing it wrong. The service should correspond with a particular subdomain of your application. If users and organization need to be managed together transactionally, then you should have one service that encompasses both.
Alternatively, if you want/need to keep the two separate, then you would need to incorporate the concept of sagas.
So I've started to move more towards what Chris mentioned in his answer and use the ISession directly, but I have used a generic repository in the past. Your repos can't correctly handle transactions that are already started.
So my generic repo has a couple of methods
protected virtual TResult Transact<TResult>(Func<TResult> func)
{
if (_session.Transaction.IsActive)
return func.Invoke();
TResult result;
using (var tx = _session.BeginTransaction(IsolationLevel.ReadCommitted))
{
result = func.Invoke();
tx.Commit();
}
return result;
}
protected virtual void Transact(System.Action action)
{
Transact(() =>
{
action.Invoke();
return false;
});
}
Then the methods that are implementing the repo functionality look like this
public bool Remove(T item)
{
Transact(() => _session.Delete(item));
return true;
}
This allows the method to use an existing Transaction if it is already started, otherwise create your transaction for this work.
You also should not have a Dispose in your repo since you don't own the reference to ISession. It's life cycle should be handled by whoever created that instance.
The generic repository also shouldn't have commit functionality except when it is explicitly starting a new transaction. So now you need to have something that handles starting and committing said transaction. In a web scenario you are typically in a session per request scenario. This would mean you are creating your session in BeginRequest and disposing of it in EndRequest. I then use a transaction attribute to manage creating transactions prior to executing the controller action and commit/rollback after the execution of the controller method.
I'm having an issue in Entity Framework 6 where an exception is consistently thrown. For the most part the application works perfectly fine until I try adding a user to a role via a linking table.
The error being thrown is the following:
The relationship between the two objects cannot be defined because they are attached to different ObjectContext objects.
The functionality will happily add the user to the role virtually but as soon as SaveChanges() is called the process falls over.
I'm aware of the how and why for the above error and after doing some research it's due to the the context not being disposed of correctly. So following on from that and looking into the DbContext setup I've realised IDisposable wasn't added to the configuration. Unfortunately, no matter what I've tried incorporating IDisposable at any point within the application still doesn't dispose of the contexts correctly.
So after spending a fair bit of time and having no luck via Google I'm wondering if any of you have a solution or are able to point me in the right direction.
The following is a cutdown version of the Data Layer classes I've implemented:
public class GenericRepository<T> : WebsiteContext, IGenericRepository<T> where T : class
{
public virtual void Commit()
{
SaveChanges();
}
public virtual void Delete(int id)
{
var record = Set<T>().Find(id);
if (record == null)
throw new Exception("Some Message");
Set<T>().Remove(record);
}
// ... ETC
}
public interface IGenericRepository<T> where T : class
{
void Commit();
// ... ETC
}
public class WebsiteContext : DbContext, IWebsiteContext
{
static WebsiteContext()
{
Database.SetInitializer<WebsiteContext>(null);
}
public WebsiteContext() : base("Name=WebsiteContext") { }
public IDbSet<User> Users { get; set; }
// ... ETC
protected override void OnModelCreating(DbModelBuilder modelBuilder)
{
// ... ETC
}
}
This implementation is loosely based around the following Stackoverflow question.
Entity Framework 6 Code First - Is Repository Implementation a Good One?
The following is a condensed version of the Service Layer class and method which is causing the issue.
private IGenericRepository<User> _userRepository;
private IGenericRepository<ApplicationUserSetting> _userSettingRepository;
private IGenericRepository<ApplicationRole> _roleRepository;
public UserManagementService()
{
_userRepository = new GenericRepository<User>();
_roleRepository = new GenericRepository<ApplicationRole>();
_userSettingRepository = new GenericRepository<ApplicationUserSetting>();
}
public void AssignUserRole(AssignRoleModel model)
{
var user = _userRepository.GetById(model.UserId);
if (user == null)
return;
var role = _roleRepository.GetById(model.RoleId);
if (role == null)
return;
user.Roles.Add(role);
_userRepository.Commit();
}
The issue, just like the error states, is because you have multiple instances of the type DbContext fetching your entities for you. Each fetched entity is then associated with the DbContext instance that retrieved it. If you want to persist changes to these entities it has to occur on the DbContext instance that it is associated with OR you have to attach it to the DbContext instance it is not associated with.
If you are trying to keep it simple I recommend you implement a DI framework like AutoFac. You can then have a single DbContext instance created per request and have it injected everywhere you need it. It will allow you to keep your existing structure (I am not going to comment on that as I consider that out of scope for this question), the end result would be that each injected GenericRepository instance has an injected WebsiteContext instance but the WebsiteContext instances are shared (all the same instance). The upside of that is no more error but the downside is you do have to be aware that any changes to any entities will result in those changes being persisted as soon as you execute the Save functionality.
Using multiple repositories causes the issue. Just use one repository (= one db context) and have different methods for getting the individual types.
E.g. _repository.Get(id)
It's way out of scope to point out how your current implementation could be made to work, but if you did want to use more than one context, you can despite what others have said.
If you do, you will have to detach the entity from the previous context first.
I'm starting a new small project with ASP.NET MVC and Entity Framework. (SQL Server - around 20 DB tables)
In past projects I’ve used Linq2SQL but it seems to be obsolete.
I've read a lot of posts on using repository pattern for EF (pros and cons) , For me it seems better/simpler to code without repository pattern.
I created the following project architecture :
namespace MySite.Models
{
public class User
{
public Int32 ID { get; set; }
public String Email { get; set; }
public String Password { get; set; }
public String Name { get; set; }
public Int32 Gender { get; set; }
}
}
namespace MySite.DAL
{
public class Users
{
public static IEnumerable<User> GetUsers()
{
using (var context = new DatingSiteContext())
{
return context.Users.ToList();
}
}
public static User GetUserByID(int id)
{
using (var context = new DatingSiteContext())
{
return context.Users.Find(id);
}
}
}
namespace MySite.Controllers
{
public class HomeController : Controller
{
public ActionResult Index()
{
ViewBag.Message = "Modify this template to jump-start your ASP.NET MVC application.";
var users = DAL.Users.GetUsers();
return View(users);
}
}
}
What are the disadvantage of using EF like this? (Except lack of unit
testing support)
Is it wrong to create a new DbContext on each call to DAL ? Any Performance hit?
Any other recommended structure for using EF ? Examples? :)
Would you use Linq2SQL in a new project ?
Thank you.
Edit:
The code inside GetUsers() and GetUserByID() is just for example , i understand its a bad practice to return all records from the db (paging or filter in missing)
You actually just created a repository only you call it a 'data access layer' which is, in my opinion, not a good name since Entity Framework is the data access layer. A repository is an abstraction on top of a data access layer, Entity Framework in this case.
Is it wrong to create a new DbContext on each call to DAL ? Any
Performance hit?
Nope, it's just fine, but it might cause trouble when you fetch an entity in one instance of DbContext, and try to update it in another instance.
Would you use Linq2SQL in a new project ?
Nope, Microsoft proposed Entity Framework as the successor of L2SQL and active development of it has stopped.
Any other recommended structure for using EF ? Examples? :)
The approach you use, specific repositories, will result in a lot of redundant code. You could create a generic repository implementing an interface:
public interface IRepository<TEntity>
where TEntity : class, new()
{
IEnumerable<TEntity> GetAll();
TEntity GetById(int id);
IQueryable<TEntity> Table { get; }
}
And an implementation of this:
public EfRepository<TEntity> : IRepository<TEntity>
where TEntity : class, new()
{
private readonly DatingSiteContext _context;
public EfRepository()
{
_context = new DatingSiteContext();
}
private IDbSet<TEntity> Entities
{
get
{
return _context.Set<TEntity>();
}
}
public IEnumerable<TEntity> GetAll()
{
return Entities.ToList();
}
public TEntity GetById(int id)
{
return Entities.Find(id);
}
public IQueryable<TEntity> Table
{
get { return Entities; }
}
}
You can use this repository in your controller like this:
public class HomeController : Controller
{
private readonly IRepository<User> _userRepository;
public HomeController()
{
_userRepository = new EfRepository<User>();
}
public ActionResult Index()
{
var users = _userRepository.GetAll();
var inactiveUsers = _userRepository.Table.Where(u => !u.Active).ToList();
}
}
This generic repository allows you to create mocked repositories:
public class FakeUserRepository : IRepository<User>
{
// ...
}
This approach might seem like a lot of code, but as your entity type amount grows, it will save you a lot of work since all you have to do is create an IRepository<> field in a controller. Yet you have a lot of flexibility with the IQueryable<> property which allows deferred execution.
I'm not saying this is the best method, just one I use regularly in projects. I have to say that I usually write a business (service) layer between the controller and the repositories. I keep my business logic and complex Linq queries (and their execution) there. I also use an IoC container which handles the lifetime of my objects (instances of DbContext and services for example). See this question for more information about that.
My thoughts
Whats the disadvantages:
You cant really unit test anywhere that uses the static methods you have defined in your DAL.
They are also strongly coupled making them more difficult to swap out at runtime, if that became a requirement.
You may start to get additional complications if you need to commit several updates in a transaction
Is it wrong to create a new DbContext on each call?
No, this is fine. The DbContext is lightweight and meant to be used this way.
Other patterns
You already mentioned the repository pattern which is pretty solid, especially when used with a unit of work pattern.
Would you use Linqtosql
No - Linqtosql is pretty much done with, entity framework provides a more complete and generally better solution to this problem
I would rethink how you implemented GetUsers(). You are calling ToList() which will cause all the rows from the underlying table to be returned and stored in memory. If the tables grows large enough you will run into performance issues. It's better to return an IQueryable<User> instead and have your method return context.Users.
Of course you'll run into the problem that the context has already been disposed by the time you execute the IQueryable<>, so you'll need to handle the life cycle of the context in a different way.
If the project is small enough, then you can just store an instance of the Context at the Controller level, and dispose of it when the controller is being disposed. If you do that, make sure you don't do anything in your views that would cause additional queries to be executed (e.g. access a collection off of User if one exists) or else that will error out.
I've encountered what seems to be a common problem: I am updating values in my database, but EF is using its original in-memory copy of the object and these changed values are not reflected in the displayed data. I understand why this is, but I can't figure out a way around it.
The most common solution seems to be to set MergeOptions.NoTracking to turn off change tracking completely (or use the AsNoTracking() extension method when querying) and force a refresh every time the object is accessed, which is fine for my purposes.
I've got a generic base repository which my other repositories inherit from:
public abstract class RepositoryBase<T> where T : class
{
private readonly IDbSet<T> _dbset;
private readonly IUnitOfWork _unitOfWork;
protected RepositoryBase(IUnitOfWork unitOfWork)
{
_unitOfWork = unitOfWork;
_dbset = _unitOfWork.Database.Set<T>();
}
public virtual IQueryable<T> All()
{
return _dbset;
}
// Some other IQueryable methods here (Query, GetByProductCode etc)
public virtual T Get(long id)
{
return _dbset.Find(id);
}
}
And a DbContext like this:
public class Db : DbContext
{
private IDbSet<Product> _products;
public IDbSet<Product> Products
{
get { return _products ?? (_products = DbSet<Product>()); }
}
public virtual IDbSet<T> DbSet<T>() where T : class
{
return Set<T>();
}
public virtual void Commit()
{
base.SaveChanges();
}
}
If I change the All() method of my repository thus:
public virtual IQueryable<T> All()
{
return _dbset.AsNoTracking();
}
I get the desired result - an update in the database is reflected when the page displaying the products is refreshed. However, I can't do this in the Get() method, as that extension method only works on an IQueryable.
Ideally I'd like to turn this off at the DbContext level as I will never need change tracking, but there doesn't seem to be an obvious way to do this, and there is pretty much zero documentation on the subject (unless someone can point me to some? Please!).
I tried adding a constructor to the DbContext with these configuration options disabled:
public Db()
{
base.Configuration.ProxyCreationEnabled = false;
base.Configuration.AutoDetectChangesEnabled = false;
}
But I must admit I'm only guessing as to what they really do (I only found them through looking at the source code), and they don't seem to have any effect anyway.
Any help would be greatly appreciated. If more info/code would help, please let me know.
If you want to force context to get fresh data each time you don't want to use Find method. Find method always query internal storage first. Use this instead:
public virtual T Get(long id)
{
return All().SingleOrDefault(e => e.Id == id);
}
But I don't understand what do you need this? What do you mean by:
an update in the database is reflected
when the page displaying the products
is refreshed
Context is unit of work. It should be used as unit of work - in web application or web service it means creating new context instance per request. In winforms / wpf application it means using context per logical block (per presenter etc). Because of that you should need this only in very specific scenarios but you want it globally. Your description seems like you are reusing context among requests which is completely bad solution. There are no performance costs in recreating context for each request.