just wondering if I dispose my dbcontext object correctly here or should I be using the using block instead?
public class RepoBankAccount : IBankAccount
{
private AppDbContext db = null;
public RepoBankAccount()
{
this.db = new AppDbContext();
}
public RepoBankAccount(AppDbContext db)
{
this.db = db;
}
public IEnumerable<BankAccount> ViewAllBankAccount()
{
return db.BankAccounts.ToList();
}
public BankAccount ViewBankAccount(long accountNumber)
{
return db.BankAccounts.Where(b => b.AccountNumber.Equals(accountNumber)).SingleOrDefault();
}
public void DeleteBankAccount(BankAccount bankAccount)
{
db.BankAccounts.Remove(bankAccount);
Save();
}
public void InsertBankAccount(BankAccount bankAccount)
{
db.BankAccounts.Add(bankAccount);
Save();
}
public void Save()
{
try
{
db.SaveChanges();
}
catch(Exception ex)
{
System.Console.WriteLine("Error:" + ex.Message);
}
finally
{
if(db != null)
db.Dispose();
}
}
}
I read that I should not be calling dispose manually from
https://softwareengineering.stackexchange.com/questions/359667/is-it-ok-to-create-an-entity-framework-datacontext-object-and-dispose-it-in-a-us
But in some sample code, I also notice this scaffolding code but not too clear how it does the job on its own.
protected override void Dispose(bool disposing)
{
if (disposing)
{
db.Dispose();
}
base.Dispose(disposing);
}
DbContexts are designed to be short-lived. The very first initialization and use of a DbContext presents a spin up cost to resolve the entity mappings, but aside from that the context can be scoped to individual calls, or sets of calls. Your code will work fine and so long as your repo is disposed, the dbContext will be cleaned up. There are pitfalls with this approach though in that as the product matures it is easy to forget to dispose something, and these DbContexts can soak up a fair bit of memory if they are long-lived.
To avoid issues with entities that become disconnected from their DbContext, an entity should never leave the scope of it's DbContext. If it does, you run into errors if a lazy load gets triggered for example.
For instance lets say I have a method in a Controller or such that does something like this:
(Note: I don't advocate ever returning Entities to a view, but for example's sake...)
public ActionResult View(long accountNumber)
{
BankAccount bankAccount;
using (var repo = new RepoBankAccount())
{
bankAccount = repo.ViewBankAccount(accountNumber);
}
return new View(bankAccount);
}
The repo will be disposed, and if bank account either has no references, or all references are eager loaded, this call would work just fine. However, if there is a lazy load call, the controller method will fail because the DbContext associated with the Bank Account was disposed.
This can be compensated for by ensuring the return occurs inside the scope of the using block:
public ActionResult View(long accountNumber)
{
using (var repo = new RepoBankAccount())
{
BankAccount bankAccount = repo.ViewBankAccount(accountNumber);
return new View(bankAccount);
}
}
To help avoid issues like this, it is generally a better idea to create POCO view model classes to populate within the scope of the DbContext from the entities, then return those view models. No surprise lazy load hits etc.
Where this really starts to crumble apart is when you want to coordinate things like updates across entities to ensure that updates are committed or rolled back together. Each of your repo classes are going to have separate DbContext instances.
The first default approach to get familiar with to address this is Dependency Injection and Inversion of Control, particularly an IoC container such as Autofac, Unity, Ninject, or Castle Windsor. Using these, you can have your repository classes accept a dependency on a DbContext, and they can scope a single instance of a Dependency across a lifetime. (such as per HTTP Request for example) In this way, the references of all of your repositories in a single session call will be provided the same DbContext instance. A call to SaveChanges() will attempt to commit all pending changes.
A better pattern is the Unit of Work pattern where the scope of the DbContext is moved outside of the repository and each repository is either provided a reference to the DbContext, or can locate it. (similar to how the IoC pattern works) The advantage of UoW patterns is that you can move control of the commit/rollback out to the consumer of the repositories I promote the use of Mehdime's DbContextScope since it negates the need to pass around references to the UoW/DbContext.
Mehdime DbContextScope
(EF6 original github)
EFCore supported Port
Related
I'm working on a classic .Net Framework Web API solution.
I have 3 layers. Let's call them
MVC - with POST, GET, UPDATE, DELETE controllers.
BIZZ - for business with my service class. My service class are king of repositories with CREATE, READ, UPDATE, DELETE and specific methods.
DATA - with POCO and definition of DB context.
I will not develop the EF layer. It is a classic Entity Framework project with POCO.Here is a sample of a Service and with BaseService class
public abstract class Service : IDisposable
{
protected DbContext dbContext = new DbContext();
public void Dispose()
{
dbContext.Dispose();
}
}
Then I have a cart service and a order service. They are similar in their structure so I will only write the code useful for this example.
public class CartService : Service
{
public Cart Create(Cart cart)
{
// Create the cart
}
public Cart Read(Guid id)
{
// Read
}
public Cart Update(Cart cart)
{
// I do some check first then
}
public void Delete(Cart cart)
{
// Delete
}
public void Checkout(Cart cart)
{
// Validation of cart removed in this example
dbContext.Cart.Attach(cart);
cart.DateCheckout = DateTime.UtcNow;
dbContext.Entry(cart).State = EntityState.Modified; // I think this line can be removed
dbContext.SaveChanges();
using (var orderService = new OrderService())
{
foreach (var order in cart.Orders)
{
order.DateCheckout = cart.DateCheckout;
order.Status = OrderStatus.PD; // pending
orderService.Update(order);
}
}
}
}
public class OrderService : Service
{
public Cart Create(Cart cart)
{
// Create the cart
}
public Cart Read(Guid id)
{
// Read
}
public Cart Update(Cart cart)
{
dbContext.Entry(order).State = EntityState.Modified;
dbContext.SaveChanges();
// More process here...
return order;
}
public void Delete(Cart cart)
{
// Delete
}
}
So, I have a service, cart service, that call another service, order service. I must work like this because I cannot simply accept the cart and all orders in it as it is. When I save a new order or update an existing order I must create a record in some other tables in other databases. The code is not in my example. So, I repeat I have a service that call another service and then I have 2 dbContext. At best this just create 2 context in memory, at worst this create exception. Exception like you cannot attach an entity to 2 contexts or this entity is not in context.
Well, I would like all my service use the same context. I suppose you will al tell me to use Dependency Injection. Yes, well ok but I don't want, each time I create a new service have to pass the context. I don't want to have to do that:
public void Checkout(Cart cart)
{
// ...
using (var orderService = new OrderService(dbContext))
{
// ...
}
}
I would like to do something that impact my base service only if possible. A singleton maybe... At this point I can see your face. Yes I know Singleton are soo bad. Yes but i'm doing a IIS Web API. Each request is a new instance. I don't care about the impact of the singleton. And I can load my database by changing the connection string in config file so the benefit of DI is there already. Well, I also know it is possible to have singleton with DI. I just don't know how.
So, what can I do to be sure I share my dbContext with all my services?
Disclaimer: This example is not intended to be a "good" one and certainly does not follow best practices, but faced with an existing legacy code base which from your example already suffers from a number of questionable practices, this should get you past the multiple context issues.
Essentially if you're not already using a IoC Container to perform dependency injection then what you need is to introduce a unit of work to manage the scope of a DbContext where your base Service class provides a DbContext provided by the unit of work. (Essentially a DbContext Registry)
For the unit of work and assuming EF6 I would recommend Mehdime's DbContextScope which is available as a NuGet package. Alternatively you can find the source code on Github and implement something similar without too much trouble. I like this pattern because it leverages the CallContext to serve as the communication layer between the ContextScope (Unit of Work) created by the DbContextScopeFactory and the AmbientDbContextScope. This will probably take a little time to get your head around but it injects very nicely into legacy applications where you want to leverage the Unit of Work and don't have dependency injection.
What it would look like:
In your Service class you would introduce the AmbientDbContextLocator to resolve your DbContext:
private readonly IAmbientDbContextLocator _contextLocator = new AmbientDbContextLocator();
protected DbContext DbContext
{
get { return _contextLocator.Get<DbContext>(); }
}
And that's it. Later as you refactor to accommodate Dependency injection, just inject the AmbientDbContextLocator instead of 'new'ing it up.
Then, in your web API controllers where you are using your services, (not the services themselves) you need to add the DbContextScopeFactory instance..
private readonly IDbContextScopeFactory _contextScopeFactory = new DbContextScopeFactory();
Lastly, in your API methods, when you want to call your services, you need to simply use the ContextScopeFactory to create a context scope. The AmbientDbContextLocators will retrieve the DbContext from this context scope. The context scope you create with the factory will be done in a using block to ensure your contexts are disposed. So, using your Checkout method as an example, it would look like:
In your Web API [HttpPost] Checkout() method:
using (var contextScope = _contextScopeFactory.Create())
{
using(var service = new CartService())
{
service.Checkout();
}
contextScope.SaveChanges();
}
Your cart service Checkout method would remain relatively unchanged, only instead of accessing dbContext as a variable (new DbContext()) it will access the DbContext property which gets the context through the context locator.
The Services can continue to call DbContext.SaveChanges(), but this isn't necessary and the changes will not be committed to the DB until the contextScope.SaveChanges() is called. Each service will have its own instance of the Context Locator rather than the DbContext and these will be dependent on you defining a ContextScope to function. If you call a Service method that tries to access the DbContext without being within a using (var contextScope = _contextScopeFactory.Create()) block you will receive an error. This way all of your service calls, even nested service calls (CartService calls OrderService) will be interacting with the same DbContext instance.
Even if you just want to read data, you can leverage a slightly faster DbContext using _contextScopeFactory.CreateReadOnly() which will help guard against unexpected/disallowed calls to SaveChanges().
When using the ASP.NET Core stack, the tutorial for using EF with it defaults to using DI to provide your DB context, just not with a service layer. That said, it actually does the right thing for this out of the box. I'll give a brief rundown of the bare minimum necessary for this to work, using whatever the latest versions of ASP.NET Core Web API and EF Core were on NuGet at the time of writing.
First, let's get the boilerplate out of the way, starting with the model:
Models.cs
public class ShopContext : DbContext
{
public ShopContext(DbContextOptions options) : base(options) {}
// We add a GUID here so we're able to tell it's the same object later.
public string Id { get; } = Guid.NewGuid().ToString();
public DbSet<Cart> Carts { get; set; }
public DbSet<Order> Orders { get; set; }
}
public class Cart
{
public string Id { get; set; }
public string Name { get; set; }
}
public class Order
{
public string Id { get; set; }
public string Name { get; set; }
}
Then some bare-bones services:
Services.cs
public class CartService
{
ShopContext _ctx;
public CartService(ShopContext ctx)
{
_ctx = ctx;
Console.WriteLine($"Context in CartService: {ctx.Id}");
}
public async Task<List<Cart>> List() => await _ctx.Carts.ToListAsync();
public async Task<Cart> Create(string name)
{
return (await _ctx.Carts.AddAsync(new Cart {Name = name})).Entity;
}
}
public class OrderService
{
ShopContext _ctx;
public OrderService(ShopContext ctx)
{
_ctx = ctx;
Console.WriteLine($"Context in OrderService: {ctx.Id}");
}
public async Task<List<Order>> List() => await _ctx.Orders.ToListAsync();
public async Task<Order> Create(string name)
{
return (await _ctx.Orders.AddAsync(new Order {Name = name})).Entity;
}
}
The only notable things here are: the context comes in as a constructor parameter as God intended, and we log the ID of the context to verify when it gets created with what.
Then our controller:
ShopController.cs
[ApiController]
[Route("[controller]")]
public class ShopController : ControllerBase
{
ShopContext _ctx;
CartService _cart;
OrderService _order;
public ShopController(ShopContext ctx, CartService cart, OrderService order)
{
Console.WriteLine($"Context in ShopController: {ctx.Id}");
_ctx = ctx;
_cart = cart;
_order = order;
}
[HttpGet]
public async Task<IEnumerable<string>> Get()
{
var carts = await _cart.List();
var orders = await _order.List();
return (from c in carts select c.Name).Concat(from o in orders select o.Name);
}
[HttpPost]
public async Task Post(string name)
{
await _cart.Create(name);
await _order.Create(name);
await _ctx.SaveChangesAsync();
}
}
As above, we take the context as a constructor parameter to triple-check it's what it should be; we also need it to call SaveChanges at the end of an operation. (You can refactor this out of controllers if you want to, but they'll work just fine as units of work for now.)
The part that ties this together is the DI configuration:
Startup.cs
public void ConfigureServices(IServiceCollection services)
{
services.AddControllers();
// Use whichever provider you have here, this is where you grab a connection string from the app configuration.
services.AddDbContext<ShopContext>(options =>
options.UseInMemoryDatabase("Initrode"));
services.AddScoped<CartService>();
services.AddScoped<OrderService>();
}
AddDbContext() defaults to registering a DbContext to be created per-request by the container. Web API provides the AddControllers method that puts those into the DI container, and we also register our services manually.
The rest of Startup.cs I've left as-is.
Starting this up and opening https://localhost:5001/shop should log something like:
Context in CartService: b213966e-35f2-4cc9-83d1-98a5614742a3
Context in OrderService: b213966e-35f2-4cc9-83d1-98a5614742a3
Context in ShopController: b213966e-35f2-4cc9-83d1-98a5614742a3
with the same GUID for all three lines in a request, but a different GUID between requests.
A little additional explanation of what goes on above:
Registering a component in a container (using Add() and such above) means telling the container those components exist and that it should create them for you when asked, as well as what identifiers they're available under and how to create them. The defaults for this are more or less "make the component available as its class, and create it by calling its one public constructor, passing other registered components into it" - the container looks at the constructor signature to figure this out.
"Scoped" in an ASP.NET Core app means "per-request." I think in this case one could also use services with a transient lifetime - a new one created every time it's needed, but they'll still get the same DbContext as long as they're created while handling the same request. Which one to do is a design consideration; the main constraint is that you can't inject shorter-lived components into longer-lived components without having to use more complex techniques, which is why I favour having all components as short-lived as possible. In other words, I only make things longer-lived when they actually hold some state that needs to live for that time, while also doing that as sparingly as possible because state bad. (Just recently I had to refactor an unfortunate design where my services were singletons, but I wanted my repositories to be per-request so as to be able to inject the currently logged in user's information into the repository to be able to automatically add the "created by" and "updated by" fields.)
You'll note that with support for doing things this way being built-in to both ASP.NET Core and EF Core, there's actually very little extra code involved. Also, the only thing needed to go from "injecting a context into your controllers" (as the tutorial does) to "injecting a context into services that you use from your controllers" is adding the services into DI - since the controller and context are already under DI, anything new you add can be injected into them and vice versa.
This should give you a quick introduction into how to make things "just work" and shows you the basic use case of a DI container: you declaratively tell it or it infers "this is an X", "this is an Y", "this is a Z and it needs to be created using an X and a Y"; then when you ask the container to give you a Z, it will automagically first create an X and Y, then create Z with them. They also manage the scope and lifetime of these objects, i.e. only create one of a type for an API request. Beyond that it's a question of experience with them and familiarity with a given container - say Ninject and Autofac are much more powerful than the built-in one - but it's variations on the same idea of declaratively describing how to create an object possibly using other objects (its dependencies) and having the container "figure out" how to wire things together.
I'm trying to wrap a transaction around 2 or more database operations which occur in different repository classes. Each repository class uses a DbContext instance, using Dependency Injection. I'm using Entity Framework Core 2.1.
public PizzaService(IPizzaRepo pizzaRepo, IPizzaIngredientsRepo ingredientRepo)
{
_pizzaRepo = pizzaRepo;
_ingredientRepo = ingredientRepo;
}
public async Task SavePizza(PizzaViewModel pizza)
{
using (var scope = new TransactionScope(TransactionScopeOption.Required, new TransactionOptions { IsolationLevel = IsolationLevel.ReadCommitted }))
{
int pizzaRows = await _pizzaRepo.AddEntityAsync(pizza.Pizza);
int ingredientRows = await _ingredientRepo.PutIngredientsOnPizza(
pizza.Pizza.PizzaId,
pizza.Ingredients.Select(x => x.IngredientId).ToArray());
scope.Complete();
}
}
}
Obviously, if one of the operations fails, I want to rollback the entire thing.
Will this transaction scope be enough to rollback or should the repository classes have transactions on their own?
Even if above methods works, are there better ways to implement transactions?
Repository patterns are great for enabling testing, but do not have a repository new up a DbContext, share the context across repositories.
As a bare-bones example (assuming you are using DI/IoC)
The DbContext is registered with your IoC container with a lifetime scope of Per Request. So at the onset of the service call:
public PizzaService(PizzaDbContext context, IPizzaRepo pizzaRepo, IPizzaIngredientsRepo ingredientRepo)
{
_context = pizzaContext;
_pizzaRepo = pizzaRepo;
_ingredientRepo = ingredientRepo;
}
public async Task SavePizza(PizzaViewModel pizza)
{
int pizzaRows = await _pizzaRepo.AddEntityAsync(pizza.Pizza);
int ingredientRows = await _ingredientRepo.PutIngredientsOnPizza(
pizza.Pizza.PizzaId,
pizza.Ingredients.Select(x => x.IngredientId).ToArray());
_context.SaveChanges();
}
Then in the repositories:
public class PizzaRepository : IPizzaRepository
{
private readonly PizzaDbContext _pizzaDbContext = null;
public PizzaRepository(PizzaDbContext pizzaDbContext)
{
_pizzaDbContext = pizzaDbContext;
}
public async Task<int> AddEntityAsync( /* params */ )
{
PizzaContext.Pizzas.Add( /* pizza */)
// ...
}
}
The trouble I have with this pattern is that it restricts the unit of work to the request, and only the request. You have to be aware of when and where the context save changes occurs. You don't want repositories for example to call SaveChanges as that could have side effects depending on what was changed as far as the context goes prior to that being called.
As a result I use a Unit of Work pattern to manage the lifetime scope of the DbContext(s) where repositories no longer get injected with a DbContext, they instead get a locator, and the services get a context scope factory. (Unit of work) The implementation I use for EF(6) is Mehdime's DbContextScope. (https://github.com/mehdime/DbContextScope) There are forks available for EFCore. (https://www.nuget.org/packages/DbContextScope.EfCore/) With the DBContextScope the service call looks more like:
public PizzaService(IDbContextScopeFactory contextScopeFactory, IPizzaRepo pizzaRepo, IPizzaIngredientsRepo ingredientRepo)
{
_contextScopeFactory = contextScopeFactory;
_pizzaRepo = pizzaRepo;
_ingredientRepo = ingredientRepo;
}
public async Task SavePizza(PizzaViewModel pizza)
{
using (var contextScope = _contextScopeFactory.Create())
{
int pizzaRows = await _pizzaRepo.AddEntityAsync(pizza.Pizza);
int ingredientRows = await _ingredientRepo.PutIngredientsOnPizza(
pizza.Pizza.PizzaId,
pizza.Ingredients.Select(x => x.IngredientId).ToArray());
contextScope.SaveChanges();
}
}
Then in the repositories:
public class PizzaRepository : IPizzaRepository
{
private readonly IAmbientDbContextLocator _contextLocator = null;
private PizzaContext PizzaContext
{
get { return _contextLocator.Get<PizzaContext>(); }
}
public PizzaRepository(IDbContextScopeLocator contextLocator)
{
_contextLocator = contextLocator;
}
public async Task<int> AddEntityAsync( /* params */ )
{
PizzaContext.Pizzas.Add( /* pizza */)
// ...
}
}
This gives you a couple benefits:
The control of the unit of work scope remains clearly in the service. You can call any number of repositories and the changes will be committed, or rolled back based on the determination of the service. (inspecting results, catching exceptions, etc.)
This model works extremely well with bounded contexts. In larger systems you may split different concerns across multiple DbContexts. The context locator serves as one dependency for a repository and can access any/all DbContexts. (Think logging, auditing, etc.)
There is also a slight performance/safety option for Read-based operations using the CreateReadOnly() scope creation in the factory. This creates a context scope that cannot be saved so it guarantees no write operations get committed to the database.
The IDbContextScopeFactory and IDbContextScope are easily mock-able so that your service unit tests can validate if a transaction is committed or not. (Mock an IDbContextScope to assert SaveChanges, and mock an IDbContextScopeFactory to expect a Create and return the DbContextScope mock.) Between that and the Repository pattern, No messy mocking DbContexts.
One caution that I see in your example is that it appears that your View Model is serving as a wrapper for your entity. (PizzaViewModel.Pizza) I'd advise against ever passing an entity to the client, rather let the view model represent just the data that is needed for the view. I outline the reasons for this here.
I have been trying to create a Repository Pattern along with Dependency injection, But Looks like I am missing some simple step. Here is my code
public class HomeController
{
private readonly ILoggingRepository _loggingRepository;
public HomeController(ILoggingRepository loggingRepository)
{
_loggingRepository = loggingRepository;
}
public void MyMethod()
{
string message = "MyMessage Called";
_loggingRepository .LogMessage(message);
}
}
// ILoggingRepository.cs
public interface ILoggingRepository
{
void LogMessage(string message);
}
// LoggingRepository.cs
public class LoggingRepository : ILoggingRepository
{
public void LogMessage(string message)
{
using (var dbContext = new DbContext())
{
var serviceLog = new Log() { Message = message, Logged = DateTime.UtcNow };
dbContext.Logs.Add(serviceLog);
dbContext.SaveChanges();
}
}
}
This works perfectly all right so far, but the problem arises when i make more than one repository calls.
Now I know that Entity framework 6.0 has inbuilt unit of work representation so I didn't created a UnitofWork Interface or class
But the problem appears when I do something like this in two different transactions. Lets say
Area area = _areaRepository.GetArea(); // Line 1
area.Name = "NewArea"; // Line 2
_areaRepository.SaveArea(area); // Line 3
now because it _areaRepository creates a new DbContext in Line 3, it doesn't changes the name of area as it doesn't consider EntityState.Modified
I have to explicitly set that, which isn't correct.
So I guess I need to do all this in single Transaction, Where I am doing wrong here ?
What is the correct and best way to achieve this, Should I inject my DbContext also into the repository?
This is how I doit all times:
If dont use Repository or Unit of Work layers, because Entity Framework db Context already implements those patterns. So, I only have a Service layer:
public interface IBaseService<VO, ENT>{
IQueryable<VO> GetAll();
VO Get(object id);
}
public abstract class BaseService<VO, ENT> : IBaseService<VO, ENT>{
MyContext db;
public BaseService(MyContext db){
this.db = db;
}
public IQueryable<VO> GetAll(){
return db.Set<ENT>().ProjectTo<VO>();
}
}
A service class have a dbContext injected in the constructor. This classes are located in a Service library. Then, how the dbContext and the service are resolved is a problem of the project who will be using them. The ProjectTo method is an extension for IQueryable from the Automapper Nuget. For example:
A Windows Service needs all services instance in the same thread shares the same dbContext. So, in the windows service project, I use Ninject https://www.nuget.org/packages/Ninject/4.0.0-beta-0134, this library is a dependency resolver, wich I use to configure how dependencies are builded, creating a Kernel, like this:
var kernel = new StandardKernel();
kernel.Bind<MyContext>().ToSelf().InThreadScope();
kernel.Bind<IServiceImplInterface>().To<ServiceImplClass>().InThreadScope();
I you are creating a Web project, you will need to install a aditional nuget (Ninject.WebCommon, Ninject.Web.COmmon.WebHost, Ninject.MVC5) to provide a .InRequestScope() method to the binding configuration, like this:
var kernel = new StandardKernel();
kernel.Bind<MyContext>().ToSelf().InRequestScope();
kernel.Bind<IServiceImplInterface>().To<ServiceImplClass>().InRequestScope();
You need setup those kernel when the app startup. In a web project is in the global.asax, in a windows service project, should be in the Service constructor:
You can visit www.ninject.org/learn.html to learn more about ninject. But, there are othres like Autofac or Caste Windsor, it is up to you. If you like to keep using the repository pattern, just use Ninject inject them into the Service layer, like i did with the dbContext.
The best approach is to have one instance of DbContext, injecting it on each repository implementation. That way you will have a single instance of the database context, so EF will be able to detect changes on the entity objects.
If you need to use isolated dbContexts as in your example, then you need to explicitly set the state of the object as Modified.
Depending on the type of project, you should set the context on a specific scope. For example, for web applications one option is to use instance per Web request (per lifetime scope). Check this url where you can see a good explanation of the different instance scopes.
The using statement simply creates a new scope, executing the Dispose() method after the code block. EF does a lot on the background to maintain the UoW and state of the objects, but in your case, with the using, you are not using this fature.
First, a DbContext is a repository. If you want to wrap it in a custom repository, they should have the same lifecycle.
Second, your Unit-of-work is your controller. The repository should be scoped to unit-of-work.
This means that your repository needs to be Disposable, since the DbContext is.
So something like:
public interface ILoggingRepository : IDisposable
{
void LogMessage(string message);
}
// LoggingRepository.cs
public class LoggingRepository : ILoggingRepository
{
MyDbContext db;
public LoggingRepository(MyDbContext db)
{
this.db = db;
}
public void Dispose()
{
db.Dispose();
}
public void LogMessage(string message)
{
var serviceLog = new MonitoringServiceLog() { Message = message, Logged = DateTime.UtcNow };
db.MonitoringServiceLogs.Add(serviceLog);
db.SaveChanges();
}
}
If your ILoggingRepository wan't a database, it might be a file or something else that is expensive to create or open and needs to be closed.
If there is a class which works as a data access layer, and provides the functionality of CRUD operations for entities, Which version is the best practice when considering performance & Multi threaded environment (i.e methods of this class are called by multiple threads simultaneously . .). . .
Version 1:
DbContext created at class level, shared by all the methods . .
class EmployeeService{
private DbContext db=new DbContext();
public Employee GetEmployee(int id)
return db.Employees.Find(id);
}
public void AddEmployee(Employee employee){
db.Employees.Add(employee);
db.SaveChanges();
}
}
Version 2:
DbContext for each method call . .
class EmployeeService{
public Employee GetEmployee(int id){
using(DbContext db=new DbContext()){
return db.Employees.Find(id);
}
}
public void AddEmployee(Employee employee){
using(DbContext db=new DbContext()){
db.Employees.Add(employee);
db.SaveChanges();
}
}
}
UPDATE:
May be the question posted is too generic in scope which leads several points to consider.
The point of interest is, the cost of instantiating DbContext object. Can it be created per request (Version 2) or is it heavy weight object and better to create few instances and share them across different calls (Version 1)
There's even a third approach based on manual or automatic dependency injection:
public interface ISomeService
{
// Interface members
}
public class SomeService : ISomeService
{
public SomeService(DbContext dbContext)
{
DbContext = dbContext;
}
private DbContext DbContext { get; }
}
Then, SomeService won't be the responsible of defining the life-time of injected DbContext, but it's an external class who does it.
That way, your service focuses on doing just what's meant to do (working with the domain and reading/writing data).
Depending on the execution environment, you'll want different DbContext life-styles: per service instance, per request, per thread... There're a lot of choices here depending on the case.
Perhaps you're not considering another scenario: a shared transaction between two or more services. You would need to transfer the responsibility of instantiating DbContext to an upper layer, and then you would inject same DbContext on all participating services, and you would confirm or abandon the whole transaction globally.
Performance of EF is made up of several factors, scope of the DbContext being one of them.
Some background information regarding the scope is found here: https://msdn.microsoft.com/en-us/data/jj729737.aspx
Scope is not only about performance, it's also about the objects returned. If lazy loading is applied and the DbContext is disposed before you access some navigation properties it will give an exception. See http://www.entityframeworktutorial.net/EntityFramework4.3/lazy-loading-with-dbcontext.aspx
You could write code like this:
public class EmployeeService
{
public EmployeeDto GetEmployee(int id)
{
using(DbContext db=new DbContext())
{
return db.Employees.Select(e =>
new EmployeeDto
{
Id = e.Id,
Name = e.Name,
Department = e.Department.Name
}).First(e => e.Id == id);
}
}
}
public class EmployeeDto
{
public int Id { get;set;}
public string Name { get;set;}
public string Department { get;set;}
}
So rather than return the whole object you can use projection to limit the set of data that is returned. This you can use to reduce load on the database server as the queries will be less verbose but it also helps to load all the required data before disposing the context. See http://www.entityframeworktutorial.net/querying-entity-graph-in-entity-framework.aspx for some examples of generated queries.
So my advice is to limit the scope of the DbContext. You can inject the DbContext but then you will have no control over the DbContext It will lead to errors like this: http://wallacekelly.blogspot.nl/2012/01/linq-to-entities-objectdisposedexceptio.html
But it all depends on your needs and on what kind of service it is you are building.
I'm having an issue in Entity Framework 6 where an exception is consistently thrown. For the most part the application works perfectly fine until I try adding a user to a role via a linking table.
The error being thrown is the following:
The relationship between the two objects cannot be defined because they are attached to different ObjectContext objects.
The functionality will happily add the user to the role virtually but as soon as SaveChanges() is called the process falls over.
I'm aware of the how and why for the above error and after doing some research it's due to the the context not being disposed of correctly. So following on from that and looking into the DbContext setup I've realised IDisposable wasn't added to the configuration. Unfortunately, no matter what I've tried incorporating IDisposable at any point within the application still doesn't dispose of the contexts correctly.
So after spending a fair bit of time and having no luck via Google I'm wondering if any of you have a solution or are able to point me in the right direction.
The following is a cutdown version of the Data Layer classes I've implemented:
public class GenericRepository<T> : WebsiteContext, IGenericRepository<T> where T : class
{
public virtual void Commit()
{
SaveChanges();
}
public virtual void Delete(int id)
{
var record = Set<T>().Find(id);
if (record == null)
throw new Exception("Some Message");
Set<T>().Remove(record);
}
// ... ETC
}
public interface IGenericRepository<T> where T : class
{
void Commit();
// ... ETC
}
public class WebsiteContext : DbContext, IWebsiteContext
{
static WebsiteContext()
{
Database.SetInitializer<WebsiteContext>(null);
}
public WebsiteContext() : base("Name=WebsiteContext") { }
public IDbSet<User> Users { get; set; }
// ... ETC
protected override void OnModelCreating(DbModelBuilder modelBuilder)
{
// ... ETC
}
}
This implementation is loosely based around the following Stackoverflow question.
Entity Framework 6 Code First - Is Repository Implementation a Good One?
The following is a condensed version of the Service Layer class and method which is causing the issue.
private IGenericRepository<User> _userRepository;
private IGenericRepository<ApplicationUserSetting> _userSettingRepository;
private IGenericRepository<ApplicationRole> _roleRepository;
public UserManagementService()
{
_userRepository = new GenericRepository<User>();
_roleRepository = new GenericRepository<ApplicationRole>();
_userSettingRepository = new GenericRepository<ApplicationUserSetting>();
}
public void AssignUserRole(AssignRoleModel model)
{
var user = _userRepository.GetById(model.UserId);
if (user == null)
return;
var role = _roleRepository.GetById(model.RoleId);
if (role == null)
return;
user.Roles.Add(role);
_userRepository.Commit();
}
The issue, just like the error states, is because you have multiple instances of the type DbContext fetching your entities for you. Each fetched entity is then associated with the DbContext instance that retrieved it. If you want to persist changes to these entities it has to occur on the DbContext instance that it is associated with OR you have to attach it to the DbContext instance it is not associated with.
If you are trying to keep it simple I recommend you implement a DI framework like AutoFac. You can then have a single DbContext instance created per request and have it injected everywhere you need it. It will allow you to keep your existing structure (I am not going to comment on that as I consider that out of scope for this question), the end result would be that each injected GenericRepository instance has an injected WebsiteContext instance but the WebsiteContext instances are shared (all the same instance). The upside of that is no more error but the downside is you do have to be aware that any changes to any entities will result in those changes being persisted as soon as you execute the Save functionality.
Using multiple repositories causes the issue. Just use one repository (= one db context) and have different methods for getting the individual types.
E.g. _repository.Get(id)
It's way out of scope to point out how your current implementation could be made to work, but if you did want to use more than one context, you can despite what others have said.
If you do, you will have to detach the entity from the previous context first.