In this question I asked about NHibernate session lifetime. I'm using a desktop application, but with client/server separation, so the conclusion is that I will use one session per server request, as the server side is where all the NHibernate magic happens.
My problem now is how to handle it. I've had problems before with loading of referenced data when the session is prematurely closed. The problem is that I see the following on my referenced classes when debugging - hence the referenced data isn't loaded yet:
base {NHibernate.HibernateException} = {"Initializing[MyNamespace.Foo#14]-failed to lazily initialize a collection of role: MyNamespace.Foo.Bars, no session or session was closed"}
From what I understand it doesn't load all even though I commit the transaction. So I've learned that I need to keep my session open for a while, but how long?
My question is basically if I'm handling the lifetime properly, or what I should change to be on the right track. Honestly I can't see how this can be wrong, so what I'd really like is a function call to ensure that the referenced data is fetched. I'm not using lazy loading, so I thought they would be loaded immediately..?
Current architecture: Using a "service behavior" class that does the transaction. This is IDisposable, so the service itself it using a using-clause around it. The NHibernateSessionFactory provides a static factory which hence will be resused.
// This is the service - the function called "directly" through my WCF service.
public IList<Foo> SearchForFoo(string searchString)
{
using (var serviceBehavior = new FooServiceBehavior(new NhibernateSessionFactory()))
{
return serviceBehavior.SearchForFoo(searchString);
}
}
public class FooServiceBehavior : IDisposable
{
private readonly ISession _session;
public FooServiceBehavior(INhibernateSessionFactory sessionFactory)
{
_session = sessionFactory.OpenSession();
}
public void Dispose()
{
_session.Dispose();
}
public IList<Foo> SearchForFoo(string searchString)
{
using (var tx = _session.BeginTransaction())
{
var result = _session.CreateQuery("from Foo where Name=:name").SetString("name", searchString).List<Name>();
tx.Commit();
return result;
}
}
It turns out I'm doing lazy loading after all. I had the following mapping:
public class FooMapping : ClassMap<Foo>
{
public FooMapping()
{
Not.LazyLoad();
Id(c => c.Id).GeneratedBy.HiLo("1");
Map(c => c.Name).Not.Nullable().Length(100);
HasMany(x => x.Bars).Cascade.All();
}
}
I assumed the Not.LazyLoad() disabled lazy loading, but apparently not for referenced objects. I added lazy loading on the reference, and this seems to have fixed the issue.
public class FooMapping : ClassMap<Foo>
{
public FooMapping()
{
Not.LazyLoad();
Id(c => c.Id).GeneratedBy.HiLo("1");
Map(c => c.Name).Not.Nullable().Length(100);
HasMany(x => x.Bars).Not.LazyLoad(); // <----------
}
}
Thanks for your time, and I'll still be glad to see your opinions to whether my given structure is reasonable.
If xml files are used for mapping purpose, we can set lazy=false for the bag.
Related
If there is a class which works as a data access layer, and provides the functionality of CRUD operations for entities, Which version is the best practice when considering performance & Multi threaded environment (i.e methods of this class are called by multiple threads simultaneously . .). . .
Version 1:
DbContext created at class level, shared by all the methods . .
class EmployeeService{
private DbContext db=new DbContext();
public Employee GetEmployee(int id)
return db.Employees.Find(id);
}
public void AddEmployee(Employee employee){
db.Employees.Add(employee);
db.SaveChanges();
}
}
Version 2:
DbContext for each method call . .
class EmployeeService{
public Employee GetEmployee(int id){
using(DbContext db=new DbContext()){
return db.Employees.Find(id);
}
}
public void AddEmployee(Employee employee){
using(DbContext db=new DbContext()){
db.Employees.Add(employee);
db.SaveChanges();
}
}
}
UPDATE:
May be the question posted is too generic in scope which leads several points to consider.
The point of interest is, the cost of instantiating DbContext object. Can it be created per request (Version 2) or is it heavy weight object and better to create few instances and share them across different calls (Version 1)
There's even a third approach based on manual or automatic dependency injection:
public interface ISomeService
{
// Interface members
}
public class SomeService : ISomeService
{
public SomeService(DbContext dbContext)
{
DbContext = dbContext;
}
private DbContext DbContext { get; }
}
Then, SomeService won't be the responsible of defining the life-time of injected DbContext, but it's an external class who does it.
That way, your service focuses on doing just what's meant to do (working with the domain and reading/writing data).
Depending on the execution environment, you'll want different DbContext life-styles: per service instance, per request, per thread... There're a lot of choices here depending on the case.
Perhaps you're not considering another scenario: a shared transaction between two or more services. You would need to transfer the responsibility of instantiating DbContext to an upper layer, and then you would inject same DbContext on all participating services, and you would confirm or abandon the whole transaction globally.
Performance of EF is made up of several factors, scope of the DbContext being one of them.
Some background information regarding the scope is found here: https://msdn.microsoft.com/en-us/data/jj729737.aspx
Scope is not only about performance, it's also about the objects returned. If lazy loading is applied and the DbContext is disposed before you access some navigation properties it will give an exception. See http://www.entityframeworktutorial.net/EntityFramework4.3/lazy-loading-with-dbcontext.aspx
You could write code like this:
public class EmployeeService
{
public EmployeeDto GetEmployee(int id)
{
using(DbContext db=new DbContext())
{
return db.Employees.Select(e =>
new EmployeeDto
{
Id = e.Id,
Name = e.Name,
Department = e.Department.Name
}).First(e => e.Id == id);
}
}
}
public class EmployeeDto
{
public int Id { get;set;}
public string Name { get;set;}
public string Department { get;set;}
}
So rather than return the whole object you can use projection to limit the set of data that is returned. This you can use to reduce load on the database server as the queries will be less verbose but it also helps to load all the required data before disposing the context. See http://www.entityframeworktutorial.net/querying-entity-graph-in-entity-framework.aspx for some examples of generated queries.
So my advice is to limit the scope of the DbContext. You can inject the DbContext but then you will have no control over the DbContext It will lead to errors like this: http://wallacekelly.blogspot.nl/2012/01/linq-to-entities-objectdisposedexceptio.html
But it all depends on your needs and on what kind of service it is you are building.
I'm having an issue in Entity Framework 6 where an exception is consistently thrown. For the most part the application works perfectly fine until I try adding a user to a role via a linking table.
The error being thrown is the following:
The relationship between the two objects cannot be defined because they are attached to different ObjectContext objects.
The functionality will happily add the user to the role virtually but as soon as SaveChanges() is called the process falls over.
I'm aware of the how and why for the above error and after doing some research it's due to the the context not being disposed of correctly. So following on from that and looking into the DbContext setup I've realised IDisposable wasn't added to the configuration. Unfortunately, no matter what I've tried incorporating IDisposable at any point within the application still doesn't dispose of the contexts correctly.
So after spending a fair bit of time and having no luck via Google I'm wondering if any of you have a solution or are able to point me in the right direction.
The following is a cutdown version of the Data Layer classes I've implemented:
public class GenericRepository<T> : WebsiteContext, IGenericRepository<T> where T : class
{
public virtual void Commit()
{
SaveChanges();
}
public virtual void Delete(int id)
{
var record = Set<T>().Find(id);
if (record == null)
throw new Exception("Some Message");
Set<T>().Remove(record);
}
// ... ETC
}
public interface IGenericRepository<T> where T : class
{
void Commit();
// ... ETC
}
public class WebsiteContext : DbContext, IWebsiteContext
{
static WebsiteContext()
{
Database.SetInitializer<WebsiteContext>(null);
}
public WebsiteContext() : base("Name=WebsiteContext") { }
public IDbSet<User> Users { get; set; }
// ... ETC
protected override void OnModelCreating(DbModelBuilder modelBuilder)
{
// ... ETC
}
}
This implementation is loosely based around the following Stackoverflow question.
Entity Framework 6 Code First - Is Repository Implementation a Good One?
The following is a condensed version of the Service Layer class and method which is causing the issue.
private IGenericRepository<User> _userRepository;
private IGenericRepository<ApplicationUserSetting> _userSettingRepository;
private IGenericRepository<ApplicationRole> _roleRepository;
public UserManagementService()
{
_userRepository = new GenericRepository<User>();
_roleRepository = new GenericRepository<ApplicationRole>();
_userSettingRepository = new GenericRepository<ApplicationUserSetting>();
}
public void AssignUserRole(AssignRoleModel model)
{
var user = _userRepository.GetById(model.UserId);
if (user == null)
return;
var role = _roleRepository.GetById(model.RoleId);
if (role == null)
return;
user.Roles.Add(role);
_userRepository.Commit();
}
The issue, just like the error states, is because you have multiple instances of the type DbContext fetching your entities for you. Each fetched entity is then associated with the DbContext instance that retrieved it. If you want to persist changes to these entities it has to occur on the DbContext instance that it is associated with OR you have to attach it to the DbContext instance it is not associated with.
If you are trying to keep it simple I recommend you implement a DI framework like AutoFac. You can then have a single DbContext instance created per request and have it injected everywhere you need it. It will allow you to keep your existing structure (I am not going to comment on that as I consider that out of scope for this question), the end result would be that each injected GenericRepository instance has an injected WebsiteContext instance but the WebsiteContext instances are shared (all the same instance). The upside of that is no more error but the downside is you do have to be aware that any changes to any entities will result in those changes being persisted as soon as you execute the Save functionality.
Using multiple repositories causes the issue. Just use one repository (= one db context) and have different methods for getting the individual types.
E.g. _repository.Get(id)
It's way out of scope to point out how your current implementation could be made to work, but if you did want to use more than one context, you can despite what others have said.
If you do, you will have to detach the entity from the previous context first.
I'm starting a new small project with ASP.NET MVC and Entity Framework. (SQL Server - around 20 DB tables)
In past projects I’ve used Linq2SQL but it seems to be obsolete.
I've read a lot of posts on using repository pattern for EF (pros and cons) , For me it seems better/simpler to code without repository pattern.
I created the following project architecture :
namespace MySite.Models
{
public class User
{
public Int32 ID { get; set; }
public String Email { get; set; }
public String Password { get; set; }
public String Name { get; set; }
public Int32 Gender { get; set; }
}
}
namespace MySite.DAL
{
public class Users
{
public static IEnumerable<User> GetUsers()
{
using (var context = new DatingSiteContext())
{
return context.Users.ToList();
}
}
public static User GetUserByID(int id)
{
using (var context = new DatingSiteContext())
{
return context.Users.Find(id);
}
}
}
namespace MySite.Controllers
{
public class HomeController : Controller
{
public ActionResult Index()
{
ViewBag.Message = "Modify this template to jump-start your ASP.NET MVC application.";
var users = DAL.Users.GetUsers();
return View(users);
}
}
}
What are the disadvantage of using EF like this? (Except lack of unit
testing support)
Is it wrong to create a new DbContext on each call to DAL ? Any Performance hit?
Any other recommended structure for using EF ? Examples? :)
Would you use Linq2SQL in a new project ?
Thank you.
Edit:
The code inside GetUsers() and GetUserByID() is just for example , i understand its a bad practice to return all records from the db (paging or filter in missing)
You actually just created a repository only you call it a 'data access layer' which is, in my opinion, not a good name since Entity Framework is the data access layer. A repository is an abstraction on top of a data access layer, Entity Framework in this case.
Is it wrong to create a new DbContext on each call to DAL ? Any
Performance hit?
Nope, it's just fine, but it might cause trouble when you fetch an entity in one instance of DbContext, and try to update it in another instance.
Would you use Linq2SQL in a new project ?
Nope, Microsoft proposed Entity Framework as the successor of L2SQL and active development of it has stopped.
Any other recommended structure for using EF ? Examples? :)
The approach you use, specific repositories, will result in a lot of redundant code. You could create a generic repository implementing an interface:
public interface IRepository<TEntity>
where TEntity : class, new()
{
IEnumerable<TEntity> GetAll();
TEntity GetById(int id);
IQueryable<TEntity> Table { get; }
}
And an implementation of this:
public EfRepository<TEntity> : IRepository<TEntity>
where TEntity : class, new()
{
private readonly DatingSiteContext _context;
public EfRepository()
{
_context = new DatingSiteContext();
}
private IDbSet<TEntity> Entities
{
get
{
return _context.Set<TEntity>();
}
}
public IEnumerable<TEntity> GetAll()
{
return Entities.ToList();
}
public TEntity GetById(int id)
{
return Entities.Find(id);
}
public IQueryable<TEntity> Table
{
get { return Entities; }
}
}
You can use this repository in your controller like this:
public class HomeController : Controller
{
private readonly IRepository<User> _userRepository;
public HomeController()
{
_userRepository = new EfRepository<User>();
}
public ActionResult Index()
{
var users = _userRepository.GetAll();
var inactiveUsers = _userRepository.Table.Where(u => !u.Active).ToList();
}
}
This generic repository allows you to create mocked repositories:
public class FakeUserRepository : IRepository<User>
{
// ...
}
This approach might seem like a lot of code, but as your entity type amount grows, it will save you a lot of work since all you have to do is create an IRepository<> field in a controller. Yet you have a lot of flexibility with the IQueryable<> property which allows deferred execution.
I'm not saying this is the best method, just one I use regularly in projects. I have to say that I usually write a business (service) layer between the controller and the repositories. I keep my business logic and complex Linq queries (and their execution) there. I also use an IoC container which handles the lifetime of my objects (instances of DbContext and services for example). See this question for more information about that.
My thoughts
Whats the disadvantages:
You cant really unit test anywhere that uses the static methods you have defined in your DAL.
They are also strongly coupled making them more difficult to swap out at runtime, if that became a requirement.
You may start to get additional complications if you need to commit several updates in a transaction
Is it wrong to create a new DbContext on each call?
No, this is fine. The DbContext is lightweight and meant to be used this way.
Other patterns
You already mentioned the repository pattern which is pretty solid, especially when used with a unit of work pattern.
Would you use Linqtosql
No - Linqtosql is pretty much done with, entity framework provides a more complete and generally better solution to this problem
I would rethink how you implemented GetUsers(). You are calling ToList() which will cause all the rows from the underlying table to be returned and stored in memory. If the tables grows large enough you will run into performance issues. It's better to return an IQueryable<User> instead and have your method return context.Users.
Of course you'll run into the problem that the context has already been disposed by the time you execute the IQueryable<>, so you'll need to handle the life cycle of the context in a different way.
If the project is small enough, then you can just store an instance of the Context at the Controller level, and dispose of it when the controller is being disposed. If you do that, make sure you don't do anything in your views that would cause additional queries to be executed (e.g. access a collection off of User if one exists) or else that will error out.
I've encountered what seems to be a common problem: I am updating values in my database, but EF is using its original in-memory copy of the object and these changed values are not reflected in the displayed data. I understand why this is, but I can't figure out a way around it.
The most common solution seems to be to set MergeOptions.NoTracking to turn off change tracking completely (or use the AsNoTracking() extension method when querying) and force a refresh every time the object is accessed, which is fine for my purposes.
I've got a generic base repository which my other repositories inherit from:
public abstract class RepositoryBase<T> where T : class
{
private readonly IDbSet<T> _dbset;
private readonly IUnitOfWork _unitOfWork;
protected RepositoryBase(IUnitOfWork unitOfWork)
{
_unitOfWork = unitOfWork;
_dbset = _unitOfWork.Database.Set<T>();
}
public virtual IQueryable<T> All()
{
return _dbset;
}
// Some other IQueryable methods here (Query, GetByProductCode etc)
public virtual T Get(long id)
{
return _dbset.Find(id);
}
}
And a DbContext like this:
public class Db : DbContext
{
private IDbSet<Product> _products;
public IDbSet<Product> Products
{
get { return _products ?? (_products = DbSet<Product>()); }
}
public virtual IDbSet<T> DbSet<T>() where T : class
{
return Set<T>();
}
public virtual void Commit()
{
base.SaveChanges();
}
}
If I change the All() method of my repository thus:
public virtual IQueryable<T> All()
{
return _dbset.AsNoTracking();
}
I get the desired result - an update in the database is reflected when the page displaying the products is refreshed. However, I can't do this in the Get() method, as that extension method only works on an IQueryable.
Ideally I'd like to turn this off at the DbContext level as I will never need change tracking, but there doesn't seem to be an obvious way to do this, and there is pretty much zero documentation on the subject (unless someone can point me to some? Please!).
I tried adding a constructor to the DbContext with these configuration options disabled:
public Db()
{
base.Configuration.ProxyCreationEnabled = false;
base.Configuration.AutoDetectChangesEnabled = false;
}
But I must admit I'm only guessing as to what they really do (I only found them through looking at the source code), and they don't seem to have any effect anyway.
Any help would be greatly appreciated. If more info/code would help, please let me know.
If you want to force context to get fresh data each time you don't want to use Find method. Find method always query internal storage first. Use this instead:
public virtual T Get(long id)
{
return All().SingleOrDefault(e => e.Id == id);
}
But I don't understand what do you need this? What do you mean by:
an update in the database is reflected
when the page displaying the products
is refreshed
Context is unit of work. It should be used as unit of work - in web application or web service it means creating new context instance per request. In winforms / wpf application it means using context per logical block (per presenter etc). Because of that you should need this only in very specific scenarios but you want it globally. Your description seems like you are reusing context among requests which is completely bad solution. There are no performance costs in recreating context for each request.
I am using the Entity framework for the first time, and would like to know if I am using in the best practice.
I have created a separate class in my business logic which will handle the entity context. the problem I have, is in all the videos I have seen they usually wrap the context in a using statement to make sure its closed, but obviously I can't do this in my business logic as the context will be closed before I can actually use it?
So is this ok what I'm doing? A couple of examples:
public IEnumerable<Article> GetLatestArticles(bool Authorised)
{
var ctx = new ArticleNetEntities();
return ctx.Articles.Where(x => x.IsApproved == Authorised).OrderBy(x => x.ArticleDate);
}
public IEnumerable<Article> GetArticlesByMember(int MemberId, bool Authorised)
{
var ctx = new ArticleNetEntities();
return ctx.Articles.Where(x => x.MemberID == MemberId && x.IsApproved == Authorised).OrderBy(x => x.ArticleDate);
}
I just want to make sure I'm not building something that's going to die when a lot of people use it?
It really depends on how to want to expose your repository/data store.
Not sure what you mean by "the context will be closed, therefore i cannot do business logic". Do your business logic inside the using statement. Or if your business logic is in a different class, then let's continue. :)
Some people return concrete collections from their Repository, in which case you can wrap the context in the using statement:
public class ArticleRepository
{
public List<Article> GetArticles()
{
List<Article> articles = null;
using (var db = new ArticleNetEntities())
{
articles = db.Articles.Where(something).Take(some).ToList();
}
}
}
Advantage of that is satisfying the good practice with connections - open as late as you can, and close as early as you can.
You can encapsulate all your business logic inside the using statement.
The disadvantages - your Repository becomes aware of business-logic, which i personally do not like, and you end up with a different method for each particular scenario.
The second option - new up a context as part of the Repository, and make it implement IDisposable.
public class ArticleRepository : IDisposable
{
ArticleNetEntities db;
public ArticleRepository()
{
db = new ArticleNetEntities();
}
public List<Article> GetArticles()
{
List<Article> articles = null;
db.Articles.Where(something).Take(some).ToList();
}
public void Dispose()
{
db.Dispose();
}
}
And then:
using (var repository = new ArticleRepository())
{
var articles = repository.GetArticles();
}
Or the third-option (my favourite), use dependency injection. Decouple all the context-work from your Repository, and let the DI container handle disposal of resources:
public class ArticleRepository
{
private IObjectContext _ctx;
public ArticleRepository(IObjectContext ctx)
{
_ctx = ctx;
}
public IQueryable<Article> Find()
{
return _ctx.Articles;
}
}
Your chosen DI container will inject the concrete ObjectContext into the instantiation of the Repository, with a configured lifetime (Singleton, HttpContext, ThreadLocal, etc), and dispose of it based on that configuration.
I have it setup so each HTTP Request gets given a new Context. When the Request is finished, my DI container will automatically dispose of the context.
I also use the Unit of Work pattern here to allow multiple Repositories to work with one Object Context.
You may have also noticed I prefer to return IQueryable from my Repository (as opposed to a concrete List). Much more powerful (yet risky, if you don't understand the implications). My service layer performs the business logic on the IQueryable and then returns the concrete collection to the UI.
That is my far the most powerful option, as it allows a simple as heck Repository, the Unit Of Work manages the context, the Service Layer manages the Business Logic, and the DI container handles the lifetime/disposal of resources/objects.
Let me know if you want more info on that - as there is quite a lot to it, even more than this surprisingly long answer. :)
I would have the ctx as a private variable within each class, then create a new instance of this each time and then dispose when finished.
public class ArticleService
{
private ArticleEntities _ctx;
public ArticleService()
{
_ctx = new ArticleEntities();
}
public IEnumerable<Article> GetLatestArticles(bool Authorised)
{
return _ctx.Articles.Where(x => x.IsApproved == Authorised).OrderBy(x => x.ArticleDate);
}
public IEnumerable<Article> GetArticlesByMember(int MemberId, bool Authorised)
{
return _ctx.Articles.Where(x => x.MemberID == MemberId && x.IsApproved == Authorised).OrderBy(x => x.ArticleDate);
}
public void Dispose()
{
_ctx.Dispose();
_ctx = null;
}
}
Then when calling this.
ArticleService articleService = new ArticleService();
IEnumerable<Article> article = articleService.GetLatestArticles(true);
articleService.Dispose(); // killing the connection
This way you can also add/update other objects within the same context and call a save method which saves any changes to the db through the Entity.
In my experience this code is not good, because you lose the capacity to navigate relationships through navigation properties.
public List<Articles> getArticles( ){
using (var db = new ArticleNetEntities())
{
articles = db.Articles.Where(something).ToList();
}
}
Using this approach you can't use the following code because a.Members is always null( db context is close and cant get data automatically).
var articles = Data.getArticles();
foreach( var a in articles ) {
if( a.Members.any(p=>p.Name=="miki") ) {
...
}
else {
...
}
}
}
Using only a global db context is a bad idea because you must use a delete changes function
in a point of your application yo do this but don't save changes and close the window
var article= globalcontext.getArticleByID(10);
article.Approved=true;
then in another point of application you make some operation and save
//..... something
globalcontext.saveChanges();
in this case previous article approved property is set to modified by entity framework. When you save, approved is set true!!!
Best approach for me is use 1 context per class
You can pass context to another external method if you need
class EditArticle {
private DbEntities de;
private currentAricle;
public EditArticle() {
de = new DbEntities; //inizialize on new istance
}
loadArticleToEdit(Articele a){
// a is from another context
currentArticle= de.Article.Single(p=>p.IdArticle==a.IdArticle){
}
private saveChanges(){
...
pe.saveChanges();
}
}
What you can also do is store your context at a higher level.
E.g., you can have a static class storing the current context:
class ContextManager
{
[ThreadStatic]
public static ArticleEntities CurrentContext;
}
Then, somewhere outside you do something like this:
using (ContextManager.CurrentContext = new ArticleEntities())
{
IEnumerable<Article> article = articleService.GetLatestArticles(true);
}
Then, inside the GetLastestArticles, you just use the same ContextManager.CurrentContext.
Of course, this is just the basic idea. You can make this a lot more workable by using service providers, IoC and such.
You can start preparing Entity Framework from data access layer by creating a generic repository class for all required Entity Framework functions. Then you can used it in Business layer (Encapsulated)
Here are the best practices that I have used for Entity Framework in data, business, and UI layers
Techniques used for this practice:
Applying SOLID architecture principles
Using Repository design pattern
Only one class to go (and you will find it ready)