I've seen some posts here that look like mine, however I couldn't find the answer to my specific problem.
The thing is that I'm trying to insert an entity into the DB using a nav. prop.
however I'm constantly getting: The ObjectContext instance has been disposed and can no longer be used for operations that require a connection.
using (EntitiesDatabase context = new EntitiesDatabase())
{
if (UserHandler.Instance.User is Admin)
{
((Admin)UserHandler.Instance.User).ProjectManagers.Add(
new ProjectManager(firstNameTextBox.Text, lastNameTextBox.Text, usernameTextBox.Text, passwordTextBox.Text));
}
else if (UserHandler.Instance.User is ProjectManager)
{
((ProjectManager)UserHandler.Instance.User).Developers.Add(
new Developer(firstNameTextBox.Text, lastNameTextBox.Text, usernameTextBox.Text, passwordTextBox.Text));
}
context.SaveChanges();
}
The error exception is thrown at line 5-6 of the code (u.ProjectManagers.Add(new ProjectManager(...))
For clarity:
UserHandler is a singelton that holds the reference to the logged on user.
The user has a navigation property
public virtual ICollection<ProjectManager> ProjectManagers { get; set; }
That should actually insert the new project manager to the DB after I do SaveChanges() on the context.
I've done some simple tests on smaller projects (like the ones in the PluralSight tutorials) and it did work... for some reason these same lines do not work.
It really seems as if the new project manager I'm creating is being disposed before I try to insert it to the nav. prop. because I'm never getting to the save changes line...
From other answers I've seen, I guessed that the using statement for some reason disposes of things before it has reached the end bracket. I don't get why...
Could someone point me in the right way?
BTW this is not homework guys, it's a project of mine in which I am trying to learn C#, EF and other technologies.
It looks like the context you get the User instance from are fetched in a different context, which are disposed.
You have two options either use the same context, or attach the entity to the new context using ObjectContext.Attach like this:
using (EntitiesDatabase context = new EntitiesDatabase())
{
context.Attach(UserHandler.Instance.User);
if (UserHandler.Instance.User is Admin)
{
((Admin)UserHandler.Instance.User).ProjectManagers.Add(
new ProjectManager(firstNameTextBox.Text, lastNameTextBox.Text, usernameTextBox.Text, passwordTextBox.Text));
}
else if (UserHandler.Instance.User is ProjectManager)
{
((ProjectManager)UserHandler.Instance.User).Developers.Add(
new Developer(firstNameTextBox.Text, lastNameTextBox.Text, usernameTextBox.Text, passwordTextBox.Text));
}
context.SaveChanges();
}
My guess would be that the data context related to UserHandler.Instance.User isn't alive in the current calling context, and UserHandler.Instance.User isn't attached to the context more recently instantiated.
Related
We are using the ExecutionStrategy and have this helper method in our db context:
public Task<T> ExecuteWithinTransactionAsync<T>(Func<IDbContextTransaction, Task<T>> operation, string operationInfo)
{
int counter = 0;
return Database.CreateExecutionStrategy().ExecuteAsync(RunOperationWithinTransaction);
async Task<T> RunOperationWithinTransaction()
{
counter++;
if (counter > 1)
{
Logger.Log(LogLevel.Warn, $"Executing ({counter}. time) transaction for {operationInfo}.");
ClearChangeTracker();
}
using (var transaction = await Database.BeginTransactionAsync(IsolationLevel.Serializable))
{
return await operation.Invoke(transaction);
}
}
}
We than use ExecuteWithinTransactionAsync when calling complex/fragile business logic which should be executed in a serializable transaction reliably. We are using Postgres so it can happen that our transaction will be aborted due to serialization issues. The execution strategy detects it and retries the operation. That works nicely. But EF still keeps the old cache from the previous execution. That's why we introduced ClearChangeTracker which looks like this:
private void ClearChangeTracker()
{
ChangeTracker.DetectChanges();
foreach (var entity in ChangeTracker.Entries().ToList())
{
entity.State = EntityState.Detached;
}
}
And this seemed to have worked properly, until we found a case where it didn't work anymore. When we add new entities to a navigation property list, these entities won't be removed on the next try. For instance
var parent = context.Parents.FirstOrDefault(p => p.Id == 1);
if (parent.Children.Any())
{
throw new Exception("Parent already has a child"); // This exception is thrown on the second try
}
parent.Children.Add(new Child());
context.SaveChangesAsync();
So if the last line context.SaveChangesAsync() fails, and the whole operation is re-run, parent.Children already contains the new child added in parent.Children.Add(new Child()); and I didn't find any way to remove that item from EF.
However, if we remove the check (if (parent.Children.Any())), if the item already exists or not, and just try adding it a second time, it's only stored once in the DB.
I was trying to figure out how to clear the DbContext properly, but most of the time, the answer was just to create a new DbContext. However, that's not an option, since the DbContext is needed for the ExecutionStrategy. That's why I wanted to know, what's the suggested way to used the ExecutionStrategy and having a clean DbContext on every retry.
Further technical details
EF Core version: 1.1.2
Database Provider: Npgsql.EntityFrameworkCore.PostgreSQL (1.1.1)
Operating system: Windows 10, Dockerized in Linux
In ef-core 2.0.0, this new feature DbContext pooling was introduced. For it to work properly, DbContext instances are now able to reset their internal state, so they can be handed out as "new". The reset method can be called like so (inside your DbContext):
((IDbContextPoolable)this).ResetState();
So if you can upgrade to ef-core 2.0.0, go for it. Not only to benefit from this new feature, it's more mature in many ways.
Disclaimer: this method is intended for internal use, so the API may change in the future.
I am writing a Web API Rest service that does operations on a distinct set of entities. I have broken them down like this:
db = new DBEntities();
using (var dbContextTransaction = db.Database.BeginTransaction())
{
try
{
ProcessClient();
ProcessClientPerson();
ProcessGuardian();
ProcessAddress();
ProcessEmail();
ProcessPhones();
ProcessChildren();
dbContextTransaction.Commit();
}
catch (Exception ex)
{
dbContextTransaction.Rollback();
etc.
Following the advice that data contexts should live as short as possible, each of the methods creates its own data context, calls SaveChanges(), and disposes of it at the end:
private ProcessClient()
{
db = new DBEntities();
....
This obviously does not work - a transaction context created this way is tied to the data context. If something goes wrong in one of the entity operations, only that operation is rolled back (implicitly), but the overarching transaction is not.
I found this approach for creating a transaction outside of EF, but I am wondering if I should follow it or if I should just let my data context live for the duration of the transaction and keep the transaction inside of EF!?
I am not looking for an opinion, but for data around stability, performance, etc.
There is no immediate need to keep contexts short-lived. You can do that but you don't have to.
Over time entities will accumulate in a context. If you risk running out of memory it can be necessary to let go of a context.
Otherwise, the usual procedure is to keep the context alive for the duration of the logical unit of work. Here, that UOW is all those methods in their entirety.
This also makes transaction management easier (as you already found out).
dbContextTransaction.Rollback();
This is an anti-pattern. Simply don't commit in case of error.
I have mixed feelings about this. I am working against a legacy database that has no foreign key constraints, and I am inserting, updating, and deleting between 20 and 30 objects in one of these service calls.
The problem is that I need to call SaveChanges() frequently to get the identity column values that will become foreign keys.
On the other hand, I have to be able to roll back everything if there is a problem three layers down, so a single large transaction is needed.
For some reason that I have not been able to determine, calling SaveChanges repeatedly on the same data context would result in errors that the connection state is open. So I ended up giving each method its own data context anyway:
var scope = new TransactionScope(TransactionScopeOption.RequiresNew,
new TransactionOptions() { IsolationLevel = IsolationLevel.ReadUncommitted);
using (scope)
{
try
{
ProcessClient();
ProcessClientPerson();
ProcessGuardian();
ProcessAddress();
ProcessEmail();
ProcessPhones();
ProcessChildren();
scope.Complete();
}
catch (System.Data.Entity.Validation.
DbEntityValidationException ex)
{
[...] handle validation errors etc [...]
}
}
with each section doing basically this, once stripped down to the bare essentials:
private void ProcessClient() {
{
using (MyDBEntities db = new MyDBEntities())
{
[...] doing client stuff [...]
aClient.LastUpdated = DateTime.Now;
db.AddOrUpdate(db, db.Clients, aClient, aClient.ClientID);
db.SaveChanges();
ClientId = aClient.ClientID; // now I can use this to form FKs
}
}
Mixed feelings about locking, because on my development VM the transaction runs for 1-2 seconds and this is a production database with office staff and online customers doing CRUD transactions through web applications at the same time.
Unrelated, but helpful for my AddOrUpdate method was this blog post.
I was following the MVC music store tutorial and I came to the part where they are using database connections (DbConnection is a child of DbContext). I was taught to create methods like this (wrapping with using):
public class StoreManagerController : Controller
{
//
// GET: /StoreManager/
public ActionResult Index()
{
using(var db = new DbConnection())
{
var albums = db.Albums.Include(a => a.Genre).Include(a => a.Artist);
return View(albums.ToList());
}
}
...
}
but Visual Studio generated me a controller which looked like this:
public class StoreManagerController : Controller
{
private DbConnection db = new DbConnection();
//
// GET: /StoreManager/
public ActionResult Index()
{
var albums = db.Albums.Include(a => a.Genre).Include(a => a.Artist);
return View(albums.ToList());
}
...
}
I assume, Visual Studio isn't wrong, but why was I told to wrap each method with using to make the connections as short as possible and also the users to use separate connections?
I assume, Visual Studio isn't wrong, but why was I told to wrap each method with using
using(var db = new DbConnection())
{
var albums = db.Albums.Include(a => a.Genre).Include(a => a.Artist);
return View(albums.ToList());
}
The scope of db remains only within the curly braces. This is perhaps another purpose that using keyword serves in C#. It defines the scope of a variable, here in the above case it is your db object.
Now, if you debug the code that visual studio generated for you, then you notice that there is a Dispose method being invoked each and every time an object of a controller class is made or in other words, an action method is called within the corresponding Controller.
The DBContext instance is always disposed because of the following -
As you load more objects and their references into memory, the memory consumption of the context may increase rapidly. This may cause performance issues.
If an exception causes the context to be in an unrecoverable state, the whole application may terminate.
The chances of running into concurrency-related issues increase as the gap between the time when the data is queried and updated grows.
For more info - Reference
This might depend on the usability of your app; whether or not you need a persistent connection, and the cost of creating one (and a myriad of other factors).
But for starters, you should always dispose of the connection (as in the first pattern, not the one suggested by Visual Studio) and then move to other patterns based on new requirements or performance-related issues.
The biggest issue I see with the Visual Studio-suggested options is that you have no way of controlling the lifetime of the DbConncetion object, and are leaving it up to the garbage collector to eventually dispose of it. This could leave the connection resources in use for an rather undetermined period of time.
I have an MVC3/.NET 4 application which uses Entity Framework (4.3.1 Code First)
I have wrapped EF into a Repository/UnitOfWork pattern as described here…
http://www.asp.net/mvc/tutorials/getting-started-with-ef-using-mvc/implementing-the-repository-and-unit-of-work-patterns-in-an-asp-net-mvc-application
Typically, as it explains in the article, when I require the creation of a new record I’ve been doing this…
public ActionResult Create(Course course)
{
unitOfWork.CourseRepository.Add(course);
unitOfWork.Save();
return RedirectToAction("Index");
}
However, when more than simply saving a record to a database is required I wrap the logic into what I’ve called an IService. For example…
private ICourseService courseService;
public ActionResult Create(Course course)
{
courseService.ProcessNewCourse(course);
return RedirectToAction("Index");
}
In one of my services I have something like the following…
public void ProcessNewCourse(Course course)
{
// Save the course to the database…
unitOfWork.CourseRepository.Add(course);
unitOfWork.Save();
// Generate a PDF that email some people about the new course being created, which requires more use of the unitOfWork…
var someInformation = unitOfWork.AnotherRepository.GetStuff();
var myPdfCreator = new PdfCreator();
IEnumerable<People> people = unitOfWork.PeopleRepository.GetAllThatWantNotifiying(course);
foreach(var person in people)
{
var message = “Hi ” + person.FullName;
var attachment = myPdfCreator.CreatePdf();
etc...
smtpClient.Send();
}
}
The above isn’t the actual code (my app has nothing to do with courses, I’m using view models, and I have separated the PDF creation and email message out into other classes) but the gist of what is going on is as above!
My problem is that the generation of the PDF and emailing it out is taking some time. The user just needs to know that the record has been saved to the database so I thought I would put the code below the unitOfWork.Save(); into an asynchronous method. The user can then be redirected and the server can happily take its time processing the emails, and attachments and whatever else I require it to do post save.
This is where I’m struggling.
I’ve tried a few things, the current being the following in ICourseService…
public class CourseService : ICourseService
{
private delegate void NotifyDelegate(Course course);
private NotifyDelegate notifyDelegate;
public CourseService()
{
notifyDelegate = new NotifyDelegate(this.Notify);
}
public void ProcessNewCourse(Course course)
{
// Save the course to the database…
unitOfWork.CourseRepository.Add(course);
unitOfWork.Save();
notifyDelegate.BeginInvoke(course);
}
private void Notify(Course course)
{
// All the stuff under unitOfWork.Save(); moved here.
}
}
My Questions/Problems
I’m randomly getting the error: "There is already an open DataReader associated with this Command which must be closed first." in the Notify() method.
Is it something to do with the fact that I’m trying to share the unitOrWork and therefore a dbContext across threads?
If so, can someone be kind enough to explain why this is a problem?
Should I be giving a new instance of unitOfWork to the Notify method?
Am I using the right patterns/classes to invoke the method asynchronously? Or should I be using something along the lines of....
new System.Threading.Tasks.Task(() => { Notify(course); }).Start();
I must say I've become very confused with the terms asynchronous, parallel, and concurrent!!
Any links to articles (c# async for idiots) would be appreciated!!
Many thanks.
UPDATE:
A little more digging got me to this SO page: https://stackoverflow.com/a/5491978/192999 which says...
"Be aware though that EF contexts are not thread safe, i.e. you cannot use the same context in more than one thread."
...so am I trying to achieve the impossible? Does this mean I should be creating a new IUnitOfWork instance for my new thread?
You could create a polling background thread that does the lengthy operation separately from your main flow. This thread could scan the database for new items (or items marked to process). This solution is pretty simple and ensures that jobs get done even if you application crashes (it will be picked up when the polling thread is started again).
You could also use a Synchronised Queue if it's not terrible if the request is 'lost', in the case your application crashes after the doc is requested and before it's generated/sent.
One thing is almost sure - as rikitikitik said - you will need to use a new unit of work, which means a separate transaction.
You could also look at Best threading queue example / best practice .
I'm in troubles with TransactionPropagation.NotSupported. I believed that this propagation causes that the code is executed within no transaction. Means that when I marked the specific method, the current transaction will be suspended and the code will be executed without any transaction.
The current version of spring.net creates new transaction instead. See the following code:
[Test]
public void A() {
TransactionTemplate template = new TransactionTemplate(TransactionManager) {
PropagationBehavior = TransactionPropagation.NotSupported
};
template.Execute(delegate {
Assert.AreEqual(0,
SessionFactory.GetCurrentSession().Linq<XXX>().
Where(t => t.Id.Equals(YYY)).ToList().Count);
return null;
});
}
I hoped that this notation causes that linq query is executed without transaction and it'll throw the new exception. But the log showed that it creates both new session and transaction automatically.
I've find out this issue when I marked any method by mentioned annotation and despite the annotation the LINQ query inside was correctly executed.
The question is: how can I mark the method to it's doesn't use the transaction at all? I don't want to use propagation never as I want the current transaction would be suspended.
My project has the business code flow, there is transaction handling, and I want to mark any parts to be certainly non-transactional.
You mention being able to tell from the log that a new transaction is started. What log, the database or the application? What database are you using? Some databases won't let you run a query outside a transaction at all, so would just start one internally for you...
Update:
Your issues looks similar to this one:
https://jira.springframework.org/browse/SPRNET-1307?page=com.atlassian.jira.plugin.ext.bamboo%3Abamboo-build-results-tabpanel#issue-tabs
I would make sure you are running the version of Spring.NET that has this fix in it (looks like v 1.3.1 or greater?)
Also, you could try manually suppressing the transaction and see if that fixes the behavior:
using(var tx = new TransactionScope(TransactionScopeOption.Suppress))
{
// make DB call...
}