I am currently having problems getting my Update function working, the function first loads an entity using session.Load() and then uses session.SaveorUpdate().
My problem is that if I do not load the session first nhibernate will not know the the relationships and therefore try and insert data which is already there and when I do load the entity first, the updated entity is overwritten by the data already in the database.
public void Update(T Entity, bool load)
{
using(ISession session = this.helper.GetSession())
{
using(ITransaction transaction = session.BeginTransaction())
{
if(load)
{
session.Load(Entity, Entity.ID);
}
session.SaveOrUpdate(Entity);
transaction.Commit();
session.Flush();
}
}
}
In a nutshell:
load object and then bind it with new values (changes will be persisted on session.Flush() without any explicit Update() call) or
create new C# instance with bounded values, including ID, and call session.Update(myInstance)
The more complex answer could be found in one of the doc chapters:
9.4.2. Updating detached objects
Many applications need to retrieve an object in one transaction, send it to the UI layer for manipulation, then save the changes in a new transaction. (Applications that use this kind of approach in a high-concurrency environment usually use versioned data to ensure transaction isolation.) This approach requires a slightly different programming model to the one described in the last section. NHibernate supports this model by providing the method ISession.Update().
// in the first session
Cat cat = firstSession.Load<Cat>(catId);
Cat potentialMate = new Cat();
firstSession.Save(potentialMate);
// in a higher tier of the application
cat.Mate = potentialMate;
// later, in a new session
secondSession.Update(cat); // update cat
secondSession.Update(mate); // update mate
The usage and semantics of SaveOrUpdate() seems to be confusing for new users. Firstly, so long as you are not trying to use instances from one session in another new session, you should not need to use Update() or SaveOrUpdate(). Some whole applications will never use either of these methods.
Usually Update() or SaveOrUpdate() are used in the following scenario:
the application loads an object in the first session
the object is passed up to the UI tier
some modifications are made to the object
the object is passed back down to the business logic tier
the application persists these modifications by calling Update() in a second session
So, we can get an instance of some entity in one session... and close that session. Such object could be even totally brand new C# instance - with all its properties being bounded by some upper layer (e.g. MVC binder, or Web API formatter)
Later, we can use that instance and call session.Update(myInstance). NHibernate will take the ID of that entity and issue the proper update statement.
Another way could be to call Merge:
The last case can be avoided by using Merge(Object o). This method copies the state of the given object onto the persistent object with the same identifier. If there is no persistent instance currently associated with the session, it will be loaded. The method returns the persistent instance. If the given instance is unsaved or does not exist in the database, NHibernate will save it and return it as a newly persistent instance. Otherwise, the given instance does not become associated with the session. In most applications with detached objects, you need both methods, SaveOrUpdate() and Merge().
read more in the doc
Related
I attempted to get the object that has lazy properties in a session and tried to update it in another session. But it failed to do so
with an Error: No persister for SecUserProxy (actual class is SecUser)
I'm using NHibernate 3.4. When I googled I came to know its a bug which has been fixed.
I also came across this post in which it is said if your proxy object implements INhibernateProxy, you can unproxy the object
with NHibernate. As NHibernate no longer supports pluggable proxy factories (like Castle, LinFu etc), it uses an internal one, I'm
assuming the internal one is perhaps INhibernateProxy
So I did the following in New session where I want to update my object as:
object unprox_obj = Session
.GetSessionImplementation()
.PersistenceContext.Unproxy(secUserobj);
in anticipation of getting this same object but with the real type i-e SecUser so that it can update without any error. But it still returns a proxy object.
I'm not able to understand what is going on?
Updated:
I have realized just now that 'secUserobj' is not INhibernateProxy. So How can I make it INhibernateProxy in order to update my object within another session?
if (secUserobj is INHibernateProxy)
{
unprox_obj = Session
.GetSessionImplementation()
.PersistenceContext.Unproxy(secUserobj);
}
Detached objects (loaded in one session, and kept as reference) could be re-attached. We can use session.Merge() or session.Lock()
9.4.2. Updating detached objects
Many applications need to retrieve an object in one transaction, send it to the UI layer for manipulation, then save the changes in a new transaction....
...
...The last case can be avoided by using Merge(Object o). This method copies the state of the given object onto the persistent object with the same identifier. If there is no persistent instance currently associated with the session, it will be loaded. The method returns the persistent instance. If the given instance is unsaved or does not exist in the database, NHibernate will save it and return it as a newly persistent instance. Otherwise, the given instance does not become associated with the session. In most applications with detached objects, you need both methods, SaveOrUpdate() and Merge().
19.1.4. Initializing collections and proxies
...
You may also attach a previously loaded object to a new ISession with Merge() or Lock() before accessing uninitialized collections (or other proxies). No, NHibernate does not, and certainly should not do this automatically, since it would introduce ad hoc transaction semantics!
...
So, we can pass detached reference into .Merge(), and later work with returned (brand new) object reference:
MyEntity reAttached = session.Merge<MyEntity>(detached);
Be careful, this should be done (as stated above) before any of the detached collections were touched.
I develop a server with persistent client connections (non request based). As I keep track of each connected client state in memory it would be strange if I load entities each time when I need to access such client data.
So I have my detached entities and when I need to perform any changes I don't apply them directly but instead pass these changes and detached entity as a request to GameDb class. It performs changes on this entity and than loads the same entity from the db to perform the same changes again on session-owned entity so NH can track these changes.
I could use Merge but it's much slower because NH should load all entity data (including lazy collections which could be unmodified) to check each property for changes. In my case the performance is critical.
An example:
public void GameDb.UpdateTradeOperation(UserOperation operation, int incomeQuantity, decimal price)
{
if (operation == null) throw new ArgumentNullException("operation");
if (operation.Id == 0) throw new ArgumentException("operation is not persisted");
_operationLogic.UpdateTradeOperation(operation, incomeQuantity, price);
try
{
_factory.Execute(
s =>
{
var op = s.Get<UserOperation>(operation.Id);
_operationLogic.UpdateTradeOperation(op, incomeQuantity, price);
if (op.User.BalanceFrozen != operation.User.BalanceFrozen)
throw new Exception("Inconsistent balance");
}); // commits transaction if no exceptions thrown
}
catch (Exception e)
{
throw new UserStateCorruptedException(operation.User, null, e);
}
}
This approach brings some overcomplexity as I need to apply each change twice and check if the result states are equal. It would be easier if I could use NH Session to monitor entity changes. But it's not recommended to keep NH session opened for a long time and I could have thousands of such long lived opened sessions.
Also it forces me to split my entities and common logic. The problem is that GameDb class doesn't know from which context it's called and can't request any additional data for its operation (e.g. current prices or client socket inactivety timer or many other things) or it may need to conditionaly (by its decision) send some data to the client. Of course I can pass a bunch of delegates to GameDb method but it doesn't seem to me as a good solution.
Can I use Session.Lock to attach my unchanged detached entities so I don't need to perform the changes twice? What LockMode should I use?
Can I use a better approach here? If I keep one opened session per client but commit or rollback transactions quickly will it still open a lot of connections? Will the session keep entities state after the transaction is completed?
What kind of concurrency issues I can experience with long lived per-client-sessions:
If I operate each user entities only from its own thread fiber (or lock)?
If I request another user profile for readonly from "wrong" session (from that session's thread)?
I think what you need to do is to use a second level cache, and store Id of entities per connected client instead of storing entities in the memory.
When client connects you can fetch the entities using Id you storing, which will not even hit the database in subsequent requests as it will fetch entities from the second level cache and you do not need to worry about change tracking.
http://ayende.com/blog/3976/nhibernate-2nd-level-cache
I tried to use Session.Lock (LockMode.None) to reattach detached entities to the new session and it works. It adds an object to the session as clean and unchanged. I can modify it and it will be stored to the database with the next transaction commit.
This is better than merge because Nhibernate does not need to look at all the properties to find out what is changed.
if I change at least one property it updates the whole object (I mean all the properties but without collections and entity links if they are not touched). I set DynamicUpdate = true in the entities mapping and now it updates only changed properties.
If I change any property of the dettached entity outside of its current session, the next call to Session.Lock throws an exception (especially if I change the collection content the exception states "reassociated object has dirty collection"). I do those changes outside of the session because I don't need to save them (some stuff with references).
Very strange, but it works perfectly when I call Lock twice!
try
{
s.Lock(DbEntity, LockMode.None); // throws
}
catch
{
s.Lock(DbEntity, LockMode.None); // everything ok
}
Also for collections: before I came to the solution above I casted them to IPersistentCollection and used ClearDirty().
What about concurrency? My code unsures that each thread fiber updates only its user and nobody except this fiber has write access to the entity.
So the pattern is:
I open a session, get an entity and store it somewhere in the memory.
When I need to read its property - I can do it at any time and very fast.
When I want to modify it I open a new session and perform Lock() on it. After applying changes I commit the transaction and close the session.
From everything I read until now it should not be possible to attach same object to different dbcontexts (and all the examples and questions I could find were showing exceptions in such cases). Right now as I tested with EF6 it allowed me to attache the same object to different contexts (from different threads); I was even able to change teh object from one thread and save it with the other thread.
This is not necessarily a bad thing (except the fact I must make sure I lock all the time as there is no exception thrown), just that I would like to understand what is going on.
Does anybody know if this is really a "new feature" in EF6?
Some code here. Calling this from several different threads gave no exception, and if I change the object from another thread before save it takes the last values:
using (var db = new TestContext())
{
db.Users.Attach(_cachedUser);
MessageBox.Show("attached"); //I use this to pause the thread as long as I want
_cachedUser.UserCode = tbCode.Text;
_cachedUser.UserDesc = tbDesc.Text;
MessageBox.Show("ready to save"); //pause again
db.SaveChanges();
}
Edit
After receiving the answer why this happens, I also found how to check if an object is proxy or not: http://msdn.microsoft.com/en-us/library/vstudio/ee835846(v=vs.100).aspx
public static bool IsProxy(object type)
{
return type != null && ObjectContext.GetObjectType(type.GetType()) != type.GetType();
}
Works just fine.
This has been possible since Entity Framework introduced the code-first style, because you can only do this with POCOs.
The cachedUser is a plain C# class. It has no information whatsoever about a context it's attached to. Also, a new context instance has not knowledge whatsoever of another context's change tracker. So there is no way to check if a POCO is attached to a context anywhere.
This changes when cachedUser is not a POCO, but a proxy object. (A proxy object is an object that EF creates on the fly. It inherits from the entity class and it contains code and state that enables lazy loading and facilitates change tracking). When you try to attach a proxy object to a second context you'll get an exception:
An entity object cannot be referenced by multiple instances of IEntityChangeTracker.
That's why for many scenarios it's recommended to create proxies instead of POCOs. You can create proxies by using db.Users.Create() in stead of new User().
When to create proxies, whether this is possible at all and when EF materialized proxies is a subject that's beyond the scope of this question. More about this can be found here.
I Have a Nhibernate object called Car, this Car object have a PersietentBag IList collection called Doors, all in lazy loading.
if i do (IN SESSION 1)
int singleDoor = Car.Doors[0];
the lazyest collections loads from db and the related objects becomes added to the first level cache, the i will have in the 1rst level cache N objects Car and N Doors loaded from db.
From another hand (at other part of the code IN SESSION 2) i loads the same Car objects and do the same assigment
int singleDoor = Car,Doors[0];
and i Evicts Car and all the Door(s) objects from SESSION2
i modify the state of this objects and want to attach the doscontected objects tu the SESSION1 for save nd do
mySession.Update(Car);
But when i try to update the Door(s) obejcts obviously i have the (an other obejct with the same id, etc) exception is thrown because there's yet another object with the same id.
Bot it's dificult to find the old object to evict, how can i EVICT the old objects or clear the 1rst level cache (only by type and id) or discard old objects from cache and update what i want?.
thanks in advance.
This isn't because of the second level cache, it's because you're trying to save the entity from session 2 when it's already been loaded in session 1 (in fact it's the first level cache that's causing this).
The answer to your question is to use (in session 1), session.Evict(car), however that's not really the best approach - I would rather suggest using session.Merge(car) which will update the persistent object in session 1 without throwing the exception about another object with the same id.
This question has been asked 500 different times in 50 different ways...but here it is again, since I can't seem to find the answer I'm looking for:
I am using EF4 with POCO proxies.
A.
I have a graph of objects I fetched from one instance of an ObjectContext. That ObjectContext is disposed.
B.
I have an object I fetched from another instance of an ObjectContext. That ObjectContext has also been disposed.
I want to set a related property on a bunch of things from A using the entity in B....something like
foreach(var itemFromA in collectionFromA)
{
itemFromA.RelatedProperty = itemFromB;
}
When I do that, I get the exception:
System.InvalidOperationException occurred
Message=The relationship between the two objects cannot be defined because they are attached to different ObjectContext objects.
Source=System.Data.Entity
StackTrace:
at System.Data.Objects.DataClasses.RelatedEnd.Add(IEntityWrapper wrappedTarget, Boolean applyConstraints, Boolean addRelationshipAsUnchanged, Boolean relationshipAlreadyExists, Boolean allowModifyingOtherEndOfRelationship, Boolean forceForeignKeyChanges)
at System.Data.Objects.DataClasses.RelatedEnd.Add(IEntityWrapper wrappedEntity, Boolean applyConstraints)
at System.Data.Objects.DataClasses.EntityReference`1.set_ReferenceValue(IEntityWrapper value)
at System.Data.Objects.DataClasses.EntityReference`1.set_Value(TEntity value)
at
I guess I need to detach these entities from the ObjectContexts when they dispose in order for the above to work... The problem is, detaching all entities from my ObjectContext when it disposes seems to destroy the graph. If I do something like:
objectContext.ObjectStateManager.GetObjectStateEntries(EntityState.Added | EntityState.Deleted | EntityState.Modified | EntityState.Unchanged)
.Select(i => i.Entity).OfType<IEntityWithChangeTracker>().ToList()
.ForEach(i => objectContext.Detach(i));
All the relations in the graph seem to get unset.
How can I go about solving this problem?
#Danny Varod is right. You should use one ObjectContext for the whole workflow. Moreover because your workflow seems as one logical feature containing multiple windows it should probably also use single presenter. Then you would follow recommended approach: single context per presenter. You can call SaveChanges multiple times so it should not break your logic.
The source of this issue is well known problem with deficiency of dynamic proxies generated on top of POCO entities combined with Fixup methods generated by POCO T4 template. These proxies still hold reference to the context when you dispose it. Because of that they think that they are still attached to the context and they can't be attached to another context. The only way how to force them to release the reference to the context is manual detaching. In the same time once you detach an entity from the context it is removed from related attached entities because you can't have mix of attached and detached entities in the same graph.
The issue actually not occures in the code you call:
itemFromA.RelatedProperty = itemFromB;
but in the reverse operation triggered by Fixup method:
itemFromB.RelatedAs.Add(itemFromA);
I think the ways to solve this are:
Don't do this and use single context for whole unit of work - that is the supposed usage.
Remove reverse navigation property so that Fixup method doesn't trigger that code.
Don't use POCO T4 template with Fixup methods or modify T4 template to not generate them.
Turn off lazy loading and proxy creation for these operations. That will remove dynamic proxies from your POCOs and because of that they will be independent on the context.
To turn off proxy creation and lazy loading use:
var context = new MyContext();
context.ContextOptions.ProxyCreationEnabled = false;
You can actually try to write custom method to detach the whole object graph but as you said it was asked 500 times and I haven't seen working solution yet - except the serialization and deserialization to the new object graph.
I think you have a few different options here, 2 of them are:
Leave context alive until you are done with the process, use only 1 context, not 2.
a. Before disposing of context #1, creating a deep clone of graph, using BinaryStreamer or a tool such as ValueInjecter or AutoMapper.
b. Merge changes from context #2 into cloned graph.
c. Upon saving, merge changes from cloned graph into graph created by new ObjectContext.
For future reference, this MSDN blogs link can help decide you decide what to do when:
http://blogs.msdn.com/b/dsimmons/archive/2008/02/17/context-lifetimes-dispose-or-reuse.aspx
I don't think you need to detach to solve the problem.
We do something like this:
public IList<Contact> GetContacts()
{
using(myContext mc = new mc())
{
return mc.Contacts.Where(c => c.City = "New York").ToList();
}
}
public IList<Sale> GetSales()
{
using(myContext mc = new mc())
{
return mc.Sales.Where(c => c.City = "New York").ToList();
}
}
public void SaveContact(Contact contact)
{
using (myContext mc = new myContext())
{
mc.Attach(contact);
contact.State = EntityState.Modified;
mc.SaveChanges();
}
}
public void Link()
{
var contacts = GetContacts();
var sales = GetSales();
foreach(var c in contacts)
{
c.AddSales(sales.Where(s => s.Seller == c.Name));
SaveContact(c);
}
}
This allows us to pull the data, pass it to another layer, let them do whatever they need to do, and then pass it back and we update or delete it. We do all of this with a separate context (one per method) (one per request).
The important thing to remember is, if you're using IEnumerables, they are deferred execution. Meaning they don't actually pull the information until you do a count or iterate over them. So if you want to use it outside your context you have to do a ToList() so that it gets iterated over and a list is created. Then you can work with that list.
EDIT Updated to be more clear, thanks to #Nick's input.
Ok I get it that your object context has long gone.
But let's look at it this way, Entity Framework implements unit of work concept, in which it tracks the changes you are making in your object graph so it can generate the SQL corresponding to the changes you have made. Without attached to context, there is no way it can tack changes.
If you have no control over context then I don't think there is anything you can do.
Otherwise there are two options,
Keep your object context alive for longer lifespan like session of user logged in etc.
Try to regenerate your proxy classes using self tracking text template that will enable change tracking in disconnected state.
But even in case of self tracking, you might still get little issues.