I have a problem to handle the delete operation in C# / EF.
This is my code. I can delete record that is no problem. I have only problems with record that are used in other records.
So he throw an exception and the user see a messagebox.
But the record is still marked as delete so when I do another SaveChanges, I get a exception.
Is it possible to change back the state of the record to not deleted?
public void Remove()
{
try
{
Klant deleteKlant = dbContext.Klanten.First(i => i.KlantId == Klant.KlantId);
dbContext.Klanten.Remove(deleteKlant);
dbContext.SaveChanges();
}
catch (Exception)
{
throw new ArgumentException("Wissen mislukt");
}
}
First you should keep your dbContext objects short lived if at all possible. You will run into huge problems if you keep them around, as they hold a strong pointer to every object created using them.
Second you should check whether it is valid to delete a record before you try to delete it, that way you never get an exception in the first place.
Related
I have the following method updating an entity. The only biff I had was that when an non-existing ID were provided, I got an harsh exception.
public bool Update(Thing thing)
{
Context.Things.Update(thing);
int result = Context.SaveChanges();
return result == 1;
}
So I added a check to control the exception thrown (plus some nice logging and other facilitation). Eventually, I plan to skip the throwing up entirely.
public bool UpdateWithCheck(Thing thing)
{
Thing target = Context.Things.SingleOrDefault(a => a.Id == thing.Id);
if (target == null)
throw new CustomException($"No thing with ID {thing.Id}.");
Context.Things.Update(thing);
int result = Context.SaveChanges();
return result == 1;
}
No, this doesn't work, because the entity already is being tracked. I have several options to handle that.
Change to Context.Where(...).AsNoTracking().
Explicitly set the updated fields in target and save it.
Horse around with entity states and tampering with the tracker.
Removing the present and adding the new one.
I can't decide which is the best practice. Googling gave me the default examples that do not contain the check for pre-existing status in the same operation.
The reason for the exception is because by loading the entity from the Context to check if it exists, you now have a tracked reference. When you go to update the detatched reference, EF will complain that a instance is already tracked.
The simplest work-around would be:
public bool UpdateWithCheck(Thing thing)
{
bool doesExist = Context.Things.Any(a => a.Id == thing.Id);
if (!doesExist)
throw new CustomException($"No thing with ID {thing.Id}.");
Context.Things.Update(thing);
int result = Context.SaveChanges();
return result == 1;
}
However, there are two problems with this approach. Firstly, because we don't know the scope of the DbContext instance or can guarantee the order of methods, it may be possible that at some point that DbContext instance could have loaded and tracked that instance of the thing. This can manifest as seemingly intermittent errors. The proper way to guard against that would be something like:
public bool UpdateWithCheck(Thing thing)
{
bool doesExist = Context.Things.Any(a => a.Id == thing.Id);
if (!doesExist)
throw new CustomException($"No thing with ID {thing.Id}.");
Thing existing = Context.Things.Local.SingleOrDefault(a => a.Id == thing.Id);
if (existing != null)
Context.Entry(existing).State = EntityState.Detached;
Context.Things.Update(thing);
int result = Context.SaveChanges();
return result == 1;
}
This checks the local tracking cache for any loaded instances, and if found, detaches them. The risk here is that any modifications that haven't be persisted in those tracked references will be discarded, and any references floating around that would have assumed were attached, will now be detached.
The second significant issue is with using Update(). When you have detached entities being passed around there is a risk that data you don't intend to be updated could be updated. Update will replace all columns, where typically if a client might only be expected to update a subset of them. EF can be configured to check row versions or timestamps on entities against the database before updating when your database is set up to support them (Such as Snapshot isolation) which can help guard against stale overwrites, but still allow unexpected tampering.
As you've already figured out, the better approach is to avoid passing detached entities around, and instead use dedicated DTOs. This avoids potential confusion about what objects represent view/consumer state vs. data state. By explicitly copying the values across from the DTO to the entity, or configuring a mapper to copy supported values, you also protect your system from unexpected tampering and potential stale overwrites. One consideration with this approach is that you should guard updates to avoid unconditionally overwriting data with potentially stale data by ensuring your Entity and DTO have a RowVersion/Timestamp to compare. Before copying from DTO to the freshly loaded Entity, compare the version, if it matches then nothing has changed in the data row since you fetched and composed your DTO. If it has changed, that means someone else has updated the underlying data row since the DTO was read, so your modifications are against stale data. From there, take an appropriate action such as discard changes, overwrite changes, merge the changes, log the fact, etc.
Just alter properties of target and call SaveChanges() - remove the Update call. I'd say the typical use case these days is for the input thing to not actually be a Thing but to be a ThingViewModel, ThingDto or some other variation on a theme of "an object that carries enough data to identify and update a Thing but isn't actually a DB entity". To that extent, if the notion of updating properties of Thing from ThingViewModel by hand bores you, you can look at a mapper (AutoMapper is probably the most well known but there are many others) to do the copying for you, or even set you up with a new Thing if you decide to turn this method into an Upsert
I have a list of objects that I need to attach to the Context in order to track changes and, afterward, saving them, but the foreach iterating the items executes only the first time, after that the method ends.
I'm certain that those items already exist in the database.
I have tried both calling the .Attach() method and setting the Entry state to Unchanged.
protected override Task SetViewModelPropertiesAsync()
{
SelectedItems.ForEach(l =>
{
//Context.Pap_Pedido_ODP.Attach(l);
Context.Entry(l).State = System.Data.Entity.EntityState.Unchanged;
// After the first iteration the method ends
});
return base.SetViewModelPropertiesAsync();
}
I expect all the items to be added to context but after the first iteration of the foreach loop breaks the method and continues to the next one, without even giving an exception.
EDIT:
There is more code after the foreach that is being skipped when I do either the Attach or EntityState.
If I comment both the code executes correctly
The behaviour does sound like an exception is being thrown. This is IMO a huge red-flag about List<T>.ForEach() and the main reason I never use it. If you alter your code to:
foreach(var item in SelectedItems)
{
Context.Pap_Pedido_ODP.Attach(item);
Context.Entry(item).State = System.Data.Entity.EntityState.Unchanged;
}
... you should at least now see the exception(s) that are blocking your code. Attaching/Detaching entities between contexts is messy and there are very, very few scenarios where I personally can ever justify it. You are dealing with references to an entity. This means that:
item must not already be associated to any other context.
Context must not already have another entity tracked with the same PK as item.
Point #1 will hinder you because any code returning an entity that "might" want to attach that entity to another context will need to detach, or otherwise load that entity AsNoTracking. Passing a tracked entity to your method from somewhere will break your code.
Point #2 will hinder you because even if the entity passed is detached, if your context happens to already know about that entity via another reference, you have to essentially discard that untracked entity, and use the reference to the tracked instance. This means before attaching any entity you need to check Context .Local for a matching entity.
Only if the entity isn't tracked, and the context does not have a tracked entity with the same PK can you attach it.
If your code is not breaking on an exception and you are debugging, make sure your debug exception handling is set to break on all exceptions. Alternatively you can pop a try/catch block with a breakpoint in the catch to inspect the exception.
Edit: To check instances
foreach(var item in SelectedItems)
{
if(Context.Pap_Pedido_ODP.Local.Contains(item))
{ // This exact instance is already associated to the Context.
// We shouldn't need to copy anything across or do anything...
}
else
{
var existingItem = Context.Pap_Pedido_ODP.Local.SingleOrDefault(x => x.Id == item.Id);
if(existingItem != null)
{ // A different instance matching this one already exists in the context,
// Here if item represents changes we would need to copy changes across to existingItem...
}
else
{ // Item is not associated, safe to attach.
Context.Pap_Pedido_ODP.Attach(item);
// ...
}
}
}
Now it doesn't end there. If "item" contains references to other entities, each and every one will be updated automatically. This can cause problems if some of them have already been associated to the context. This can be caused when the DbContext is too long-lived or where multiple copies of the same instance of a referenced entity are passed back. For instance if I have a set of Orders being saved, and Orders contain references to Customers. 2 orders have a reference to the same customer. When I attach Order #1 to Customer #1, Customer 1 is now associated to the context. When I try to attach Order #2, the instance of Customer #1 is a different instance to Order #1 so attaching Order #2 will generate an error. When dealing with detached entities, you need to take steps to ensure that all instances of objects in the graph that refer to the same record are using the same object instance reference. When you loaded the data from EF, these would be the same object reference, but if you feed them to a Serializer/Deserializer you will get 2 identical copies as separate references. You cannot simply re-attach those object references.
Unfortunately there's no really simple answer I can offer to make it easier, but this is why serializing and deserializing entities can be a terrible idea, and even detaching/attaching instances can be a pain.
I am new to EF 4.1, i am trying to add list of object to sql table using EF and DB first approach.
1) Should i be calling context.SaveChanges() inside for loop after AddObject(Trans) or its okay to call it outside the loop?
2) Also, would all transactions be roll back in case there is an error?
I have read the Entity Framework already supports native database transactions, If any dirty ObjectStateEntry object cannot be persisted, then ObjectContext.SaveChanges will roll back the transaction and throw an exception.
3) What else can be done to increase the performance? I will be inserting around 1200 records in one go.
try
{
using (TransactionModel context = new TransactionModel())
{
try
{
foreach (var item in lstUnAllocatedTransaction)
{
Transaction Trans = new Transaction();
Trans.Amount = item.Amount;
Trans.Date = Datetime.Now;
Trans.DateAllocated = null;
context.Transactions.AddObject(Trans);
}
context.SaveChanges();
}
catch (Exception ex)
{
}
}
}
catch (Exception ex)
{
}
}
1) Should i be calling context.SaveChanges() inside for loop after AddObject(Trans) or its okay to call it outside the loop?
It's okay (and preferable) to call outside the loop if it's okay that the whole add operations are one transaction. If you need one transaction per item, then obviously you need to call it inside the loop.
2) Also, would all transactions be roll back in case there is an error? I have read the Entity Framework already supports native database transactions, If any dirty ObjectStateEntry object cannot be persisted, then ObjectContext.SaveChanges will roll back the transaction and throw an exception.
Only the last transaction will be rolled back. That's why 1) makes a difference. If you want to insert all-or-none then you need to add all and then call SaveChanges once.
3) What else can be done to increase the performance? I will be inserting around 1200 records in one go.
You could call AddRange instead of Add, otherwise I never had problems with a small number of records. If you think it's too slow, look into bulk inserts. There is nothing build into EF to do that, you will need an extension.
Sometimes I am importing 5000+ rows (of different entity types) into my DB. I use Entity Framework and I want the context to save the rows that fits on my model, if there are invalid ones they should be discarded.
I am inside a transaction, so :
I need to decide at the end of the iteration if i want to Complete() the transacion or Dispose() ,It depends on which kind of entity throws the exception.
I think there are 2 possibilities:
Check if a entity object is valid before context.Add(object)
Add the 5000 items and, set in some way that context.SaveChanges() , saves the valid one and discard the invalid ones (now it throws an exception and discard all added items).
I am asking for a snippet of code, or some indications i can follow to solve that situation. I want only to import the valid items and generate a log of the invalid ones.
Thanks. I will mark as an answer if you solve it.
Option 1 is the more sensible approach. Adding 5000 objects and then saving changes is very non-performant. A better approach is
while (null != (entity = GetNextEntity())
{
if (entity.IsValid())
{
context.Add(entity);
context.SaveChanges();
}
}
Update -- Ignore SQL errors
while (null != (entity = GetNextEntity())
{
try
{
context.Add(entity);
context.SaveChanges();
}
catch (Exception) { /* eat it */ }
}
Do I have to refresh a LINQ object that represents a view/table row after I do a submitchanges operation that may alter it?
Let's say I have
Event _event = new Event
{
someproperty = "this";
};
DataContext.Events.InsertOnSubmit(_event);
DataContext.SubmitChanges();
//some operations
_event.someproperty = "that";
DataContext.SubmitChanges();
Let's add to the equation that during some operations, a different thread, with a different instance of DataContext, might change the same _event and submit it.
I sometimes get an exception saying the row cannot be found or has changed. Is there a way to get around this without me having to reselect the _event?
I suppose you use optimistic concurrency with Linq. That means when updating an object in the DB, the update statement checks that the row was not changed in the meantime.
If another process changed the object in the DB between your initial read (or creation) of the object and your update, Linq will throw a ChangeConflictException. You can catch this Exception and handle the conflict (by reloading the object from the DB or by overwriting the DB values).
More details here: http://msdn.microsoft.com/en-us/library/bb399373%28v=VS.90%29.aspx
Your code could look something like:
try
{
_event.someproperty = "that";
DataContext.SubmitChanges();
}
catch (ChangeConflictException e)
{
foreach (ObjectChangeConflict occ in db.ChangeConflicts)
{
// All database values overwrite current values.
occ.Resolve(RefreshMode.OverwriteCurrentValues);
}
}
There are several refresh modes and it really depends on how you want your application to behave.
Please find more details in these answers:
LINQ to SQL: Concurrency resolution
If the entities are in the same DataContext, you don't, but if you have additional Datacontexts accessing the data (eg from other processes) then you need to be aware and refresh before you try commit.