Is it possible to update objects with Entity Framework, without grabbing them first?
Example: Here, I have a function that provides a Primary Key to locate the objects, pulls them, then updates them. I would like to eliminate having to pull the objects first, and simply run an UPDATE query. Removing the need for the SELECT query being generated.
public async Task<int> UpdateChecks(long? acctId, string payorname, string checkaccountnumber, string checkroutingnumber, string checkaccounttype)
{
using (var max = new Max(_max.ConnectionString))
{
var payments = await
max.payments.Where(
w =>
w.maindatabaseid == acctId && (w.paymentstatus == "PENDING" || w.paymentstatus == "HOLD")).ToListAsync();
payments.AsParallel().ForAll(payment =>
{
payment.payorname = payorname;
payment.checkaccountnumber = checkaccountnumber;
payment.checkroutingnumber = checkroutingnumber;
payment.checkaccounttype = checkaccounttype;
payment.paymentmethod = "CHECK";
payment.paymentstatus = "HOLD";
});
await max.SaveChangesAsync();
return payments.Count;
}
}
You can use the Attach() command to attach an entity you already know exists and then call SaveChanges() will will call the appropriate update method. Here is some sample code from the MSDN article on the topic:
on the subject:
var existingBlog = new Blog { BlogId = 1, Name = "ADO.NET Blog" };
using (var context = new BloggingContext())
{
context.Entry(existingBlog).State = EntityState.Unchanged;
// Do some more work...
context.SaveChanges();
}
Note that this is general EF logic, not related to any specific database implementation.
Related
Sometimes the entity is not updated within scope, but inserts are committed.
I'm thinking that this problem is due to the isolation level, added to the number of queries in the registry. I also think that it might be the order in which things are effective in the db, since the endpoint calls that imply changing the entity happen very quickly, even if in the correct order. I don't know...
The code is like:
using (var scope = new TransactionScope())
{
var db = new Context().Database.ExecuteSqlCommand("SET TRANSACTION ISOLATION LEVEL READ UNCOMMITTED;")
var student= db.Students.SingleOrDefault(x => x.ID == 1);
var schoolName = "TestSchool";
db.School.Insert(new School{ Name = schoolName }); // IT ALWAYS WORKS
student.School = schoolName; //SOMETIMES THIS CHANGE DOESN'T WORK
db.Save();
scope.Complete();
}
Any help?
It's only occur with mass data, making tests difficult
What you should do is query the school first, then assign the result to student.School
Something like this:
using (var scope = new TransactionScope())
{
var db = new Context().Database.ExecuteSqlCommand("SET TRANSACTION ISOLATION LEVEL READ UNCOMMITTED;")
var student= db.Students.SingleOrDefault(x => x.ID == 1);
var schoolName = "TestSchool";
db.School.Insert(new School{ Name = schoolName }); // IT ALWAYS WORKS
// Query new school here
var newSchool = db.School.SingleOrDefault(x => x.SchoolName = schoolName);
// Assign the result to student's School property
student.School = newSchool;
db.Save();
scope.Complete();
}
This way, Entity Framework would be able to "track" the changes and generate the correct SQL to update the database
I am writing many (20+) parent child datasets to the database, and EF is requiring me to savechanges between each set, without which it complains about not being able to figure out the primary key. Can the data be flushed to the SQL Server so that EF can get the primary keys back from the identities, with the SaveChanges being sent at the end of writing all of the changes?
foreach (var itemCount in itemCounts)
{
var addItemTracking = new ItemTracking
{
availabilityStatusID = availabilityStatusId,
itemBatchId = itemCount.ItemBatchId,
locationID = locationId,
serialNumber = serialNumber,
trackingQuantityOnHand = itemCount.CycleQuantity
};
_context.ItemTrackings.Add(addItemTracking);
_context.SaveChanges();
var addInventoryTransaction = new InventoryTransaction
{
activityHistoryID = newInventoryTransaction.activityHistoryID,
itemTrackingID = addItemTracking.ItemTrackingID,
personID = newInventoryTransaction.personID,
usageTransactionTypeId = newInventoryTransaction.usageTransactionTypeId,
transactionDate = newInventoryTransaction.transactionDate,
usageQuantity = usageMultiplier * itemCount.CycleQuantity
};
_context.InventoryTransactions.Add(addInventoryTransaction);
_context.SaveChanges();
}
I would like to do my SaveChanges just once at the end of the big loop.
You don`t need to save changes every time if you use objects refernces to newly created objects not IDs:
var addItemTracking = new ItemTracking
{
...
}
_context.ItemTrackings.Add(addItemTracking);
var addInventoryTransaction = new InventoryTransaction
{
itemTracking = addItemTracking,
...
};
_context.InventoryTransactions.Add(addInventoryTransaction);
...
_context.SaveChanges();
Since they're all new items rather than
itemTrackingID = addItemTracking.ItemTrackingID,
you could go with
addItemTracking.InventoryTransaction = addInventoryTransaction;
(or whatever the associated navigation property is) and pull the _context.SaveChanges() out of the loop entirely. Entity Framework is very good at inserting object graphs when everything is new. When saving object graphs containing both new and existing items setting the associated id is always safer.
How about:
var trackingItems = itemCounts
.Select(i => new ItemTracking
{
availabilityStatusID = availabilityStatusId,
itemBatchId = i.ItemBatchId,
locationID = locationId,
serialNumber = serialNumber,
trackingQuantityOnHand = i.CycleQuantity
});
_context.ItemTrackings.AddRange(trackingItems);
_context.SaveChanges();
var inventoryTransactions = trackingItems
.Select(t => new InventoryTransaction
{
activityHistoryID = newInventoryTransaction.activityHistoryID,
itemTrackingID = t.ItemTrackingID,
personID = newInventoryTransaction.personID,
usageTransactionTypeId = newInventoryTransaction.usageTransactionTypeId,
transactionDate = newInventoryTransaction.transactionDate,
usageQuantity = usageMultiplier * t.trackingQuantityOnHand
});
_context.InventoryTransactions.AddRange(inventoryTransactions);
_context.SaveChanges();
However I haven't worked with EF for quite a while and above code is written in notepad so I cannot vouch for it
I am using EF 6.0.2 and trying to update only the Status field of an entity.
var drama = new Drama { Id = id };
using (var ctx = new DataContext()) {
ctx.Dramas.Attach(drama);
drama.Status = state;
ctx.SaveChanges();
}
This throws a ValidationException: The ClassName field is required.
The entity already exists, and is valid (including having a ClassName). Id is the entity key.
What's going on here to cause the exception to be thrown?
You're saying there already is an entity with the given Id in the database and you'd only like to change its Status value?
That's not what you're doing in your code. You have created a new entity with the given Id and set only its Status value. That's why it's throwing a ValidationException: ClassName field is empty unless you're setting it in Drama constructor.
To modify an existing entity you should first retrieve it from the database, then modify it and save the changes:
using (var ctx = new DataContext()) {
var drama = ctx.Dramas.Single(d => d.Id == id);
drama.Status = state;
ctx.SaveChanges();
}
EDIT:
If you want to simulate a scenario of editing a detached entity, you still need its values to be valid. Try fulfilling all the validation requirements before attaching the entity (i.e. fill in the ClassName), then Attach it, update Status and SaveChanges:
var drama = new Drama { Id = id, ClassName = "Dummy" };
using (var ctx = new DataContext()) {
ctx.Dramas.Attach(drama);
drama.Status = state;
ctx.SaveChanges();
}
When you do edits in EF, you should get the entire record that you are editing and then put the fields whatever you want to update.
using (var ctx = new DataContext()) {
drama = ctx.Dramas.Where(O => O.Id = id);
drama.Status = state;
ctx.Dramas.Attach(drama);
ctx.SaveChanges();
}
Hope this helps
I have a query that looks like this:
using (MyDC TheDC = new MyDC())
{
foreach (MyObject TheObject in TheListOfMyObjects)
{
DBTable TheTable = new DBTable();
TheTable.Prop1 = TheObject.Prop1;
.....
TheDC.DBTables.InsertOnSubmit(TheTable);
}
TheDC.SubmitChanges();
}
This query basically inserts a list into the database using linq-to-sql. Now I've read online that L2S does NOT support bulk operations.
Does my query work by inserting each element at a time or all of them in one write?
Thanks for the clarification.
I modified the code from the following link to be more efficient and used it in my application. It is quite convenient because you can just put it in a partial class on top of your current autogenerated class. Instead of InsertOnSubmit add entities to a list, and instead of SubmitChanges call YourDataContext.BulkInsertAll(list).
http://www.codeproject.com/Tips/297582/Using-bulk-insert-with-your-linq-to-sql-datacontex
partial void OnCreated()
{
CommandTimeout = 5 * 60;
}
public void BulkInsertAll<T>(IEnumerable<T> entities)
{
using( var conn = new SqlConnection(Connection.ConnectionString))
{
conn.Open();
Type t = typeof(T);
var tableAttribute = (TableAttribute)t.GetCustomAttributes(
typeof(TableAttribute), false).Single();
var bulkCopy = new SqlBulkCopy(conn)
{
DestinationTableName = tableAttribute.Name
};
var properties = t.GetProperties().Where(EventTypeFilter).ToArray();
var table = new DataTable();
foreach (var property in properties)
{
Type propertyType = property.PropertyType;
if (propertyType.IsGenericType &&
propertyType.GetGenericTypeDefinition() == typeof(Nullable<>))
{
propertyType = Nullable.GetUnderlyingType(propertyType);
}
table.Columns.Add(new DataColumn(property.Name, propertyType));
}
foreach (var entity in entities)
{
table.Rows.Add(
properties.Select(
property => property.GetValue(entity, null) ?? DBNull.Value
).ToArray());
}
bulkCopy.WriteToServer(table);
}
}
private bool EventTypeFilter(System.Reflection.PropertyInfo p)
{
var attribute = Attribute.GetCustomAttribute(p,
typeof(AssociationAttribute)) as AssociationAttribute;
if (attribute == null) return true;
if (attribute.IsForeignKey == false) return true;
return false;
}
The term Bulk Insert usually refers to the SQL Server specific ultra fast bcp based SqlBulkCopy implementation. It is built on top of IRowsetFastLoad.
Linq-2-SQL does not implement insert using this mechanism, under any conditions.
If you need to bulk load data into SQL Server and need it to be fast, I would recommend hand coding using SqlBulkCopy.
Linq-2-SQL will attempt to perform some optimisations to speed up multiple inserts, however it still will fall short of many micro ORMs (even though no micro ORMs I know of implement SqlBulkCopy)
It will generate a single insert statement for every record, but will send them all to the server in a single batch and run in a single transaction.
That is what the SubmitChanges() outside the loop does.
If you moved it inside, then every iteration through the loop would go off to the server for the INSERT and run in it's own transaction.
I don't believe there is any way to fire off a SQL BULK INSERT.
LINQ Single Insert from List:
int i = 0;
foreach (IPAPM_SRVC_NTTN_NODE_MAP item in ipapmList)
{
++i;
if (i % 50 == 0)
{
ipdb.Dispose();
ipdb = null;
ipdb = new IPDB();
// .NET CORE
//ipdb.ChangeTracker.AutoDetectChangesEnabled = false;
ipdb.Configuration.AutoDetectChangesEnabled = false;
}
ipdb.IPAPM_SRVC_NTTN_NODE_MAP.Add(item);
ipdb.SaveChanges();
}
I would suggest you take a look at N.EntityFramework.Extension. It is a basic bulk extension framework for EF 6 that is available on Nuget and the source code is available on Github under MIT license.
Install-Package N.EntityFramework.Extensions
https://www.nuget.org/packages/N.EntityFramework.Extensions
Once you install it you can simply use BulkInsert() method directly on the DbContext instance. It support BulkDelete, BulkInsert, BulkMerge and more.
BulkInsert()
var dbcontext = new MyDbContext();
var orders = new List<Order>();
for(int i=0; i<10000; i++)
{
orders.Add(new Order { OrderDate = DateTime.UtcNow, TotalPrice = 2.99 });
}
dbcontext.BulkInsert(orders);
using (EntitiesContainer db = new EntitiesContainer())
{
Language language = db.Languages.SingleOrDefault(x => x.Culture == _session.Language);
Language language2 = new Language { Id = action.Language.Id };
Operation operation = new Operation { Id = action.Operation.Id };
//Operation operation = db.Operations.SingleOrDefault(x => x.Id == action.Operation.Id);
if (!language.Id.Equals(language2.Id))
{
db.Languages.Attach(language2);
action.Language = language2;
}
else
{
action.Language = language;
}
db.Operations.Attach(operation);
//db.ObjectStateManager.ChangeObjectState(operation, System.Data.EntityState.Unchanged);
action.Operation = operation;
//operation.Internals.Add(action);
action.CurrentDetail.Language = language;
action.CurrentDetail.Id = Guid.NewGuid();
action.Id = Guid.NewGuid();
db.SaveChanges();
}
Hello I Try all this scenario in commentary, for link my existing operation to the internal action that inherited from action .. but in any of this scenario, he throw me an error like he want to insert a new operation in the DB (dbo.Operations can accept "Action" NULL value) but the Entity already exist.. Can someone please, give me the golden rule .. to insert entity with relation .. existing or not in EF. It's driving me crazy!
Cordialy,
Julien.
If i understand you correctly (your trying to update a relationship), you can use the stub technique:
Language lang = new Language { Id = action.language_id }; // create stub with ID
db.Languages.Attach(lang); // attach stub to graph
action.language = lang; // update relationship
db.SaveChanges();