EntityDbContext SaveChanges : What are the differences between these two codes? - c#

In order to performance, exception capturing and etc., What are the differences between these two codes:
int count = 0;
foreach (var record in SomeDbEntityList)
{
count++;
dbContext.SomeDbEntity.Add(record);
if (count > 500)
{
dbContext.SaveChanges();
count = 0;
}
}
//LeftOver
if (count > 0)
dbContext.SaveChanges();
The direct code:
foreach (var record in SomeDbEntityList)
{
dbContext.SomeDbEntity.Add(record);
dbContext.SaveChanges();
}

The first code saves the new changes on the DB after 500 (or less)
records are entity, so 500 updates/inserts are made to the DB on the
same time.
Advantages: you don't go for every record in the DB.
Disadvantages: if any of these records have an error (issue), than none of the records will be updated.
The second code saves directly to the DB.
Please take a look at Unit of Work design pattern.

Related

Entity Framework context retaining local data after being re-created

I am trying to do large inserts using entity framework, so to reduce context bloat I am trying to recreate the context every x loops. However, after recreating the context the data from the previous context is still in the local information.
I have tried calling Dispose() on the context and setting it to null before recreating it.
I found this question asked before where someone seems to have the same issue, but there was no solution found. Entity Framework Context - Recreating Object keeps old object in memory
ApplicationDbContext context = new ApplicationDbContext();
context.Configuration.AutoDetectChangesEnabled = false;
for (int i = 0; i < products.Count; i++)
{
context.Product.Add(products[i]);
if(i % 50 == 0)
{
context.SaveChanges();
context.Dispose();
context = null;
context = new ApplicationDbContext();
context.Configuration.AutoDetectChangesEnabled = false;
}
}
context.SaveChanges();
Context Product local
For clarification, this is what I mean when the local data is not cleared. The local data for the Products has 151 entries locally, despite having the context be recreated every 50 entries. This has lead to the products being added multiple times to the database.
Just to clarify what I am asking, how can I recreate the context so that it does not have an data left over from the previous context? Also, if anyone could explain what causes the data to be retained to begin with that would be nice to understand.
Edit:
I have tried restructuring the code as suggested by by Gert Arnold in the comments.
Here is the current version.
public void BatchAddProducts(List<ProductModel> products)
{
int loops = (int)Math.Ceiling((double)(products.Count / 50));
List<ProductModel> sublist = new List<ProductModel>();
for (int i = 0; i < loops; i++)
{
int toPull;
if((i * 50 + 50) > products.Count)
{
toPull = products.Count - (i * 50);
}
else
{
toPull = 50;
}
sublist = products.GetRange(i * 50, toPull);
ProductAdd(sublist);
}
}
private void ProductAdd(List<ProductModel> products)
{
using(ApplicationDbContext context = new ApplicationDbContext())
{
context.Product.Local.Clear();
context.Product.AddRange(products);
context.SaveChanges();
}
}
This still has the same issue. The context is retaining information from the version that should no longer exist, and entries that were added in the previous loop are being added to the database again. i.e. after 2 iterations 150 entries are added to the database, and after 3 there are 300 new entries in the database.
In response to the answer by Steve Py, I am checking the context.Product.Local information just before SaveChanges() is called. The duplicate data I mention is duplicates of the products that are in the context. So to refer to the example I made earlier, after 3 loops, there are 3 versions of a product from the 1st iteration, 2 from the 2nd, and 1 from the 3rd. I dont believe it is an issue with how the product list is being generated. That was my first thought when I encountered this, and I checked that code. This code is being written as an upgrade to improve large inserts, and the code that generates the products is older that worked fine with the old update system. Also, the list being passed to the function only has ~8000 Products, so it cant be that as far as I can tell.
I suspect that you may be encountering a different issue based on the statement "this has lead to the products being added multiple times to the database" and your process of verifying the local cache is resulting in the DBSet being loaded into memory.
I have tested the disposed DbContext behaviour around the .Local and it is behaving as expected, the previously loaded entities are not still referenced.
[Test]
public void TestLocalReset()
{
var context = new TestDbContext();
for (int count = 1; count <= 100; count++)
{
var note = new Note { Text = $"Note #{count}." };
context.Notes.Add(note);
context.SaveChanges();
if (count % 10 == 0)
{
int test = context.Notes.Local.Count; // <- Always 10
context.Dispose();
context = new TestDbContext();
test = context.Notes.Local.Count; // <- Always 0
}
}
context.Dispose();
}
Your example does not show how/where you are checking the .Local instance, but if you have any code that queries against your DbSet that could result in loading the entire results into memory and showing up in the .Local instance.
As for duplicate data, are you getting duplicate Products, or do you have other referenced entities beneath the Product that you are seeing duplicated? How are you building/populating these Products in your array? References that might have been loaded by the outermost DbContext to be referenced by these new products will not be tracked when you recreate the DbContext resulting in potentially duplicate related entities being inserted.
You may need to update your question to show the complete code you are using to create the Products and insert them as this will likely contain details explaining the behaviour you are seeing.

Try to fix SQL Exception: A new transaction is not allowed because there are other threads running in the session [duplicate]

I am currently getting this error:
System.Data.SqlClient.SqlException: New transaction is not allowed because there are other threads running in the session.
while running this code:
public class ProductManager : IProductManager
{
#region Declare Models
private RivWorks.Model.Negotiation.RIV_Entities _dbRiv = RivWorks.Model.Stores.RivEntities(AppSettings.RivWorkEntities_connString);
private RivWorks.Model.NegotiationAutos.RivFeedsEntities _dbFeed = RivWorks.Model.Stores.FeedEntities(AppSettings.FeedAutosEntities_connString);
#endregion
public IProduct GetProductById(Guid productId)
{
// Do a quick sync of the feeds...
SyncFeeds();
...
// get a product...
...
return product;
}
private void SyncFeeds()
{
bool found = false;
string feedSource = "AUTO";
switch (feedSource) // companyFeedDetail.FeedSourceTable.ToUpper())
{
case "AUTO":
var clientList = from a in _dbFeed.Client.Include("Auto") select a;
foreach (RivWorks.Model.NegotiationAutos.Client client in clientList)
{
var companyFeedDetailList = from a in _dbRiv.AutoNegotiationDetails where a.ClientID == client.ClientID select a;
foreach (RivWorks.Model.Negotiation.AutoNegotiationDetails companyFeedDetail in companyFeedDetailList)
{
if (companyFeedDetail.FeedSourceTable.ToUpper() == "AUTO")
{
var company = (from a in _dbRiv.Company.Include("Product") where a.CompanyId == companyFeedDetail.CompanyId select a).First();
foreach (RivWorks.Model.NegotiationAutos.Auto sourceProduct in client.Auto)
{
foreach (RivWorks.Model.Negotiation.Product targetProduct in company.Product)
{
if (targetProduct.alternateProductID == sourceProduct.AutoID)
{
found = true;
break;
}
}
if (!found)
{
var newProduct = new RivWorks.Model.Negotiation.Product();
newProduct.alternateProductID = sourceProduct.AutoID;
newProduct.isFromFeed = true;
newProduct.isDeleted = false;
newProduct.SKU = sourceProduct.StockNumber;
company.Product.Add(newProduct);
}
}
_dbRiv.SaveChanges(); // ### THIS BREAKS ### //
}
}
}
break;
}
}
}
Model #1 - This model sits in a database on our Dev Server.
Model #1 http://content.screencast.com/users/Keith.Barrows/folders/Jing/media/bdb2b000-6e60-4af0-a7a1-2bb6b05d8bc1/Model1.png
Model #2 - This model sits in a database on our Prod Server and is updated each day by automatic feeds. alt text http://content.screencast.com/users/Keith.Barrows/folders/Jing/media/4260259f-bce6-43d5-9d2a-017bd9a980d4/Model2.png
Note - The red circled items in Model #1 are the fields I use to "map" to Model #2. Please ignore the red circles in Model #2: that is from another question I had which is now answered.
Note: I still need to put in an isDeleted check so I can soft delete it from DB1 if it has gone out of our client's inventory.
All I want to do, with this particular code, is connect a company in DB1 with a client in DB2, get their product list from DB2 and INSERT it in DB1 if it is not already there. First time through should be a full pull of inventory. Each time it is run there after nothing should happen unless new inventory came in on the feed over night.
So the big question - how to I solve the transaction error I am getting? Do I need to drop and recreate my context each time through the loops (does not make sense to me)?
After much pulling out of hair I discovered that the foreach loops were the culprits. What needs to happen is to call EF but return it into an IList<T> of that target type then loop on the IList<T>.
Example:
IList<Client> clientList = from a in _dbFeed.Client.Include("Auto") select a;
foreach (RivWorks.Model.NegotiationAutos.Client client in clientList)
{
var companyFeedDetailList = from a in _dbRiv.AutoNegotiationDetails where a.ClientID == client.ClientID select a;
// ...
}
As you've already identified, you cannot save from within a foreach that is still drawing from the database via an active reader.
Calling ToList() or ToArray() is fine for small data sets, but when you have thousands of rows, you will be consuming a large amount of memory.
It's better to load the rows in chunks.
public static class EntityFrameworkUtil
{
public static IEnumerable<T> QueryInChunksOf<T>(this IQueryable<T> queryable, int chunkSize)
{
return queryable.QueryChunksOfSize(chunkSize).SelectMany(chunk => chunk);
}
public static IEnumerable<T[]> QueryChunksOfSize<T>(this IQueryable<T> queryable, int chunkSize)
{
int chunkNumber = 0;
while (true)
{
var query = (chunkNumber == 0)
? queryable
: queryable.Skip(chunkNumber * chunkSize);
var chunk = query.Take(chunkSize).ToArray();
if (chunk.Length == 0)
yield break;
yield return chunk;
chunkNumber++;
}
}
}
Given the above extension methods, you can write your query like this:
foreach (var client in clientList.OrderBy(c => c.Id).QueryInChunksOf(100))
{
// do stuff
context.SaveChanges();
}
The queryable object you call this method on must be ordered. This is because Entity Framework only supports IQueryable<T>.Skip(int) on ordered queries, which makes sense when you consider that multiple queries for different ranges require the ordering to be stable. If the ordering isn't important to you, just order by primary key as that's likely to have a clustered index.
This version will query the database in batches of 100. Note that SaveChanges() is called for each entity.
If you want to improve your throughput dramatically, you should call SaveChanges() less frequently. Use code like this instead:
foreach (var chunk in clientList.OrderBy(c => c.Id).QueryChunksOfSize(100))
{
foreach (var client in chunk)
{
// do stuff
}
context.SaveChanges();
}
This results in 100 times fewer database update calls. Of course each of those calls takes longer to complete, but you still come out way ahead in the end. Your mileage may vary, but this was worlds faster for me.
And it gets around the exception you were seeing.
EDIT I revisited this question after running SQL Profiler and updated a few things to improve performance. For anyone who is interested, here is some sample SQL that shows what is created by the DB.
The first loop doesn't need to skip anything, so is simpler.
SELECT TOP (100) -- the chunk size
[Extent1].[Id] AS [Id],
[Extent1].[Name] AS [Name],
FROM [dbo].[Clients] AS [Extent1]
ORDER BY [Extent1].[Id] ASC
Subsequent calls need to skip previous chunks of results, so introduces usage of row_number:
SELECT TOP (100) -- the chunk size
[Extent1].[Id] AS [Id],
[Extent1].[Name] AS [Name],
FROM (
SELECT [Extent1].[Id] AS [Id], [Extent1].[Name] AS [Name], row_number()
OVER (ORDER BY [Extent1].[Id] ASC) AS [row_number]
FROM [dbo].[Clients] AS [Extent1]
) AS [Extent1]
WHERE [Extent1].[row_number] > 100 -- the number of rows to skip
ORDER BY [Extent1].[Id] ASC
We have now posted an official response to the bug opened on Connect. The workarounds we recommend are as follows:
This error is due to Entity Framework creating an implicit transaction during the SaveChanges() call. The best way to work around the error is to use a different pattern (i.e., not saving while in the midst of reading) or by explicitly declaring a transaction. Here are three possible solutions:
// 1: Save after iteration (recommended approach in most cases)
using (var context = new MyContext())
{
foreach (var person in context.People)
{
// Change to person
}
context.SaveChanges();
}
// 2: Declare an explicit transaction
using (var transaction = new TransactionScope())
{
using (var context = new MyContext())
{
foreach (var person in context.People)
{
// Change to person
context.SaveChanges();
}
}
transaction.Complete();
}
// 3: Read rows ahead (Dangerous!)
using (var context = new MyContext())
{
var people = context.People.ToList(); // Note that this forces the database
// to evaluate the query immediately
// and could be very bad for large tables.
foreach (var person in people)
{
// Change to person
context.SaveChanges();
}
}
Indeed you cannot save changes inside a foreach loop in C# using Entity Framework.
context.SaveChanges() method acts like a commit on a regular database system (RDMS).
Just make all changes (which Entity Framework will cache) and then save all of them at once calling SaveChanges() after the loop (outside of it), like a database commit command.
This works if you can save all changes at once.
Just put context.SaveChanges() after end of your foreach(loop).
Making your queryable lists to .ToList() and it should work fine.
FYI: from a book and some lines adjusted because it's still valid:
Invoking SaveChanges() method begins a transaction which automatically rolls back all changes persisted to the database if an exception occurs before iteration completes; otherwise the transaction commits. You might be tempted to apply the method after each entity update or deletion rather than after iteration completes, especially when you're updating or deleting massive numbers of entities.
If you try to invoke SaveChanges() before all data has been processed, you incur a "New transaction is not allowed because there are other threads running in the session" exception. The exception occurs because SQL Server doesn't permit starting a new transaction on a connection that has a SqlDataReader open, even with Multiple Active Record Sets (MARS) enabled by the connection string (EF's default connection string enables MARS)
Sometimes its better to understand why things are happening ;-)
Always Use your selection as List
Eg:
var tempGroupOfFiles = Entities.Submited_Files.Where(r => r.FileStatusID == 10 && r.EventID == EventId).ToList();
Then Loop through the Collection while save changes
foreach (var item in tempGroupOfFiles)
{
var itemToUpdate = item;
if (itemToUpdate != null)
{
itemToUpdate.FileStatusID = 8;
itemToUpdate.LastModifiedDate = DateTime.Now;
}
Entities.SaveChanges();
}
I was getting this same issue but in a different situation. I had a list of items in a list box. The user can click an item and select delete but I am using a stored proc to delete the item because there is a lot of logic involved in deleting the item. When I call the stored proc the delete works fine but any future call to SaveChanges will cause the error. My solution was to call the stored proc outside of EF and this worked fine. For some reason when I call the stored proc using the EF way of doing things it leaves something open.
We started seeing this error "New transaction is not allowed because there are other threads running in the session" after migrating from EF5 to EF6.
Google brought us here but we are not calling SaveChanges() inside the loop. The errors were raised when executing a stored procedure using the ObjectContext.ExecuteFunction inside a foreach loop reading from the DB.
Any call to ObjectContext.ExecuteFunction wraps the function in a transaction. Beginning a transaction while there is already an open reader causes the error.
It is possible to disable wrapping the SP in a transaction by setting the following option.
_context.Configuration.EnsureTransactionsForFunctionsAndCommands = false;
The EnsureTransactionsForFunctionsAndCommands option allows the SP to run without creating its own transaction and the error is no longer raised.
DbContextConfiguration.EnsureTransactionsForFunctionsAndCommands Property
Here are another 2 options that allow you to invoke SaveChanges() in a for each loop.
The first option is use one DBContext to generate your list objects to iterate through, and then create a 2nd DBContext to call SaveChanges() on. Here is an example:
//Get your IQueryable list of objects from your main DBContext(db)
IQueryable<Object> objects = db.Object.Where(whatever where clause you desire);
//Create a new DBContext outside of the foreach loop
using (DBContext dbMod = new DBContext())
{
//Loop through the IQueryable
foreach (Object object in objects)
{
//Get the same object you are operating on in the foreach loop from the new DBContext(dbMod) using the objects id
Object objectMod = dbMod.Object.Find(object.id);
//Make whatever changes you need on objectMod
objectMod.RightNow = DateTime.Now;
//Invoke SaveChanges() on the dbMod context
dbMod.SaveChanges()
}
}
The 2nd option is to get a list of database objects from the DBContext, but to select only the id's. And then iterate through the list of id's (presumably an int) and get the object corresponding to each int, and invoke SaveChanges() that way. The idea behind this method is grabbing a large list of integers, is a lot more efficient then getting a large list of db objects and calling .ToList() on the entire object. Here is an example of this method:
//Get the list of objects you want from your DBContext, and select just the Id's and create a list
List<int> Ids = db.Object.Where(enter where clause here)Select(m => m.Id).ToList();
var objects = Ids.Select(id => db.Objects.Find(id));
foreach (var object in objects)
{
object.RightNow = DateTime.Now;
db.SaveChanges()
}
If you get this error due to foreach and you really need to save one entity first inside loop and use generated identity further in loop, as was in my case, the easiest solution is to use another DBContext to insert entity which will return Id and use this Id in outer context
For example
using (var context = new DatabaseContext())
{
...
using (var context1 = new DatabaseContext())
{
...
context1.SaveChanges();
}
//get id of inserted object from context1 and use is.
context.SaveChanges();
}
I was also facing same issue.
Here is the cause and solution.
http://blogs.msdn.com/b/cbiyikoglu/archive/2006/11/21/mars-transactions-and-sql-error-3997-3988-or-3983.aspx
Make sure before firing data manipulation commands like inserts, updates, you have closed all previous active SQL readers.
Most common error is functions that read data from db and return values.
For e.g functions like isRecordExist.
In this case we immediately return from the function if we found the record and forget to close the reader.
So in the project were I had this exact same issue the problem wasn't in the foreach or the .toList() it was actually in the AutoFac configuration we used.
This created some weird situations were the above error was thrown but also a bunch of other equivalent errors were thrown.
This was our fix:
Changed this:
container.RegisterType<DataContext>().As<DbContext>().InstancePerLifetimeScope();
container.RegisterType<DbFactory>().As<IDbFactory>().SingleInstance();
container.RegisterType<UnitOfWork>().As<IUnitOfWork>().InstancePerRequest();
To:
container.RegisterType<DataContext>().As<DbContext>().As<DbContext>();
container.RegisterType<DbFactory>().As<IDbFactory>().As<IDbFactory>().InstancePerLifetimeScope();
container.RegisterType<UnitOfWork>().As<IUnitOfWork>().As<IUnitOfWork>();//.InstancePerRequest();
I know it is an old question but i faced this error today.
and i found that, this error can be thrown when a database table trigger gets an error.
for your information, you can check your tables triggers too when you get this error.
I needed to read a huge ResultSet and update some records in the table.
I tried to use chunks as suggested in Drew Noakes's answer.
Unfortunately after 50000 records I've got OutofMemoryException.
The answer Entity framework large data set, out of memory exception explains, that
EF creates second copy of data which uses for change detection (so
that it can persist changes to the database). EF holds this second set
for the lifetime of the context and its this set thats running you out
of memory.
The recommendation is to re-create your context for each batch.
So I've retrieved Minimal and Maximum values of the primary key- the tables have primary keys as auto incremental integers.Then I retrieved from the database chunks of records by opening context for each chunk. After processing the chunk context closes and releases the memory. It insures that memory usage is not growing.
Below is a snippet from my code:
public void ProcessContextByChunks ()
{
var tableName = "MyTable";
var startTime = DateTime.Now;
int i = 0;
var minMaxIds = GetMinMaxIds();
for (int fromKeyID= minMaxIds.From; fromKeyID <= minMaxIds.To; fromKeyID = fromKeyID+_chunkSize)
{
try
{
using (var context = InitContext())
{
var chunk = GetMyTableQuery(context).Where(r => (r.KeyID >= fromKeyID) && (r.KeyID < fromKeyID+ _chunkSize));
try
{
foreach (var row in chunk)
{
foundCount = UpdateRowIfNeeded(++i, row);
}
context.SaveChanges();
}
catch (Exception exc)
{
LogChunkException(i, exc);
}
}
}
catch (Exception exc)
{
LogChunkException(i, exc);
}
}
LogSummaryLine(tableName, i, foundCount, startTime);
}
private FromToRange<int> GetminMaxIds()
{
var minMaxIds = new FromToRange<int>();
using (var context = InitContext())
{
var allRows = GetMyTableQuery(context);
minMaxIds.From = allRows.Min(n => (int?)n.KeyID ?? 0);
minMaxIds.To = allRows.Max(n => (int?)n.KeyID ?? 0);
}
return minMaxIds;
}
private IQueryable<MyTable> GetMyTableQuery(MyEFContext context)
{
return context.MyTable;
}
private MyEFContext InitContext()
{
var context = new MyEFContext();
context.Database.Connection.ConnectionString = _connectionString;
//context.Database.Log = SqlLog;
return context;
}
FromToRange is a simple structure with From and To properties.
Recently I faced the same issue in my project so posting my experience and it might help some on the same boat as i was. The issue was due to i am looping through the results of EF select query (results are not retrieved into memory).
var products = (from e in _context.Products
where e.StatusId == 1
select new { e.Name, e.Type });
foreach (var product in products)
{
//doing some insert EF Queries
//some EF select quries
await _context.SaveChangesAsync(stoppingToken); // This code breaks.
}
I have updated my Products select query to bring the results into LIST rather than IQueryable (This seems to be opening the reader throughout for each loop and hence save was failing).
var products = (from e in _context.Products
where e.StatusId == 1
select new { e.Name, e.Type })**.ToList()**; //see highlighted
The code below works for me:
private pricecheckEntities _context = new pricecheckEntities();
...
private void resetpcheckedtoFalse()
{
try
{
foreach (var product in _context.products)
{
product.pchecked = false;
_context.products.Attach(product);
_context.Entry(product).State = EntityState.Modified;
}
_context.SaveChanges();
}
catch (Exception extofException)
{
MessageBox.Show(extofException.ToString());
}
productsDataGrid.Items.Refresh();
}
In my case, the problem appeared when I called Stored Procedure via EF and then later SaveChanges throw this exception. The problem was in calling the procedure, the enumerator was not disposed. I fixed the code following way:
public bool IsUserInRole(string username, string roleName, DataContext context)
{
var result = context.aspnet_UsersInRoles_IsUserInRoleEF("/", username, roleName);
//using here solved the issue
using (var en = result.GetEnumerator())
{
if (!en.MoveNext())
throw new Exception("emty result of aspnet_UsersInRoles_IsUserInRoleEF");
int? resultData = en.Current;
return resultData == 1;//1 = success, see T-SQL for return codes
}
}
I am much late to the party but today I faced the same error and how I resolved was simple. My scenario was similar to this given code I was making DB transactions inside of nested for-each loops.
The problem is as a Single DB transaction takes a little bit time longer than for-each loop so once the earlier transaction is not complete then the new traction throws an exception, so the solution is to create a new object in the for-each loop where you are making a db transaction.
For the above mentioned scenarios the solution will be like this:
foreach (RivWorks.Model.Negotiation.AutoNegotiationDetails companyFeedDetail in companyFeedDetailList)
{
private RivWorks.Model.Negotiation.RIV_Entities _dbRiv = RivWorks.Model.Stores.RivEntities(AppSettings.RivWorkEntities_connString);
if (companyFeedDetail.FeedSourceTable.ToUpper() == "AUTO")
{
var company = (from a in _dbRiv.Company.Include("Product") where a.CompanyId == companyFeedDetail.CompanyId select a).First();
foreach (RivWorks.Model.NegotiationAutos.Auto sourceProduct in client.Auto)
{
foreach (RivWorks.Model.Negotiation.Product targetProduct in company.Product)
{
if (targetProduct.alternateProductID == sourceProduct.AutoID)
{
found = true;
break;
}
}
if (!found)
{
var newProduct = new RivWorks.Model.Negotiation.Product();
newProduct.alternateProductID = sourceProduct.AutoID;
newProduct.isFromFeed = true;
newProduct.isDeleted = false;
newProduct.SKU = sourceProduct.StockNumber;
company.Product.Add(newProduct);
}
}
_dbRiv.SaveChanges(); // ### THIS BREAKS ### //
}
}
I am a little bit late, but I had this error too. I solved the problem by checking what where the values that where updating.
I found out that my query was wrong and that there where over 250+ edits pending. So I corrected my query, and now it works correct.
So in my situation: Check the query for errors, by debugging over the result that the query returns. After that correct the query.
Hope this helps resolving future problems.
My situation was similar others above. I had an IQueryable which I was doing a foreach on. This in turn called a method with SaveChanges(). Booom exception here as there was already a transaction open from the query above.
// Example:
var myList = _context.Table.Where(x => x.time == null);
foreach(var i in myList)
{
MyFunction(i); // <<-- Has _context.SaveChanges() which throws exception
}
Adding ToList() to the end of the query was the solution in my case.
// Fix
var myList = _context.Table.Where(x => x.time == null).ToList();
Most of answers related with loops. But my problem was different. While i was trying to use multiple dbcontext.Savechanges() command in same scope, i got the error many times.
In my case for ef core 3.1 using
dbcontext.Database.BeginTransaction()
and
dbcontext.Database.CommitTransaction();
has fixed the problem. Here is my entire Code :
public IActionResult ApplyForCourse()
{
var master = _userService.GetMasterFromCurrentUser();
var trainee = new Trainee
{
CourseId = courseId,
JobStatus = model.JobStatus,
Gender = model.Gender,
Name = model.Name,
Surname = model.Surname,
Telephone = model.Telephone,
Email = model.Email,
BirthDate = model.BirthDate,
Description = model.Description,
EducationStatus = EducationStatus.AppliedForEducation,
TraineeType = TraineeType.SiteFirst
};
dbcontext.Trainees.Add(trainee);
dbcontext.SaveChanges();
dbcontext.Database.BeginTransaction();
var user = userManager.GetUserAsync(User).Result;
master.TraineeId = trainee.Id;
master.DateOfBirth = model.BirthDate;
master.EducationStatus = trainee.EducationStatus;
user.Gender = model.Gender;
user.Email = model.Email;
dbcontext.Database.CommitTransaction();
dbcontext.SaveChanges();
return RedirectToAction("Index", "Home");
}
}

EF 6 performance while updating multiple records with different values in same table

I have a table , whose values are updated conditional basis, and when I am calling
db.SaveChanges()
there is a huge performance drop.
I am also setting properties
db.Configuration.AutoDetectChangesEnabled = false;
db.Configuration.ValidateOnSaveEnabled = false;
still results are not as expected.
Edit 1:
using(var db= new MyEntities())
{
db.Configuration.AutoDetectChangesEnabled = false;
db.Configuration.ValidateOnSaveEnabled = false;
foreach(var acc in myacclist)
{
//will update my account objects here
}
db.SaveChanges();
}
Unfortunately, there is no way you will be able to have good performance with Entity Framework and SaveChanges.
SaveChange makes a database round-trip for every record update. So if you currently have 10,000 accounts, 10k database round-trip is performed.
Setting AutoDetectChangesEnabled and ValidateOnSaveEnabled is usually a very bad idea and will not really improve the performance since it the number of database round-trip the real issue.
Disclaimer: I'm the owner of the project Entity Framework Extensions
This library allows to dramatically improve performance by performing:
BulkSaveChanges
BulkInsert
BulkUpdate
BulkDelete
BulkMerge
Example:
using(var db= new MyEntities())
{
foreach(var acc in myacclist)
{
//will update my account objects here
}
db.BulkSaveChanges();
}
Building a Update query in string builder and saving it for every 1k records improved my performance
using (var db = new MyEntities())
{
StringBuilder finalquery = new StringBuilder();
int i = 0;
foreach (var acc in myacclist)
{
i++;
//will update my account objects here
finalquery.Append(stmnt);
if (1 % 1000 = 0) { db.Database.ExecuteSqlCommand(finalquery.ToString()); }
}
db.SaveChanges();
}

Iterating over Linq-to-Entities IEnumerable causes OutOfMemoryException

The part of the code I'm working on receives an
IEnumerable<T> items
where each item contains a class with properties reflecting a MSSQL database table.
The database table has a total count of 953664 rows.
The dataset in code is filtered down to a set of 284360 rows.
The following code throws an OutOfMemoryException when the process reaches about 1,5 GB memory allocation.
private static void Save<T>(IEnumerable<T> items, IList<IDataWriter> dataWriters, IEnumerable<PropertyColumn> columns) where T : MyTableClass
{
foreach (var item in items)
{
}
}
The variable items is of type
IQueryable<MyTableClass>
I can't find anyone with the same setup, and other's solutions that I've found doesn't apply here.
I've also tried paging, using Skip and Take with a page size of 500, but that just takes a long time and ends up with the same result. It seems like objects aren't being released after each iteration. How is that?
How can I rewrite this code to cope with a larger collection set?
Well, as Servy has already said you didn't provide your code so I'll try to make some predictions... (Sorry for my english)
If you have an exception in "foreach (var item in items)" when you are using paging then, I guess, something wrong with paging. I wrote a couple of examples to explain my idea.
if first example I suggest you (just for test) put your filter inside the Save function.
private static void Save<T>(IQueryable<T> items, IList<IDataWriter> dataWriters, IEnumerable<PropertyColumn> columns) where T : MyTableClass
{
int pageSize = 500; //Only 500 records will be loaded.
int currentStep = 0;
while (true)
{
//Here we create a new request into the database using our filter.
var tempList = items.Where(yourFilter).Skip(currentStep * pageSize).Take(pageSize);
foreach (var item in tempList)
{
//If you have an exception here maybe something wrong in your dataWriters or columns.
}
currentStep++;
if (tempList.Count() == 0) //No records have been loaded so we can leave.
break;
}
}
The second example show how to use paging without any changes in the Save function
int pageSize = 500;
int currentStep = 0;
while (true)
{
//Here we create a new request into the database using our filter.
var tempList = items.Where(yourFilter).Skip(currentStep * pageSize).Take(pageSize);
Save(tempList, dataWriters, columns); //Calling saving function.
currentStep++;
if (tempList.Count() == 0)
break;
}
Try both of them and you'll either resolve your problem or find another place where an exception is raised.
By the way, another potential place is your dataWriters. I guess there you store all data that your have been received from the database. Maybe you shouldn't save all data? Just calculate memory size that all objects are required.
P.S. And don't use while(true) in your code. It just an example:)

Delete from Database via Linq is slow

I have delete an application in c# and i utilize linq to entities to build queries in my application. However I find my application to be very slow in adding and deleting records. Delete 500 records would take me about 15 mins. That is too long. my delete query is below.
public void deletePayrollBatch(int batchNo)
{
var context = new DataSources.sspEntities();
var query1 = (from a in context.Payroll_Details
where a.Payroll_BatchBatch_no == batchNo
select a).ToList();
foreach (var x in query1)
{
context.DeleteObject(x);
context.SaveChanges();
}
}
My query is rather simple. One row of data does contain about 20 columns. I dont know if that would have an effect. Anyhow, if i can get some expert suggestion, it would be helpful. thank you.
EDITED.
I already had good suggestions based on the delete. One of which is to used context.SaveChanges() outside of the loop. In my insert, i have similar code (shown below), however I need the record to be save first because i want the ID to use it in another function. In this case, how may i proceed?
foreach (var xyz in x)
{
Try{
pDetails1.EstateID = 143;
pDetails1.Employee_Personal_InfoEmp_id = xyz.PD_EmpPersonal_Id;
pDetails1.PAYE = (decimal)xyz.PD_PAYE.GetValueOrDefault();
pDetails1.PAYE_YTD = (decimal)PayeYTD.GetValueOrDefault();
pDetails1.Basic_Pay = (decimal)xyz.PD_BasicPay.GetValueOrDefault();
pDetails1.Basic_Pay_YTD = (decimal)basicpay_YTD.GetValueOrDefault();
pDetails1.Gross_Pay = (decimal)xyz.PD_GrossPay.GetValueOrDefault();
pDetails1.Gross_Pay_YTD = (decimal)GrossPay_YTD.GetValueOrDefault();
context.SaveChanges();
Functions.RandomFunct rf = new RandomFunct();
rf.AdditionalEarningsIsContinuous(xyz.PD_EmpPersonal_Id); // I NEED THE ID IN ORDER TO RUN THIS FUNCTION. IN THIS CASE I HAVE TO SAVE ON EACH ELEMENT IN THE FOR EACH LOOP??
}
catch (exeception e)
debug.writeline(e.message);
}
You are calling context.SaveChanges for every record. You can do that once at the end of your foreach loop.
foreach (var x in query1)
{
context.DeleteObject(x);
}
context.SaveChanges();
That will send the request to delete once to the database for all records. Currently you are sending DELETE request for each record and that is taking time.

Categories