Concurrent execution of C# code results in duplicate db entries - c#

Well, I have this code:
var companyId = 1;
var categoryId = 1;
var item = _dbContext.FirstOrDefault(i =>
i.CompanyId == companyId && i.CategoryId == categoryId);
if (item == null) {
var newItem = new Item()
{
CategoryId = companyId,
CompanyId = categoryId,
// some other properties
};
_dbContext.Items.Add(newItem);
_dbContext.SaveChanges();
}
else
{
// update some properties of the item (not CompanyId or CategoryId)
item.xyz = "new Value"
_dbContext.Items.Update(item);
_dbContext.SaveChanges();
}
This is the only piece of code where I access Items table. I want to make sure no duplicate entries are made for CompanyId-CategoryId combination, that is to say, this combination should be unique. But even though the code checks for the existence before insertion, how do I end up having duplicate entries in the database?
Please ignore everything else (the architecture, the use of DB Context etc since I have just written this simple code to understand the problem). Also, I know I can implement this thing at DB level and make columns combination unique, but is there any way to do this in application code (in C#)? I am using EF, if that's important. Why exactly is this problem caused, multithreading?
Please help me understand this issue or give me some directions to what further I should read in this regard. Thanks.

If you want to do this in application code, then you are going to have to force single threading somewhere. One way to do this would be to use locking to produce a section of code that only one thread can enter at a time. You then would re-check existence of the record and if still null perform your insert. It would look something like this:
// declared at class level
private static readonly object ItemCreationSyncLock = new object();
public void MyMethodThatCreatesAnItem()
{
// ... setup code ...
var item = _dbContext.Items.FirstOrDefault(itm => item.Name == criteria);
if(item == null)
{
lock(ItemCreationSyncLock)
{
item = _dbContext.Items.FirstOfDefault(itm => item.Name == criteria);
if(item == null)
{
item = new Item { Name = criteria };
_dbContext.Items.Add(item);
_dbContext.SaveChanges();
}
}
}
}
So I covered the "how" of avoiding the duplicate entry in the application code, but it is important to realize that even as a backup to this you would still want to enforce a unique constraint at the database level if you really want to ensure duplicates cannot exist. I am capable of defeating the guard that I just showed you by scaling the application out to multiple instances, whether using web garden/farm or scaling to additional instances on Azure.

Using a lock in C# will only work if you have a single web server. If you have a farm, then each instance of your application will have its own ItemCreationSyncLock.
To create a synchronization lock in the database, for SQL Server is as simple as this:
using (var db = new Db())
using (var tran = db.Database.BeginTransaction())
{
db.Database.ExecuteSqlCommand("exec sp_getapplock 'MyAppLock','exclusive';");
//only one thread across all application instances will execute this code at one time
tran.Commit();
}

Related

Adding same entity object more than once with EF Core

In below code , I want to add myEntity object more than once to database using EF Core. but each time with different value on property id but all other properties are the same. How can I do this? because it's only adding 1 row in the database.
I want to do this because I don't want to repeat calling GetCurrentLocalDateTime() for each iteration and also in the else statement.
var myEntity = _mapper.Map<AEntity >(entityDto);
myEntity.updatedAt = _helper.GetCurrentLocalDateTime();
myEntity.CreatedAt = _helper.GetCurrentLocalDateTime();
if (entityDto.ids != null || entityDto.ids.Count > 0)
{
foreach (var id in entityDto.ids)
{
myEntity.id = id;
await _dbContext.myEntities.AddAsync(myEntity);
}
}
else
{
await _dbContext.myEntities.AddAsync(myEntity);
}
await _dbContext.SaveChangesAsync();
You can't add a single instance of a class, change one of its properties, and add it again expecting a new instance to be added to your database. All that will happen is that you change the property of the single instance you have added.
Instead, you will need to map the DTO multiple times, so that you add multiple instance of your entity class to the DbSet.
You also need to use && instead of || in your if condition. Use OR (||) will result in a NullReferenceException if the entityDto.ids collection is null.
var now = _helper.GetCurrentLocalDateTime();
if (entityDto.ids != null && entityDto.ids.Count > 0)
{
foreach (var id in entityDto.ids)
{
var myEntity = _mapper.Map<AEntity>(entityDto);
myEntity.updatedAt = now;
myEntity.CreatedAt = now;
myEntity.id = id;
await _dbContext.myEntities.AddAsync(myEntity);
}
}
else
{
var myEntity = _mapper.Map<AEntity>(entityDto);
myEntity.updatedAt = now;
myEntity.CreatedAt = now;
await _dbContext.myEntities.AddAsync(myEntity);
}
await _dbContext.SaveChangesAsync();
It uses it to add multiple data.Look at these
.AddRangeAsync()
.AddRange()
you can try to keep your data in list and save it all at once with "addrange()"
Yes, it is possible, though I would use caution to ensure that this is only to insert a number of copies:
foreach (var id in entityDto.ids)
{
myEntity.id = id;
await _dbContext.myEntities.AddAsync(myEntity);
_dbContext.Entry(myEntity).State = EntityState.Detached;
}
By detaching the entity, the DbContext will no longer be tracking it, so updating it and adding it again will be treated as a request to add a new entity.
Normally this kind of code results in a bug where developers are trying to reuse a single entity instance to update multiple data rows. This is a rather odd requirement to insert several copies of the same data, so I would ensure it is documented well so future developers don't try repurposing it. :)

Try to fix SQL Exception: A new transaction is not allowed because there are other threads running in the session [duplicate]

I am currently getting this error:
System.Data.SqlClient.SqlException: New transaction is not allowed because there are other threads running in the session.
while running this code:
public class ProductManager : IProductManager
{
#region Declare Models
private RivWorks.Model.Negotiation.RIV_Entities _dbRiv = RivWorks.Model.Stores.RivEntities(AppSettings.RivWorkEntities_connString);
private RivWorks.Model.NegotiationAutos.RivFeedsEntities _dbFeed = RivWorks.Model.Stores.FeedEntities(AppSettings.FeedAutosEntities_connString);
#endregion
public IProduct GetProductById(Guid productId)
{
// Do a quick sync of the feeds...
SyncFeeds();
...
// get a product...
...
return product;
}
private void SyncFeeds()
{
bool found = false;
string feedSource = "AUTO";
switch (feedSource) // companyFeedDetail.FeedSourceTable.ToUpper())
{
case "AUTO":
var clientList = from a in _dbFeed.Client.Include("Auto") select a;
foreach (RivWorks.Model.NegotiationAutos.Client client in clientList)
{
var companyFeedDetailList = from a in _dbRiv.AutoNegotiationDetails where a.ClientID == client.ClientID select a;
foreach (RivWorks.Model.Negotiation.AutoNegotiationDetails companyFeedDetail in companyFeedDetailList)
{
if (companyFeedDetail.FeedSourceTable.ToUpper() == "AUTO")
{
var company = (from a in _dbRiv.Company.Include("Product") where a.CompanyId == companyFeedDetail.CompanyId select a).First();
foreach (RivWorks.Model.NegotiationAutos.Auto sourceProduct in client.Auto)
{
foreach (RivWorks.Model.Negotiation.Product targetProduct in company.Product)
{
if (targetProduct.alternateProductID == sourceProduct.AutoID)
{
found = true;
break;
}
}
if (!found)
{
var newProduct = new RivWorks.Model.Negotiation.Product();
newProduct.alternateProductID = sourceProduct.AutoID;
newProduct.isFromFeed = true;
newProduct.isDeleted = false;
newProduct.SKU = sourceProduct.StockNumber;
company.Product.Add(newProduct);
}
}
_dbRiv.SaveChanges(); // ### THIS BREAKS ### //
}
}
}
break;
}
}
}
Model #1 - This model sits in a database on our Dev Server.
Model #1 http://content.screencast.com/users/Keith.Barrows/folders/Jing/media/bdb2b000-6e60-4af0-a7a1-2bb6b05d8bc1/Model1.png
Model #2 - This model sits in a database on our Prod Server and is updated each day by automatic feeds. alt text http://content.screencast.com/users/Keith.Barrows/folders/Jing/media/4260259f-bce6-43d5-9d2a-017bd9a980d4/Model2.png
Note - The red circled items in Model #1 are the fields I use to "map" to Model #2. Please ignore the red circles in Model #2: that is from another question I had which is now answered.
Note: I still need to put in an isDeleted check so I can soft delete it from DB1 if it has gone out of our client's inventory.
All I want to do, with this particular code, is connect a company in DB1 with a client in DB2, get their product list from DB2 and INSERT it in DB1 if it is not already there. First time through should be a full pull of inventory. Each time it is run there after nothing should happen unless new inventory came in on the feed over night.
So the big question - how to I solve the transaction error I am getting? Do I need to drop and recreate my context each time through the loops (does not make sense to me)?
After much pulling out of hair I discovered that the foreach loops were the culprits. What needs to happen is to call EF but return it into an IList<T> of that target type then loop on the IList<T>.
Example:
IList<Client> clientList = from a in _dbFeed.Client.Include("Auto") select a;
foreach (RivWorks.Model.NegotiationAutos.Client client in clientList)
{
var companyFeedDetailList = from a in _dbRiv.AutoNegotiationDetails where a.ClientID == client.ClientID select a;
// ...
}
As you've already identified, you cannot save from within a foreach that is still drawing from the database via an active reader.
Calling ToList() or ToArray() is fine for small data sets, but when you have thousands of rows, you will be consuming a large amount of memory.
It's better to load the rows in chunks.
public static class EntityFrameworkUtil
{
public static IEnumerable<T> QueryInChunksOf<T>(this IQueryable<T> queryable, int chunkSize)
{
return queryable.QueryChunksOfSize(chunkSize).SelectMany(chunk => chunk);
}
public static IEnumerable<T[]> QueryChunksOfSize<T>(this IQueryable<T> queryable, int chunkSize)
{
int chunkNumber = 0;
while (true)
{
var query = (chunkNumber == 0)
? queryable
: queryable.Skip(chunkNumber * chunkSize);
var chunk = query.Take(chunkSize).ToArray();
if (chunk.Length == 0)
yield break;
yield return chunk;
chunkNumber++;
}
}
}
Given the above extension methods, you can write your query like this:
foreach (var client in clientList.OrderBy(c => c.Id).QueryInChunksOf(100))
{
// do stuff
context.SaveChanges();
}
The queryable object you call this method on must be ordered. This is because Entity Framework only supports IQueryable<T>.Skip(int) on ordered queries, which makes sense when you consider that multiple queries for different ranges require the ordering to be stable. If the ordering isn't important to you, just order by primary key as that's likely to have a clustered index.
This version will query the database in batches of 100. Note that SaveChanges() is called for each entity.
If you want to improve your throughput dramatically, you should call SaveChanges() less frequently. Use code like this instead:
foreach (var chunk in clientList.OrderBy(c => c.Id).QueryChunksOfSize(100))
{
foreach (var client in chunk)
{
// do stuff
}
context.SaveChanges();
}
This results in 100 times fewer database update calls. Of course each of those calls takes longer to complete, but you still come out way ahead in the end. Your mileage may vary, but this was worlds faster for me.
And it gets around the exception you were seeing.
EDIT I revisited this question after running SQL Profiler and updated a few things to improve performance. For anyone who is interested, here is some sample SQL that shows what is created by the DB.
The first loop doesn't need to skip anything, so is simpler.
SELECT TOP (100) -- the chunk size
[Extent1].[Id] AS [Id],
[Extent1].[Name] AS [Name],
FROM [dbo].[Clients] AS [Extent1]
ORDER BY [Extent1].[Id] ASC
Subsequent calls need to skip previous chunks of results, so introduces usage of row_number:
SELECT TOP (100) -- the chunk size
[Extent1].[Id] AS [Id],
[Extent1].[Name] AS [Name],
FROM (
SELECT [Extent1].[Id] AS [Id], [Extent1].[Name] AS [Name], row_number()
OVER (ORDER BY [Extent1].[Id] ASC) AS [row_number]
FROM [dbo].[Clients] AS [Extent1]
) AS [Extent1]
WHERE [Extent1].[row_number] > 100 -- the number of rows to skip
ORDER BY [Extent1].[Id] ASC
We have now posted an official response to the bug opened on Connect. The workarounds we recommend are as follows:
This error is due to Entity Framework creating an implicit transaction during the SaveChanges() call. The best way to work around the error is to use a different pattern (i.e., not saving while in the midst of reading) or by explicitly declaring a transaction. Here are three possible solutions:
// 1: Save after iteration (recommended approach in most cases)
using (var context = new MyContext())
{
foreach (var person in context.People)
{
// Change to person
}
context.SaveChanges();
}
// 2: Declare an explicit transaction
using (var transaction = new TransactionScope())
{
using (var context = new MyContext())
{
foreach (var person in context.People)
{
// Change to person
context.SaveChanges();
}
}
transaction.Complete();
}
// 3: Read rows ahead (Dangerous!)
using (var context = new MyContext())
{
var people = context.People.ToList(); // Note that this forces the database
// to evaluate the query immediately
// and could be very bad for large tables.
foreach (var person in people)
{
// Change to person
context.SaveChanges();
}
}
Indeed you cannot save changes inside a foreach loop in C# using Entity Framework.
context.SaveChanges() method acts like a commit on a regular database system (RDMS).
Just make all changes (which Entity Framework will cache) and then save all of them at once calling SaveChanges() after the loop (outside of it), like a database commit command.
This works if you can save all changes at once.
Just put context.SaveChanges() after end of your foreach(loop).
Making your queryable lists to .ToList() and it should work fine.
FYI: from a book and some lines adjusted because it's still valid:
Invoking SaveChanges() method begins a transaction which automatically rolls back all changes persisted to the database if an exception occurs before iteration completes; otherwise the transaction commits. You might be tempted to apply the method after each entity update or deletion rather than after iteration completes, especially when you're updating or deleting massive numbers of entities.
If you try to invoke SaveChanges() before all data has been processed, you incur a "New transaction is not allowed because there are other threads running in the session" exception. The exception occurs because SQL Server doesn't permit starting a new transaction on a connection that has a SqlDataReader open, even with Multiple Active Record Sets (MARS) enabled by the connection string (EF's default connection string enables MARS)
Sometimes its better to understand why things are happening ;-)
Always Use your selection as List
Eg:
var tempGroupOfFiles = Entities.Submited_Files.Where(r => r.FileStatusID == 10 && r.EventID == EventId).ToList();
Then Loop through the Collection while save changes
foreach (var item in tempGroupOfFiles)
{
var itemToUpdate = item;
if (itemToUpdate != null)
{
itemToUpdate.FileStatusID = 8;
itemToUpdate.LastModifiedDate = DateTime.Now;
}
Entities.SaveChanges();
}
I was getting this same issue but in a different situation. I had a list of items in a list box. The user can click an item and select delete but I am using a stored proc to delete the item because there is a lot of logic involved in deleting the item. When I call the stored proc the delete works fine but any future call to SaveChanges will cause the error. My solution was to call the stored proc outside of EF and this worked fine. For some reason when I call the stored proc using the EF way of doing things it leaves something open.
We started seeing this error "New transaction is not allowed because there are other threads running in the session" after migrating from EF5 to EF6.
Google brought us here but we are not calling SaveChanges() inside the loop. The errors were raised when executing a stored procedure using the ObjectContext.ExecuteFunction inside a foreach loop reading from the DB.
Any call to ObjectContext.ExecuteFunction wraps the function in a transaction. Beginning a transaction while there is already an open reader causes the error.
It is possible to disable wrapping the SP in a transaction by setting the following option.
_context.Configuration.EnsureTransactionsForFunctionsAndCommands = false;
The EnsureTransactionsForFunctionsAndCommands option allows the SP to run without creating its own transaction and the error is no longer raised.
DbContextConfiguration.EnsureTransactionsForFunctionsAndCommands Property
Here are another 2 options that allow you to invoke SaveChanges() in a for each loop.
The first option is use one DBContext to generate your list objects to iterate through, and then create a 2nd DBContext to call SaveChanges() on. Here is an example:
//Get your IQueryable list of objects from your main DBContext(db)
IQueryable<Object> objects = db.Object.Where(whatever where clause you desire);
//Create a new DBContext outside of the foreach loop
using (DBContext dbMod = new DBContext())
{
//Loop through the IQueryable
foreach (Object object in objects)
{
//Get the same object you are operating on in the foreach loop from the new DBContext(dbMod) using the objects id
Object objectMod = dbMod.Object.Find(object.id);
//Make whatever changes you need on objectMod
objectMod.RightNow = DateTime.Now;
//Invoke SaveChanges() on the dbMod context
dbMod.SaveChanges()
}
}
The 2nd option is to get a list of database objects from the DBContext, but to select only the id's. And then iterate through the list of id's (presumably an int) and get the object corresponding to each int, and invoke SaveChanges() that way. The idea behind this method is grabbing a large list of integers, is a lot more efficient then getting a large list of db objects and calling .ToList() on the entire object. Here is an example of this method:
//Get the list of objects you want from your DBContext, and select just the Id's and create a list
List<int> Ids = db.Object.Where(enter where clause here)Select(m => m.Id).ToList();
var objects = Ids.Select(id => db.Objects.Find(id));
foreach (var object in objects)
{
object.RightNow = DateTime.Now;
db.SaveChanges()
}
If you get this error due to foreach and you really need to save one entity first inside loop and use generated identity further in loop, as was in my case, the easiest solution is to use another DBContext to insert entity which will return Id and use this Id in outer context
For example
using (var context = new DatabaseContext())
{
...
using (var context1 = new DatabaseContext())
{
...
context1.SaveChanges();
}
//get id of inserted object from context1 and use is.
context.SaveChanges();
}
I was also facing same issue.
Here is the cause and solution.
http://blogs.msdn.com/b/cbiyikoglu/archive/2006/11/21/mars-transactions-and-sql-error-3997-3988-or-3983.aspx
Make sure before firing data manipulation commands like inserts, updates, you have closed all previous active SQL readers.
Most common error is functions that read data from db and return values.
For e.g functions like isRecordExist.
In this case we immediately return from the function if we found the record and forget to close the reader.
So in the project were I had this exact same issue the problem wasn't in the foreach or the .toList() it was actually in the AutoFac configuration we used.
This created some weird situations were the above error was thrown but also a bunch of other equivalent errors were thrown.
This was our fix:
Changed this:
container.RegisterType<DataContext>().As<DbContext>().InstancePerLifetimeScope();
container.RegisterType<DbFactory>().As<IDbFactory>().SingleInstance();
container.RegisterType<UnitOfWork>().As<IUnitOfWork>().InstancePerRequest();
To:
container.RegisterType<DataContext>().As<DbContext>().As<DbContext>();
container.RegisterType<DbFactory>().As<IDbFactory>().As<IDbFactory>().InstancePerLifetimeScope();
container.RegisterType<UnitOfWork>().As<IUnitOfWork>().As<IUnitOfWork>();//.InstancePerRequest();
I know it is an old question but i faced this error today.
and i found that, this error can be thrown when a database table trigger gets an error.
for your information, you can check your tables triggers too when you get this error.
I needed to read a huge ResultSet and update some records in the table.
I tried to use chunks as suggested in Drew Noakes's answer.
Unfortunately after 50000 records I've got OutofMemoryException.
The answer Entity framework large data set, out of memory exception explains, that
EF creates second copy of data which uses for change detection (so
that it can persist changes to the database). EF holds this second set
for the lifetime of the context and its this set thats running you out
of memory.
The recommendation is to re-create your context for each batch.
So I've retrieved Minimal and Maximum values of the primary key- the tables have primary keys as auto incremental integers.Then I retrieved from the database chunks of records by opening context for each chunk. After processing the chunk context closes and releases the memory. It insures that memory usage is not growing.
Below is a snippet from my code:
public void ProcessContextByChunks ()
{
var tableName = "MyTable";
var startTime = DateTime.Now;
int i = 0;
var minMaxIds = GetMinMaxIds();
for (int fromKeyID= minMaxIds.From; fromKeyID <= minMaxIds.To; fromKeyID = fromKeyID+_chunkSize)
{
try
{
using (var context = InitContext())
{
var chunk = GetMyTableQuery(context).Where(r => (r.KeyID >= fromKeyID) && (r.KeyID < fromKeyID+ _chunkSize));
try
{
foreach (var row in chunk)
{
foundCount = UpdateRowIfNeeded(++i, row);
}
context.SaveChanges();
}
catch (Exception exc)
{
LogChunkException(i, exc);
}
}
}
catch (Exception exc)
{
LogChunkException(i, exc);
}
}
LogSummaryLine(tableName, i, foundCount, startTime);
}
private FromToRange<int> GetminMaxIds()
{
var minMaxIds = new FromToRange<int>();
using (var context = InitContext())
{
var allRows = GetMyTableQuery(context);
minMaxIds.From = allRows.Min(n => (int?)n.KeyID ?? 0);
minMaxIds.To = allRows.Max(n => (int?)n.KeyID ?? 0);
}
return minMaxIds;
}
private IQueryable<MyTable> GetMyTableQuery(MyEFContext context)
{
return context.MyTable;
}
private MyEFContext InitContext()
{
var context = new MyEFContext();
context.Database.Connection.ConnectionString = _connectionString;
//context.Database.Log = SqlLog;
return context;
}
FromToRange is a simple structure with From and To properties.
Recently I faced the same issue in my project so posting my experience and it might help some on the same boat as i was. The issue was due to i am looping through the results of EF select query (results are not retrieved into memory).
var products = (from e in _context.Products
where e.StatusId == 1
select new { e.Name, e.Type });
foreach (var product in products)
{
//doing some insert EF Queries
//some EF select quries
await _context.SaveChangesAsync(stoppingToken); // This code breaks.
}
I have updated my Products select query to bring the results into LIST rather than IQueryable (This seems to be opening the reader throughout for each loop and hence save was failing).
var products = (from e in _context.Products
where e.StatusId == 1
select new { e.Name, e.Type })**.ToList()**; //see highlighted
The code below works for me:
private pricecheckEntities _context = new pricecheckEntities();
...
private void resetpcheckedtoFalse()
{
try
{
foreach (var product in _context.products)
{
product.pchecked = false;
_context.products.Attach(product);
_context.Entry(product).State = EntityState.Modified;
}
_context.SaveChanges();
}
catch (Exception extofException)
{
MessageBox.Show(extofException.ToString());
}
productsDataGrid.Items.Refresh();
}
In my case, the problem appeared when I called Stored Procedure via EF and then later SaveChanges throw this exception. The problem was in calling the procedure, the enumerator was not disposed. I fixed the code following way:
public bool IsUserInRole(string username, string roleName, DataContext context)
{
var result = context.aspnet_UsersInRoles_IsUserInRoleEF("/", username, roleName);
//using here solved the issue
using (var en = result.GetEnumerator())
{
if (!en.MoveNext())
throw new Exception("emty result of aspnet_UsersInRoles_IsUserInRoleEF");
int? resultData = en.Current;
return resultData == 1;//1 = success, see T-SQL for return codes
}
}
I am much late to the party but today I faced the same error and how I resolved was simple. My scenario was similar to this given code I was making DB transactions inside of nested for-each loops.
The problem is as a Single DB transaction takes a little bit time longer than for-each loop so once the earlier transaction is not complete then the new traction throws an exception, so the solution is to create a new object in the for-each loop where you are making a db transaction.
For the above mentioned scenarios the solution will be like this:
foreach (RivWorks.Model.Negotiation.AutoNegotiationDetails companyFeedDetail in companyFeedDetailList)
{
private RivWorks.Model.Negotiation.RIV_Entities _dbRiv = RivWorks.Model.Stores.RivEntities(AppSettings.RivWorkEntities_connString);
if (companyFeedDetail.FeedSourceTable.ToUpper() == "AUTO")
{
var company = (from a in _dbRiv.Company.Include("Product") where a.CompanyId == companyFeedDetail.CompanyId select a).First();
foreach (RivWorks.Model.NegotiationAutos.Auto sourceProduct in client.Auto)
{
foreach (RivWorks.Model.Negotiation.Product targetProduct in company.Product)
{
if (targetProduct.alternateProductID == sourceProduct.AutoID)
{
found = true;
break;
}
}
if (!found)
{
var newProduct = new RivWorks.Model.Negotiation.Product();
newProduct.alternateProductID = sourceProduct.AutoID;
newProduct.isFromFeed = true;
newProduct.isDeleted = false;
newProduct.SKU = sourceProduct.StockNumber;
company.Product.Add(newProduct);
}
}
_dbRiv.SaveChanges(); // ### THIS BREAKS ### //
}
}
I am a little bit late, but I had this error too. I solved the problem by checking what where the values that where updating.
I found out that my query was wrong and that there where over 250+ edits pending. So I corrected my query, and now it works correct.
So in my situation: Check the query for errors, by debugging over the result that the query returns. After that correct the query.
Hope this helps resolving future problems.
My situation was similar others above. I had an IQueryable which I was doing a foreach on. This in turn called a method with SaveChanges(). Booom exception here as there was already a transaction open from the query above.
// Example:
var myList = _context.Table.Where(x => x.time == null);
foreach(var i in myList)
{
MyFunction(i); // <<-- Has _context.SaveChanges() which throws exception
}
Adding ToList() to the end of the query was the solution in my case.
// Fix
var myList = _context.Table.Where(x => x.time == null).ToList();
Most of answers related with loops. But my problem was different. While i was trying to use multiple dbcontext.Savechanges() command in same scope, i got the error many times.
In my case for ef core 3.1 using
dbcontext.Database.BeginTransaction()
and
dbcontext.Database.CommitTransaction();
has fixed the problem. Here is my entire Code :
public IActionResult ApplyForCourse()
{
var master = _userService.GetMasterFromCurrentUser();
var trainee = new Trainee
{
CourseId = courseId,
JobStatus = model.JobStatus,
Gender = model.Gender,
Name = model.Name,
Surname = model.Surname,
Telephone = model.Telephone,
Email = model.Email,
BirthDate = model.BirthDate,
Description = model.Description,
EducationStatus = EducationStatus.AppliedForEducation,
TraineeType = TraineeType.SiteFirst
};
dbcontext.Trainees.Add(trainee);
dbcontext.SaveChanges();
dbcontext.Database.BeginTransaction();
var user = userManager.GetUserAsync(User).Result;
master.TraineeId = trainee.Id;
master.DateOfBirth = model.BirthDate;
master.EducationStatus = trainee.EducationStatus;
user.Gender = model.Gender;
user.Email = model.Email;
dbcontext.Database.CommitTransaction();
dbcontext.SaveChanges();
return RedirectToAction("Index", "Home");
}
}

Hitting the database in loop

I am hitting the database in loop. Pls suggest how i can avoid this ?
foreach (var assignedPricing in reviewingPricing)
{
var assignedUserId = _wfActivitySvc.GetPricingReviewer(assignedPricing.PricingID).UserId;
if (assignedUserId == UserId)
{
reviewingAssignedPricings.Add(assignedPricing);
}
}
Create a new query in your database service:
//Build an collection with just unique ids
var priceIds = reviewingPricing.Select(x => x.PricingId).Distinct();
//Return a key/value collection with all priceId/UserId
var reviewerMap = _wfActivitySvc.GetAllReviewersByPriceId(priceIds);
//now you can loop without db queries
foreach (var pricing in reviewingPricing)
{
var reviewer = reviewMap.FirstOrDefault(x => x.PricingId == pricing.PricingId);
if (reviewerMap == null)
continue;
if (reviewer.UserId == UserId)
{
reviewingAssignedPricings.Add(pricing);
}
}
1) You may want to insert all records at once. You can create stored procedure to do this. If you use SQL server you can use BulkInserter class: https://github.com/ronnieoverby/RonnieOverbyGrabBag/blob/master/BulkInserter.cs
For production I had to tweak it a little bit internally to reduce its initialization time, but for infrequent bulk inserts Github version is just fine.
Usage example:
var bulkInserter = new BulkInserter<YourClass>(SqlConnection, "Table Name");
bulkInserter.Insert(reviewingAssignedPricings);
bulkInserter.Flush();
2) If this reads from database every time
_wfActivitySvc.GetPricingReviewer(assignedPricing.PricingID).UserId;
then replace it with single call outside loop to get all reviweres from database, then add to dictionary (key = priceID, value = reviewer) and then get reviewers from dictionary within loop. If you use simple List and .FirstOrDefault(), then it can be noticeably slow for 100+ items in list. jgauffin answer describes this idea.

Linq to entities query optimizing

I have 3 tables in my DB which I'm working with:
Theme [Theme_ID]
ThemeWorkplace [Theme_ID, Workplace_ID, ThemeWorkplace_ID]
UserTheme [User_ID, Theme_ID, UserTheme_ID, UserTheme_AccessType]
I need to change UserTheme_AccessType for all UserTheme.Theme_ID in current workplace with ThemeWorkplace.Workplace_ID = 2 and current user with User_ID = 1. If theme is no row in UserTheme for such user and such theme - I need to create it.
I wrote such a code, but it works too long time:
var themeList = (from t in m_Entities.Theme
where (from tw in m_Entities.ThemeWorkplace
where tw.Workplace.Workplace_ID == 2
select tw.Theme.Theme_ID).Contains(t.Theme_ID)
select t)
.ToList();
foreach (Theme theme in themeList)
{
var oldUserTheme = GetByUserTheme(user/*user is given*/, theme);
if (oldUserTheme == null)
{
/* create new User Theme with params, that I need*/
this.Add(newUserTheme, true);
}
else
{
/* here - changing found row */
oldUserTheme.UserTheme_AccessType = 2;
}
}
I understand that this code accesses the database too many times. I want to find a way to get rid of:
var oldUserTheme = GetByUserTheme(user/*user is given*/, theme);
In every foreach iteration. Could somebody please help me?
Adding code of GetByUserTheme():
private UserTheme GetByUserTheme(User user, Theme theme)
{
return m_Entities.UserTheme.FirstOrDefault(ut => ut.User.User_ID == user.User_ID && ut.Theme.Theme_ID == theme.Theme_ID);
}
First: All changes for entities that you have done in code will be pushed to database in one batch command when you call context.SaveChanges. So you will have one request to Database for select and one request for update.
But in your batch command will be many sql queries for cause EF generate sql for updates entities one by one (not all in one).
If you want update really many records in database, you should use sql script (call stored procedure or execute sqlquery) against using EntityFramework.
I don't know if I'm completely understanding your question and structure. But based on what I see, could this be a reasonable slution?
First you select the workplaces that have ID equal to 2. From that result you select the theme-ID's. All your userthemes that have a themeID that occurs in the former result will then be selected into 'userThemes'. From there, you iterate over the results and if the userID is empty you create a new UserTheme, otherwise you update it.
Quick note: the code below is no real working code. It's just code I wrote to exemplify my explanation, kind of pseudo-code if you will.. :)
var userThemes = entities.Userthemes
.Where(ut => entities.Workplaces
.Where(w => w.WorkPlaceID == 2)
.Select(s => s.ThemeID)
.Contains(ut.ThemeID));
foreach (UserTheme ut in userThemes)
{
if (ut.UserID.ToString() == "")
{
//Create
}
else
ut.UserThemeAccessType = 2;
}

LINQ to SharePoint 2010 getting error "All new entities within an object graph must be added/attached before changes are submitted."

I've been having a problem for some time, and I've exhausted all means of figuring this out for myself.
I have 2 lists in a MS Sharepoint 2010 environment that are holding personal physician data for a medical group...nothing special just mainly text fields and a few lookup choice fields.
I am trying to write a program that will migrate the data over from List A to List B. I am using LINQ to Sharepoint to accomplish this. Everything compiles just fine, but when it runs and hits the SubmitChanges() method, I get a runtime error that states:
"All new entities within an object graph must be added/attached before changes are submitted."
this issue must be outside of my realm of C# knowledge because I simply cannot find the solution for it. The problem is DEFINITELY stemming from the fact that some of the columns are of type "Lookup", because when I create a new "Physician" entity in my LINQ query, if I comment out the fields that deal with the lookup columns, everything runs perfectly.
With the lookup columns included, if I debug and hit breakpoints before the SubmitChanges() method, I can look at the new "Physician" entities created from the old list and the fields, including data from the lookup columns, looks good, the data is in there the way I want it to be, it just flakes out whenever it tries to actually update the new list with the new entities.
I have tried several methods of working around this error, all to no avail. In particular, I have tried created a brand new EntityList list and calling the Attach() method after each new "Physician" Entity is created, but to no avail, it just sends me around in a bunch of circles, chasing other errors such as "ID cannot be null", "Cannot insert entities that have been deleted" etc.,
I am no farther now than when I first got this error and any help that anyone can offer would certainly be appreciated.
Here is my code:
using (ProviderDataContext ctx = new ProviderDataContext("http://dev"))
{
SPSite sitecollection = new SPSite("http://dev");
SPWeb web = sitecollection.OpenWeb();
SPList theOldList = web.Lists.TryGetList("OldList_Physicians");
//Create new Physician entities.
foreach(SPListItem l in theOldList.Items)
{
PhysiciansItem p = new PhysiciansItem()
{
FirstName = (String)l["First Name"],
Title = (String)l["Last Name"],
MiddleInitial = (String)l["Middle Init"],
ProviderNumber = Convert.ToInt32(l["Provider No"]),
Gender = ConvertGender(l),
UndergraduateSchool =(String)l["UG_School"],
MedicalSchool = (String)l["Med_School"],
Residency = (String)l["Residency"],
Fellowship = (String)l["Fellowship"],
Internship = (String)l["Internship"],
PhysicianType = ConvertToPhysiciantype(l),
Specialty = ConvertSpecialties(l),
InsurancesAccepted = ConvertInsurance(l),
};
ctx.Physicians.InsertOnSubmit(p);
}
ctx.SubmitChanges(); //this is where it flakes out
}
}
//Theses are conversion functions that I wrote to convert the data from the old list to the new lookup columns.
private Gender ConvertGender(SPListItem l)
{
Gender g = new Gender();
if ((String)l["Sex"] == "M")
{
g = Gender.M;
}
else g = Gender.F;
return g;
}
//Process and convert the 'Physician Type', namely the distinction between MD (Medical Doctor) and
//DO (Doctor of Osteopathic Medicine). State Regualtions require this information to be attached
//to a physician's profile.
private ProviderTypesItem ConvertToPhysiciantype(SPListItem l)
{
ProviderTypesItem p = new ProviderTypesItem();
p.Title = (String)l["Provider_Title:Title"];
p.Intials = (String)l["Provider_Title"];
return p;
}
//Process and convert current Specialty and SubSpecialty data into the single multi-choice lookup column
private EntitySet<Item> ConvertSpecialties(SPListItem l)
{
EntitySet<Item> theEntityList = new EntitySet<Item>();
Item i = new Item();
i.Title = (String)l["Provider Specialty"];
theEntityList.Add(i);
if ((String)l["Provider SubSpecialty"] != null)
{
Item theSubSpecialty = new Item();
theSubSpecialty.Title = (String)l["Provider SubSpecialty"];
theEntityList.Add(theSubSpecialty);
}
return theEntityList;
}
//Process and add insurance accepted.
//Note this is a conversion from 3 boolean columns in the SP Environment to a multi-select enabled checkbox
//list.
private EntitySet<Item> ConvertInsurance(SPListItem l)
{
EntitySet<Item> theEntityList = new EntitySet<Item>();
if ((bool)l["TennCare"] == true)
{
Item TenncareItem = new Item();
TenncareItem.Title = "TennCare";
theEntityList.Add(TenncareItem);
}
if ((bool)l["Medicare"] == true)
{
Item MedicareItem = new Item();
MedicareItem.Title = "Medicare";
theEntityList.Add(MedicareItem);
}
if ((bool)l["Commercial"] == true)
{
Item CommercialItem = new Item();
CommercialItem.Title = "Commercial";
theEntityList.Add(CommercialItem);
}
return theEntityList;
}
}
So this may not be the answer you're looking for, but it's what's worked for me in the past. I've found that updating lookup fields using Linq to Sharepoint to be quite frustrating. It frequently doesn't work, or doesn't work efficiently (forcing me to query an item by ID just to set the lookup value).
You can set up the entity so that it has an int property for the lookup id (for each lookup field) and a string property for the lookup value. If, when you generate the entities using SPMetal, you don't generate the list that is being looked up then it will do this on it's own. What I like to do is (using your entity as an example)
Generate the entity for just that one list (Physicians) in some temporary folder
Pull out the properties for lookup id & value (there will also be private backing fields that need to come along for the ride too) for each of the lookups (or the ones that I'm interested in)
Create a partial class file for Physicians in my actual project file, so that regenerating the entire SPMetal file normally (without restricting to just that list) doesn't overwrite changes
Paste the lookup id & value properties in this partial Physicians class.
Now you will have 3 properties for each lookup field. For example, for PhysicianType there will be:
PhysicianType, which is the one that is currently there. This is great when querying data, as you can perform joins and such very easily.
PhysicianTypeId which can be occasionally useful for queries if you only need ID as it makes it a bit simpler, but mostly I use it whenever setting the value. To set a lookup field you only need to set the ID. This is easy, and has a good track record of actually working (correctly) in my experiences.
PhysicianTypeValue which could be useful when performing queries if you just need the lookup value, as a string (meaning it will be the raw value, rather than something which is already parsed if it's a multivalued field, or a user field, etc. Sometimes I'd rather parse it myself, or maybe just see what the underlying value is when doing development. Even if you don't use it and use the first property, I often bring it along for the ride since I'm already doing most of the work to bring the PhysicianTypeId field over.
It seems a bit hacky, and contrary to the general design of linq-to-SharePoint. I agree, but it also has the advantage of actually working, and not actually being all that hard (once you get the rhythm of it down and learn what exactly needs to be copied over to move the properties from one file to another).

Categories