ASP.NET Core SaveChangesAsync does not save everything - c#

PROBLEM:
If I put SaveChangesAsync outside of the loop, it changes only the last data which was put with _context.Add(attdef);
Why is that so?
First I thought it's because I have autoIncrement, but when I disabled it, It still did not work.
Using SaveChanges instead of SaveChangesAsync does not fix problem aswell.
But updating data works well.
GameController.cs
for (int i = 0; i < editViewModel.TowerAttack.Count; i++)
{
tower = _context.Tower.First(m => m.TowerId == editViewModel.TowerId[i]);
tower.Attack -= editViewModel.TowerAttack[i];
_context.Update(tower);
attdef.Id = 0; // AutoIncrement
attdef.Amount = attackSum;
_context.Add(attdef);
}
await _context.SaveChangesAsync();

I think you have declared attdef variable somewhere above and in the loop you're updating the same reference and adding it to the context. Due to this, you have single item adding in the context. The better way is to do it something like this
var attdefs = new List<Attdef>();
for (int i = 0; i < editViewModel.TowerAttack.Count; i++)
{
tower = _context.Tower.First(m => m.TowerId == editViewModel.TowerId[i]);
tower.Attack -= editViewModel.TowerAttack[i];
_context.Update(tower);
attdefs.Add(new AttacDef { id = 0, Amount = attackSum }) ;
}
_context.AddRange(attdefs); // don't remember exaxct syntaxt but this should be faster way
await _context.SaveChangesAsync();

Is attdef declared outside the loop? Are you just updating the same object with each loop? I would expect only the latest version of that object to be added if that's the case.
If you're trying to add several new objects, try declaring attdef within the loop so you're working with a new object each time.

Related

ASP.NET adding data to class in loop

I have this code here:
for (int i = 0; i < reader.FieldCount; i++)
{
RedBlue item = new RedBlue();
if (reader.GetName(i).ToString().Contains("BID"))
{
item.baselinefinish = reader.GetValue(i).ToString();
}
if (reader.GetName(i).ToString().Contains("AID"))
{
item.actualenddate = reader.GetValue(i).ToString();
}
redBlue.Add(item);
}
What I am trying to do loop through data and add it to a class, but my problem is in my class I have two strings and I want to populate each string like this (first string gets the first item in the loop, the second string get the second item in the loop, and keep going like that, so instead of each one in the loop, for every two items in the loop add them to the string and continue on....I really hope this makes sense. Anyone know how I would accomplish this?
Currently what is happening, is it will add one of the strings to the class and then add the second string to a new class.
You can use Automapper and do something like this :
(adapted from what I remember of this framework, the docs here and your example)
// Configure AutoMapper
Mapper.Initialize(cfg =>
cfg.CreateMap<YourReaderClass, RedBlue>()
.ForMember(dest => dest.baselinefinish , opt => opt.MapFrom(src => src.BID))
.ForMember(dest => dest.actualenddate , opt => opt.MapFrom(src => src.AID))
// Perform mapping
RedBlue item = Mapper.Map<YourReaderClass, RedBlue>(reader);
You do the configuration once somewhere and then you can perform as many mapping you want. Of course, you have to manually indicate which field is mapped to which field with as many ForMember as you need.
EDIT
Actually, you could of course still do it without 3rd party, as you were thinking. To solve the specific problem with your method :
Currently what is happening, is it will add one of the strings to the
class and then add the second string to a new class.
(by the way, you mean instance of your class (object), not class )
Of course this happening, because you are creating new objects each time you iterate your loop !
If you do it like this, it should work :
// instantiate your object once, before the loop :
RedBlue item = new RedBlue();
for (int i = 0; i < reader.FieldCount; i++)
{
if (reader.GetName(i).ToString().Contains("BID"))
{
item.baselinefinish = reader.GetValue(i).ToString();
}
if (reader.GetName(i).ToString().Contains("AID"))
{
item.actualenddate = reader.GetValue(i).ToString();
}
}
// now you have one object named 'item' which should be what you want.
Not sure I follow, but right now you are creating a new object at every iteration and therefore each object will have only either string.
If your data is as you say, containing BID every second and AID every second element, as a poor mans solution, you could simply increment the i after adding the first string.
for (int i = 0; i < reader.FieldCount; i++)
{
RedBlue item = new RedBlue();
if (reader.GetName(i).ToString().Contains("BID"))
{
item.baselinefinish = reader.GetValue(i).ToString();
i++;
}
if (reader.GetName(i).ToString().Contains("AID"))
{
item.actualenddate = reader.GetValue(i).ToString();
}
redBlue.Add(item);
}
Or maybe I'm missing something?
Not sure if I follow your idea, but you can use even or odd index number to represent the first and second item on the loop.
Something like that:
for (int i = 0; i < reader.FieldCount; i++)
{
RedBlue item = null;
//if it's even index number
if(i % 2 == 0){
item = new RedBlue();
redBlue.Add(item);
if (reader.GetName(i).ToString().Contains("BID"))
item.baselinefinish = reader.GetValue(i).ToString();
}else{
if (reader.GetName(i).ToString().Contains("AID"))
item.actualenddate = reader.GetValue(i).ToString();
}
}

c# collections and re-numbering not working as expected

Hi i'm trying to setup simple test data.
I simply want to take a collection which is smallish and make it bigger by add itself multiple times.
After I;ve added them together i want to re-number the property LineNumber
so that there are no duplicates and that it goes in order. 1,2,3,4....
no matter what i try it doesn't seem to work and i cant see the mistake.
var sampleTemplateLine = dataContext.TemplateFileLines.ToList();
*//tired this doesnt work either*
//List<TemplateFileLine> lineRange = new List<TemplateFileLine>();
//lineRange.AddRange(sampleTemplateLine);
//lineRange.AddRange(sampleTemplateLine);
//lineRange.AddRange(sampleTemplateLine);
//lineRange.AddRange(sampleTemplateLine);
var allProducts = sampleTemplateLine
.Concat(sampleTemplateLine)
.Concat(sampleTemplateLine)
.Concat(sampleTemplateLine)
.ToList();
int i = 1;
foreach (var item in allProducts)
{
item.LineNumber = i;
i++;
}
this doesnt seem to work either
//re-number the line number
var total = allProducts.Count();
for (int i =0; i < total; i++)
{
allProducts[i].LineNumber = i+1;
}
PROBLEM: below RETURN 4 when i'm expecting 1
var itemthing = allProducts.Where(x => x.LineNumber == 17312).ToList();
You are adding the same objects multiple times. You wold have to add new objects or clone the ones you have.
The problem is they are pointing the same object. So if you change a property it changes all the pointed objects at the same
You can use Clone method if it exist, if not you can create your own Clone method like in this question.

ArgumentOutOfRangeException Using Tasks

I'm getting an ArgumentOutOfRangeException when I'm really not sure why.
Task[] downloadTasks = new Task[music.Count];
for (int i = 0; i < music.Count; i++)
downloadTasks[i] = Task.Factory.StartNew(() => DownloadAudio(music[i], lstQueue.Items[i]));
Task.Factory.ContinueWhenAll(downloadTasks, (tasks) =>
{
MessageBox.Show("All the downloads have completed!",
"Success",
MessageBoxButtons.OK,
MessageBoxIcon.Information);
});
The error occurs when the for loop runs when i = 1 and I'm not sure why it does this when I'm positive that music.Count = 1.
I always tried this approach as an alternative to the for loop and got the same exception:
int index = 0;
foreach (MusicFile song in music)
{
downloadTasks[index] = Task.Factory.StartNew(() => DownloadAudio(song, lstQueue.Items[index]));
index++;
}
Is there anything in the above code that might cause this?
I'm also not sure if this is relevant, but when I can accomplish the same thing using threads without any exception. It was only when I tried implementing tasks that this exception appeared.
This happens because you're passing StartNew a Lambda Expression, which implicitly captures your i variable. This effect is called Closure.
In order to get the proper behavior, you'll have to make a local copy of your index:
for (int i = 0; i < music.Count; i++)
{
var currentIndex = i;
downloadTasks[i] = Task.Factory.StartNew(() =>
DownloadAudio(music[currentIndex],
lstQueue.Items[currentIndex]));
}
In both instances, you are closing over the loop variable i in the first example, or your manually assigned index in the second.
What is happening is that the final value of i / index is used after the loop completion, which is when i++ has incremented beyond the size of the iterated array. (See also here)
Either capture the value of i inside the loop with an additional variable as per #Yuval, or alternatively, look at ways of coupling the two collections together, such that you do not need to iterate music and lstQueue independently, e.g. here we pre-combine the two collections into a new anonymous class:
var musicQueueTuples = music.Zip(lstQueue, (m, q) => new {Music = m, QueueItem = q})
.ToList();
// Which now allows us to use LINQ to project the tasks:
var downloadTasks = musicQueueTuples.Select(
mqt => Task.Factory.StartNew(
() => DownloadAudio(mqt.Music, mqt.QueueItem))).ToArray();
Task.Factory.ContinueWhenAll(downloadTasks, (tasks) => ...
Closures is your problem, where the variable i is being referenced by the lambada expression, thus it has access to i and always reads its value directly from the memory.
You can create a factory function that create task handlers. You can follow the following idea to solve the problem.
private Action CreateTaskHandler(int arg1)
{
return () => DownloadAudio(music[arg1], lstQueue.Items[arg1])
}
Task[] downloadTasks = new Task[music.Count];
for (int i = 0; i < music.Count; i++)
downloadTasks[i] = Task.Factory.StartNew(CreateTaskHandler(i));
}

How do I create a Dictionary<int, EntityState>, and add values inside a for loop?

So, I hope this is simple. I'm coming up with a way to store disconnected entities (due to my case being quite peculiar), and for it to work, I'd like to create a Dictionary with those values inside a for loop.
But I'm getting "An item with the same key" has been added problem, which I do not know why.
I've tried the following:
Dictionary<int, EntityState> StateProduct = new Dictionary<int, EntityState>();
for (int s = 0; s < userProducts.Count; s++ ) //userProducts.Count had value of 3
{
StateProduct.Add(s, EntityState.Modified);
}
But I get the error:
In which:
I really really do not know what's going on..
Edit: Here is the complete code
var dbIboID = dbs.OrderDB.Where(x => x.OrderID == Order[0].OrderID).FirstOrDefault();
if(dbIboID.IboID != uid)
{
return false;
}
//2nd Step:
//2.0 Attach it. Yes I know it sets it as unchanged. But let me do the magic trick!!!
dbIboID.OrderProcess = Order.ToList(); //CHANGED
dbs.OrderDB.Attach(dbIboID);
//2.1 Extract original values from the database.
var originalProducts = dbs.OrderProcessDB.Where(x => x.OrderProcessID == Order[0].OrderProcessID).ToList();
var userProducts = Order.ToList();
//This is a dictionary which will be used to set all other entities with their correct states!
Dictionary<int, System.Data.Entity.EntityState> StateProduct = new Dictionary<int, System.Data.Entity.EntityState>();
//2.3 Find new added products. addedProducts = userProducts[key] - originalProducts[key]
if(userProducts.Count > originalProducts.Count)
{
for (int i = originalProducts.Count - 1; i < userProducts.Count; i++ )
{
StateProduct.Add(i, System.Data.Entity.EntityState.Added);
}
}
//2.3 Find Deleted products = originalProducts - userProducts. Do reverse of the addedProducts
else
{
for (int i = userProducts.Count - 1; i < originalProducts.Count; i++)
{
StateProduct.Add(i, System.Data.Entity.EntityState.Deleted);
}
}
//2.4 Find modified products modifiedProducts = [userProducts - addedProducts] different originalProducts
//This is not 100% fool proof. Because there will be times that I will always have a modification,
// when objects remained unchanged.
for (int s = 0; s < userProducts.Count; s++ )
{
StateProduct.Add(s, System.Data.Entity.EntityState.Modified);
}
//2.5 Painting Process:
for (int i = 0; i < dbIboID.OrderProcess.Count(); i++ )
{
dbs.DB.Entry(dbIboID.OrderProcess[i]).State = StateProduct[i];
}
The code as you have shown it should not produce that exception, because the dictionary was allocated immediately prior to the loop, and thus should be empty, and the items being added all are unique integers.
My guess is that the dictionary already had some values in it. If so, then using Add to set a value will throw an ArgumentException, since the value corresponding to that key can only be replaced, not added, for the Dictionary class only allows one value per key.
So, if you expect the dictionary not to already have a value for a key, and want an error exception to be thrown if it does, do:
StateProduct.Add(s, EntityState.Modified)
If you want to add or replace a value, do:
StateProduct[s] = EntityState.Modified;

Out of memory when creating a lot of objects C#

I'm processing 1 million records in my application, which I retrieve from a MySQL database. To do so I'm using Linq to get the records and use .Skip() and .Take() to process 250 records at a time. For each retrieved record I need to create 0 to 4 Items, which I then add to the database. So the average amount of total Items that has to be created is around 2 million.
IQueryable<Object> objectCollection = dataContext.Repository<Object>();
int amountToSkip = 0;
IList<Object> objects = objectCollection.Skip(amountToSkip).Take(250).ToList();
while (objects.Count != 0)
{
using (dataContext = new LinqToSqlContext(new DataContext()))
{
foreach (Object objectRecord in objects)
{
// Create 0 - 4 Random Items
for (int i = 0; i < Random.Next(0, 4); i++)
{
Item item = new Item();
item.Id = Guid.NewGuid();
item.Object = objectRecord.Id;
item.Created = DateTime.Now;
item.Changed = DateTime.Now;
dataContext.InsertOnSubmit(item);
}
}
dataContext.SubmitChanges();
}
amountToSkip += 250;
objects = objectCollection.Skip(amountToSkip).Take(250).ToList();
}
Now the problem arises when creating the Items. When running the application (and not even using dataContext) the memory increases consistently. It's like the items are never getting disposed. Does anyone notice what I'm doing wrong?
Thanks in advance!
Ok I've just discussed this situation with a colleague of mine and we've come to the following solution which works!
int amountToSkip = 0;
var finished = false;
while (!finished)
{
using (var dataContext = new LinqToSqlContext(new DataContext()))
{
var objects = dataContext.Repository<Object>().Skip(amountToSkip).Take(250).ToList();
if (objects.Count == 0)
finished = true;
else
{
foreach (Object object in objects)
{
// Create 0 - 4 Random Items
for (int i = 0; i < Random.Next(0, 4); i++)
{
Item item = new Item();
item.Id = Guid.NewGuid();
item.Object = object.Id;
item.Created = DateTime.Now;
item.Changed = DateTime.Now;
dataContext.InsertOnSubmit(item);
}
}
dataContext.SubmitChanges();
}
// Cumulate amountToSkip with processAmount so we don't go over the same Items again
amountToSkip += processAmount;
}
}
With this implementation we dispose the Skip() and Take() cache everytime and thus don't leak memory!
Ahhh, the good old InsertOnSubmit memory leak. I've encountered it and bashed my head against the wall many times when trying to load data from large CVS files using LINQ to SQL. The problem is that even after calling SubmitChanges, the DataContext continues to track all objects that have been added using InsertOnSubmit. The solution is to SubmitChanges after a certain amount of objects, then create a new DataContext for the next batch. When the old DataContext is garbage collected, so will all the inserted objects that are tracked by it (and that you no longer require).
"But wait!" you say, "Creating and disposing of many DataContext will have a huge overhead!". Well, not if you create a single database connection and pass it to each DataContext constructor. That way, a single connection to the database is maintained throughout, and the DataContext object is otherwise a lightweight object that represents a small work unit and should be discarded after it is complete (in your example, submitting a certain number of records).
My best guess here would be the IQueryable to cause the Memory leak.
Maybe there is no appropriate implementation for MySQL of the Take/Skip methods and it's doing the paging in memory? Stranger things have happened, but your loop looks fine. All references should go out of scope and get garbage collected ..
Well yeah.
So at the end of that loop you'll attempt to have 2 million items in your list, no? Seems to me that the answer is trivial: Store less items or get more memory.
-- Edit:
It's possible I've read it wrong, I'd probably need to compile and test it, but I can't do that now. I'll leave this here, but I could be wrong, I haven't reviewed it carefully enough to be definitive, nevertheless the answer may prove useful, or not. (Judging by the downvote, I guess not :P)
Have you tried declaring the Item outside the loop like this:
IQueryable<Object> objectCollection = dataContext.Repository<Object>();
int amountToSkip = 0;
IList<Object> objects = objectCollection.Skip(amountToSkip).Take(250).ToList();
Item item = null;
while (objects.Count != 0)
{
using (dataContext = new LinqToSqlContext(new DataContext()))
{
foreach (Object objectRecord in objects)
{
// Create 0 - 4 Random Items
for (int i = 0; i < Random.Next(0, 4); i++)
{
item = new Item();
item.Id = Guid.NewGuid();
item.Object = objectRecord.Id;
item.Created = DateTime.Now;
item.Changed = DateTime.Now;
dataContext.InsertOnSubmit(item);
}
}
dataContext.SubmitChanges();
}
amountToSkip += 250;
objects = objectCollection.Skip(amountToSkip).Take(250).ToList();
}

Categories