So my parallel foreach sometimes gets stuck and when debugging I try break all to see where I'm at and it's always on the beginning of the parallel foreach.
It doesn't do it every time but when it does it , it's after a while.
Can you guys help me out figure it out? I don't really understand why beside the fact that it could be because every threads are accessing the list of checkedPogos
public static Task<List<string>> Checks(List<string> pogos)
{
arret = false;
var pogoTask= new Task<List<string>>(() =>
{
List<string> checkedPogos= new List<string>();
Parallel.ForEach(pogos, (string pogo, ParallelLoopState state) =>
{
if (arret)
state.Stop();
pogo= pogo.Trim();
if (pogo != null && Check(pogo))
{
checkedPogos.Add(pogo);
}
});
return checkedPogos.Distinct().ToList();
});
pogoTask.Start();
return pogoTask;
}
EDIT : I've changed the List to a concurrent collection and it still didn't fix the problem.
Related
I'm getting this exception with this code and can't understand why
private static void LoopBTCtx()
{
Task.Factory.StartNew(async () =>
{
while (true)
{
try
{
Thread.Sleep((int)TimeSpan.FromSeconds(10).TotalMilliseconds);
List<(string, SocketMessage, int)> _btcTX = btcTX;
foreach (var tx in btcTX)
{
int newConfirmations = GetBTCtxConfirmations(tx.Item1);
if (tx.Item3 != newConfirmations)
{
_btcTX.Remove(tx);
if (newConfirmations < 6)
{
_btcTX.Add((tx.Item1, tx.Item2, newConfirmations));
}
await tx.Item2.Channel.SendMessageAsync($"{tx.Item2.Author.Mention}, ``{tx.Item1}`` now has **{newConfirmations}**/6 confirmation{(newConfirmations != 1 ? "s" : null)}.");
}
}
btcTX = _btcTX;
}
catch (Exception e)
{
Console.WriteLine(e);
}
}
});
}
It's thrown after processing the first list element (foreach)
The exception line from the stacktrace is the one containing foreach (var tx in btcTX)
I tried using 2 different lists then updating the main one once the foreach is done, as you can see in my code above, but it didn't fix.
You still have one list.
The following statement just causes _btcTX to point to the same list instance as btcTX:
List<(string, SocketMessage, int)> _btcTX = btcTX;
So actually the main list was modified in the Remove() and/or Add().
One way to remove/add the items is to perform a regular for loop with an index (from last to first), and then you will be able to remove/add items with no problem.
Another way would be to keep the foreach loop, but store the indices to be removed and the items to be added inside the loop and then perform the actual adding/removal after the loop (removing should be done from the last index to the first).
I'm encountering (I hope) a deadlocking issue with a WCF service I'm trying to write.
I have the following lock on a function that "locates" a particular item im the list:
CIPRecipe FindRecipe_ByUniqueID(string uniqueID)
{
lock (_locks[LOCK_RECIPES])
{
foreach (var r in _recipes.Keys)
{
if (_recipes[r].UniqueID == uniqueID)
{
return _recipes[r];
}
}
}
return null;
}
However, various functions reiterate through this list and always apply the same LOCK for example ....
lock (_locks[LOCK_RECIPES_NO_ADD_OR_REMOVE])
{
foreach (var r in _recipes)
{
r.Value.UpdateSummary();
summaries.Add((RecipeSummary)r.Value.Summary);
}
}
What I suspect is, an item in _recipes in the above example has suddenly called a function which ultimately calls the first function - "CIPRecipe FindRecipe_ByUniqueID(string uniqueID)" and this is causing a deadlock when it is reached in the iteration.
I need to stop this list changing whilst I'm iterating through it. Can someone advise me the best practice?
Thanks
What you want is to use a ReaderWriterLockSlim, this will let unlimited concurrent readers through but only a single writer through and block all readers while the writer is writing.
This assumes _locks has been chagned from a object[] to a ReaderWriterSlim[]
//On Read
CIPRecipe FindRecipe_ByUniqueID(string uniqueID)
{
var lockObj = _locks[LOCK_RECIPES];
lockObj.EnterReadLock();
try
{
foreach (var r in _recipes.Keys)
{
if (_recipes[r].UniqueID == uniqueID)
{
return _recipes[r];
}
}
}
finally
{
lockObj.ExitReadLock();
}
return null;
}
//On write
var lockObject = _locks[LOCK_RECIPES]; //Note this now uses the same lock object as the other method.
lockObj.EnterWriteLock();
try
{
foreach (var r in _recipes)
{
r.Value.UpdateSummary();
summaries.Add((RecipeSummary)r.Value.Summary);
}
}
finally
{
lockObj.ExitWriteLock();
}
I don't know if it will solve your deadlock issue, if it is caused by you allowing reads during a write it may.
Perhaps a ConcurrentDictionary is called for here?
I have this code:
// positions is a List<Position>
Parallel.ForEach(positions, (position) =>
{
DeterminePostPieceIsVisited(position, postPieces);
});
private void DeterminePostPieceIsVisited(Position position, IEnumerable<Postpieces> postPieces)
{
foreach (var postPiece in postPieces)
{
if (postPiece.Deliverd)
continue;
var distanceToClosestPosition = postPiece.GPS.Distance(position.GPS);
postPiece.Deliverd = distanceToClosestPosition.HasValue && IsInRadius(distanceToClosestPosition.Value);
}
}
}
I know that 50 post pieces must have the property Deliverd set to true. But, when running this code, I get changing results. Sometimes I get 44, when I run it another time I get 47. The results are per execution different.
When I run this code using a plain foreach-loop I get the expected result. So I know my implementation of the method DeterminePostPieceIsVisited is correct.
Could someone explain to me why using the Parallel foreach gives me different results each time I execute this code?
You've already, I think, tried to avoid a race, but there is still one - if two threads are examining the same postPiece at the same time, they may both observe that Deliverd (sic) is false, and then both assess whether it's been delivered to position (a distinct value for each thread) and both attempt to set a value for Deliverd - and often, I would guess, one of them will be trying to set it to false. Simple fix:
private void DeterminePostPieceIsVisited(Position position, IEnumerable<Postpieces> postPieces)
{
foreach (var postPiece in postPieces)
{
if (postPiece.Deliverd)
continue;
var distanceToClosestPosition = postPiece.GPS.Distance(position.GPS);
var delivered = distanceToClosestPosition.HasValue && IsInRadius(distanceToClosestPosition.Value);
if(delivered)
postPiece.Deliverd = true;
}
}
Also, by the way:
When I run this code using a plain foreach-loop I get the expected result. So I know my implementation of the method DeterminePostPieceIsVisited is correct.
The correct thing to state is would be "I know my implementation is correct for single threaded access" - what you hadn't established is that the method was safe for calling from multiple threads.
I have solved my issue with ConcurrentBag<T>. Here's what I use now:
var concurrentPostPiecesList = new ConcurrentBag<Postpiece>(postPieces);
Parallel.ForEach(positions, (position) =>
{
DeterminePostPieceIsVisited(position, concurrentPostPiecesList);
});
private void DeterminePostPieceIsVisited(Position position, ConcurrentBag<Postpieces> postPieces)
{
foreach (var postPiece in postPieces)
{
if (postPiece.Deliverd)
continue;
var distanceToClosestPosition = postPiece.GPS.Distance(position.GPS);
postPiece.Deliverd = distanceToClosestPosition.HasValue && IsInRadius(distanceToClosestPosition.Value);
}
}
I have the following code:
public class EmailJobQueue
{
private EmailJobQueue()
{
}
private static readonly object JobsLocker = new object();
private static readonly Queue<EmailJob> Jobs = new Queue<EmailJob>();
private static readonly object ErroredIdsLocker = new object();
private static readonly List<long> ErroredIds = new List<long>();
public static EmailJob GetNextJob()
{
lock (JobsLocker)
{
lock (ErroredIdsLocker)
{
// If there are no jobs or they have all errored then get some new ones - if jobs have previously been skipped then this will re get them
if (!Jobs.Any() || Jobs.All(j => ErroredIds.Contains(j.Id)))
{
var db = new DBDataContext();
foreach (var emailJob in db.Emailing_SelectSend(1))
{
// Dont re add jobs that exist
if (Jobs.All(j => j.Id != emailJob.Id) && !ErroredIds.Contains(emailJob.Id))
{
Jobs.Enqueue(new EmailJob(emailJob));
}
}
}
while (Jobs.Any())
{
var curJob = Jobs.Dequeue();
// Check the job has not previously errored - if they all have then eventually we will exit the loop
if (!ErroredIds.Contains(curJob.Id))
return curJob;
}
return null;
}
}
}
public static void ReInsertErrored(long id)
{
lock (ErroredIdsLocker)
{
ErroredIds.Add(id);
}
}
}
I then start 10 threads which do this:
var email = EmailJobQueue.GetNextJob();
if (email != null)
{
// Breakpoint here
}
The thing is that if I put a breakpoint where the comment is and add one item to the queue then the breakpoint gets hit multiple times. Is this an issue with my code or a peculiarity with VS debugger?
Thanks,
Joe
It appears as if you are getting your jobs from the database:
foreach (var emailJob in db.Emailing_SelectSend(1))
Is that database call marking the records as unavailable for section in future queries? If not, I believe that's why you're hitting the break point multiple times.
For example, if I replace that call to the database with the following, I see your behavior.
// MockDB is a static configured as `MockDB.Enqueue(new EmailJob{Id = 1})`
private static IEnumerable<EmailJob> GetJobFromDB()
{
return new List<EmailJob>{MockDB.Peek()};
}
However, if I actually Dequeue from the mock db, it only hits the breakpoint once.
private static IEnumerable<EmailJob> GetJobFromDB()
{
var list = new List<EmailJob>();
if (MockDB.Any())
list.Add(MockDB.Dequeue());
return list;
}
This is a side effect of debugging a multi-threaded piece of your application.
You are seeing the breakpoint being hit on each thread. Debugging a multi-threaded piece of the application is tricky because you're actually debugging all threads at the same time. In fact, at times, it will jump between classes while you're stepping through because it's doing different things on all of those threads, depending on your application.
Now, to address whether or not it's thread-safe. That really depends on how you're using the resources on those threads. If you're just reading, it's likely that it's thread-safe. But if you're writing, you'll need to leverage at least the lock operation on shared objects:
lock (someLockObject)
{
// perform the write operation
}
With the new ConcurrentBag<T> in .NET 4, how do you remove a certain, specific object from it when only TryTake() and TryPeek() are available?
I'm thinking of using TryTake() and then just adding the resulting object back into the list if I don't want to remove it, but I feel like I might be missing something. Is this the correct way?
The short answer: you can't do it in an easy way.
The ConcurrentBag keeps a thread local queue for each thread and it only looks at other threads' queues once its own queue becomes empty. If you remove an item and put it back then the next item you remove may be the same item again. There is no guarantee that repeatedly removing items and putting them back will allow you to iterate over the all the items.
Two alternatives for you:
Remove all items and remember them, until you find the one you want to remove, then put the others back afterwards. Note that if two threads try to do this simultaneously you will have problems.
Use a more suitable data structure such as ConcurrentDictionary.
You can't. Its a bag, it isn't ordered. When you put it back, you'll just get stuck in an endless loop.
You want a Set. You can emulate one with ConcurrentDictionary. Or a HashSet that you protect yourself with a lock.
The ConcurrentBag is great to handle a List where you can add items and enumerate from many thread, then eventually throw it away as its name is suggesting :)
As Mark Byers told, you can re-build a new ConcurrentBag that does not contains the item you wish to remove, but you have to protect this against multiple threads hits using a lock. This is a one-liner:
myBag = new ConcurrentBag<Entry>(myBag.Except(new[] { removedEntry }));
This works, and match the spirit in which the ConcurrentBag has been designed for.
As you mention, TryTake() is the only option. This is also the example on MSDN. Reflector shows no other hidden internal methods of interest either.
Mark is correct in that the ConcurrentDictionary is will work in the way you are wanting. If you wish to still use a ConcurrentBag the following, not efficient mind you, will get you there.
var stringToMatch = "test";
var temp = new List<string>();
var x = new ConcurrentBag<string>();
for (int i = 0; i < 10; i++)
{
x.Add(string.Format("adding{0}", i));
}
string y;
while (!x.IsEmpty)
{
x.TryTake(out y);
if(string.Equals(y, stringToMatch, StringComparison.CurrentCultureIgnoreCase))
{
break;
}
temp.Add(y);
}
foreach (var item in temp)
{
x.Add(item);
}
public static void Remove<T>(this ConcurrentBag<T> bag, T item)
{
while (bag.Count > 0)
{
T result;
bag.TryTake(out result);
if (result.Equals(item))
{
break;
}
bag.Add(result);
}
}
This is my extension class which I am using in my projects. It can a remove single item from ConcurrentBag and can also remove list of items from bag
public static class ConcurrentBag
{
static Object locker = new object();
public static void Clear<T>(this ConcurrentBag<T> bag)
{
bag = new ConcurrentBag<T>();
}
public static void Remove<T>(this ConcurrentBag<T> bag, List<T> itemlist)
{
try
{
lock (locker)
{
List<T> removelist = bag.ToList();
Parallel.ForEach(itemlist, currentitem => {
removelist.Remove(currentitem);
});
bag = new ConcurrentBag<T>();
Parallel.ForEach(removelist, currentitem =>
{
bag.Add(currentitem);
});
}
}
catch (Exception ex)
{
Debug.WriteLine(ex.Message);
}
}
public static void Remove<T>(this ConcurrentBag<T> bag, T removeitem)
{
try
{
lock (locker)
{
List<T> removelist = bag.ToList();
removelist.Remove(removeitem);
bag = new ConcurrentBag<T>();
Parallel.ForEach(removelist, currentitem =>
{
bag.Add(currentitem);
});
}
}
catch (Exception ex)
{
Debug.WriteLine(ex.Message);
}
}
}
public static ConcurrentBag<String> RemoveItemFromConcurrentBag(ConcurrentBag<String> Array, String Item)
{
var Temp=new ConcurrentBag<String>();
Parallel.ForEach(Array, Line =>
{
if (Line != Item) Temp.Add(Line);
});
return Temp;
}
how about:
bag.Where(x => x == item).Take(1);
It works, I'm not sure how efficiently...