if we have a ConcurrentBag<object> safeBag` filled with 100 objects.
then one thread works as:
foreach(object o in safeBag)
{
Thread.Sleep(1000);
}
the other thread starts right after the 1st thread starts:
{
safeBag.AddOrTake(something);
}
Will the 2nd thread wait for 100Sec to enter the resource?
Another question, if the 1st thread run with Parallel.ForEach(),how will the threads work?
EDIT:The MSDN said:"A List can support multiple readers concurrently, as long as the collection is not modified. Enumerating through a collection is intrinsically not a thread-safe procedure. In the rare case where an enumeration contends with one or more write accesses, the only way to ensure thread safety is to lock the collection during the entire enumeration." Does the Enumerating through the ConcurrentBag cause the 2nd thread waiting at writing access to the ConcurrentBag?
With most Concurrent* collections most operations are atomic but don't hold any long term locks. The first thread doesn't block the second thread after GetEnumerator() returns.
ConcurrentBag<T>.GetEnumerator Method
The enumeration represents a moment-in-time snapshot of the contents of the bag. It does not reflect any updates to the collection after GetEnumerator was called. The enumerator is safe to use concurrently with reads from and writes to the bag.
The second thread, assuming you're spawning two threads right in a row - the first using a ThreadStart pointing to the block containing the iteration and the second pointing to that other code block, will not wait for 1000ms. The foreach block will simply wait 1 second between moving to the next object in the set, the second block is unaffected by that.
If it were a parallel foreach, you'd have several threads waiting for a second (concurrently) before moving to the next element. The second block would still not be waiting on the ConcurrentBag to become free.
Related
I have a generic List that's getting added and removed by two different threads.
Thread 1 :
batchToBeProcessed = this.list.Count;
// Do some processing
this.list.RemoveRange(0, batchToBeProcessed.Count);
Thread 2 :
lock()
{
this.list.Add(item);
}
Is RemoveRange, thread safe in the following scenario?
Say the list has 10 items thats being processed in thread1, and while removerange is getting executed, thread2 adds 1 item, what would happen?
Edit : Do not want to use a concurrentbag since I need the ordering.
The answer to your question is No, it is not safe to have multiple threads modifying the list concurrently. You must synchronize access to the list using a lock or similar synchronization primitive. List is safe for multiple concurrent readers. Or you can have a single writer. But you can't modify the list in one thread while some other thread is accessing it at all.
As to what will happen if you try to Add while another thread is executing a RemoveRange, it's likely that the code will throw an exception. But it's possible no exception will be thrown and your list will become corrupted. Or it could work just fine. Which will happen depends on timing. But there's no guarantee that it will work.
To my knowledge, the .NET Framework does not have a concurrent data structure that gives you all the functionality of List.
Is RemoveRange, thread safe in the following scenario?
No, all the methods in List<T> class are not thread safe.
Say the list has 10 items thats being processed in thread1, and while removerange is getting executed, thread2 adds 1 item, what would happen?
It's undefined, new item may be added to the end of the list, old items may not be removed, ...
I know static variable or collection is shared, across the threads, at most single memory address is created for variable, and it's state will be persistent, across the threads.
static int count =0
thread 1 --> count++
thread 2 --> diplay count -->1
thread 3 --> count--
thread 1 --> display count -->0
my question, locking mechanism is required for static collections? below is the static collection and locking mechanism.
public static List<ConnectionManager> ServerConnections = new List<ConnectionManager>();
lock (Global.ServerConnections)
{
//do something
}
Sure. If you need just thread safety you can use C# concurrent collections, but if you want some synchronization (like several actions upon a collection inside one thread to be executed without any impact of other threads) you need locking.
Actually you also need to take care of your variables, count++ and count-- are not thread safe. Use Interlocked or any other mechanism to ensure thread safety.
The answer is: Yes. You need a lock because Static != Thread safe. This applies to your count variable too.
Just because a variable is static that doesn't make it thread safe. Multiple threads can still access it at the exact same time which causes concurrency issues.
There is no thread safety to anything by default, it has to be designed to be thread safe.
Also take a look at the ConcurrentBag<T>.
It belongs to how you use the collection and how you instatiate it. If you instantiate it from different threads you should gurantee that only one thread instatiate it. Because with badluck more threads try instatiation at the same time. The Lazy class implemented in .Net is for this purpose and makes it easy to instantiate thread safe and lazy. Futher you need to lock your collection in any case of operation you wanna do. Insert, Remove, Iterate etc. are all not thread safe. Read about concurrentDictionary etc. for more information about thread safe collections.
I have not used concurrent queue before.
Is it OK to use TryDequeue as below, in a while loop? Could this not get stuck forever?
var cq = new ConcurrentQueue<string>();
cq.Enqueue("test");
string retValue;
while(!cq.TryDequeue(out retValue))
{
// Maybe sleep?
}
//Do rest of code
It's safe in the sense that the loop won't actually end until there is an item it has pulled out, and that it will eventually end if the queue has an item to be taken out. If the queue is emptied by another thread and no more items are added then of course the loop will not end.
Beyond all of that, what you have is a busy loop. This should virtually always be avoided. Either you end up constantly polling the queue asking for more items, wasting CPU time and effort tin the process, or you end up sleeping and therefore not actually using the item in the queue as soon as it is added (and even then, still wasting some time/effort on context switches just to poll the queue).
What you should be doing instead, if you find yourself in the position of wanting to "wait until there is an item for me to take" is use a BlockingCollection. It is specifically designed to wrap various types of concurrent collections and block until there is an item available to take. It allows you to change your code to queue.Take() and have it be easier to write, semantically stating what you're doing, be clearly correct, noticeably more effective, and completely safe.
Yes it is safe as per the documentation, but it is not a recommended design.
It might get "Stuck forever" if the queue was empty at the first call TryDequeue, and if no other thread pushes data in the queue after that point (you could break the while after N attempts or after a timeout, though).
ConcurrentQueue offers an IsEmpty member to check if there are items in the Queue. It is much more efficient to check it than to loop over a TryDequeue call (particularly if the queue is generally empty)
What you might want to do is :
while(cq.IsEmpty())
{
// Maybe sleep / wait / ...
}
if(cq.TryDequeue(out retValue))
{
...
}
EDIT:
If this last call returns false: another of your threads dequeued the item. If you don't have other threads, this is safe, if you do, you should use while(TryDequeue)
Can the following be considered thread safe due to the atomic operation appearance of the code.
My main concern is if the lists needs to be re-sized it becomes non-thread safe during the re-sizing.
List<int> list = new List<int>(10);
public List<int> GetList()
{
var temp = list;
list = new List<int>(10);
return temp;
}
TimerElapsed(int number)
{
list.Add(number);
}
No. List<T> is explicitly documented not to be thread-safe:
It is safe to perform multiple read operations on a List, but issues can occur if the collection is modified while it’s being read. To ensure thread safety, lock the collection during a read or write operation. To enable a collection to be accessed by multiple threads for reading and writing, you must implement your own synchronization. For collections with built-in synchronization, see the classes in the System.Collections.Concurrent namespace. For an inherently thread–safe alternative, see the ImmutableList class.
Neither your code nor the List<T> are thread-safe.
The list isn't thread-safe according to its documentation. Your code is not thread safe because it lacks synchronization.
Consider two threads calling GetList concurrently. Let's say the first thread gets pre-empted right after setting up the temp. Now the second thread sets the temp of its own, replaces the list, and lets the GetList function run to completion. When the first thread gets to continue, it would return the same list that the second thread has just returned.
But that's not all! If a third thread has called TimerElapsed after the second thread has completed but before the first thread has completed, it would place a value in a list that is about to be overwritten without a trace. So not only would multiple threads return the same data, but also some of your data will disappear.
No. It is not ThreadSafe.
Try using members of the System.Collections.Concurrent namespace
As already mentioned, a List<T> is not thread safe. You can look at alternatives in the Concurrent namespace, possibly using the ConcurrentBag, or there is an article here by Dean Chalk Fast Parallel ConcurrentList<T> Implementation.
It is not thread safe since there can be a context switch between the first line of the GetList method which transfers to TimerElapsed method. This will create inconsistent result on different scenarions. Also as other users already mentioned the List class is not thread safe and you should use the System.Collections.Concurrent equivalent.
It is thread safe for reading only, not for writing.
For multiple threads wait, can anyone compare the pros and cons of using WaitHandle.WaitAll and Thread.Join?
WaitHandle.WaitAll has a 64 handle limit so that is obviously a huge limitation. On the other hand, it is a convenient way to wait for many signals in only a single call. Thread.Join does not require creating any additional WaitHandle instances. And since it could be called individually on each thread the 64 handle limit does not apply.
Personally, I have never used WaitHandle.WaitAll. I prefer a more scalable pattern when I want to wait on multiple signals. You can create a counting mechanism that counts up or down and once a specific value is reach you signal a single shared event. The CountdownEvent class conveniently packages all of this into a single class.
var finished = new CountdownEvent(1);
for (int i = 0; i < NUM_WORK_ITEMS; i++)
{
finished.AddCount();
SpawnAsynchronousOperation(
() =>
{
try
{
// Place logic to run in parallel here.
}
finally
{
finished.Signal();
}
}
}
finished.Signal();
finished.Wait();
Update:
The reason why you want to signal the event from the main thread is subtle. Basically, you want to treat the main thread as if it were just another work item. Afterall, it, along with the other real work items, is running concurrently as well.
Consider for a moment what might happen if we did not treat the main thread as a work item. It will go through one iteration of the for loop and add a count to our event (via AddCount) indicating that we have one pending work item right? Lets say the SpawnAsynchronousOperation completes and gets the work item queued on another thread. Now, imagine if the main thread gets preempted before swinging around to the next iteration of the loop. The thread executing the work item gets its fair share of the CPU and starts humming along and actually completes the work item. The Signal call in the work item runs and decrements our pending work item count to zero which will change the state of the CountdownEvent to signalled. In the meantime the main thread wakes up and goes through all iterations of the loop and hits the Wait call, but since the event got prematurely signalled it pass on by even though there are still pending work items.
Again, avoiding this subtle race condition is easy when you treat the main thread as a work item. That is why the CountdownEvent is intialized with one count and the Signal method is called before the Wait.
I like #Brian's answer as a comparison of the two mechanisms.
If you are on .Net 4, it would be worthwhile exploring Task Parallel Library to achieve Task Parellelism via System.Threading.Tasks which allows you to manage tasks across multiple threads at a higher level of abstraction. The signalling you asked about in this question to manage thread interactions is hidden or much simplified, and you can concentrate on properly defining what each Task consists of and how to coordinate them.
This may seem offtopic but as Microsoft themselves say in the MSDN docs:
in the .NET Framework 4, tasks are the
preferred API for writing
multi-threaded, asynchronous, and
parallel code.
The waitall mechanism involves kernal-mode objects. I don't think the same is true for the join mechanism. I would prefer join, given the opportunity.
Technically though, the two are not equivalent. IIRC Join can only operate on one thread. Waitall can hold for the signalling of multiple kernel objects.