How to elegantly access thread-safe collection and use AutoResetEvent - c#

I have two methods, ProcessQueue and AddToQueue, which happen on different threads. Sometimes I will attempt to Process the Queue before an item is added to a queue, at which point I want to wait for an item to be added to a queue. I also want to make sure that I will never get a situation where I wait, after the Queue is evaluated as being empty and then after the Queue is added to on a different thread. Below is my attempt at doing this, but a deadlock is created because the Auto Reset Event waits with a lock still in force.
There has to be a more elegant way of doing this. Any suggestions?
private readonly object m_Locker = new object();
private readonly Queue<int> m_Queue = new Queue<int>();
private readonly AutoResetEvent m_AutoResetEvent = new AutoResetEvent(false);
void ProcessQueue()
{
lock (m_Locker)
{
if (m_Queue.Count == 0)
{
// nothing is happening, so wait for it to happen
m_AutoResetEvent.WaitOne();
}
}
Console.WriteLine("Processed {0}", m_Queue.Dequeue());
}
// on another thread
void AddToQueue(int i)
{
lock (m_Locker)
{
m_Queue.Enqueue(i);
m_AutoResetEvent.Set();
}
}

You must release the lock on the queue m_locker before you issue the wait. You could do that manually with a Monitor, reacquire and recheck after your wait is satisfied. This way you only hold the lock while you are checking for non-zero element count.
If you are on .Net 4 you can use BlockingCollection<T> or ConcurrentQueue<T> instead, from System.Collections.Concurrent. There's really no reason to build this by hand any more.
This code won't work if you have > 1 concurrent consumer - you'd need a Semaphore instead of AutoResetEvent in that case to ensure the correct number of consumers get signaled.
Since you can't use .Net 4, there are guidelines for this scenario here. Note that the comments on that article include some approaches you can use to make this bulletproof.
The following example demonstrates
thread synchronization between the
primary thread and two worker threads
using the lock keyword, and the
AutoResetEvent and ManualResetEvent
classes.

The problem is that you keep the queue locked in while you're waiting for the event.
This way the other process can't add to the queue because it is already locked. Try this:
int value = 0;
while (true)
{
lock (m_Locker)
{
if (m_Queue.Count > 0)
{
value = m_Queue.Dequeue();
break;
}
}
m_AutoResetEvent.WaitOne();
}
With the example above, you also dequeue in the lock, so you are sure that no other thread has a chance to dequeue between the moment you waited and the moment that you check the queue actually had an item.

Well, this is textbook deadlock example. The bottom line is you don't want to enter the Wait state on your AutoResetEvent while locking on m_locker in the ProcessQueue function.
Also, note that the generic Queue implementation in .NET is not thread-safe so you should also guard access to the Dequeue call in ProcessQueue.

Wouldn't you want to do:
// no lock up here
while (true)
{
// nothing is happening, so wait for it to happen
m_AutoResetEvent.WaitOne();
lock (m_locker)
{
// ProcessTheQueue(); // process the queue after the reset event is Set
}
}
and then:
lock (m_Locker)
{
m_Queue.Enqueue(i);
}
m_AutoResetEvent.Set();
?

If you are using .NET 4 the new BlockingCollection<T> provides the most elegant way to handle this.

Why bothering with the AutoResetEvent in the first place?
When you call the Process function, if it doesn't find anything than it should exit. I don't see the point in waiting since you'll probably just call it again after a while...

Related

In the scenario of using Wait() and Pulse(), can we replace `while` with `if`?

On the internet, I saw many example about Wait() and Pulse() and they used two while like in this example:
class MyQueue
{
private Queue<string> queue = new Queue<string>();
private const int CAPACITY = 3;
public void Put(string element)
{
lock (this)
{
// first `while`
while (queue.Count == CAPACITY)
{
Monitor.Wait(this);
}
queue.Enqueue(element);
Console.WriteLine($"Put {element} ({queue.Count})");
Monitor.Pulse(this);
}
}
public string Take()
{
lock (this)
{
// second `while`
while (queue.Count == 0)
{
Monitor.Wait(this);
}
string element = queue.Dequeue();
Console.WriteLine($"Taked {element} ({queue.Count})");
Monitor.Pulse(this);
return element;
}
}
}
In the Main():
MyQueue queue = new MyQueue();
new Thread(new ThreadStart(() => {
queue.Take();
queue.Take();
})).Start();
new Thread(new ThreadStart(() => {
queue.Put("a");
queue.Put("b");
queue.Put("c");
queue.Put("d");
queue.Put("e");
queue.Put("f");
})).Start();
I think I understood the scenario of using Pulse() and Wait().
In the above example, I think it's ok to replace the two while with if. I tried and it also printed the same result.
Is it right? Thank you.
In your exact example, probably it would be fine to do as you suggest. You have exactly one producer and one consumer, and so they should always operate in concert with each other to ensure a thread is woken only if its wait condition is resolved.
However:
The producer and consumer implementations would not be safe to use if, if you have more than one of either the producer or consumer. This is because the threads could be racing, and one thread could be made runnable but then not scheduled until another thread has in some way invalidated the original resolution of the wait condition.
While I'm skeptical that the .NET Monitor class is subject to the problem of spurious wake-ups — i.e. a thread in a wait state being woken due to some event other than an explicit wake by a cooperating thread (e.g. calling Monitor.Pulse()) — people who know concurrent programming and C# much better than I do have said otherwise (see e.g. Does C# Monitor.Wait() suffer from spurious wakeups?). And if you're at all concerned about spurious wake-ups, you'll want a loop instead of a simple if, to ensure that you recheck the wait condition before proceeding, just in case it wasn't actually satisfied before your thread was woken.
See also Eric Lippert's article Monitor madness, part two.
All that said, note that a producer/consumer scenario is much more easily implemented in modern .NET/C# by using BlockingCollection<T>. You can even include a maximum length for the queue when creating the collection to provide the "block if full" behavior seen in your code example.

How to wait for a boolean without looping (using any kind of wait / semaphore / event / mutex, etc)

I need to stop a thread until another thread sets a boolean value and I don't want to share between them an event.
What I currently have is the following code using a Sleep (and that's the code I want to change):
while (!_engine.IsReadyToStop())
{
System.Threading.Thread.Sleep(Properties.Settings.Default.IntervalForCheckingEngine);
}
Any ideas?
EDIT TO CLARIFY THINGS:
There is an object called _engine of a class that I don't own. I cannot modify it, that's why I don't want to share an event between them. I need to wait until a method of that class returns true.
SpinWait.SpinUntil is the right answer, regardless where you're gonna place this code. SpinUntil offers "a nice mix of spinning, yielding, and sleeping in between invocations".
If you are using C# 4.0, you can use:
Task t = Task.Factory.StartNew (() => SomeCall(..));
t.Wait();
By using Task.Wait method.
If you have more than one task run one after another, you can use Task.ContinueWith:
Task t = Task.Factory.StartNew (() =>SomeCall(..)).
ContinueWith(ExecuteAfterThisTaskFinishes(...);
t.Wait();
declare as
AutoResetEvent _ReadyToStop = new AutoResetEvent(false);
and use as
_ReadyToStop.WaitOne();
and
_ReadyToStop.Set();
For more info see the Synchronization Primitives in .Net
A condition variable is the synchronization primitive you can use for waiting on a condition.
It does not natively exist in .NET. But the following link provides 100% managed code for a condition variable class implemented in terms of SemaphoreSlim, AutoResetEvent and Monitor classes. It allows thread to wait on a condition. And can wake up one or more threads when condition is satisfied. In addition, it supports timeouts and CancellationTokens.
To wait on a condition you write code similar to the following:
object queueLock = new object();
ConditionVariable notEmptyCondition = new ConditionVariable();
T Take() {
lock(queueLock) {
while(queue.Count == 0) {
// wait for queue to be not empty
notEmptyCondition.Wait(queueLock);
}
T item = queue.Dequeue();
if(queue.Count < 100) {
// notify producer queue not full anymore
notFullCondition.Pulse();
}
return item;
}
}
Then in another thread you can wake up one or more threads waiting on condition.
lock(queueLock) {
//..add item here
notEmptyCondition.Pulse(); // or PulseAll
}

Don't understand the need for Monitor.Pulse()

According to MSDN, Monitor.Wait():
Releases the lock on an object and blocks the current thread until it
reacquires the lock.
However, everything I have read about Wait() and Pulse() seems to indicate that simply releasing the lock on another thread is not enough. I need to call Pulse() first to wake up the waiting thread.
My question is why? Threads waiting for the lock on a Monitor.Enter() just get it when it's released. There is no need to "wake them up". It seems to defeat the usefulness of Wait().
eg.
static object _lock = new Object();
static void Main()
{
new Thread(Count).Start();
Sleep(10);
lock (_lock)
{
Console.WriteLine("Main thread grabbed lock");
Monitor.Pulse(_lock) //Why is this required when we're about to release the lock anyway?
}
}
static void Count()
{
lock (_lock)
{
int count = 0;
while(true)
{
Writeline("Count: " + count++);
//give other threads a chance every 10th iteration
if (count % 10 == 0)
Monitor.Wait(_lock);
}
}
}
If I use Exit() and Enter() instead of Wait() I can do:
static object _lock = new Object();
static void Main()
{
new Thread(Count).Start();
Sleep(10);
lock (_lock) Console.WriteLine("Main thread grabbed lock");
}
static void Count()
{
lock (_lock)
{
int count = 0;
while(true)
{
Writeline("Count: " + count++);
//give other threads a chance every 10th iteration
if (count % 10 == 0)
{
Monitor.Exit(_lock);
Monitor.Enter(_lock);
}
}
}
}
You use Enter / Exit to acquire exclusive access to a lock.
You use Wait / Pulse to allow co-operative notification: I want to wait for something to occur, so I enter the lock and call Wait; the notifying code will enter the lock and call Pulse.
The two schemes are related, but they're not trying to accomplish the same thing.
Consider how you'd implement a producer/consumer queue where the consumer can say "Wake me up when you've got an item for me to consume" without something like this.
I myself had this same doubt, and despite some interesting answers (some of them present here), I still kept searching for a more convincing answer.
I think an interesting and simple thought on this matter would be: I can call Monitor.Wait(lockObj) at a particular moment in which no other thread is waiting to acquire a lock on the lockObj object. I just want to wait for something to happen (some object's state to change, for instance), which is something I know that will happen eventually, on some other thread. As soon as this condition is achieved, I want to be able to reacquire the lock as soon as the other thread releases its lock.
By the definition of the Monitor.Wait method, it releases the lock and tries to acquire it again. If it didn't wait for the Monitor.Pulse method to be called before trying to acquire the lock again, it would simply release the lock and immediately acquire it again (depending on your code, possibly in loop).
That is, I think it's interesting trying to understand the need of the Monitor.Pulse method by looking at its usefulness in the functioning of the Monitor.Wait method.
Think like this: "I don't want to release this lock and immediately try to acquire it again, because I DON'T WANT to be ME the next thread to acquire this lock. And I also don't want to stay in a loop containing a call to Thread.Sleep checking some flag or something in order to know when the condition I'm waiting for has been achieved so that I can try to reacquire the lock. I just want to 'hibernate' and be awaken automatically, as soon as someone tells me the condition I'm waiting for has been achieved.".
Read the Remarks section of the linked MSDN page:
When a thread calls Wait, it releases the lock on the object and enters the object's waiting queue. The next thread in the object's ready queue (if there is one) acquires the lock and has exclusive use of the object. All threads that call Wait remain in the waiting queue until they receive a signal from Pulse or PulseAll, sent by the owner of the lock. If Pulse is sent, only the thread at the head of the waiting queue is affected. If PulseAll is sent, all threads that are waiting for the object are affected. When the signal is received, one or more threads leave the waiting queue and enter the ready queue. A thread in the ready queue is permitted to reacquire the lock.
This method returns when the calling thread reacquires the lock on the object. Note that this method blocks indefinitely if the holder of the lock does not call Pulse or PulseAll.
So, basically, when you call Monitor.Wait, your thread is in the waiting queue. For it to re-acquire the lock, it needs to be in the ready queue. Monitor.Pulse moves the first thread in the waiting queue to the ready queue and thus allows for it to re-acquire the lock.

Explanation about obtaining locks

I have been coding with C# for a good little while, but this locking sequence does not make any sense to me. My understanding of locking is that once a lock is obtained with lock(object), the code has to exit the lock scope to unlock the object.
This brings me to the question at hand. I cut out the code below which happens to appear in an animation class in my code. The way the method works is that settings are passed to the method and modified and then passed to a another overloaded method. That other overloaded method will pass all the information to another thread to handle and actually animate the object in some way. When the animation completes, the other thread calls the OnComplete method. This actually all works perfectly, but I do not understand why!
The other thread is able to call OnComplete, obtain a lock on the object and signal to the original thread that it should continue. Should the code not freeze at this point since the object is held in a lock on another thread?
So this is not a need for help in fixing my code, it is a need for clarification on why it works. Any help in understanding is appreciated!
public void tween(string type, object to, JsDictionaryObject properties) {
// Settings class that has a delegate field OnComplete.
Tween.Settings settings = new Tween.Settings();
object wait_object = new object();
settings.OnComplete = () => {
// Why are we able to obtain a lock when the wait_object already has a lock below?
lock(wait_object) {
// Let the waiting thread know it is ok to continue now.
Monitor.Pulse(wait_object);
}
};
// Send settings to other thread and start the animation.
tween(type, null, to, settings);
// Obtain a lock to ensure that the wait object is in synchronous code.
lock(wait_object) {
// Wait here if the script tells us to. Time out with total duration time + one second to ensure that we actually DO progress.
Monitor.Wait(wait_object, settings.Duration + 1000);
}
}
As documented, Monitor.Wait releases the monitor it's called with. So by the time you try to acquire the lock in OnComplete, there won't be another thread holding the lock.
When the monitor is pulsed (or the call times out) it reacquires it before returning.
From the docs:
Releases the lock on an object and blocks the current thread until it reacquires the lock.
I wrote an article about this: Wait and Pulse demystified
There's more going on than meets the eye!
Remember that :
lock(someObj)
{
int uselessDemoCode = 3;
}
Is equivalent to:
Monitor.Enter(someObj);
try
{
int uselessDemoCode = 3;
}
finally
{
Monitor.Exit(someObj);
}
Actually there are variants of this that varies from version to version.
Already, it should be clear that we could mess with this with:
lock(someObj)
{
Monitor.Exit(someObj);
//Don't have the lock here!
Monitor.Enter(someObj);
//Have the lock again!
}
You might wonder why someone would do this, and well, so would I, it's a silly way to make code less clear and less reliable, but it does come into play when you want to use Pulse and Wait, which the version with explicit Enter and Exit calls makes clearer. Personally, I prefer to use them over lock if I'm going to Pulse or Wait for that reason; I find that lock stops making code cleaner and starts making it opaque.
I tend to avoid this style, but, as Jon already said, Monitor.Wait releases the monitor it's called with, so there is no locking at that point.
But the example is slightly flawed IMHO. The problem is, generally, that if Monitor.Pulse gets called before Monitor.Wait, the waiting thread will never be signaled. Having that in mind, the author decided to "play safe" and used an overload which specified a timeout. So, putting aside the unnecessary acquiring and releasing of the lock, the code just doesn't feel right.
To explain this better, consider the following modification:
public static void tween()
{
object wait_object = new object();
Action OnComplete = () =>
{
lock (wait_object)
{
Monitor.Pulse(wait_object);
}
};
// let's say that a background thread
// finished really quickly here
OnComplete();
lock (wait_object)
{
// this will wait for a Pulse indefinitely
Monitor.Wait(wait_object);
}
}
If OnComplete gets called before the lock is acquired in the main thread, and there is no timeout, we will get a deadlock. In your case, Monitor.Wait will simply hang for a while and continue after a timeout, but you get the idea.
That is why I usually recommend a simpler approach:
public static void tween()
{
using (AutoResetEvent evt = new AutoResetEvent(false))
{
Action OnComplete = () => evt.Set();
// let's say that a background thread
// finished really quickly here
OnComplete();
// event is properly set even in this case
evt.WaitOne();
}
}
To quote MSDN:
The Monitor class does not maintain state indicating that the Pulse method has been called. Thus, if you call Pulse when no threads are waiting, the next thread that calls Wait blocks as if Pulse had never been called. If two threads are using Pulse and Wait to interact, this could result in a deadlock.
Contrast this with the behavior of the AutoResetEvent class: If you signal an AutoResetEvent by calling its Set method, and there are no threads waiting, the AutoResetEvent remains in a signaled state until a thread calls WaitOne, WaitAny, or WaitAll. The AutoResetEvent releases that thread and returns to the unsignaled state.

Starting multiple threads and keeping track of them from my .NET application

I would like to start x number of threads from my .NET application, and I would like to keep track of them as I will need to terminate them manually or when my application closes my application later on.
Example ==> Start Thread Alpha, Start Thread Beta .. then at any point in my application I should be able to say Terminate Thread Beta ..
What is the best way to keep track of opened threads in .NET and what do I need to know ( an id ? ) about a thread to terminate it ?
You could save yourself the donkey work and use this Smart Thread Pool. It provides a unit of work system which allows you to query each thread's status at any point, and terminate them.
If that is too much bother, then as mentioned anIDictionary<string,Thread> is probably the simplest solution. Or even simpler is give each of your thread a name, and use an IList<Thread>:
public class MyThreadPool
{
private IList<Thread> _threads;
private readonly int MAX_THREADS = 25;
public MyThreadPool()
{
_threads = new List<Thread>();
}
public void LaunchThreads()
{
for (int i = 0; i < MAX_THREADS;i++)
{
Thread thread = new Thread(ThreadEntry);
thread.IsBackground = true;
thread.Name = string.Format("MyThread{0}",i);
_threads.Add(thread);
thread.Start();
}
}
public void KillThread(int index)
{
string id = string.Format("MyThread{0}",index);
foreach (Thread thread in _threads)
{
if (thread.Name == id)
thread.Abort();
}
}
void ThreadEntry()
{
}
}
You can of course get a lot more involved and complicated with it. If killing your threads isn't time sensitive (for example if you don't need to kill a thread in 3 seconds in a UI) then a Thread.Join() is a better practice.
And if you haven't already read it, then Jon Skeet has this good discussion and solution for the "don't use abort" advice that is common on SO.
You can create a Dictionary of threads and assign them id's, like:
Dictionary<string, Thread> threads = new Dictionary<string, Thread>();
for(int i = 0 ;i < numOfThreads;i++)
{
Thread thread = new Thread(new ThreadStart(MethodToExe));
thread.Name = threadName; //Any name you want to assign
thread.Start(); //If you wish to start them straight away and call MethodToExe
threads.Add(id, thread);
}
If you don't want to save threads against an Id you can use a list and later on just enumerate it to kill threads.
And when you wish to terminate them, you can abort them. Better have some condition in your MethodToExe that allows that method to leave allowing the thread to terminate gracefully. Something like:
void MethodToExe()
{
while(_isRunning)
{
//you code here//
if(!_isRunning)
{
break;
}
//you code here//
}
}
To abort you can enumerate the dictionary and call Thread.Abort(). Be ready to catch ThreadAbortException
I asked a similar questions and received a bunch of good answers: Shutting down a multithreaded application
Note: my question did not require a graceful exit, but people still recommended that I gracefully exit from the loop of each thread.
The main thing to remember is that if you want to avoid having your threads prevent your process from terminating you should set all your threads to background:
Thread thread = new Thread(new ThreadStart(testObject.RunLoop));
thread.IsBackground = true;
thread.start();
The preferred way to start and manage threads is in a ThreadPool, but just about any container out there can be used to keep a reference to your threads. Your threads should always have a flag that will tell them to terminate and they should continually check it.
Furthermore, for better control you can supply your threads with a CountdownLatch: whenever a thread is exiting its loop it will signal on a CountdownLatch. Your main thread will call the CountdownLatch.Wait() method and it will block until all the threads have signaled... this allows you to properly cleanup and ensures that all your threads have shutdown before you start cleaning up.
public class CountdownLatch
{
private int m_remain;
private EventWaitHandle m_event;
public CountdownLatch(int count)
{
Reset(count);
}
public void Reset(int count)
{
if (count < 0)
throw new ArgumentOutOfRangeException();
m_remain = count;
m_event = new ManualResetEvent(false);
if (m_remain == 0)
{
m_event.Set();
}
}
public void Signal()
{
// The last thread to signal also sets the event.
if (Interlocked.Decrement(ref m_remain) == 0)
m_event.Set();
}
public void Wait()
{
m_event.WaitOne();
}
}
It's also worthy to mention that the Thread.Abort() method does some strange things:
When a thread calls Abort on itself,
the effect is similar to throwing an
exception; the ThreadAbortException
happens immediately, and the result is
predictable. However, if one thread
calls Abort on another thread, the
abort interrupts whatever code is
running. There is also a chance that a
static constructor could be aborted.
In rare cases, this might prevent
instances of that class from being
created in that application domain. In
the .NET Framework versions 1.0 and
1.1, there is a chance the thread could abort while a finally block is
running, in which case the finally
block is aborted.
The thread that calls Abort might
block if the thread that is being
aborted is in a protected region of
code, such as a catch block, finally
block, or constrained execution
region. If the thread that calls Abort
holds a lock that the aborted thread
requires, a deadlock can occur.
After creating your thread, you can set it's Name property. Assuming you store it in some collection you can access it conveniently via LINQ in order to retrieve (and abort) it:
var myThread = (select thread from threads where thread.Name equals "myThread").FirstOrDefault();
if(myThread != null)
myThread.Abort();
Wow, there are so many answers..
You can simply use an array to hold the threads, this will only work if the access to the array will be sequantial, but if you'll have another thread accessing this array, you will need to synchronize access
You can use the thread pool, but the thread pool is very limited and can only hold fixed amount of threads.
As mentioned above, you can create you own thread pool, which in .NET v4 becomes much easier with the introduction of safe collections.
you can manage them by holding a list of mutex object which will determine when those threads should finish, the threads will query the mutex each time they run before doing anything else, and if its set, terminate, you can manage the mutes from anywhere, and since mutex are by defenition thread-safe, its fairly easy..
i can think of another 10 ways, but those seems to work. let me know if they dont fit your needs.
Depends on how sophisticated you need it to be. You could implement your own type of ThreadPool with helper methods etc. However, I think its as simple as just maintaining a list/array and adding/removing the threads to/from the collection accordingly.
You could also use a Dictionary collection and use your own type of particular key to retrieve them i.e. Guids/strings.
As you start each thread, put it's ManagedThreadId into a Dictionary as the key and the thread instance as the value. Use a callback from each thread to return its ManagedThreadId, which you can use to remove the thread from the Dictionary when it terminates. You can also walk the Dictionary to abort threads if needed. Make the threads background threads so that they terminate if your app terminates unexpectedly.
You can use a separate callback to signal threads to continue or halt, which reflects a flag set by your UI, for a graceful exit. You should also trap the ThreadAbortException in your threads so that you can do any cleanup if you have to abort threads instead.

Categories