I have a multi-threaded program in C#. What is the best way to prevent deadlock in practice?
Is it timedlock?
Also, what is the best tool available to help detect and prevent the deadlock?
Thank you very much.
Deadlocks typically occur in a few scenarios:
You are using several locks and not locking/unlocking them in the correct order. Hence, you may create a situation where a thread holds lock A and needs lock B, and another thread needs lock A and holds lock B. Neither of them can proceed. This is because each thread is locking in a different order.
When using a reentrant lock and locking it more times than you are unlocking it. See this related question: why does the following code result in deadlock
When using Monitor.Wait/Monitor.Pulse as a signaling mechanism, but the thread that must call Wait does not manage to reach the call by the time the other thread has called Pulse and the signal is lost. You can use the AutoResetEvent for a persistent signal.
You have a worker thread polling a flag to know when to stop. The main thread sets the flag and attempts to join the worker thread, but you forgot to make the flag volatile.
It's not C# specific. You should always acquired in some well-defined order.
There is much information in internet, for example, you might take a look here
http://www.javamex.com/tutorials/threads/deadlock.shtml
Related
I have a C#/.Net application.
I use Monitor.Enter and Monitor.Exit to acquire exclusive access on some object to synchronize threads.
I have a specific case where a thread 1 should acquire exclusive access on some object. Another thread 2 should behave as follows : I there is no exclusive lock on the object, it should continue without acquiring exclusive lock. I there is already an exclusive lock acquired by another thread, It should wait until it's released.
I have been looking in the documentation here.
But I couldn't find any function that does this.
Is that possible please?
So:
if the object is currently locked, you wish to wait for it to be released, but not take the lock
if the object is not currently locked, you wish to continue without holding the lock
I believe that is achieved by simply:
lock (theThing) {}
This acquires and releases, which has the same semantics as not taking the lock, but waiting for anyone who does hold it.
Note that there is inherently a race condition in what you ask, as if you're not taking the lock: someone else could after you've checked. We long as that is OK.
There are also a myriad of methods on Monitor for other scenarios, usually mixed with try/finally - in particular, TryEnter with a zero timeout can be used for "take the lock immediately if you can, but don't wait if you can't".
However, a ManualResetEvent may also be worth investigating.
spoiler note: the question is the last phrase.
In C#, the classical pattern to use a condition variable is like this:
lock (answersQueue)
{
answersQueue.Enqueue(c);
Monitor.Pulse(answersQueue); // condition variable "notify one".
}
and some other thread:
lock (answersQueue)
{
while (answersQueue.Count == 0)
{
// unlock answer queue and sleeps here until notified.
Monitor.Wait(answersQueue);
}
...
}
that's an example taken from my code.
if I place the Pulse outside of the lock scope, it doesn't compile.
however, it is the correct way:
c.f:
http://msdn.microsoft.com/en-us/library/windows/desktop/ms686903(v=vs.85).aspx
and:
http://www.installsetupconfig.com/win32programming/threadprocesssynchronizationapis11_7.html
(search for "inside")
And indeed it is idiotic to signal the sleeping thread when you still are in your critical section. Because the sleeping thread CAN'T wake up (not immediately), BECAUSE it is INSIDE a criticial section as well !
Therefore, I hope that .NET or C# Pulse call is actually just flagging the lock object, so that when it goes out of scope it actually "pulses" the condition variable at this moment. Because otherwise, it would have an optimality issue.
So how come the design of the Monitor object was chosen to be that way ?
Edit:
I found the answer in this paper:
http://research.microsoft.com/pubs/64242/implementingcvs.pdf
section "Optimising Signal and Broadcast" and the previous section about NT kernel and how to make Condition Variable on top of Semaphores, which is the reason for introducing the "darned queues".
NOW that makes me a better engineer.
And indeed it is idiotic to signal the sleeping thread when you still are in your critical section. Because the sleeping thread CAN'T wake up
Pulse doesn't expect to get a thread running; it only expects to move a thread between the 2 queues (waiting and ready). The "not go do something" is part of releasing the lock via Exit (or the end of a lock). In reality, it isn't an issue because Monitor.Pulse typically happens right before a Wait or an Exit.
Therefore, I hope that .NET or C# Pulse call is actually just flagging the lock object, so that when it goes out of scope it actually "pulses" the condition variable at this moment. Because otherwise, it would have an optimality issue.
Again; these are different issues: moving between waiting and ready is one thing; exiting a lock already has all the code to actually activate the next ready thread.
You did not understood the basic problem of synchronization. What is a 'monitor', what does it mean that a thread sleeps and what does it mean that it is about to be woken up?
A monitor is a mid-level synchronization structure. This is not a low-level petty volatile boolean flag with bus-halting XCHG operation, and this is not high-level thread pool handler that requires dozens of other special mechanisms..
On a monitor, MANY threads may sleep. There are logical queues out there that i.e. preserver order of being put to sleep/woken up, or mechanisms that guarantee proper time scheduling and fairnees. I will not get into details, all of it is out there on the web, even on wiki.
Add to that that the operation is PULSE. Pulse is instantenous. It does not "stick". Pulse will wake those now sleeping. If after the pulse another one check the monitor, it will go to sleep.
Now imagine: you have a queue of 5 sleeping threads. One thread (6th) wants now to pulse them, and yet another (7th) wants to check the monitor.
6th and 7th are running in parallel, truly simultaneously, since you have quad-core CPU.
So, tell me, what would happen to the queue's implementtion if the 6th starts pulsing andwaking and removing woken threads from the queue, and in the same time the 7th one starts adding itself there?
To solve that, the internal queues would have to be internally synchronized and locked, so only one thread at time modifies them.
Um wait. We just stumbled upon a case where we wanted to SYNCHRONIZE something, and to do it properly we need to SYNCHRONIZE on another thing? Not good.
Therefore, the actual LOCK is done EXTERNALLY before you talk to the monitor itself. This is to achieve SINGLE LOCKING, instead of introduce several layers of hierarchical locks.
That way it is simplier, faster, and more resource-friendly.
I was wondering on the Monitor Class.
As far as i know all waiting threads are not FIFO.
The first one that aquires the lock is not allways the first on in the waiting queue.
Is this correct?
Is there some way to ensure the FIFO condition?
Regards
If you are referring to a built-in way, then no. Repeatedly calling TryEnter in a loop is by definition not fair and unfortunately neither is the simple Monitor.Enter. Technically a thread could wait forever without getting the lock.
If you want absolute fairness you will need to implement it yourself using a queue to keep track of arrival order.
Is there some way to ensure the FIFO condition?
In a word: no!
I wrote a short article about this: Is the Ready Queue FIFO?
Look at this question, I think it will very useful for you - Does lock() guarantee acquired in order requested?
especially this quote:
Because monitors use kernel objects internally, they exhibit the same
roughly-FIFO behavior that the OS synchronization mechanisms also
exhibit (described in the previous chapter). Monitors are unfair, so
if another thread tries to acquire the lock before an awakened waiting
thread tries to acquire the lock, the sneaky thread is permitted to
acquire a lock.
I've been tasked with removing blocking calls from a C# app. Turns out this is a requirement of the environment it'll be running on. I understand the concept of a blocking call, however, I'm not sure where to begin finding existing blocking calls.
So a few questions:
For any given function, how can I tell whether or not it is blocking? Is there any way besides looking up the documentation?
Is there any way to search for blocking in a project or solution? Eg. some plug-in that could tell me?
There's no automatic way I know of to find blocking calls. Most blocking code is used for thread or process synchronization such as lock, Monitor.Enter, Mutex and Semaphore/SemaphoreSlim waits, CountdownEvent and Barrier class use. There's also SpinLock and ReaderWriterLock/ReaderWriterLockSlim locks which block.
There are several Thread calls that are blocking. Thread.Sleep can technically be considered a blocking call, though it lasts a finite amount of time. Thread.Join waits for other threads to finish and is thus blocking.
For and While loops can be considered blocking as they will run until they are done, but usually they will use one of the calls above (especially lock) if they are waiting on a specific variable updated in another thread.
Keep in mind that removing any of these is likely to have a serious negative impact on thread safety.
I've got several threads ,how can I pause/resume them?
From duplicate question:
How can i pause 5 threads, and to remember their status. Because one of them is eating another is thinking, etc.
If you're using System.Threading.Thread, then you can call Suspend and Resume. This, however is not recommended. There's no telling what a thread might be doing when you call Suspend. If you call Suspend while the thread holds a lock, for example, or has a file open for exclusive access, nothing else will be able to access the locked resource.
As the documentation for Thread.Suspend says:
Do not use the Suspend and Resume
methods to synchronize the activities
of threads. You have no way of knowing
what code a thread is executing when
you suspend it. If you suspend a
thread while it holds locks during a
security permission evaluation, other
threads in the AppDomain might be
blocked. If you suspend a thread while
it is executing a class constructor,
other threads in the AppDomain that
attempt to use that class are blocked.
Deadlocks can occur very easily.
Typically, you control threads' activity using synchronization primitives like events. A thread will wait on an event (look into AutoResetEvent and ManualResetEvent). Or, if a thread is servicing a queue, you'll use something like BlockingCollection so that the thread can wait for something to be put into the queue. All of these non-busy wait techniques are much better than arbitrarily suspending and restarting a thread, and don't suffer from the potential disastrous consequences.
Have a look at Monitor.Wait and Monitor.Pulse in the first instance- Marc Gravell has a nice example used in a queue here.
In it quite likely that you want to consider using a Producer/Consumer queue.
You have to use synchronisation techniques
MSDN Thread Synchronization
In the main thread:
ManualResetEvent re = new ManualResetEvent(true);
In all the threads, at "strategic" points:
re.WaitOne();
In the main thread, to stop the threads:
re.Reset();
and to restart:
re.Set();
You can use Suspend() and Resume().
http://msdn.microsoft.com/en-us/library/system.threading.thread.resume.aspx
http://msdn.microsoft.com/en-us/library/system.threading.thread.suspend.aspx
You can also read:
What are alternative ways to suspend and resume a thread?