Need help understanding AutoResetEvent behaviour - c#

I am experimenting with the signaling constructs of .NET and tried to do a simple coordination between two threads, however something is wrong and the results are all over the place with the program sometimes locking up. I guess there is some sort of race condition going on, but would like for someone to explain what's happening.
private static readonly AutoResetEvent manual = new AutoResetEvent(false);
private static int num;
public static async Task Main()
{
await Task.WhenAll(Task.Run(Method1), Task.Run(Method2));
}
private static void Method1()
{
num = 100;
manual.Set();
manual.WaitOne();
num -= 1000;
manual.Set();
}
private static void Method2()
{
manual.WaitOne();
var into = num;
num += into / 2;
manual.Set();
manual.WaitOne();
Console.WriteLine($"final value of num {num}");
}

Yes, this is a race condition.
Method 1 Begins
Method 2 cant start before Method 1 because it calls WaitOne in the beginning and the AutoResetEvent is not in the signaled state
Method 1 assigns num = 100
Method 1 sets AutoResetEvent. This notifies a waiting thread that an event has occurred.
Rest of work will be divided into multiple Scenarios.
Scenario 1:
Method 1 calls WaitOne, consumes AutoResetEvent signaled state and goes to the next line, AutoResetEvent resets.
Method 1 decreases num by 1000 (num = -900 now)
Method 1 signals AutoResetEvent
Method 2 can start cause AutoResetEvent is signlaed. AutoResetEvent resets after Method 2 getting notified.
Method 2 assigns into = num (into = -900)
Method 2 increases num by (into / 2) which makes the final result of num = -1350 (num = -900 - 450)
Method 2 signals AutoResetEvent
Method 2 consumes AutoResetEvent signaled state
Method 2 prints final value
The result is -1350 and the program terminates because both tasks finished in this Scenario.
Scenario 2:
Instead of method 1, Method 2 calls WaitOne and continues. AutoResetEvent resets.
Method 1 cant go to the next line because it is blocked by WaitOne
Method 2 assigns num to into (into = 100 now)
Method 2 increases num by into/2 (num = 100 + 50 = 150)
Method 2 sets AutoResetEvent.
Here, Scenario 2 will be divided into multiple Scenarios. Scenarios2-1 and Scenarios2-2
Scenario 2-1:
Method 1 gets notified and decrease num by 1000 (num = 150 - 1000 = -850)
Method 1 sets AutoResetEvent.
Method 2 gets notified and prints the result.
The result is -850 and the program terminates, because both tasks finished in this Scenario.
Scenario 2-2:
Method 2 gets notified and prints the result.
Method 1 will be blocked until someone somewhere set the AutoResetEvent.
The result is 150 and the program does NOT terminate, because the first task is not finished yet.

Related

C# Multi-Threading - What's wrong, how to use AutoResetEvent

I'm still learning Threading, and I have problems with this code below. Sorry if this question appeared before, I just don't really understand why this code not working.
I simplified the code:
static EventWaitHandle waitH; // AutoResetEvent, wait for signal
static bool whExit; // signal to exit waiting
static Queue<string> str; // waiting line (example values)
static Queue<int> num; //
static void Main(string[] args)
{
waitH = new AutoResetEvent(false); // initialize waiter
str = new Queue<string>();
num = new Queue<int>();
Thread thr = new Thread(new ThreadStart(Waiter)); // waiting in another thread
thr.Start(); // start the waiting thread
for(short i = 0; i < 10; i++)
{
str.Enqueue(string.Format($"{(char)(i + 65)}")); // add something to queue
num.Enqueue(i); // add a number to test incrementing
waitH.Set(); // signal to start the "long processing"
}
}
static void Waiter()
{
while(!whExit)
{
waitH.WaitOne(); // wait for signal
WriteToConsole(); // start the long processing on another thread
}
}
static void WriteToConsole()
{
// threadstart with parameters
// action: void delegate
// get 2 values from waiting line
var f = new ParameterizedThreadStart(obj =>
new Action<string, int>(ConsoleWriter)
(str.Dequeue(), num.Dequeue())); // it's thread safe, because FIFO?
Thread thr = new Thread(f);
thr.IsBackground = true; // close thread when finished
thr.Start();
}
// print to console
static void ConsoleWriter(string s, int n)
{
Console.WriteLine(string.Format($"{s}: {++n}")); // easy example
}
It stops in the Main's loop.
I think the problem is: Thread.Start() called first, but it needs to change of state of Thread and joins the "need to be processed" queue , which takes time. The Main's loop already running and not wait for the signaling.
I solved this problem with Two-way signaling: used another pause-signal AutoResetEvent after waitH.Set() in the loop (WaitOne) and signal it after finished Console.WriteLine().
I'm not really proud of this solution, because if I do so, the program looses the "threadish", parallel, or synchronous approach.
And this is an example, I would like to run long calculations at the same time, on different threads.
If I see the output, it's a book-fitting example that I'm doing it wrong:
Output:
A: 1
B: 2
Sometimes
B: 2
A: 1
Expected output:
A: 1
B: 2
C: 3
D: 4
E: 5
F: 6
G: 7
H: 8
I: 9
J: 10
Is there any elegant way to solve this? Maybe to use locks, etc.
Any discussion would be appreciated.
Thanks!
There are several issues:
The main method never waits for the worker thread to complete, so it will probably run to completion and stop all the threads before they are done. This could be solved by signaling the worker thread to stop, and then use thread.Join() to wait for it to complete.
The WriteToConsole takes one item from each list and prints it to the console. But the thread might start after the loop in the main method has completed. So when the thread starts the autoReset event will be signaled and one item will be processsed. But in the next iteration the autoResetEvent will be unsignaled, and will never become signaled again. This can be solved by iterating over all items in the queue after the event has been signaled.
Using two-way signaling in the loop like you mention will in effect serialize the code, removing any benefit of using threads.
If this is a learning exercise i would suggest spending the time learning Tasks, async/await, lock and Parallel.For first. If you have a good grasp of these things you will be much more effective than using threads and reset events by hand.
// it's thread safe, because FIFO?
No. Use the concurrent collections if you want threadsafe collections.

AutoResetEvent accumulative Sets()

I've written this code as proof of concept in order to check a problem I'm experimenting with my current project.
var sleeper = new AutoResetEvent(false);
sleeper.Set();
sleeper.Set();
sleeper.Set();
var ix = 0;
while(true) {
sleeper.WaitOne();
Console.Write(ix++);
}
Surprisingly (at least for me) I'm not getting the result I expected.
I expected a 012 being printed in the console but only 0 is printed.
What am I misunderstanding ?
Which might be the best way to solve this problem and get the expected result?
Actually output for your code is 0
AutoResetEvent does not have a memory, it is either signalled or not. And you're waiting on it after it has been signalled. The sequence of events in your case:
create AutoResetEvent (state = 0)
signal AutoResetEvent (state = 1)
signal AutoResetEvent (state = 1)
signal AutoResetEvent (state = 1)
start waiting on the event (process the current signalled state of 1 and switch it to 0)
go on waiting on the event
As mentioned in msdn AutoResetEvent description
Also, if Set is called when there are no threads waiting and the
AutoResetEvent is already signaled, the call has no effect.
To observe the desired behavior you need a second Thread (as AutoResetEvent is created for inter-thread communication).
Something like this should work for you
private static AutoResetEvent sleeper;
static void Main()
{
sleeper = new AutoResetEvent(false);
Thread t = new Thread(ThreadProc);
t.Start();
var ix = 0;
while(true)
{
sleeper.WaitOne();
Console.Write(ix++);
}
}
static void ThreadProc()
{
Thread.Sleep(1000);
sleeper.Set();
Thread.Sleep(1000);
sleeper.Set();
Thread.Sleep(1000);
sleeper.Set();
}
Please note, that AutoResetEvent does not guarantee that it' will process 3 events for the following code
sleeper.Set();
sleeper.Set();
sleeper.Set();
as mentioned in its' description
Hence the Thread.Sleep(1000) bits.
There is no guarantee that every call to the Set method will release a
thread. If two calls are too close together, so that the second call
occurs before a thread has been released, only one thread is released.
It is as if the second call did not happen.

C# Thread and lock

I test simple code
static Thread _readThread = null;
static private Object thisLock = new Object();
static int a = 1;
private static void ReadComPort()
{
lock (thisLock)
{
for (int i = 0; i < 3; i++)
{
Console.WriteLine(Thread.CurrentThread.Name + " " + a++.ToString());
Thread.Sleep(1000);
}
}
}
static void Main(string[] args)
{
for (int i = 0; i < 3; i++)
{
_readThread = new Thread(new ThreadStart(ReadComPort));
_readThread.IsBackground = true;
_readThread.Name = i.ToString();
_readThread.Start();
//Thread.Sleep(50);
}
Console.WriteLine("End");
Console.ReadKey();
}
but why is the sequence of execution and the launching of threads chaotic:
0,2,1 Why?
Console output:
0 1
End
0 2
0 3
2 4
2 5
2 6
1 7
1 8
1 9
Because you can't expect threads to start or run in a specific order. The OS schedules threads the way it wants to. Sometimes it puts a thread on hold, executes another one, before coming back to the original one.
Here you see that the threads start almost at the same time. Obviously (from the output) thread 0 wins it to the first lock. Then, by pure chance, thread 2 gets by the lock earlier than thread 1. This could have gone entirely different since the threads are created shortly after each other. As said: there is no guarantee.
Lock does not guarantee the order : Does lock() guarantee acquired in order requested?
Also, in your code, you should wait your threads to finish at the end of your for loop in order to not have "end" at the beginning - if you press a key, you will exit while your thread are still running, and you may have unexpected behaviour.
Read the C# reference carefully.
https://learn.microsoft.com/en-us/dotnet/csharp/language-reference/keywords/lock-statement
There, you cannot find anything about the order of threads entering the lock block.

ManualResetEventSlim Wait() doesn't work after Set() is done once

What I want to do
I have some threads (e.g. Thread 1, Thread 2, Thread 3), and a queue of integers (e.g. 1, 2, 3).
Every one second, I want to pause currently running thread, and pop from a queue, and run a thread that has the same id.
For example:
Let's say I have an array of thread, and an integer that holds currently executing thread.
int[] myThread; //1, 2, 3
ManualResetEventSlim[] mre;
executing = 1;
Queue myQueue; //2, 3
Time 0:01 //Thread 1 is running
Time 0:02 mre[executing].Wait();
myQueue.Enqueue(executing);
int nextThread = myQueue.Dequeue(); //say 2
mre[nextThread].Set();
executing = nextThread;
Time 0:03 //Same thing as at 0:02...
Time 0:04 //Same above
Time 0:05 //Same above
and want outputs that look like this:
Time 0:01 I'm 1
I'm 1
I'm 1
Time 0:02 I'm 2 // thread 2 was selected
I'm 2
I'm 2
Time 0:03 I'm 3 // thread 3 was selected
I'm 3
I'm 3
Time 0:04 I'm 2 // thread 2 was selected
I'm 2
I'm 2
What I'm doing
I have a main file that defines the action:
static void Main()
{
CreateThread(Action, n);
}
public static void Action(int pid)
{
for(int i=0; i<100000; i++)
{
Trace.WriteLine("I'm "+ pid);
}
}
Problem
The problem is that once a thread is set with Set(), it cannot be blocked again with Wait(). Because of that, each thread just keeps executing the whole Action method until it's done.
and outputs look like this:
Time 0:01 I'm 1 //Only Thread 1 is unblocked
I'm 1
I'm 1
Time 0:02 I'm 2 //Thread 2 was unblocked.
I'm 1 //Thread 1 is not blocked, so it keeps printing
I'm 2
Time 0:03 I'm 3 //Thread 3 was unblocked.
I'm 1 //Thread 1 is not blocked
I'm 2 //Thread 2 is not blocked either
I've been working on this for a while and am stuck. I really appreciate any help.
Im not sure if I understand the problem, but I'd change Action() so that it takes an instance of ThreadData. You'll need it to check whether the MRE is set. If so, break out of the iteration and let the framework take care of cleaning up after the thread.
public static void Action(ThreadData threadData)
{
for (int i = 0; i < 100000; i++)
{
if (threadData.Mre.IsSet)
{
break;
}
Trace.WriteLine("I'm " + threadData.Pid);
}
}
EDIT
You might not realize it, but you're probably using the same instance of MRE for each thread. Each thread references the mre variable. Who is to say that the thread starts before you set mre to a new instance? Spoiler: it doesn't.

How do I achieve mutual exclusion like in the lock statement, but the block would be skipped if it is locked?

Using the lock statement, one can "ensure that one thread does not enter a critical section of code while another thread is in the critical section. If another thread tries to enter a locked code, it will wait, block, until the object is released."
What if the behaviour I want is that if another thread tries to enter the locked code, it will just skip the whole code (instead of waiting the lock to be released)? An idea that come to my mind is using a flag, something like
if(flag) return;
flag = true;
//do stuff here
flag =false;
But I know this is not safe because two threads can pass the first line before anyone set to true, or the flag being never set to false in case of exceptions.. Can you suggest an improvement or an alternative?
Use this overload of Monitor.TryEnter, which lets you specify a timeout.
Attempts, for the specified amount of time, to acquire an exclusive
lock on the specified object.
Return Value Type: System.Boolean true if the current thread acquires
the lock without blocking; otherwise, false.
In your case, you probably want to use a timeout of close to TimeSpan.Zero.
If you don't want the thread attempting to take the lock to wait for any length of time, you can just this overload of Monitor.TryEnter, which does not accept a TimeSpan argument. This method will immediately return without waiting - very close to the sentiment of the flag technique you are trying to use.
You need Semaphores with a limit 1 and timeout period 0 miliseconds
By using Semaphore you can say that only a limited number of threads can access a piece of code at a time.
See this sample for how to use it
You need to use this method for waiting
bool WaitOne(int millisecondsTimeout)
specify timeout period = 0; in this way your waiting threads will wait 0 second which means they will simply skip the code
Example
class SemaphoreExample
{
// Three reserved slots for threads
public static Semaphore Pool = new Semaphore(1, 0);
public static void Main(string[] args)
{
// Create and start 20 threads
for (int i = 0; i < 20; i++)
{
Thread t = new Thread(new ThreadStart(DoWork));
t.Start();
}
Console.ReadLine();
}
private static void DoWork()
{
// Wait 0 miliseconds
SemaphoreExample.Pool.WaitOne(0);
#region Area Protected By Semaphore
Console.WriteLine("Acquired slot...");
for (int i = 0; i < 10; i++)
{
Console.WriteLine(i + 1);
}
Console.WriteLine("Released slot...");
#endregion
// Release the semaphore slot
SemaphoreExample.Pool.Release();
}
}
you can use Monitor.TryEnter
http://msdn.microsoft.com/en-us/library/dd289679.aspx

Categories