I'm still learning Threading, and I have problems with this code below. Sorry if this question appeared before, I just don't really understand why this code not working.
I simplified the code:
static EventWaitHandle waitH; // AutoResetEvent, wait for signal
static bool whExit; // signal to exit waiting
static Queue<string> str; // waiting line (example values)
static Queue<int> num; //
static void Main(string[] args)
{
waitH = new AutoResetEvent(false); // initialize waiter
str = new Queue<string>();
num = new Queue<int>();
Thread thr = new Thread(new ThreadStart(Waiter)); // waiting in another thread
thr.Start(); // start the waiting thread
for(short i = 0; i < 10; i++)
{
str.Enqueue(string.Format($"{(char)(i + 65)}")); // add something to queue
num.Enqueue(i); // add a number to test incrementing
waitH.Set(); // signal to start the "long processing"
}
}
static void Waiter()
{
while(!whExit)
{
waitH.WaitOne(); // wait for signal
WriteToConsole(); // start the long processing on another thread
}
}
static void WriteToConsole()
{
// threadstart with parameters
// action: void delegate
// get 2 values from waiting line
var f = new ParameterizedThreadStart(obj =>
new Action<string, int>(ConsoleWriter)
(str.Dequeue(), num.Dequeue())); // it's thread safe, because FIFO?
Thread thr = new Thread(f);
thr.IsBackground = true; // close thread when finished
thr.Start();
}
// print to console
static void ConsoleWriter(string s, int n)
{
Console.WriteLine(string.Format($"{s}: {++n}")); // easy example
}
It stops in the Main's loop.
I think the problem is: Thread.Start() called first, but it needs to change of state of Thread and joins the "need to be processed" queue , which takes time. The Main's loop already running and not wait for the signaling.
I solved this problem with Two-way signaling: used another pause-signal AutoResetEvent after waitH.Set() in the loop (WaitOne) and signal it after finished Console.WriteLine().
I'm not really proud of this solution, because if I do so, the program looses the "threadish", parallel, or synchronous approach.
And this is an example, I would like to run long calculations at the same time, on different threads.
If I see the output, it's a book-fitting example that I'm doing it wrong:
Output:
A: 1
B: 2
Sometimes
B: 2
A: 1
Expected output:
A: 1
B: 2
C: 3
D: 4
E: 5
F: 6
G: 7
H: 8
I: 9
J: 10
Is there any elegant way to solve this? Maybe to use locks, etc.
Any discussion would be appreciated.
Thanks!
There are several issues:
The main method never waits for the worker thread to complete, so it will probably run to completion and stop all the threads before they are done. This could be solved by signaling the worker thread to stop, and then use thread.Join() to wait for it to complete.
The WriteToConsole takes one item from each list and prints it to the console. But the thread might start after the loop in the main method has completed. So when the thread starts the autoReset event will be signaled and one item will be processsed. But in the next iteration the autoResetEvent will be unsignaled, and will never become signaled again. This can be solved by iterating over all items in the queue after the event has been signaled.
Using two-way signaling in the loop like you mention will in effect serialize the code, removing any benefit of using threads.
If this is a learning exercise i would suggest spending the time learning Tasks, async/await, lock and Parallel.For first. If you have a good grasp of these things you will be much more effective than using threads and reset events by hand.
// it's thread safe, because FIFO?
No. Use the concurrent collections if you want threadsafe collections.
Related
I've written this code as proof of concept in order to check a problem I'm experimenting with my current project.
var sleeper = new AutoResetEvent(false);
sleeper.Set();
sleeper.Set();
sleeper.Set();
var ix = 0;
while(true) {
sleeper.WaitOne();
Console.Write(ix++);
}
Surprisingly (at least for me) I'm not getting the result I expected.
I expected a 012 being printed in the console but only 0 is printed.
What am I misunderstanding ?
Which might be the best way to solve this problem and get the expected result?
Actually output for your code is 0
AutoResetEvent does not have a memory, it is either signalled or not. And you're waiting on it after it has been signalled. The sequence of events in your case:
create AutoResetEvent (state = 0)
signal AutoResetEvent (state = 1)
signal AutoResetEvent (state = 1)
signal AutoResetEvent (state = 1)
start waiting on the event (process the current signalled state of 1 and switch it to 0)
go on waiting on the event
As mentioned in msdn AutoResetEvent description
Also, if Set is called when there are no threads waiting and the
AutoResetEvent is already signaled, the call has no effect.
To observe the desired behavior you need a second Thread (as AutoResetEvent is created for inter-thread communication).
Something like this should work for you
private static AutoResetEvent sleeper;
static void Main()
{
sleeper = new AutoResetEvent(false);
Thread t = new Thread(ThreadProc);
t.Start();
var ix = 0;
while(true)
{
sleeper.WaitOne();
Console.Write(ix++);
}
}
static void ThreadProc()
{
Thread.Sleep(1000);
sleeper.Set();
Thread.Sleep(1000);
sleeper.Set();
Thread.Sleep(1000);
sleeper.Set();
}
Please note, that AutoResetEvent does not guarantee that it' will process 3 events for the following code
sleeper.Set();
sleeper.Set();
sleeper.Set();
as mentioned in its' description
Hence the Thread.Sleep(1000) bits.
There is no guarantee that every call to the Set method will release a
thread. If two calls are too close together, so that the second call
occurs before a thread has been released, only one thread is released.
It is as if the second call did not happen.
I have three threads and some part of the code can run in parallel, some parts are locked(only one thread at the time). However one lock needs to only let them in in order. Since this is a loop it gets more complex. How do I make this behavior?
If i had a print statement I would like to receive the following output:
1,2,3,1,2,3,1,2,3.... currently I receive 2,3,1,3,1,3,2,1,2 A.K.A. random order.
The code which is executed in three threads in parallel:
while (true){
lock (fetchLock){
if(done){
break;
}
//Do stuff one at the time
}
//Do stuff in parralell
lock (displayLock){
//Do stuff one at the time but need's to be in order.
}
}
You could use a combination of Barrier and AutoResetEvent to achieve this.
Firstly, you use Barrier.SignalAndWait() to ensure that all the threads reach a common point before proceeding. This common point is the point at which you want the threads to execute some code in order.
Then you use numberOfThreads-1 AutoResetEvents to synchronise the threads. The first thread doesn't need to wait for any other thread, but after it has finished it should signal the event that the next thread is waiting on.
The middle thread (or threads if more than 3 threads total) needs to wait for the previous thread to signal the event that tells it to proceed. After it has finished, the middle thread should signal the event that the next thread is waiting on.
The last thread needs to wait for the previous thread to signal the event that tells it to proceed. Since it is the last thread, it does not need to signal an event to tell the next thread to proceed.
Finally, you resync the threads with another call to Barrier.SignalAndWait().
This is easiest to show via a sample console app. If you run it, you'll see that the work that should be done by the threads in order (prefixed with the letter "B" in the output) is indeed always in order, while the other work (prefixed with the letter "A") is executed in a random order.
using System;
using System.Threading;
using System.Threading.Tasks;
namespace Demo
{
public static class Program
{
public static void Main()
{
using (Barrier barrier = new Barrier(3))
using (AutoResetEvent t2 = new AutoResetEvent(false))
using (AutoResetEvent t3 = new AutoResetEvent(false))
{
Parallel.Invoke
(
() => worker(1, barrier, null, t2),
() => worker(2, barrier, t2, t3),
() => worker(3, barrier, t3, null)
);
}
}
private static void worker(int threadId, Barrier barrier, AutoResetEvent thisThreadEvent, AutoResetEvent nextThreadEvent)
{
Random rng = new Random(threadId);
for (int i = 0; i < 1000; ++i)
{
doSomething(threadId, rng); // We don't care what order threads execute this code.
barrier.SignalAndWait(); // Wait for all threads to reach this point.
if (thisThreadEvent != null) // If this thread is supposed to wait for a signal
thisThreadEvent.WaitOne(); // before proceeding, then wait for it.
doWorkThatMustBeDoneInThreadOrder(threadId);
if (nextThreadEvent != null) // If this thread is supposed to raise a signal to indicate
nextThreadEvent.Set(); // that the next thread should proceed, then raise it.
barrier.SignalAndWait(); // Wait for all threads to reach this point.
}
}
private static void doWorkThatMustBeDoneInThreadOrder(int threadId)
{
Console.WriteLine(" B" + threadId);
Thread.Sleep(200); // Simulate work.
}
private static void doSomething(int threadId, Random rng)
{
for (int i = 0; i < 5; ++i)
{
Thread.Sleep(rng.Next(50)); // Simulate indeterminate amount of work.
Console.WriteLine("A" + threadId);
}
}
}
}
I have a simple program here below that has 2 threads performing some task.
Thread1 is the data feeder. Thread2 is the data processor.
So far the work being done through my approach is working but I want to have better way of getting notified when the work completes
Here is the code
class Program
{
private static BlockingCollection<int> _samples = new BlockingCollection<int>();
private static CancellationTokenSource _cancellationTokenSource = new CancellationTokenSource();
private static bool _cancel;
static void Main(string[] args)
{
ThreadStart thread1 = delegate
{
ProcessThread1();
};
new Thread(thread1).Start();
ThreadStart thread2 = delegate
{
ProcessThread2();
};
new Thread(thread2).Start();
Console.WriteLine("Press any key to cancel..");
Console.Read();
_cancel = true;
_cancellationTokenSource.Cancel();
Console.Read();
}
private static void ProcessThread1()
{
for (int i = 0; i < 10; i++)
{
if (_cancel)
{
break;
}
Console.WriteLine("Adding data..");
_samples.TryAdd(i,100);
Thread.Sleep(1000);
}
// I dont like this. Instead can I get notified in the UI thread that this thread is complete.
_cancel = true;
_cancellationTokenSource.Cancel();
}
private static void ProcessThread2()
{
while (!_cancellationTokenSource.IsCancellationRequested)
{
int data;
if (_samples.TryTake(out data, 100))
{
// Do some work.
Console.WriteLine("Processing data..");
}
}
Console.WriteLine("Cancelled.");
}
}
I want the program to exit if the cancel is requested by the user or when the work completes.
I am not sure how I can get notified when the ProcessThread1 runs out of work. Currently I am setting cancel = true when the work is complete but it seem not right. Any help appreciated.
If you use Task instead of manually creating threads, you can attach a continuation on your task to notify your UI that the work is complete.
Task workOne = Task.Factory.StartNew( () => ProcessThread1());
workOne.ContinueWith(t =>
{
// Update UI here
}, TaskScheduler.FromCurrentSynchronizationContext());
With .NET 4.5, this becomes even easier, as you can potentially use the new async language support:
var workOne = Task.Run(ProcessThread1);
var workTwo = Task.Run(ProcessThread2);
// asynchronously wait for both tasks to complete...
await Task.WhenAll(workOne, workTwo);
// Update UI here.
Note that these both are designed with a user interface in mind - and will behave unusually in a console application, as there is no current synchronization context in a console application. When you move this to a true user interface, it will behave correctly.
Start one more thread whose only job is to wait on console input:
private void ConsoleInputProc()
{
Console.Write("Press Enter to cancel:");
Console.ReadLine();
_cancellationTokenSource.Cancel();
}
Your main thread then starts the two processing threads and the input thread.
// create and start the processing threads
Thread t1 = new Thread(thread1);
Thread t2 = new Thread(thread2);
t1.Start();
t2.Start();
// create and start the input thread
Thread inputThread = new Thread(ConsoleInputProc);
inputThread.Start();
Then, you wait on the two processing threads:
t1.Join();
// first thread finished. Request cancellation.
_cancellationTokenSource.Cancel();
t2.Join();
So if the user presses Enter, then the input thread sets the cancellation flags. thread1 and thread2 both see the cancellation request and exit.
If thread1 completes its work, then the main thread sets the cancellation flag and thread2 will cancel.
In either case, the program won't exit until thread 2 exits.
There's no need to kill the input thread explicitly. It will die when the program exits.
By the way, I would remove these lines from the thread 1 proc:
// I dont like this. Instead can I get notified in the UI thread that this thread is complete.
_cancel = true;
_cancellationTokenSource.Cancel();
I would remove the _cancel variable altogether, and have the first thread check IsCancellationRequested just like the second thread does.
It's unfortunate that you have to start a dedicated thread to wait on console input, but it's the only way I know of to accomplish this. The Windows console doesn't appear to have a waitable event.
Note that you could do this same thing with Task, which overall is easier to use. The code that the tasks perform would be the same.
Update
Looking at the bigger picture, I see that you have a typical producer/consumer setup with BlockingCollection. You can make your producer and consumer threads a lot cleaner:
private static void ProcessThread1()
{
for (int i = 0; i < 10; i++)
{
Console.WriteLine("Adding data..");
_samples.TryAdd(i, Timeout.Infinite, _cancellationTokenSource.Token);
// not sure why the sleep is here
Thread.Sleep(1000);
}
// Marks the queue as complete for adding.
// When the queue goes empty, the consumer will know that
// no more data is forthcoming.
_samples.CompleteAdding();
}
private static void ProcessThread2()
{
int data;
while (_samples.TryTake(out data, TimeSpan.Infinite, _cancellationTokenSource.Token))
{
// Do some work.
Console.WriteLine("Processing data..");
}
Console.WriteLine("Cancelled.");
}
You'll still need that input thread (unless you want to spin a loop on Console.KeyAvailable), but this greatly simplifies your producer and consumer.
I am trying to get 2 threads running in the background to perform tasks. I have to create the threads sequentially and proceed with the program execution. But the second thread must execute it's work only when the first finishes. Also, One more clarification. I am looking to have this solution on a WPF application. There is no UI feedback needed. All I need is a status update from the first task. I agree if we do all in one thread it will be fine. But we want to have the second thread which does more things seperately even if the user leaves the screen which created this thread.
Here is the sample:
class Program
{
static string outValue;
static bool _isFinished = false;
static void Main(string[] args)
{
ThreadStart thread1 = delegate()
{
outValue = AnotherClass.FirstLongRunningTask();
// I need to set the _isFinished after the long running finishes..
// I cant wait here because I need to kick start the next thread and move on.
//
};
new Thread(thread1).Start();
ThreadStart thread2 = delegate()
{
while (!_isFinished)
{
Thread.Sleep(1000);
Console.WriteLine("Inside the while loop...");
}
if (!string.IsNullOrEmpty(outValue))
{
// This should execute only if the _isFinished is true...
AnotherClass.SecondTask(outValue);
}
};
new Thread(thread2).Start();
for (int i = 0; i < 5000; i++)
{
Thread.Sleep(500);
Console.WriteLine("I have to work on this while thread 1 and thread 2 and doing something ...");
}
Console.ReadLine();
}
}
public class AnotherClass
{
public static string FirstLongRunningTask()
{
Thread.Sleep(6000);
return "From the first long running task...";
}
public static void SecondTask(string fromThread1)
{
Thread.Sleep(1000);
Console.WriteLine(fromThread1);
}
}
Where do I set the _isFinished?
I can't use BackgroundWorker threads. Any help is appreciated.
If a thread can only start when another one finishes, you have a very simple solution: execute the entire code on the first thread.
You can use Task.ContinueWith to queue up more work for the same Task.
You should simply call thread1.Join(), which will block until thread1 terminates.
However, there are a large number of better ways to do this.
You should use the TPL and the Task class instead.
I have a scenario where I will have to kick off a ton of threads (possibly up to a 100), then wait for them to finish, then perform a task (on yet another thread).
What is an accepted pattern for doing this type of work? Is it simply .Join? Or is there a higher level of abstraction nowadays?
Using .NET 2.0 with VS2008.
In .NET 3.5sp1 or .NET 4, the TPL would make this much easier. However, I'll tailor this to .NET 2 features only.
There are a couple of options. Using Thread.Join is perfectly acceptable, especially if the threads are all ones you are creating manually. This is very easy, reliable, and simple to implement. It would probably be my choice.
However, the other option would be to create a counter for the total amount of work, and to use a reset event when the counter reaches zero. For example:
class MyClass {
int workToComplete; // Total number of elements
ManualResetEvent mre; // For waiting
void StartThreads()
{
this.workToComplete = 100;
mre = new ManualResetEvent(false);
int total = workToComplete;
for(int i=0;i<total;++i)
{
Thread thread = new Thread( new ThreadStart(this.ThreadFunction) );
thread.Start(); // Kick off the thread
}
mre.WaitOne(); // Will block until all work is done
}
void ThreadFunction()
{
// Do your work
if (Interlocked.Decrement(ref this.workToComplete) == 0)
this.mre.Set(); // Allow the main thread to continue here...
}
}
Did you look at ThreadPool? Looks like here -ThreadPool tutorial, avtor solves same task as you ask.
What's worked well for me is to store each thread's ManagedThreadId in a dictionary as I launch it, and then have each thread pass its id back through a callback method when it completes. The callback method deletes the id from the dictionary and checks the dictionary's Count property; when it's zero you're done. Be sure to lock around the dictionary both for adding to and deleting from it.
I am not sure that any kind of standard thread locking or synchronization mechanisms will really work with so many threads. However, this might be a scenario where some basic messaging might be an ideal solution to the problem.
Rather than using Thread.Join, which will block (and could be very difficult to manage with so many threads), you might try setting up one more thread that aggregates completion messages from your worker threads. When the aggregator has received all expected messages, it completes. You could then use a single WaitHandle between the aggregator and your main application thread to signal that all of your worker threads are done.
public class WorkerAggregator
{
public WorkerAggregator(WaitHandle completionEvent)
{
m_completionEvent = completionEvent;
m_workers = new Dictionary<int, Thread>();
}
private readonly WaitHandle m_completionEvent;
private readonly Dictionary<int, Thread> m_workers;
public void StartWorker(Action worker)
{
var thread = new Thread(d =>
{
worker();
notifyComplete(thread.ManagedThreadID);
}
);
lock (m_workers)
{
m_workers.Add(thread.ManagedThreadID, thread);
}
thread.Start();
}
private void notifyComplete(int threadID)
{
bool done = false;
lock (m_workers)
{
m_workers.Remove(threadID);
done = m_workers.Count == 0;
}
if (done) m_completionEvent.Set();
}
}
Note, I have not tested the code above, so it might not be 100% correct. However I hope it illustrates the concept enough to be useful.