I want to launch an arbitrary number of threads, each executing the same method, but with different parameters. Each thread needs to block at a certain point, and wait until all threads have reached the same point. (Like racers getting into their starting blocks)
I'm stumped on how to make all threads signal to the starter that they are each ready to go.
The solution is to use Barrier Class.
i think that using locking you can synchronize the thread's access.
try this:
lock (lockThis)
{
// Access thread-sensitive resources.
}
I was struggling with multithreading too not so long ago. What you are trying to achieve can be done in a very simple way using just what you know. Here is an idea :
class MyThread
{
private Thread thread;
private bool isWaitingAtPointA = false;
private bool continueWorking = false;
public MyThread ()
{
thread = new Thread(DoMyStuff);
}
private void DoMyStuff()
{
//do stuff
//when at point A :
isWaitingAtPointA = true;
while (!continueWorking)
{
Thread.Sleep(10);
}
isWaitingAtPointA = false;
continueWorking = false;
//do more stuff
}
public bool isAtWaitingPointA()
{
return isWaitingAtPointA;
}
}
Then have a List of MyThread in your main thread that will instantiate all the MyThread objects, start their threads and also unlock them by setting from your main thread continueWorking to true.
Obviously you can check if all the threads are at point A by calling isAtWaitingPointA(). This approach is called "control variables" I believe (please someone correct me if I am wrong) and here the controls variables are the bools isWaitingAtPointA and continueWorking.
The method you want them all to use is here represented by DoMyStuff() which can be defined somewhere else to avoid code redundancies.
I hope this inspires you =)
Related
I am working on a legacy application that is built on top of NET 3.5. This is a constraint that I can't change.
I need to execute a second thread to run a long running task without locking the UI. When the thread is complete, somehow I need to execute a Callback.
Right now I tried this pseudo-code:
Thread _thread = new Thread(myLongRunningTask) { IsBackground = True };
_tread.Start();
// wait until it's done
_thread.Join();
// execute finalizer
The second option, which does not lock the UI, is the following:
Thread _thread = new Thread(myLongRunningTask) { IsBackground = True };
_tread.Start();
// wait until it's done
while(_thread.IsAlive)
{
Application.DoEvents();
Thread.Sleep(100);
}
// execute finalizer
Of course the second solution is not good cause it overcharge the UI.
What is the correct way to execute a callback when a _thread is complete? Also, how do I know if the thread was cancelled or aborted?
*Note: * I can't use the BackgroundWorker and I can't use the Async library, I need to work with the native thread class.
There are two slightly different kinds of requirement here:
Execute a callback once the long-running task has completed
Execute a callback once the thread in which the long-running task was running has completed.
If you're happy with the first of these, the simplest approach is to create a compound task of "the original long-running task, and the callback", basically. You can even do this just using the way that multicast delegates work:
ThreadStart starter = myLongRunningTask;
starter += () => {
// Do what you want in the callback
};
Thread thread = new Thread(starter) { IsBackground = true };
thread.Start();
That's very vanilla, and the callback won't be fired if the thread is aborted or throws an exception. You could wrap it up in a class with either multiple callbacks, or a callback which specifies the status (aborted, threw an exception etc) and handles that by wrapping the original delegate, calling it in a method with a try/catch block and executing the callback appropriately.
Unless you take any special action, the callback will be executed in the background thread, so you'll need to use Control.BeginInvoke (or whatever) to marshal back to the UI thread.
I absolutely understand your requirements, but you've missed one crucial thing: do you really need to wait for the end of that thread synchronously? Or maybe you just need to execute the "finalizer" after thread's end is detected?
In the latter case, simply wrap the call to myLongRunningTask into another method:
void surrogateThreadRoutine() {
// try{ ..
mytask();
// finally { ..
..all 'finalization'.. or i.e. raising some Event that you'll handle elsewhere
}
and use it as the thread's routine. That way, you'll know that the finalization will occur at the thread's and, just after the end of the actual job.
However, of course, if you're with some UI or other schedulers, the "finalization" will now run on yours thread, not on the "normal threads" of your UI or comms framework. You will need to ensure that all resources are external to your thread-task are properly guarded or synchronized, or else you'll probably clash with other application threads.
For instance, in WinForms, before you touch any UI things from the finalizer, you will need the Control.InvokeRequired (surely=true) and Control.BeginInvoke/Invoke to bounce the context back to the UI thread.
For instance, in WPF, before you touch any UI things from the finalizer, you will need the Dispatcher.BeginInvoke..
Or, if the clash could occur with any threads you control, simple proper lock() could be enough. etc.
You can use a combination of custom event and the use of BeginInvoke:
public event EventHandler MyLongRunningTaskEvent;
private void StartMyLongRunningTask() {
MyLongRunningTaskEvent += myLongRunningTaskIsDone;
Thread _thread = new Thread(myLongRunningTask) { IsBackground = true };
_thread.Start();
label.Text = "Running...";
}
private void myLongRunningTaskIsDone(object sender, EventArgs arg)
{
label.Text = "Done!";
}
private void myLongRunningTask()
{
try
{
// Do my long task...
}
finally
{
this.BeginInvoke(Foo, this, EventArgs.Empty);
}
}
I checked, it's work under .NET 3.5
You could use the Observer Pattern, take a look here:
http://www.dofactory.com/Patterns/PatternObserver.aspx
The observer pattern will allow you, to notify other objects which were previously defined as observer.
A very simple thread of execution with completion callback
This does not need to run in a mono behavior and is simply used for convenience
using System;
using System.Collections.Generic;
using System.Threading;
using UnityEngine;
public class ThreadTest : MonoBehaviour
{
private List<int> numbers = null;
private void Start()
{
Debug.Log("1. Call thread task");
StartMyLongRunningTask();
Debug.Log("2. Do something else");
}
private void StartMyLongRunningTask()
{
numbers = new List<int>();
ThreadStart starter = myLongRunningTask;
starter += () =>
{
myLongRunningTaskDone();
};
Thread _thread = new Thread(starter) { IsBackground = true };
_thread.Start();
}
private void myLongRunningTaskDone()
{
Debug.Log("3. Task callback result");
foreach (int num in numbers)
Debug.Log(num);
}
private void myLongRunningTask()
{
for (int i = 0; i < 10; i++)
{
numbers.Add(i);
Thread.Sleep(1000);
}
}
}
Try to use ManualRestEvent to signal of thread complete.
Maybe using conditional variables and mutex, or some functions like wait(), signal(), maybe timed wait() to not block main thread infinitely.
In C# this will be:
void Notify()
{
lock (syncPrimitive)
{
Monitor.Pulse(syncPrimitive);
}
}
void RunLoop()
{
for (;;)
{
// do work here...
lock (syncPrimitive)
{
Monitor.Wait(syncPrimitive);
}
}
}
more on that here:
Condition Variables C#/.NET
It is the concept of Monitor object in C#, you also have version that enables to set timeout
public static bool Wait(
object obj,
TimeSpan timeout
)
more on that here:
https://msdn.microsoft.com/en-us/library/system.threading.monitor_methods(v=vs.110).aspx
I have a worker thread that may be active for short bursts of time and idle for rest of the time. I'm thinking to put the thread to sleep and then awake it when needed.
Any additional recommendations for this I should be aware of?
Thanks!
this is in C#/.NET4
You should probably not be using a persistent worker thread- use the thread pool. This is exactly what it is intended for.
ThreadPool.QueueUserWorkItem(() => {
// My temporary work here
});
If you insist on having a persistent worker thread, make it run this:
// This is our latch- we can use this to "let the thread out of the gate"
AutoResetEvent threadLatch = new AutoResetEvent(false);
// The thread runs this
public void DoBackgroundWork() {
// Making sure that the thread is a background thread
// ensures that the endless loop below doesn't prevent
// the program from exiting
Thread.IsBackground = true;
while (true) {
// The worker thread will get here and then block
// until someone Set()s the latch:
threadLatch.WaitOne();
// Do your work here
}
}
// To signal the thread to start:
threadLatch.Set();
Also note that if this background thread is going to interact with the user interface at all, you'll need to Invoke or BeginInvoke accordingly. See http://weblogs.asp.net/justin_rogers/pages/126345.aspx
Just use an event to pause the worker thread: reset - paused, set - unpaused (working) state.
Here is the draft version of code that demonstrates the approach.
class Worker
{
private Thread _thread;
// Un-paused by default.
private ManualResetEvent _notToBePaused = new ManualResetEvent(true);
public Worker()
{
_thread = new Thread(Run)
{
IsBackground = true
};
}
/// <summary>
/// Thread function.
/// </summary>
private void Run()
{
while (true)
{
// Would block if paused!
_notToBePaused.WaitOne();
// Process some stuff here.
}
}
public void Start()
{
_thread.Start();
}
public void Pause()
{
_notToBePaused.Reset();
}
public void UnPause()
{
_notToBePaused.Set();
}
}
Signaling with WaitHandle is the right way to go, but just to add on what others said already
I'd usually go with 2 signals working together, otherwise you wouldn't know whether to 'continue' or 'exit' when needed - or would have to resort to a less graceful way of doing that (stopping the thread - of course there are other ways of doing something like this, just one 'pattern'). So usually it works with an 'exit' signal and a 'new work available' signal - working in unison. e.g.
WaitHandle[] eventArray = new WaitHandle[2] { _exitEvent, _newWorkEvent };
while ((waitid = WaitHandle.WaitAny(eventArray, timeout, false)) > 1)
{
// do your work, and optionally handle timeout etc.
}
note:
exit is ManualResetEvent with 'false' initial state - 'Set' event to exit.
_newWork is either Manual in which case you need to pause/continue from outside which is what you wanted I think -
...or could also be new AutoResetEvent(false) which you 'signal' to do one loop of work, signal returns to 'false' right away - and you need to repeat that for each 'new batch' of work - this is a bit simplified.
(often that goes hand in hand with some 'messages' being passed along, synchronized of course in some way).
Hope this adds some more info,
I am trying to learn the threading in C#. Today I sow the following code at http://www.albahari.com/threading/:
class ThreadTest
{
bool done;
static void Main()
{
ThreadTest tt = new ThreadTest(); // Create a common instance
new Thread (tt.Go).Start();
tt.Go();
}
// Note that Go is now an instance method
void Go()
{
if (!done) { done = true; Console.WriteLine ("Done"); }
}
}
In Java unless you define the "done" as volatile the code will not be safe. How does C# memory model handles this?
Guys, Thanks all for the answers. Much appreciated.
Well, there's the clear race condition that they could both see done as false and execute the if body - that's true regardless of memory model. Making done volatile won't fix that, and it wouldn't fix it in Java either.
But yes, it's feasible that the change made in one thread could happen but not be visible until in the other thread. It depends on CPU architecture etc. As an example of what I mean, consider this program:
using System;
using System.Threading;
class Test
{
private bool stop = false;
static void Main()
{
new Test().Start();
}
void Start()
{
new Thread(ThreadJob).Start();
Thread.Sleep(500);
stop = true;
}
void ThreadJob()
{
int x = 0;
while (!stop)
{
x++;
}
Console.WriteLine("Counted to {0}", x);
}
}
While on my current laptop this does terminate, I've used other machines where pretty much the exact same code would run forever - it would never "see" the change to stop in the second thread.
Basically, I try to avoid writing lock-free code unless it's using higher-level abstractions provided by people who really know their stuff - like the Parallel Extensions in .NET 4.
There is a way to make this code lock-free and correct easily though, using Interlocked. For example:
class ThreadTest
{
int done;
static void Main()
{
ThreadTest tt = new ThreadTest(); // Create a common instance
new Thread (tt.Go).Start();
tt.Go();
}
// Note that Go is now an instance method
void Go()
{
if (Interlocked.CompareExchange(ref done, 1, 0) == 0)
{
Console.WriteLine("Done");
}
}
}
Here the change of value and the testing of it are performed as a single unit: CompareExchange will only set the value to 1 if it's currently 0, and will return the old value. So only a single thread will ever see a return value of 0.
Another thing to bear in mind: your question is fairly ambiguous, as you haven't defined what you mean by "thread safe". I've guessed at your intention, but you never made it clear. Read this blog post by Eric Lippert - it's well worth it.
No, it's not thread safe. You could potentially have one thread check the condition (if(!done)), the other thread check that same condition, and then the first thread executes the first line in the code block (done = true).
You can make it thread safe with a lock:
lock(this)
{
if(!done)
{
done = true;
Console.WriteLine("Done");
}
}
Even in Java with volatile, both threads could enter the block with the WriteLine.
If you want mutual exclusion you need to use a real synchronisation object such as a lock.
onle way this is thread safe is when you use atomic compare and set in the if test
if(atomicBool.compareAndSet(false,true)){
Console.WriteLine("Done");
}
You should do something like this:
class ThreadTest{
Object myLock = new Object();
...
void Go(){
lock(myLock){
if(!done)
{
done = true;
Console.WriteLine("Done");
}
}
}
The reason you want to use an generic object, rather than "this", is that if your object (aka "this") changes at all it is considered another object. Thus your lock does not work any more.
Another small thing you might consider is this. It is a "good practices" thing, so nothing severe.
class ThreadTest{
Object myLock = new Object();
...
void Go(){
lock(myLock){
if(!done)
{
done = true;
}
}
//This line of code does not belong inside the lock.
Console.WriteLine("Done");
}
Never have code inside a lock that does not need to be inside a lock. This is due to the delay this causes. If you have lots of threads you can gain a lot of performance from removing all this unnecessary waiting.
Hope it helps :)
I want to know a proper way to start and stop a threaded job forced and unforced. Is this the proper way to stop a Thread?
public class ProcessDataJob : IJob
{
private ConcurrentQueue<byte[]> _dataQueue = new ConcurrentQueue<byte[]>();
private volatile bool _stop = false;
private volatile bool _forceStop = false;
private Thread _thread;
private int _timeOut = 1000;
public void Start()
{
_stop = false;
_forceStop = false;
_thread = new Thread(ProcessData);
_thread.Start();
}
private void ProcessData()
{
while (!_stop || _dataQueue.Count > 0)
{
if(_forceStop) return;
byte[] data;
if(_dataQueue.TryDequeue(data))
{
//Process data
//.....//
}
}
}
public void Stop(bool force)
{
_stop = true;
_forceStop = force;
_thread.Join(_timeOut);
}
public void Enqueue(byte[] data)
{
_dataQueue.Enqueue(data);
}
}
There is no proper way to forcibly kill a thread.
There are several ways to do it, but none of them are proper.
Forcibly killing a thread is what you should do only if you need to terminate the program, or unload the appdomain containing the thread, and don´t care about any data structures left dangling in a corrupted/bad/locked state, because they will be gone in a short while as well.
There´s plenty of advice on the internet about how bad/evil Thread.Abort is, so don´t do it.
Instead, write proper cooperative threading. The thread(s) should themselves check a flag (event, volatile bool field, etc.) and then voluntairly exit when nicely asked to do so.
That is the proper way.
This is the way that I've done it in the past, with one difference. I've had threads unexpectedly hang in the past, which means that that loop of yours doesn't come back with an answer. To address this I have all thread using classes, like yours, register with a 'manager' class who is responsible for participating in key things like Forced Stop. The threaded class has a reference to the manager, and when a forced stop is done it calls a method on the manager which effectively is a timer. If by the time the timer has gone off the threaded class hasn't set a state flag to STOPPED then the manager calls abort on it.
The key thing for me was not just the calling 'stop' but the confirmation that 'stop' had occurred, and understanding that it will be a non-deterministic amount of time but that 'after a reasonable amount of time' that I should give up and move on.
You could use the .NET ThreadPool class, this way you don't have to handle the stack yourself.
I have a scenario where I will have to kick off a ton of threads (possibly up to a 100), then wait for them to finish, then perform a task (on yet another thread).
What is an accepted pattern for doing this type of work? Is it simply .Join? Or is there a higher level of abstraction nowadays?
Using .NET 2.0 with VS2008.
In .NET 3.5sp1 or .NET 4, the TPL would make this much easier. However, I'll tailor this to .NET 2 features only.
There are a couple of options. Using Thread.Join is perfectly acceptable, especially if the threads are all ones you are creating manually. This is very easy, reliable, and simple to implement. It would probably be my choice.
However, the other option would be to create a counter for the total amount of work, and to use a reset event when the counter reaches zero. For example:
class MyClass {
int workToComplete; // Total number of elements
ManualResetEvent mre; // For waiting
void StartThreads()
{
this.workToComplete = 100;
mre = new ManualResetEvent(false);
int total = workToComplete;
for(int i=0;i<total;++i)
{
Thread thread = new Thread( new ThreadStart(this.ThreadFunction) );
thread.Start(); // Kick off the thread
}
mre.WaitOne(); // Will block until all work is done
}
void ThreadFunction()
{
// Do your work
if (Interlocked.Decrement(ref this.workToComplete) == 0)
this.mre.Set(); // Allow the main thread to continue here...
}
}
Did you look at ThreadPool? Looks like here -ThreadPool tutorial, avtor solves same task as you ask.
What's worked well for me is to store each thread's ManagedThreadId in a dictionary as I launch it, and then have each thread pass its id back through a callback method when it completes. The callback method deletes the id from the dictionary and checks the dictionary's Count property; when it's zero you're done. Be sure to lock around the dictionary both for adding to and deleting from it.
I am not sure that any kind of standard thread locking or synchronization mechanisms will really work with so many threads. However, this might be a scenario where some basic messaging might be an ideal solution to the problem.
Rather than using Thread.Join, which will block (and could be very difficult to manage with so many threads), you might try setting up one more thread that aggregates completion messages from your worker threads. When the aggregator has received all expected messages, it completes. You could then use a single WaitHandle between the aggregator and your main application thread to signal that all of your worker threads are done.
public class WorkerAggregator
{
public WorkerAggregator(WaitHandle completionEvent)
{
m_completionEvent = completionEvent;
m_workers = new Dictionary<int, Thread>();
}
private readonly WaitHandle m_completionEvent;
private readonly Dictionary<int, Thread> m_workers;
public void StartWorker(Action worker)
{
var thread = new Thread(d =>
{
worker();
notifyComplete(thread.ManagedThreadID);
}
);
lock (m_workers)
{
m_workers.Add(thread.ManagedThreadID, thread);
}
thread.Start();
}
private void notifyComplete(int threadID)
{
bool done = false;
lock (m_workers)
{
m_workers.Remove(threadID);
done = m_workers.Count == 0;
}
if (done) m_completionEvent.Set();
}
}
Note, I have not tested the code above, so it might not be 100% correct. However I hope it illustrates the concept enough to be useful.