I'm playing around with a simple console app that creates one thread and I do some inter thread communication between the main and the worker thread.
I'm posting objects from the main thread to a concurrent queue and the worker thread is dequeueing that and does some processing.
What strikes me as odd, is that when I profile this app, even despite I have two cores.
One core is 100% free and the other core have done all the work, and I see that both threads have been running in that core.
Why is this?
Is it because I use a wait handle that sets when I post a message and releases when the processing is done?
This is my sample code, now using 2 worker threads.
It still behaves the same, main, worker1 and worker2 is running in the same core.
Ideas?
[EDIT]
It sort of works now, atleast, I get twice the performance compared to yesterday.
the trick was to slow down the consumer just enough to avoid signaling using the AutoResetEvent.
public class SingleThreadDispatcher
{
public long Count;
private readonly ConcurrentQueue<Action> _queue = new ConcurrentQueue<Action>();
private volatile bool _hasMoreTasks;
private volatile bool _running = true;
private int _status;
private readonly AutoResetEvent _signal = new AutoResetEvent(false);
public SingleThreadDispatcher()
{
var thread = new Thread(Run)
{
IsBackground = true,
Name = "worker" + Guid.NewGuid(),
};
thread.Start();
}
private void Run()
{
while (_running)
{
_signal.WaitOne();
do
{
_hasMoreTasks = false;
Action task;
while (_queue.TryDequeue(out task) && _running)
{
Count ++;
task();
}
//wait a short while to let _hasMoreTasks to maybe be set to true
//this avoids the roundtrip to the AutoResetEvent
//that is, if there is intense pressure on the pool, we let some new
//tasks have the chance to arrive and be processed w/o signaling
if(!_hasMoreTasks)
Thread.Sleep(5);
Interlocked.Exchange(ref _status, 0);
} while (_hasMoreTasks);
}
}
public void Schedule(Action task)
{
_hasMoreTasks = true;
_queue.Enqueue(task);
SetSignal();
}
private void SetSignal()
{
if (Interlocked.Exchange(ref _status, 1) == 0)
{
_signal.Set();
}
}
}
Is it because I use a wait handle that sets when I post a message and releases when the processing is done?
Without seeing your code it is hard to say for sure, but from your description it appears that the two threads that you wrote act as co-routines: when the main thread is running, the worker thread has nothing to do, and vice versa. It looks like .NET scheduler is smart enough to not load the second core when this happens.
You can change this behavior in several ways - for example
by doing some work on the main thread before waiting on the handle, or
by adding more worker threads that would compete for the tasks that your main thread posts, and could both get a task to work on.
OK, I've figured out what the problem is.
The producer and consumer is pretty much just as fast in this case.
This results in the consumer finishing all its work fast and then looping back to wait for the AutoResetEvent.
The next time the producer sends a task, it has to touch the AutoresetEvent and set it.
The solution was to add a very very small delay in the consumer, making it slightly slower than the producer.
This results in when the producer sends a task, it notices that the consumer is already active and it just has to post to the worker queue w/o touching the AutoResetEvent.
The original behavior resulted in a sort of ping-pong effect, that can be seen on the screenshot.
Dasblinkelight (probably) has the right answer.
Apart from that, it would also be the correct behaviour when one of your threads is I/O bound (that is, it's not stuck on the CPU) - in that case, you've got nothing to gain from using multiple cores, and .NET is smart enough to just change contexts on one core.
This is often the case for UI threads - it has very little work to do, so there usually isn't much of a reason for it to occupy a whole core for itself. And yes, if your concurrent queue is not used properly, it could simply mean that the main thread waits for the worker thread - again, in that case, there's no need to switch cores, since the original thread is waiting anyway.
You should use BlockingCollection rather than ConcurrentQueue. By default, BlockingCollection uses a ConcurrentQueue under the hood, but it has a much easier to use interface. In particular, it does non-busy waits. In addition, BlockingCollection supports cancellation, so your consumer becomes very simple. Here's an example:
public class SingleThreadDispatcher
{
public long Count;
private readonly BlockingCollection<Action> _queue = new BlockingCollection<Action>();
private readonly CancellationTokenSource _cancellation = new CancellationTokenSource();
public SingleThreadDispatcher()
{
var thread = new Thread(Run)
{
IsBackground = true,
Name = "worker" + Guid.NewGuid(),
};
thread.Start();
}
private void Run()
{
foreach (var task in _queue.GetConsumingEnumerable(_cancellation.Token))
{
Count++;
task();
}
}
public void Schedule(Action task)
{
_queue.Add(task);
}
}
The loop with GetConsumingEnumerable will do a non-busy wait on the queue. There's no need to do it with a separate event. It will wait for an item to be added to the queue, or it will exit if you set the cancellation token.
To stop it normally, you just call _queue.CompleteAdding(). That tells the consumer that no more items will be added to the queue. The consumer will empty the queue and then exit.
If you want to quit early, then just call _cancellation.Cancel(). That will cause GetConsumingEnumerable to exit.
In general, you shouldn't ever have to use ConcurrentQueue directly. BlockingCollection is easier to use and provides equivalent performance.
Related
I have the below codes in C# for consumer and producer using AutoResetEvent, but they do not work in case having multiple producer and one consumer. The problem is that the consumer can not consume all the items in the queue. When I debug, I notice the consumer can only remove one item and then it returns false and can not remove anymore. Seems the problem is in AutoResetEvent, but I can not figure out what is wrong.
private AutoResetEvent newItemSignal = new AutoResetEvent(false);
private Queue<Task> iQueue = new Queue<Task>();
public void Enqueue(Task task)
{
lock (((ICollection)iQueue).SyncRoot)
{
iQueue.Enqueue(task);
newItemSignal.Set();
}
}
public bool Dequeue(out Task task, int timeout)
{
if (newItemSignal.WaitOne(timeout, false))
{
lock (((ICollection)iQueue).SyncRoot)
{
task = iQueue.Dequeue();
}
return true;
}
task = default(Task);
return false;
}
The problem with using an AutoResetEvent like this is that you may call Set() twice or more but WaitOne() only once. Calling Set() on an ARE that is already signaled will always fail, the item gets stuck in the queue. A standard threading race bug. Looks like you could fix it by emptying the entire queue in the consumer. Not a real fix, the producer can still race ahead of the consumer, you merely lowered the odds to the once-a-month undebuggable stage.
ARE cannot do this, it cannot count. Use a Semaphore/Slim instead, it was made to count in a thread-safe way. Or use a ConcurrentQueue, a class added to solve exactly this kind of programming problem.
By using AutoResetEvent, you have designed the program in such a way that only one consumer can consume an item at a time.
If you want to stick on with the similar design, you can instead use ManualResetEvent, Reset the event when any of the consumer thread finds that there are no items to be consumed, and Set the event when the producer thread knows that there is atleast one item to be consumed.
You can find an alternate design with Monitor class here
You can also can make use of Blocking collection if you are using .NET 4.0 or higher
I have a Window in WPF and user can start very long operations on it. User must be able to cancel those operations.
All of my operations are in separate threads. So my question is:
Can I terminate all threads that are started from that Window, without killing UI thread obviously, at any time?
On places where I need to do long operations threads were created and started like this
Thread thread =
new Thread(
new ThreadStart(
delegate
{...}));
thread.Start();
How to pass that object to it? is it possible? If it is important at all I do not care about graceful closing of threads, they can be killed, it would still be a solution. Is window object aware of threads to whom it is parent?
Thank you in advance.
Typically you won't want to create/destroy threads. There's much more overhead when creating a Thread every time you need one than there is in thread pools and Tasks (This applies, like specified, when you need to create a significant number of Threads in the lifetime of your processes).
The preferred approach (especially if you're using .Net 4.0, or even better 4.5) is to use Tasks.
There's is tons of documentation on how to use Tasks, and how to cancel them. #xxbbcc posted a link in a comment on your question.
However, if you still think that dealing with Threads is your best choice, you could keep a track of all the threads. Then whenever you (as a developer) or your user determines they want to kill the thread, you can just iterate through the threads and call the Abort() method on them.
public class MyExampleClass
{
private List<Thread> MyThreads { get; set; }
public MyExampleClass()
{
MyThreads = new List<Thread>();
InstanciateThreadsWithSomeSuperImportantOperations();
}
private void InstanciateThreadsWithSomeSuperImportantOperations()
{
var thread = new Thread();
// some code here
MyThreads.Add(thread);
}
public void KillAllThreads()
{
foreach (var t in MyThreads)
{
if (t.IsAlive)
t.Abort(); // Note this isn't guaranteed to stop the thread.
}
}
}
I am trying to do the following :
I have a server that is supposed to get many messages from a queue and process them. Now what I want is to create a new thread for every message and those threads will handle the response to the queue, I just want my server (core thread) to be just listening to messages and creating threads, not caring of what happens to them.
How can I achieve this? I know I can use the Thread class to create a thread but then the application just keeps listening to the thread until if finishes.
Also I can create an async method and run it but what happens when it finishes? Also the method is supposed to be static if I want it to be async but in my current application that is not a solution since I use many non static variables into this method.
Any ideas would be appreciated.
Unless you have very specific reason, I'd recommend using Tasks instead of Threads.
Likely they'll run in background anyway, but they produce less CPU/memory overhead and (in my opinion) are easier to handle in case of exception,...
Task t = Task.Run(() => ProcessMessage(message));
Maybe take a look at this introduction
What do you mean with
I know I can use the Thread class to create a thread but then the application just keeps listening to the thread until if finishes.
Just spawn the thread and let it run:
{
Thread t = new Thread(Foo);
t.Start();
}
public void Foo()
{ }
This won't make the main thread listen to the child thread, it just spawn them and continue working on following instructions.
BTW there are tons of result on how to create and run threads.
Since I don't like when others do it, here are simple examples of each way (asynchrnous/task-based), and you pick which one you like.
Asynchronous Implementation
int main()
{
while(true)
{
string data = SomeMethodThatReturnsTheNextDataFromQueue();
ProcessDataAsync(data);
}
}
async private void ProcessDataAsync(string msg)
{
// The *await* keyword returns to caller and allows main thread to continue looping.
bool result = await ParseDataAndSaveSomewhere(msg);
return;
}
Task-Based Implementation
int main()
{
while(true)
{
string data = SomeMethodThatReturnsTheNextDataFromQueue();
Task task = new Task(() => { ProcessData(data) });
task.Start();
}
}
private void ProcessData(string data)
{
// Do work
}
public void EnqueueTask(int[] task)
{
lock (_locker)
{
_taskQ.Enqueue(task);
Monitor.PulseAll(_locker);
}
}
So, here I'm adding elements to my queue and than threads do some work with them.How can I add items to my queue asynchronously?
If you using .net V4 have a look at the new thread safe collections, they are mostly none blocking so will properly avoid the need for an async add.
Since your using Queue<T> (recommended), Queue.Synchronized can't be used.
But besides that I would use the thread pool. But your EnqueueTask method kind of implies that the threading logic is handled outside of your "TaskQueue" class (your method implies that it is a Queue of tasks).
Your implementation also implies that it is not "Here" we wan't to add logic but rather in another place, the code you have there isn't really blocking for long so I would turn things upside down.
It also implies that the thing taking things off the queue is already on another thread since you use "PulseAll" to weak that thread up.
E.g.
public void StartQueueHandler()
{
new Thread(()=>StartWorker).Start();
}
private int[] Dequeue()
{
lock(_locker)
{
while(_taskQ.Count == 0) Monitor.Wait(_locker);
return _taskQ.Dequeue();
}
}
private void StartWorker(object obj)
{
while(_keepProcessing)
{
//Handle thread abort or have another "shot down" mechanism.
int[] work = Dequeue();
//If work should be done in parallel without results.
ThreadPool.QueueUserWorkItem(obj => DoWork(work));
//If work should be done sequential according to the queue.
DoWork(work);
}
}
Maybe something like this could work:
void AddToQueue(Queue queue, string mess) {
var t = new Thread(() => Queue.Synchronized(queue).Enqueue(mess));
t.Start();
}
The new thread ensures that your current thread does not block.
Queue.Syncronized handles all locking of the queue.
It could be replaced with your locker code, might be better performance.
The code from your question seems to indicate that you are attempting to implement a blocking queue. I make that obseration from the call to Monitor.PulseAll after the Queue<T>.Enqueue. This is the normal pattern for signalling the dequeuing thread. So if that is the case then the best option is to use the BlockingCollection class which is available in .NET 4.0.
I've been working on a web crawling .NET app in my free time, and one of the features of this app that I wanted to included was a pause button to pause a specific thread.
I'm relatively new to multi-threading and I haven't been able to figure out a way to pause a thread indefinitely that is currently supported. I can't remember the exact class/method, but I know there is a way to do this but it has been flagged as obsolete by the .NET framework.
Is there any good general purpose way to indefinitely pause a worker thread in C# .NET.
I haven't had a lot of time lately to work on this app and the last time I touched it was in the .NET 2.0 framework. I'm open to any new features (if any) that exist in the .NET 3.5 framework, but I'd like to know of solution that also works in the 2.0 framework since that's what I use at work and it would be good to know just in case.
Never, ever use Thread.Suspend. The major problem with it is that 99% of the time you can't know what that thread is doing when you suspend it. If that thread holds a lock, you make it easier to get into a deadlock situation, etc. Keep in mind that code you are calling may be acquiring/releasing locks behind the scenes. Win32 has a similar API: SuspendThread and ResumeThread. The following docs for SuspendThread give a nice summary of the dangers of the API:
http://msdn.microsoft.com/en-us/library/ms686345(VS.85).aspx
This function is primarily designed for use by debuggers. It is not intended to be used for thread synchronization. Calling SuspendThread on a thread that owns a synchronization object, such as a mutex or critical section, can lead to a deadlock if the calling thread tries to obtain a synchronization object owned by a suspended thread. To avoid this situation, a thread within an application that is not a debugger should signal the other thread to suspend itself. The target thread must be designed to watch for this signal and respond appropriately.
The proper way to suspend a thread indefinitely is to use a ManualResetEvent. The thread is most likely looping, performing some work. The easiest way to suspend the thread is to have the thread "check" the event each iteration, like so:
while (true)
{
_suspendEvent.WaitOne(Timeout.Infinite);
// Do some work...
}
You specify an infinite timeout so when the event is not signaled, the thread will block indefinitely, until the event is signaled at which point the thread will resume where it left off.
You would create the event like so:
ManualResetEvent _suspendEvent = new ManualResetEvent(true);
The true parameter tells the event to start out in the signaled state.
When you want to pause the thread, you do the following:
_suspendEvent.Reset();
And to resume the thread:
_suspendEvent.Set();
You can use a similar mechanism to signal the thread to exit and wait on both events, detecting which event was signaled.
Just for fun I'll provide a complete example:
public class Worker
{
ManualResetEvent _shutdownEvent = new ManualResetEvent(false);
ManualResetEvent _pauseEvent = new ManualResetEvent(true);
Thread _thread;
public Worker() { }
public void Start()
{
_thread = new Thread(DoWork);
_thread.Start();
}
public void Pause()
{
_pauseEvent.Reset();
}
public void Resume()
{
_pauseEvent.Set();
}
public void Stop()
{
// Signal the shutdown event
_shutdownEvent.Set();
// Make sure to resume any paused threads
_pauseEvent.Set();
// Wait for the thread to exit
_thread.Join();
}
public void DoWork()
{
while (true)
{
_pauseEvent.WaitOne(Timeout.Infinite);
if (_shutdownEvent.WaitOne(0))
break;
// Do the work here..
}
}
}
The Threading in C# ebook summarises Thread.Suspend and Thread.Resume thusly:
The deprecated Suspend and Resume methods have two modes – dangerous and useless!
The book recommends using a synchronization construct such as an AutoResetEvent or Monitor.Wait to perform thread suspending and resuming.
If there are no synchronization requirements:
Thread.Sleep(Timeout.Infinite);
I just implemented a LoopingThread class which loops an action passed to the constructor. It is based on Brannon's post. I've put some other stuff into that like WaitForPause(), WaitForStop(), and a TimeBetween property, that indicates the time that should be waited before next looping.
I also decided to change the while-loop to an do-while-loop. This will give us a deterministic behavior for a successive Start() and Pause(). With deterministic I mean, that the action is executed at least once after a Start() command. In Brannon's implementation this might not be the case.
I omitted some things for the root of the matter. Things like "check if the thread was already started", or the IDisposable pattern.
public class LoopingThread
{
private readonly Action _loopedAction;
private readonly AutoResetEvent _pauseEvent;
private readonly AutoResetEvent _resumeEvent;
private readonly AutoResetEvent _stopEvent;
private readonly AutoResetEvent _waitEvent;
private readonly Thread _thread;
public LoopingThread (Action loopedAction)
{
_loopedAction = loopedAction;
_thread = new Thread (Loop);
_pauseEvent = new AutoResetEvent (false);
_resumeEvent = new AutoResetEvent (false);
_stopEvent = new AutoResetEvent (false);
_waitEvent = new AutoResetEvent (false);
}
public void Start ()
{
_thread.Start();
}
public void Pause (int timeout = 0)
{
_pauseEvent.Set();
_waitEvent.WaitOne (timeout);
}
public void Resume ()
{
_resumeEvent.Set ();
}
public void Stop (int timeout = 0)
{
_stopEvent.Set();
_resumeEvent.Set();
_thread.Join (timeout);
}
public void WaitForPause ()
{
Pause (Timeout.Infinite);
}
public void WaitForStop ()
{
Stop (Timeout.Infinite);
}
public int PauseBetween { get; set; }
private void Loop ()
{
do
{
_loopedAction ();
if (_pauseEvent.WaitOne (PauseBetween))
{
_waitEvent.Set ();
_resumeEvent.WaitOne (Timeout.Infinite);
}
} while (!_stopEvent.WaitOne (0));
}
}
Beside suggestions above, I'd like to add one tip. In some cases, use BackgroundWorker can simplify your code (especially when you use anonymous method to define DoWork and other events of it).
In line with what the others said - don't do it. What you really want to do is to "pause work", and let your threads roam free. Can you give us some more details about the thread(s) you want to suspend? If you didn't start the thread, you definitely shouldn't even consider suspending it - its not yours. If it is your thread, then I suggest instead of suspending it, you just have it sit, waiting for more work to do. Brannon has some excellent suggestions for this option in his response. Alternatively, just let it end; and spin up a new one when you need it.
The Suspend() and Resume() may be depricated, however they are in no way useless.
If, for example, you have a thread doing a lengthy work altering data, and the user wishes to stop it, he clicks on a button. Of course, you need to ask for verification, but, at the same time you do not want that thread to continue altering data, if the user decides that he really wants to abort.
Suspending the Thread while waiting for the user to click that Yes or No button at the confirmation dialog is the only way to prevent it from altering the data, before you signal the designated abort event that will allow it to stop.
Events may be nice for simple threads having one loop, but complicated threads with complex processing is another issue.
Certainly, Suspend() must never be used for syncronising, since its usefulness is not for this function.
Just my opinion.