AutoResetEvent not blocking properly - c#

I have a thread, which creates a variable number of worker threads and distributes tasks between them. This is solved by passing the threads a TaskQueue object, whose implementation you will see below.
These worker threads simply iterate over the TaskQueue object they were given, executing each task.
private class TaskQueue : IEnumerable<Task>
{
public int Count
{
get
{
lock(this.tasks)
{
return this.tasks.Count;
}
}
}
private readonly Queue<Task> tasks = new Queue<Task>();
private readonly AutoResetEvent taskWaitHandle = new AutoResetEvent(false);
private bool isFinishing = false;
private bool isFinished = false;
public void Enqueue(Task task)
{
Log.Trace("Entering Enqueue, lock...");
lock(this.tasks)
{
Log.Trace("Adding task, current count = {0}...", Count);
this.tasks.Enqueue(task);
if (Count == 1)
{
Log.Trace("Count = 1, so setting the wait handle...");
this.taskWaitHandle.Set();
}
}
Log.Trace("Exiting enqueue...");
}
public Task Dequeue()
{
Log.Trace("Entering Dequeue...");
if (Count == 0)
{
if (this.isFinishing)
{
Log.Trace("Finishing (before waiting) - isCompleted set, returning empty task.");
this.isFinished = true;
return new Task();
}
Log.Trace("Count = 0, lets wait for a task...");
this.taskWaitHandle.WaitOne();
Log.Trace("Wait handle let us through, Count = {0}, IsFinishing = {1}, Returned = {2}", Count, this.isFinishing);
if(this.isFinishing)
{
Log.Trace("Finishing - isCompleted set, returning empty task.");
this.isFinished = true;
return new Task();
}
}
Log.Trace("Entering task lock...");
lock(this.tasks)
{
Log.Trace("Entered task lock, about to dequeue next item, Count = {0}", Count);
return this.tasks.Dequeue();
}
}
public void Finish()
{
Log.Trace("Setting TaskQueue state to isFinishing = true and setting wait handle...");
this.isFinishing = true;
if (Count == 0)
{
this.taskWaitHandle.Set();
}
}
public IEnumerator<Task> GetEnumerator()
{
while(true)
{
Task t = Dequeue();
if(this.isFinished)
{
yield break;
}
yield return t;
}
}
IEnumerator IEnumerable.GetEnumerator()
{
return GetEnumerator();
}
}
As you can see, I'm using an AutoResetEvent object to make sure that the worker threads don't exit prematurely, i.e. before getting any tasks.
In a nutshell:
the main thread assigns a task to a thread by Enqeueue-ing a task to its TaskQueue
the main thread notifies the thread that are no more tasks to execute by calling the TaskQueue's Finish() method
the worker thread retrieves the next task assigned to it by calling the TaskQueue's Dequeue() method
The problem is that the Dequeue() method often throws an InvalidOperationException, saying that the Queue is empty. As you can see I added some logging, and it turns out, that the AutoResetEvent doesn't block the Dequeue(), even though there were no calls to its Set() method.
As I understand it, calling AutoResetEvent.Set() will allow a waiting thread to proceed (who previously called AutoResetEvent.WaitOne()), and then automatically calls AutoResetEvent.Reset(), blocking the next waiter.
So what can be wrong? Did I get something wrong? Do I have an error somewhere?
I'm sitting above this for 3 hours now, but I cannot figure out what's wrong.
Please help me!
Thank you very much!

Your dequeue code is incorrect. You check the Count under lock, then fly by the seams of your pants, and then you expect the tasks to have something. You cannot retain assumptions while you release the lock :). Your Count check and tasks.Dequeue must occur under lock:
bool TryDequeue(out Tasks task)
{
task = null;
lock (this.tasks) {
if (0 < tasks.Count) {
task = tasks.Dequeue();
}
}
if (null == task) {
Log.Trace ("Queue was empty");
}
return null != task;
}
You Enqueue() code is similarly riddled with problems. Your Enqueue/Dequeue don't ensure progress (you will have dequeue threads blocked waiting even though there are items in the queue). Your signature of Enqueue() is wrong. Overall your post is very very poor code. Frankly, I think you're trying to chew more than you can bite here... Oh, and never log under lock.
I strongly suggest you just use ConcurrentQueue.
If you don't have access to .Net 4.0 here is an implementation to get you started:
public class ConcurrentQueue<T>:IEnumerable<T>
{
volatile bool fFinished = false;
ManualResetEvent eventAdded = new ManualResetEvent(false);
private Queue<T> queue = new Queue<T>();
private object syncRoot = new object();
public void SetFinished()
{
lock (syncRoot)
{
fFinished = true;
eventAdded.Set();
}
}
public void Enqueue(T t)
{
Debug.Assert (false == fFinished);
lock (syncRoot)
{
queue.Enqueue(t);
eventAdded.Set();
}
}
private bool Dequeue(out T t)
{
do
{
lock (syncRoot)
{
if (0 < queue.Count)
{
t = queue.Dequeue();
return true;
}
if (false == fFinished)
{
eventAdded.Reset ();
}
}
if (false == fFinished)
{
eventAdded.WaitOne();
}
else
{
break;
}
} while (true);
t = default(T);
return false;
}
public IEnumerator<T> GetEnumerator()
{
T t;
while (Dequeue(out t))
{
yield return t;
}
}
System.Collections.IEnumerator System.Collections.IEnumerable.GetEnumerator()
{
return GetEnumerator();
}
}

A more detailed answer from me is pending, but I just want to point out something very important.
If you're using .NET 3.5, you can use the ConcurrentQueue<T> class. A backport is included in the Rx extensions library, which is available for .NET 3.5.
Since you want blocking behavior, you would need to wrap a ConcurrentQueue<T> in a BlockingCollection<T> (also available as part of Rx).

It looks like you are trying to replicate a blocking queue. One already exists in the .NET 4.0 BCL as a BlockingCollection. If .NET 4.0 is not an option for you then you can use this code. It use the Monitor.Wait and Monitor.Pulse method instead of AutoResetEvent.
public class BlockingCollection<T>
{
private Queue<T> m_Queue = new Queue<T>();
public T Take() // Dequeue
{
lock (m_Queue)
{
while (m_Queue.Count <= 0)
{
Monitor.Wait(m_Queue);
}
return m_Queue.Dequeue();
}
}
public void Add(T data) // Enqueue
{
lock (m_Queue)
{
m_Queue.Enqueue(data);
Monitor.Pulse(m_Queue);
}
}
}
Update:
I am fairly certain that it is not possible to implement a producer-consumer queue using AutoResetEvent if you want it to be thread-safe for multiple producers and multiple consumers (I am prepared to be proven wrong if someone can come up with a counter example). Sure, you will see examples on the internet, but they are all wrong. In fact, one such attempt by Microsoft is flawed in that the queue can get live-locked.

Related

Custom cross-platform thread dispatcher for console / asp.net application

I’ve got a .NET application that needs to call a COM object (it always has to be called from the same thread). As I have multiple threads in the application, I need to invoke an action on another thread.
The application does not have a (standard) message loop and I don’t really like the idea to add WPF / WinForms just to have a Dispatcher.
What would be a safe and effective way to implement a custom "message loop" / queue that allows invoking an Action / Func (with return type) on another thread?
It would also be nice to have a cross-platform solution for this problem.
Based on the information of #theodor-zoulias, I came up with this solution.
Disclaimer: Might be that this is actually a very bad design!
public sealed class DispatcherLoop : IDisposable
{
#region Instance
private DispatcherLoop() { }
static Dictionary<int, DispatcherLoop> dispatcherLoops = new();
public static DispatcherLoop Current
{
get
{
int threadId = Thread.CurrentThread.ManagedThreadId;
if (dispatcherLoops.ContainsKey(threadId))
return dispatcherLoops[threadId];
DispatcherLoop dispatcherLoop = new()
{
ThreadId = Thread.CurrentThread.ManagedThreadId
};
dispatcherLoops.Add(threadId, dispatcherLoop);
return dispatcherLoop;
}
}
#endregion
bool isDisposed = false;
public void Dispose()
{
if (isDisposed)
throw new ObjectDisposedException(null);
_queue.CompleteAdding();
_queue.Dispose();
dispatcherLoops.Remove(ThreadId);
isDisposed = true;
}
public int ThreadId { get; private set; } = -1;
public bool IsRunning { get; private set; } = false;
BlockingCollection<Task> _queue = new();
public void Run()
{
if (isDisposed)
throw new ObjectDisposedException(null);
if (ThreadId != Thread.CurrentThread.ManagedThreadId)
throw new InvalidOperationException($"The {nameof(DispatcherLoop)} has been created for a different thread!");
if (IsRunning)
throw new InvalidOperationException("Already running!");
IsRunning = true;
try
{
// ToDo: `RunSynchronously` is not guaranteed to be executed on this thread (see comments below)!
foreach (var task in _queue.GetConsumingEnumerable())
task?.RunSynchronously();
}
catch (ObjectDisposedException) { }
IsRunning = false;
}
public void BeginInvoke(Task task)
{
if (isDisposed)
throw new ObjectDisposedException(null);
if (!IsRunning)
throw new InvalidOperationException("Not running!");
if (ThreadId == Thread.CurrentThread.ManagedThreadId)
task?.RunSynchronously();
else
_queue.Add(task);
}
public void Invoke(Action action)
{
if (isDisposed)
throw new ObjectDisposedException(null);
Task task = new(action);
BeginInvoke(task);
task.GetAwaiter().GetResult();
}
public T Invoke<T>(Func<T> action)
{
if (isDisposed)
throw new ObjectDisposedException(null);
Task<T> task = new(action);
BeginInvoke(task);
return task.GetAwaiter().GetResult();
}
}
You should use Microsoft's Reactive Framework (aka Rx) - NuGet System.Reactive and add using System.Reactive.Linq; - then you can do this:
var els = new EventLoopScheduler();
Then you can do things like this:
els.Schedule(() => Console.WriteLine("Hello, World!"));
The EventLoopScheduler spins up its own thread and you can ask it to schedule any work on it you like - it'll always be that thread.
When you're finished with the scheduler, just call els.Dispose() to shut down it down cleanly.
There are also plenty of overloads for scheduling code in the future. It's a very powerful class.

I feel like I'm re-inventing the wheel. Dispatch work to a specific thread. I want to process events from multiple threads sequentially on one thread

I know that similar things exist in WPF and forms applications with the Control.Invoke method, I also know of the existence of BackgroundWorker, ThreadPool etc.
However, I don't want to depend on Forms/WPF, and I want to make sure work is executed sequentially and on one thread.
Edit: Rationale: I want to drive a state machine from one thread. The events come from other threads tough. There is no UI.
So far I couldn't really figure out how to do this with existing framework classes but I might have misunderstood the documentation.
Edit: I forgot to mention I'm bound to .NET Framework 3.5
What I wrote so far:
public class Dispatcher
{
string Name;
Thread WorkerThread;
Queue<Action> WorkQueue;
List<Exception> Exceptions;
ManualResetEvent Gate;
volatile bool KeepRunning;
readonly object WorkLocker;
public override string ToString()
{
return String.Format("{0}({1})", this.GetType().Name, Name);
}
public Dispatcher(string name)
{
Name = name;
WorkLocker = new Object();
Gate = new ManualResetEvent(false);
WorkQueue = new Queue<Action>();
Exceptions = new List<Exception>();
}
public void Start()
{
if (WorkerThread == null)
{
WorkerThread = new Thread(doDispatch)
{
IsBackground = true,
Name = this.Name
};
WorkerThread.Start();
}
}
public void Stop()
{
if (WorkerThread != null && WorkerThread.IsAlive)
{
Dispatch(() => { KeepRunning = false; });
WorkerThread.Join();
}
WorkerThread = null;
}
public void Reset()
{
Stop();
lock (WorkLocker)
{
WorkQueue = new Queue<Action>();
Exceptions = new List<Exception>();
}
}
public void Dispatch(Action a)
{
lock (WorkLocker)
{
WorkQueue.Enqueue(a);
}
Gate.Set();
}
public List<Exception> CollectExceptions()
{
List<Exception> result = new List<Exception>();
lock(WorkLocker)
{
foreach(Exception e in Exceptions)
{
result.Add(e);
}
Exceptions.Clear();
}
return result;
}
private void doDispatch()
{
KeepRunning = true;
while (KeepRunning)
{
Gate.WaitOne();
lock (WorkLocker)
{
while (WorkQueue.Count > 0)
{
try
{
WorkQueue.Dequeue()?.Invoke();
}
catch (Exception e)
{
Exceptions.Add(e);
}
}
}
}
}
}
Is there a way to do something like this in a simpler way? Another nice feature would be being able to dispatch calls that have multiple arguments.
Since you are bound to 3.5 you can't use BlockingCollection or the DataFlow library...you'll have to roll your own implementation.
The sample code you provided is a good start, but you should apply the Single Responsibility Principle to make it cleaner and easier to refactor when(if?) you upgrade the .NET Framework.
I would do it like this:
Create a thread safe wrapper class around Queue that somewhat
mimics BlockingCollection, this answer provides a nice example
Structure your code around a consumer/producer flow and inject the wrapper

WaitHandle.WaitAny that allows threads to enter orderly

I have a fixed number of "browsers", each of which is not thread safe so it must be used on a single thread. On the other hand, I have a long list of threads waiting to use these browsers. What I'm currently doing is have an AutoResetEvent array:
public readonly AutoResetEvent[] WaitHandles;
And initialize them like this:
WaitHandles = Enumerable.Range(0, Browsers.Count).Select(_ => new AutoResetEvent(true)).ToArray();
So I have one AutoResetEvent per browser, which allows me to retrieve a particular browser index for each thread:
public Context WaitForBrowser(int i)
{
System.Diagnostics.Debug.WriteLine($">>> WILL WAIT: {i}");
var index = WaitHandle.WaitAny(WaitHandles);
System.Diagnostics.Debug.WriteLine($">>> ENTERED: {i}");
return new Context(Browsers[index], WaitHandles[index]);
}
The i here is just the index of the thread waiting, since these threads are on a list and have a particular order. I'm just passing this for debugging purposes. Context is a disposable that then calls Set on the wait handle when disposed.
When I look at my Output I see all my ">>> WILL WAIT: {i}" messages are on the right order, since the calls to WaitForBrowser are made sequentially, but my ">>> ENTERED: {i}" messages are on random order (except for the first few), so they're not entering on the same order they arrive at the var index = WaitHandle.WaitAny(WaitHandler); line.
So my question is, is there any way to modify this so that threads enter on the same order the WaitForBrowser method is called (such that ">>> ENTERED: {i}" messages are also ordered)?
Since there doesn't seem to be an out-of-the-box solution, I ended up using a modified version of this solution:
public class SemaphoreQueueItem<T> : IDisposable
{
private bool Disposed;
private readonly EventWaitHandle WaitHandle;
public readonly T Resource;
public SemaphoreQueueItem(EventWaitHandle waitHandle, T resource)
{
WaitHandle = waitHandle;
Resource = resource;
}
public void Dispose()
{
if (!Disposed)
{
Disposed = true;
WaitHandle.Set();
}
}
}
public class SemaphoreQueue<T> : IDisposable
{
private readonly T[] Resources;
private readonly AutoResetEvent[] WaitHandles;
private bool Disposed;
private ConcurrentQueue<TaskCompletionSource<SemaphoreQueueItem<T>>> Queue = new ConcurrentQueue<TaskCompletionSource<SemaphoreQueueItem<T>>>();
public SemaphoreQueue(T[] resources)
{
Resources = resources;
WaitHandles = Enumerable.Range(0, resources.Length).Select(_ => new AutoResetEvent(true)).ToArray();
}
public SemaphoreQueueItem<T> Wait(CancellationToken cancellationToken)
{
return WaitAsync(cancellationToken).Result;
}
public Task<SemaphoreQueueItem<T>> WaitAsync(CancellationToken cancellationToken)
{
var tcs = new TaskCompletionSource<SemaphoreQueueItem<T>>();
Queue.Enqueue(tcs);
Task.Run(() => WaitHandle.WaitAny(WaitHandles.Concat(new[] { cancellationToken.WaitHandle }).ToArray())).ContinueWith(task =>
{
if (Queue.TryDequeue(out var popped))
{
var index = task.Result;
if (cancellationToken.IsCancellationRequested)
popped.SetResult(null);
else
popped.SetResult(new SemaphoreQueueItem<T>(WaitHandles[index], Resources[index]));
}
});
return tcs.Task;
}
public void Dispose()
{
if (!Disposed)
{
foreach (var handle in WaitHandles)
handle.Dispose();
Disposed = true;
}
}
}
Have you considered using Semaphore instead of array of AutoResetEvent ?
Problem with order of waiting threads (for semaphore) was discussed here:
Guaranteed semaphore order?

Threading Barrier Issue

I have a situation where I need one thread to wait until another thread provides it with data. I've created this class :Propery Highlighted,Syntax Highlighted Version
public class AsyncObjectWait<T1, TriggerType>
{
private void _timeout(TimeSpan timeout, TaskCompletionSource<TriggerType> tcs)
{
System.Threading.Thread.Sleep(timeout);
lock (tcs)
{
if (!tcs.TrySetException(new TimeoutException("Wait Timed Out")))
{
int a = 0;
}
else
{
Console.WriteLine("Timed Out");
}
}
}
public Task<TriggerType> WaitOneAsync(TimeSpan timeout)
{
TaskCompletionSource<TriggerType> wait = new TaskCompletionSource<TriggerType>();
lock (_waits)
{
_waits.Enqueue(wait);
Task.Run(() =>
{
_timeout(timeout, wait);
});
}
return wait.Task;
}
public bool TrySetOne(TriggerType trigger, Converter<TriggerType, T1> converter)
{
TaskCompletionSource<TriggerType> wait;
bool set = false;
lock (_waits)
{
while ((!set) && (!(_waits.Count == 0))) //while we havent completed a task, and as long as we have some more to check.
{
; //get the next wait.
lock (_waits.Peek()) //wait for any other threads to stop using the next wait it, then get exclusive access.
{
wait = _waits.Peek();
if (EqualityComparer<T1>.Default.Equals(converter(trigger), _value))
{
set = wait.TrySetResult(trigger); //try and set the result to complete the task. if we cant, try the next task
_waits.Dequeue();
if (!set)
{
continue;
}
else
{
return true;
}
}
else
{
return false;
}
}
}
return false;
}
}
}
To do this, as well as provide a Task-based interface that doesn't freeze my UI while it's waiting.
It seems to have an issue, though. When the Recieve() method is called, it can take up to a second for the task generated by WaitFor to recieve the completion info. I haven't been able to find a reason for this.
Could this be due to async method overhead? Is there another, less complicated way to do this?

Synchronizing thread communication?

Just for the heck of it I'm trying to emulate how JRuby generators work using threads in C#.
Also, I'm fully aware that C# has built in support for yield return, I'm just toying around a bit.
I guess it's some sort of poor mans coroutines by keeping multiple callstacks alive using threads. (even though none of the callstacks should execute at the same time)
The idea is like this:
The consumer thread requests a value
The worker thread provides a value and yields back to the consumer thread
Repeat untill worker thread is done
So, what would be the correct way of doing the following?
//example
class Program
{
static void Main(string[] args)
{
ThreadedEnumerator<string> enumerator = new ThreadedEnumerator<string>();
enumerator.Init(() =>
{
for (int i = 1; i < 100; i++)
{
enumerator.Yield(i.ToString());
}
});
foreach (var item in enumerator)
{
Console.WriteLine(item);
};
Console.ReadLine();
}
}
//naive threaded enumerator
public class ThreadedEnumerator<T> : IEnumerator<T>, IEnumerable<T>
{
private Thread enumeratorThread;
private T current;
private bool hasMore = true;
private bool isStarted = false;
AutoResetEvent enumeratorEvent = new AutoResetEvent(false);
AutoResetEvent consumerEvent = new AutoResetEvent(false);
public void Yield(T item)
{
//wait for consumer to request a value
consumerEvent.WaitOne();
//assign the value
current = item;
//signal that we have yielded the requested
enumeratorEvent.Set();
}
public void Init(Action userAction)
{
Action WrappedAction = () =>
{
userAction();
consumerEvent.WaitOne();
enumeratorEvent.Set();
hasMore = false;
};
ThreadStart ts = new ThreadStart(WrappedAction);
enumeratorThread = new Thread(ts);
enumeratorThread.IsBackground = true;
isStarted = false;
}
public T Current
{
get { return current; }
}
public void Dispose()
{
enumeratorThread.Abort();
}
object System.Collections.IEnumerator.Current
{
get { return Current; }
}
public bool MoveNext()
{
if (!isStarted)
{
isStarted = true;
enumeratorThread.Start();
}
//signal that we are ready to receive a value
consumerEvent.Set();
//wait for the enumerator to yield
enumeratorEvent.WaitOne();
return hasMore;
}
public void Reset()
{
throw new NotImplementedException();
}
public IEnumerator<T> GetEnumerator()
{
return this;
}
System.Collections.IEnumerator System.Collections.IEnumerable.GetEnumerator()
{
return this;
}
}
Ideas?
There are many ways to implement the producer/consumer pattern in C#.
The best way, I guess, is using TPL (Task, BlockingCollection). See an example here.

Categories