Custom cross-platform thread dispatcher for console / asp.net application - c#

I’ve got a .NET application that needs to call a COM object (it always has to be called from the same thread). As I have multiple threads in the application, I need to invoke an action on another thread.
The application does not have a (standard) message loop and I don’t really like the idea to add WPF / WinForms just to have a Dispatcher.
What would be a safe and effective way to implement a custom "message loop" / queue that allows invoking an Action / Func (with return type) on another thread?
It would also be nice to have a cross-platform solution for this problem.

Based on the information of #theodor-zoulias, I came up with this solution.
Disclaimer: Might be that this is actually a very bad design!
public sealed class DispatcherLoop : IDisposable
{
#region Instance
private DispatcherLoop() { }
static Dictionary<int, DispatcherLoop> dispatcherLoops = new();
public static DispatcherLoop Current
{
get
{
int threadId = Thread.CurrentThread.ManagedThreadId;
if (dispatcherLoops.ContainsKey(threadId))
return dispatcherLoops[threadId];
DispatcherLoop dispatcherLoop = new()
{
ThreadId = Thread.CurrentThread.ManagedThreadId
};
dispatcherLoops.Add(threadId, dispatcherLoop);
return dispatcherLoop;
}
}
#endregion
bool isDisposed = false;
public void Dispose()
{
if (isDisposed)
throw new ObjectDisposedException(null);
_queue.CompleteAdding();
_queue.Dispose();
dispatcherLoops.Remove(ThreadId);
isDisposed = true;
}
public int ThreadId { get; private set; } = -1;
public bool IsRunning { get; private set; } = false;
BlockingCollection<Task> _queue = new();
public void Run()
{
if (isDisposed)
throw new ObjectDisposedException(null);
if (ThreadId != Thread.CurrentThread.ManagedThreadId)
throw new InvalidOperationException($"The {nameof(DispatcherLoop)} has been created for a different thread!");
if (IsRunning)
throw new InvalidOperationException("Already running!");
IsRunning = true;
try
{
// ToDo: `RunSynchronously` is not guaranteed to be executed on this thread (see comments below)!
foreach (var task in _queue.GetConsumingEnumerable())
task?.RunSynchronously();
}
catch (ObjectDisposedException) { }
IsRunning = false;
}
public void BeginInvoke(Task task)
{
if (isDisposed)
throw new ObjectDisposedException(null);
if (!IsRunning)
throw new InvalidOperationException("Not running!");
if (ThreadId == Thread.CurrentThread.ManagedThreadId)
task?.RunSynchronously();
else
_queue.Add(task);
}
public void Invoke(Action action)
{
if (isDisposed)
throw new ObjectDisposedException(null);
Task task = new(action);
BeginInvoke(task);
task.GetAwaiter().GetResult();
}
public T Invoke<T>(Func<T> action)
{
if (isDisposed)
throw new ObjectDisposedException(null);
Task<T> task = new(action);
BeginInvoke(task);
return task.GetAwaiter().GetResult();
}
}

You should use Microsoft's Reactive Framework (aka Rx) - NuGet System.Reactive and add using System.Reactive.Linq; - then you can do this:
var els = new EventLoopScheduler();
Then you can do things like this:
els.Schedule(() => Console.WriteLine("Hello, World!"));
The EventLoopScheduler spins up its own thread and you can ask it to schedule any work on it you like - it'll always be that thread.
When you're finished with the scheduler, just call els.Dispose() to shut down it down cleanly.
There are also plenty of overloads for scheduling code in the future. It's a very powerful class.

Related

Lock / Monitor.Enter and deadlock scenario

I am trying to debug some old code in a product running on a Windows Embedded Compact 7 and .NET CF 3.5. I do suspect a deadlock scenario but can't figure out the reason. The originating code seems to be a class (code below) encapsulating a thread and an AutoResetEvent. It uses lock and Monitor.Enter to acquire lock for thread safety.
Is there something obvious in the following code potentially causing a deadlock that i would be missing?
//constructor
public SequentialThreadWorker(
string threadName,
ThreadPriority threadPriority)
{
_workerThread = new Thread(OnThreadStart)
{
Name = threadName,
Priority = threadPriority,
IsBackground = true
};
_waitHandle = new AutoResetEvent(false);
_actionsQueue = new Queue<Action>();
_workerThread.Start();
}
//external multi-threaded code can enqueue task using the following method
public void Enqueue(Action action)
{
lock (_lock)
{
_actionsQueue.Enqueue(action);
_waitHandle.Set();
}
}
//ThreadStart code depiling action from the queue action to execute
private void OnThreadStart()
{
while (_run)
{
Action actionData = null;
Monitor.Enter(_lock);
try
{
if (_actionsQueue.Count > 0)
{
actionData = _actionsQueue.Dequeue();
if (_actionsQueue.Count > 0)
_waitHandle.Set();
else
_waitHandle.Reset();
}
}
catch (Exception e)
{
//do log
}
finally
{
Monitor.Exit(_lock);
}
if (actionData != null)
{
try
{
actionData();
}
catch (Exception e)
{
//do log
}
}
_waitHandle.WaitOne();
}
}
}
}

Collection was modified; enumeration operation may not execute even though the collection was modified exclusively in lock statements

I have the following base code. The ActionMonitor can be used by anyone, in whatever setting, regardless of single-thread or multi-thread.
using System;
public class ActionMonitor
{
public ActionMonitor()
{
}
private object _lockObj = new object();
public void OnActionEnded()
{
lock (_lockObj)
{
IsInAction = false;
foreach (var trigger in _triggers)
trigger();
_triggers.Clear();
}
}
public void OnActionStarted()
{
IsInAction = true;
}
private ISet<Action> _triggers = new HashSet<Action>();
public void ExecuteAfterAction(Action action)
{
lock (_lockObj)
{
if (IsInAction)
_triggers.Add(action);
else
action();
}
}
public bool IsInAction
{
get;private set;
}
}
On exactly one occasion, when I examined a crash on client's machine, an exception was thrown at:
System.Core: System.InvalidOperationException Collection was modified;enumeration operation may not execute. at
System.Collections.Generic.HashSet`1.Enumerator.MoveNext() at
WPFApplication.ActionMonitor.OnActionEnded()
My reaction when seeing this stack trace: this is unbelievable! This must be a .Net bug!.
Because although ActionMonitor can be used in multithreading setting, but the crash above shouldn't occur-- all the _triggers ( the collection) modification happens inside a lock statement. This guarantees that one cannot iterate over the collection and modifying it at the same time.
And, if _triggers happened to contain an Action that involves ActionMonitor, then the we might get a deadlock, but it would never crash.
I have seen this crash exactly once, so I can't reproduce the problem at all. But base on my understanding of multithreading and lock statement, this exception can never have occurred.
Do I miss something here? Or is it known that .Net can behave it a very quirky way, when it involves System.Action?
You didn't shield your code against the following call:
private static ActionMonitor _actionMonitor;
static void Main(string[] args)
{
_actionMonitor = new ActionMonitor();
_actionMonitor.OnActionStarted();
_actionMonitor.ExecuteAfterAction(Foo1);
_actionMonitor.ExecuteAfterAction(Foo2);
_actionMonitor.OnActionEnded();
Console.ReadLine();
}
private static void Foo1()
{
_actionMonitor.OnActionStarted();
//Notice that if you would call _actionMonitor.OnActionEnded(); here instead of _actionMonitor.OnActionStarted(); - you would get a StackOverflow Exception
_actionMonitor.ExecuteAfterAction(Foo3);
}
private static void Foo2()
{
}
private static void Foo3()
{
}
FYI - that's the scenario Damien_The_Unbeliever is talking about in the comments.
To fix that issue the only 2 things that come in mind are
Don't call it like this, it's your class and your code is calling it so make sure you stick to your own rules
Get a copy of the _trigger list and enumarate this
About point 1, you could track if OnActionEnded is running and throw an exception if OnActionStarted is called while running:
private bool _isRunning = false;
public void OnActionEnded()
{
lock (_lockObj)
{
try
{
_isRunning = true;
IsInAction = false;
foreach (var trigger in _triggers)
trigger();
_triggers.Clear();
}
finally
{
_isRunning = false;
}
}
}
public void OnActionStarted()
{
lock (_lockObj)
{
if (_isRunning)
throw new NotSupportedException();
IsInAction = true;
}
}
About point 2, how about this
public class ActionMonitor
{
public ActionMonitor()
{
}
private object _lockObj = new object();
public void OnActionEnded()
{
lock (_lockObj)
{
IsInAction = false;
var tmpTriggers = _triggers;
_triggers = new HashSet<Action>();
foreach (var trigger in tmpTriggers)
trigger();
//have to decide what to do if _triggers isn't empty here, we could use a while loop till its empty
//so for example
while (true)
{
var tmpTriggers = _triggers;
_triggers = new HashSet<Action>();
if (tmpTriggers.Count == 0)
break;
foreach (var trigger in tmpTriggers)
trigger();
}
}
}
public void OnActionStarted()
{
lock (_lockObj) //fix the error #EricLippert talked about in comments
IsInAction = true;
}
private ISet<Action> _triggers = new HashSet<Action>();
public void ExecuteAfterAction(Action action)
{
lock (_lockObj)
{
if (IsInAction)
_triggers.Add(action);
else
action();
}
}
public bool IsInAction
{
get;private set;
}
}
This guarantees that one cannot iterate over the collection and modifying it at the same time.
No. You have a reentrancy problem.
Consider what happens if inside the call to trigger (same thread, so lock is already held), you modify the collection:
csharp
foreach (var trigger in _triggers)
trigger(); // _triggers modified in here
In fact if you look at your full callstack, you will be able to find the frame that is enumerating the collection. (by the time the exception happens, the code that modified the collection has been popped off the stack)

I feel like I'm re-inventing the wheel. Dispatch work to a specific thread. I want to process events from multiple threads sequentially on one thread

I know that similar things exist in WPF and forms applications with the Control.Invoke method, I also know of the existence of BackgroundWorker, ThreadPool etc.
However, I don't want to depend on Forms/WPF, and I want to make sure work is executed sequentially and on one thread.
Edit: Rationale: I want to drive a state machine from one thread. The events come from other threads tough. There is no UI.
So far I couldn't really figure out how to do this with existing framework classes but I might have misunderstood the documentation.
Edit: I forgot to mention I'm bound to .NET Framework 3.5
What I wrote so far:
public class Dispatcher
{
string Name;
Thread WorkerThread;
Queue<Action> WorkQueue;
List<Exception> Exceptions;
ManualResetEvent Gate;
volatile bool KeepRunning;
readonly object WorkLocker;
public override string ToString()
{
return String.Format("{0}({1})", this.GetType().Name, Name);
}
public Dispatcher(string name)
{
Name = name;
WorkLocker = new Object();
Gate = new ManualResetEvent(false);
WorkQueue = new Queue<Action>();
Exceptions = new List<Exception>();
}
public void Start()
{
if (WorkerThread == null)
{
WorkerThread = new Thread(doDispatch)
{
IsBackground = true,
Name = this.Name
};
WorkerThread.Start();
}
}
public void Stop()
{
if (WorkerThread != null && WorkerThread.IsAlive)
{
Dispatch(() => { KeepRunning = false; });
WorkerThread.Join();
}
WorkerThread = null;
}
public void Reset()
{
Stop();
lock (WorkLocker)
{
WorkQueue = new Queue<Action>();
Exceptions = new List<Exception>();
}
}
public void Dispatch(Action a)
{
lock (WorkLocker)
{
WorkQueue.Enqueue(a);
}
Gate.Set();
}
public List<Exception> CollectExceptions()
{
List<Exception> result = new List<Exception>();
lock(WorkLocker)
{
foreach(Exception e in Exceptions)
{
result.Add(e);
}
Exceptions.Clear();
}
return result;
}
private void doDispatch()
{
KeepRunning = true;
while (KeepRunning)
{
Gate.WaitOne();
lock (WorkLocker)
{
while (WorkQueue.Count > 0)
{
try
{
WorkQueue.Dequeue()?.Invoke();
}
catch (Exception e)
{
Exceptions.Add(e);
}
}
}
}
}
}
Is there a way to do something like this in a simpler way? Another nice feature would be being able to dispatch calls that have multiple arguments.
Since you are bound to 3.5 you can't use BlockingCollection or the DataFlow library...you'll have to roll your own implementation.
The sample code you provided is a good start, but you should apply the Single Responsibility Principle to make it cleaner and easier to refactor when(if?) you upgrade the .NET Framework.
I would do it like this:
Create a thread safe wrapper class around Queue that somewhat
mimics BlockingCollection, this answer provides a nice example
Structure your code around a consumer/producer flow and inject the wrapper

AutoResetEvent not blocking properly

I have a thread, which creates a variable number of worker threads and distributes tasks between them. This is solved by passing the threads a TaskQueue object, whose implementation you will see below.
These worker threads simply iterate over the TaskQueue object they were given, executing each task.
private class TaskQueue : IEnumerable<Task>
{
public int Count
{
get
{
lock(this.tasks)
{
return this.tasks.Count;
}
}
}
private readonly Queue<Task> tasks = new Queue<Task>();
private readonly AutoResetEvent taskWaitHandle = new AutoResetEvent(false);
private bool isFinishing = false;
private bool isFinished = false;
public void Enqueue(Task task)
{
Log.Trace("Entering Enqueue, lock...");
lock(this.tasks)
{
Log.Trace("Adding task, current count = {0}...", Count);
this.tasks.Enqueue(task);
if (Count == 1)
{
Log.Trace("Count = 1, so setting the wait handle...");
this.taskWaitHandle.Set();
}
}
Log.Trace("Exiting enqueue...");
}
public Task Dequeue()
{
Log.Trace("Entering Dequeue...");
if (Count == 0)
{
if (this.isFinishing)
{
Log.Trace("Finishing (before waiting) - isCompleted set, returning empty task.");
this.isFinished = true;
return new Task();
}
Log.Trace("Count = 0, lets wait for a task...");
this.taskWaitHandle.WaitOne();
Log.Trace("Wait handle let us through, Count = {0}, IsFinishing = {1}, Returned = {2}", Count, this.isFinishing);
if(this.isFinishing)
{
Log.Trace("Finishing - isCompleted set, returning empty task.");
this.isFinished = true;
return new Task();
}
}
Log.Trace("Entering task lock...");
lock(this.tasks)
{
Log.Trace("Entered task lock, about to dequeue next item, Count = {0}", Count);
return this.tasks.Dequeue();
}
}
public void Finish()
{
Log.Trace("Setting TaskQueue state to isFinishing = true and setting wait handle...");
this.isFinishing = true;
if (Count == 0)
{
this.taskWaitHandle.Set();
}
}
public IEnumerator<Task> GetEnumerator()
{
while(true)
{
Task t = Dequeue();
if(this.isFinished)
{
yield break;
}
yield return t;
}
}
IEnumerator IEnumerable.GetEnumerator()
{
return GetEnumerator();
}
}
As you can see, I'm using an AutoResetEvent object to make sure that the worker threads don't exit prematurely, i.e. before getting any tasks.
In a nutshell:
the main thread assigns a task to a thread by Enqeueue-ing a task to its TaskQueue
the main thread notifies the thread that are no more tasks to execute by calling the TaskQueue's Finish() method
the worker thread retrieves the next task assigned to it by calling the TaskQueue's Dequeue() method
The problem is that the Dequeue() method often throws an InvalidOperationException, saying that the Queue is empty. As you can see I added some logging, and it turns out, that the AutoResetEvent doesn't block the Dequeue(), even though there were no calls to its Set() method.
As I understand it, calling AutoResetEvent.Set() will allow a waiting thread to proceed (who previously called AutoResetEvent.WaitOne()), and then automatically calls AutoResetEvent.Reset(), blocking the next waiter.
So what can be wrong? Did I get something wrong? Do I have an error somewhere?
I'm sitting above this for 3 hours now, but I cannot figure out what's wrong.
Please help me!
Thank you very much!
Your dequeue code is incorrect. You check the Count under lock, then fly by the seams of your pants, and then you expect the tasks to have something. You cannot retain assumptions while you release the lock :). Your Count check and tasks.Dequeue must occur under lock:
bool TryDequeue(out Tasks task)
{
task = null;
lock (this.tasks) {
if (0 < tasks.Count) {
task = tasks.Dequeue();
}
}
if (null == task) {
Log.Trace ("Queue was empty");
}
return null != task;
}
You Enqueue() code is similarly riddled with problems. Your Enqueue/Dequeue don't ensure progress (you will have dequeue threads blocked waiting even though there are items in the queue). Your signature of Enqueue() is wrong. Overall your post is very very poor code. Frankly, I think you're trying to chew more than you can bite here... Oh, and never log under lock.
I strongly suggest you just use ConcurrentQueue.
If you don't have access to .Net 4.0 here is an implementation to get you started:
public class ConcurrentQueue<T>:IEnumerable<T>
{
volatile bool fFinished = false;
ManualResetEvent eventAdded = new ManualResetEvent(false);
private Queue<T> queue = new Queue<T>();
private object syncRoot = new object();
public void SetFinished()
{
lock (syncRoot)
{
fFinished = true;
eventAdded.Set();
}
}
public void Enqueue(T t)
{
Debug.Assert (false == fFinished);
lock (syncRoot)
{
queue.Enqueue(t);
eventAdded.Set();
}
}
private bool Dequeue(out T t)
{
do
{
lock (syncRoot)
{
if (0 < queue.Count)
{
t = queue.Dequeue();
return true;
}
if (false == fFinished)
{
eventAdded.Reset ();
}
}
if (false == fFinished)
{
eventAdded.WaitOne();
}
else
{
break;
}
} while (true);
t = default(T);
return false;
}
public IEnumerator<T> GetEnumerator()
{
T t;
while (Dequeue(out t))
{
yield return t;
}
}
System.Collections.IEnumerator System.Collections.IEnumerable.GetEnumerator()
{
return GetEnumerator();
}
}
A more detailed answer from me is pending, but I just want to point out something very important.
If you're using .NET 3.5, you can use the ConcurrentQueue<T> class. A backport is included in the Rx extensions library, which is available for .NET 3.5.
Since you want blocking behavior, you would need to wrap a ConcurrentQueue<T> in a BlockingCollection<T> (also available as part of Rx).
It looks like you are trying to replicate a blocking queue. One already exists in the .NET 4.0 BCL as a BlockingCollection. If .NET 4.0 is not an option for you then you can use this code. It use the Monitor.Wait and Monitor.Pulse method instead of AutoResetEvent.
public class BlockingCollection<T>
{
private Queue<T> m_Queue = new Queue<T>();
public T Take() // Dequeue
{
lock (m_Queue)
{
while (m_Queue.Count <= 0)
{
Monitor.Wait(m_Queue);
}
return m_Queue.Dequeue();
}
}
public void Add(T data) // Enqueue
{
lock (m_Queue)
{
m_Queue.Enqueue(data);
Monitor.Pulse(m_Queue);
}
}
}
Update:
I am fairly certain that it is not possible to implement a producer-consumer queue using AutoResetEvent if you want it to be thread-safe for multiple producers and multiple consumers (I am prepared to be proven wrong if someone can come up with a counter example). Sure, you will see examples on the internet, but they are all wrong. In fact, one such attempt by Microsoft is flawed in that the queue can get live-locked.

Synchronizing thread communication?

Just for the heck of it I'm trying to emulate how JRuby generators work using threads in C#.
Also, I'm fully aware that C# has built in support for yield return, I'm just toying around a bit.
I guess it's some sort of poor mans coroutines by keeping multiple callstacks alive using threads. (even though none of the callstacks should execute at the same time)
The idea is like this:
The consumer thread requests a value
The worker thread provides a value and yields back to the consumer thread
Repeat untill worker thread is done
So, what would be the correct way of doing the following?
//example
class Program
{
static void Main(string[] args)
{
ThreadedEnumerator<string> enumerator = new ThreadedEnumerator<string>();
enumerator.Init(() =>
{
for (int i = 1; i < 100; i++)
{
enumerator.Yield(i.ToString());
}
});
foreach (var item in enumerator)
{
Console.WriteLine(item);
};
Console.ReadLine();
}
}
//naive threaded enumerator
public class ThreadedEnumerator<T> : IEnumerator<T>, IEnumerable<T>
{
private Thread enumeratorThread;
private T current;
private bool hasMore = true;
private bool isStarted = false;
AutoResetEvent enumeratorEvent = new AutoResetEvent(false);
AutoResetEvent consumerEvent = new AutoResetEvent(false);
public void Yield(T item)
{
//wait for consumer to request a value
consumerEvent.WaitOne();
//assign the value
current = item;
//signal that we have yielded the requested
enumeratorEvent.Set();
}
public void Init(Action userAction)
{
Action WrappedAction = () =>
{
userAction();
consumerEvent.WaitOne();
enumeratorEvent.Set();
hasMore = false;
};
ThreadStart ts = new ThreadStart(WrappedAction);
enumeratorThread = new Thread(ts);
enumeratorThread.IsBackground = true;
isStarted = false;
}
public T Current
{
get { return current; }
}
public void Dispose()
{
enumeratorThread.Abort();
}
object System.Collections.IEnumerator.Current
{
get { return Current; }
}
public bool MoveNext()
{
if (!isStarted)
{
isStarted = true;
enumeratorThread.Start();
}
//signal that we are ready to receive a value
consumerEvent.Set();
//wait for the enumerator to yield
enumeratorEvent.WaitOne();
return hasMore;
}
public void Reset()
{
throw new NotImplementedException();
}
public IEnumerator<T> GetEnumerator()
{
return this;
}
System.Collections.IEnumerator System.Collections.IEnumerable.GetEnumerator()
{
return this;
}
}
Ideas?
There are many ways to implement the producer/consumer pattern in C#.
The best way, I guess, is using TPL (Task, BlockingCollection). See an example here.

Categories