ActiveX in AsyncCallback - c#

On developing a websocket server I initiate the listener as following:
private void StartAccept()
{
_listener.BeginAcceptTcpClient(new AsyncCallback(HandleAsyncConnection), null);
}
for having it non-blocking to be able to accept multiple connection.
One of the functions in the responding callback should print a HTML page to the selected default printer:
WebBrowser webBrowser = new WebBrowser();
webBrowser.DocumentText = "<html><body><p>I am HTML text!</p><body></html>";
webBrowser.Print();
which is failing since i try to create the WebBrowser object from within the AsyncCallback:
ActiveX control '8856f961-340a-11d0-a96b-00c04fd705a2' cannot be instantiated because the current thread is not in a single-threaded apartment.
How can i get the WebBrowser object get created here?

Thanks to the following answer:
Print html document from Windows Service in C# without print dialog
I was able to get the printing working:
StaTaskScheduler Sta = new StaTaskScheduler(1);
public void PrintHtml(string htmlPath)
{
Task.Factory.StartNew(() => PrintOnStaThread(htmlPath), CancellationToken.None, TaskCreationOptions.None, Sta).Wait();
}
void PrintOnStaThread(string htmlText)
{
const short PRINT_WAITFORCOMPLETION = 2;
const int OLECMDID_PRINT = 6;
const int OLECMDEXECOPT_DONTPROMPTUSER = 2;
using (var browser = new WebBrowser())
{
browser.DocumentText = htmlText;
while (browser.ReadyState != WebBrowserReadyState.Complete)
Application.DoEvents();
dynamic ie = browser.ActiveXInstance;
ie.ExecWB(OLECMDID_PRINT, OLECMDEXECOPT_DONTPROMPTUSER, PRINT_WAITFORCOMPLETION);
}
}
The StaTaskScheduler comes from the above mentioned thread:
//--------------------------------------------------------------------------
//
// Copyright (c) Microsoft Corporation. All rights reserved.
//
// File: StaTaskScheduler.cs
//
//--------------------------------------------------------------------------
using System.Collections.Concurrent;
using System.Collections.Generic;
using System.Linq;
namespace System.Threading.Tasks.Schedulers
{
/// <summary>Provides a scheduler that uses STA threads.</summary>
public sealed class StaTaskScheduler : TaskScheduler, IDisposable
{
/// <summary>Stores the queued tasks to be executed by our pool of STA threads.</summary>
private BlockingCollection<Task> _tasks;
/// <summary>The STA threads used by the scheduler.</summary>
private readonly List<Thread> _threads;
/// <summary>Initializes a new instance of the StaTaskScheduler class with the specified concurrency level.</summary>
/// <param name="numberOfThreads">The number of threads that should be created and used by this scheduler.</param>
public StaTaskScheduler(int numberOfThreads)
{
// Validate arguments
if (numberOfThreads < 1) throw new ArgumentOutOfRangeException("concurrencyLevel");
// Initialize the tasks collection
_tasks = new BlockingCollection<Task>();
// Create the threads to be used by this scheduler
_threads = Enumerable.Range(0, numberOfThreads).Select(i =>
{
var thread = new Thread(() =>
{
// Continually get the next task and try to execute it.
// This will continue until the scheduler is disposed and no more tasks remain.
foreach (var t in _tasks.GetConsumingEnumerable())
{
TryExecuteTask(t);
}
});
thread.IsBackground = true;
thread.SetApartmentState(ApartmentState.STA);
return thread;
}).ToList();
// Start all of the threads
_threads.ForEach(t => t.Start());
}
/// <summary>Queues a Task to be executed by this scheduler.</summary>
/// <param name="task">The task to be executed.</param>
protected override void QueueTask(Task task)
{
// Push it into the blocking collection of tasks
_tasks.Add(task);
}
/// <summary>Provides a list of the scheduled tasks for the debugger to consume.</summary>
/// <returns>An enumerable of all tasks currently scheduled.</returns>
protected override IEnumerable<Task> GetScheduledTasks()
{
// Serialize the contents of the blocking collection of tasks for the debugger
return _tasks.ToArray();
}
/// <summary>Determines whether a Task may be inlined.</summary>
/// <param name="task">The task to be executed.</param>
/// <param name="taskWasPreviouslyQueued">Whether the task was previously queued.</param>
/// <returns>true if the task was successfully inlined; otherwise, false.</returns>
protected override bool TryExecuteTaskInline(Task task, bool taskWasPreviouslyQueued)
{
// Try to inline if the current thread is STA
return
Thread.CurrentThread.GetApartmentState() == ApartmentState.STA &&
TryExecuteTask(task);
}
/// <summary>Gets the maximum concurrency level supported by this scheduler.</summary>
public override int MaximumConcurrencyLevel
{
get { return _threads.Count; }
}
/// <summary>
/// Cleans up the scheduler by indicating that no more tasks will be queued.
/// This method blocks until all threads successfully shutdown.
/// </summary>
public void Dispose()
{
if (_tasks != null)
{
// Indicate that no new tasks will be coming in
_tasks.CompleteAdding();
// Wait for all threads to finish processing tasks
foreach (var thread in _threads) thread.Join();
// Cleanup
_tasks.Dispose();
_tasks = null;
}
}
}
}
Now i can print from my AsyncCallback just by calling:
PrintHtml("<html><body><h1>I AM A html text</h1></body></hmtl>");

Related

Start new process under the current debugger session

I'm writing a simple test application (a WPF application in this case, if it matters), which attempts to launch a second application from within it (in this case, a second instance of the same app, but that should really matter). If the first program is running inside a debugger (in VS2013, in my case), I want the secondary instance launched to be automatically attached to the first instance's debug session.
Right now, I'm using Process.Start to launch the second process, but if I try calling Debugger.Launch within it, it will show the "choose a debugger" window where the current session is explicitly excluded from the list.
Is there a way that I can, from the first process, explicitly launch a second process in the current debugging session, or (failing that) get a handle to the current debugging session and call code to attach to a process? Or, alternately, a way to get the second process to call a specific debugger session to attach to it?
(I am familiar with various macros or shortcuts within VS to quickly attach to the second process, and I'm using them already. Just wondering if there's a way to have it happen automatically).
The Visual Studio team has released a Visual Studio extension that allows automatically attaching child processes to the current debugger: Introducing the Child Process Debugging Power Tool.
It is available on the Gallery for Visual Studio 2013 and above.
I have personally come up with the following code to attach a new process manually to the current debugger:
using System;
using System.Runtime.InteropServices;
using System.Runtime.InteropServices.ComTypes;
using System.Threading;
using System.Threading.Tasks;
using System.Threading.Tasks.Schedulers;
using EnvDTE;
using EnvDTE80;
using System.Collections.Concurrent;
using System.Collections.Generic;
using System.Linq;
namespace Test {
public static class Debugging {
private static _DTE Dte;
private static readonly object DteLock = new object();
private static bool Initialized;
public static void AttachCurrentDebuggerToProcess(int processId) {
lock (DteLock) {
using (var sta = new StaTaskScheduler(numberOfThreads: 1)) {
Task.Factory.StartNew(() => {
if (System.Threading.Thread.CurrentThread.GetApartmentState() != ApartmentState.STA) throw new NotSupportedException("Thread should be in STA appartment state.");
// Register the IOleMessageFilter to handle any threading errors.
MessageFilter.Register();
if (!Initialized) {
using (var currentProcess = System.Diagnostics.Process.GetCurrentProcess())
using (var vsInstances = System.Diagnostics.Process.GetProcessesByName("devenv").AsDisposable()) {
foreach (var p in vsInstances.Enumerable) {
_DTE dte;
if (TryGetVSInstance(p.Id, out dte)) {
//Will return null if target process doesn't have the same elevated rights as current process.
Utils.Retry(() => {
var debugger = dte?.Debugger;
if (debugger != null) {
foreach (Process2 process in debugger.DebuggedProcesses) {
if (process.ProcessID == currentProcess.Id) {
Dte = dte;
break;
}
}
}
}, nbRetries: int.MaxValue, msInterval: 1000, retryOnlyOnExceptionTypes: typeof(COMException).InArray());
if (Dte != null) break;
}
}
}
Initialized = true;
}
if (Dte != null) {
foreach (Process2 process in Dte.Debugger.LocalProcesses) {
if (process.ProcessID == processId) {
process.Attach2();
Dte.Debugger.CurrentProcess = process;
}
}
}
//turn off the IOleMessageFilter.
MessageFilter.Revoke();
}, CancellationToken.None, TaskCreationOptions.None, sta).Wait();
}
}
}
private static bool TryGetVSInstance(int processId, out _DTE instance) {
IntPtr numFetched = IntPtr.Zero;
IRunningObjectTable runningObjectTable;
IEnumMoniker monikerEnumerator;
IMoniker[] monikers = new IMoniker[1];
GetRunningObjectTable(0, out runningObjectTable);
runningObjectTable.EnumRunning(out monikerEnumerator);
monikerEnumerator.Reset();
while (monikerEnumerator.Next(1, monikers, numFetched) == 0) {
IBindCtx ctx;
CreateBindCtx(0, out ctx);
string runningObjectName;
monikers[0].GetDisplayName(ctx, null, out runningObjectName);
object runningObjectVal;
runningObjectTable.GetObject(monikers[0], out runningObjectVal);
if (runningObjectVal is _DTE && runningObjectName.StartsWith("!VisualStudio")) {
int currentProcessId = int.Parse(runningObjectName.Split(':')[1]);
if (currentProcessId == processId) {
instance = (_DTE)runningObjectVal;
return true;
}
}
}
instance = null;
return false;
}
[DllImport("ole32.dll")]
private static extern void CreateBindCtx(int reserved, out IBindCtx ppbc);
[DllImport("ole32.dll")]
private static extern int GetRunningObjectTable(int reserved, out IRunningObjectTable prot);
}
}
namespace System.Threading.Tasks.Schedulers {
/// <summary>Provides a scheduler that uses STA threads. From ParallelExtensionsExtras https://code.msdn.microsoft.com/Samples-for-Parallel-b4b76364/sourcecode?fileId=44488&pathId=574018573</summary>
public sealed class StaTaskScheduler : TaskScheduler, IDisposable {
/// <summary>Stores the queued tasks to be executed by our pool of STA threads.</summary>
private BlockingCollection<Task> _tasks;
/// <summary>The STA threads used by the scheduler.</summary>
private readonly List<Thread> _threads;
/// <summary>Initializes a new instance of the StaTaskScheduler class with the specified concurrency level.</summary>
/// <param name="numberOfThreads">The number of threads that should be created and used by this scheduler.</param>
public StaTaskScheduler(int numberOfThreads) {
// Validate arguments
if (numberOfThreads < 1) throw new ArgumentOutOfRangeException(nameof(numberOfThreads));
// Initialize the tasks collection
_tasks = new BlockingCollection<Task>();
// Create the threads to be used by this scheduler
_threads = Enumerable.Range(0, numberOfThreads).Select(i =>
{
var thread = new Thread(() => {
// Continually get the next task and try to execute it.
// This will continue until the scheduler is disposed and no more tasks remain.
foreach (var t in _tasks.GetConsumingEnumerable()) {
TryExecuteTask(t);
}
}) { IsBackground = true };
thread.SetApartmentState(ApartmentState.STA);
return thread;
}).ToList();
// Start all of the threads
_threads.ForEach(t => t.Start());
}
/// <summary>Queues a Task to be executed by this scheduler.</summary>
/// <param name="task">The task to be executed.</param>
protected override void QueueTask(Task task) {
// Push it into the blocking collection of tasks
_tasks.Add(task);
}
/// <summary>Provides a list of the scheduled tasks for the debugger to consume.</summary>
/// <returns>An enumerable of all tasks currently scheduled.</returns>
protected override IEnumerable<Task> GetScheduledTasks() {
// Serialize the contents of the blocking collection of tasks for the debugger
return _tasks.ToArray();
}
/// <summary>Determines whether a Task may be inlined.</summary>
/// <param name="task">The task to be executed.</param>
/// <param name="taskWasPreviouslyQueued">Whether the task was previously queued.</param>
/// <returns>true if the task was successfully inlined; otherwise, false.</returns>
protected override bool TryExecuteTaskInline(Task task, bool taskWasPreviouslyQueued) {
// Try to inline if the current thread is STA
return
Thread.CurrentThread.GetApartmentState() == ApartmentState.STA &&
TryExecuteTask(task);
}
/// <summary>Gets the maximum concurrency level supported by this scheduler.</summary>
public override int MaximumConcurrencyLevel => this._threads.Count;
/// <summary>
/// Cleans up the scheduler by indicating that no more tasks will be queued.
/// This method blocks until all threads successfully shutdown.
/// </summary>
public void Dispose() {
if (_tasks != null) {
// Indicate that no new tasks will be coming in
_tasks.CompleteAdding();
// Wait for all threads to finish processing tasks
foreach (var thread in _threads) thread.Join();
// Cleanup
_tasks.Dispose();
_tasks = null;
}
}
}
}
Usage:
AttachCurrentDebuggerToProcess(1234); //where 1234 is your pid

Which is the best way to release thread that uses BlockingCollection?

I have a class that uses BlockingCollection like this:
public class Logger : IDisposable
{
private BlockingCollection<LogMessage> _messages = null;
private Thread _worker = null;
private bool _started = false;
public void Start()
{
if (_started) return;
//Some logic to open log file
OpenLogFile();
_messages = new BlockingCollection<LogMessage>(); //int.MaxValue is the default upper-bound
_worker = new Thread(Work) { IsBackground = true };
_worker.Start();
_started = true;
}
public void Stop()
{
if (!_started) return;
// prohibit adding new messages to the queue,
// and cause TryTake to return false when the queue becomes empty.
_messages.CompleteAdding();
// Wait for the consumer's thread to finish.
_worker.Join();
//Dispose managed resources
_worker.Dispose();
_messages.Dispose();
//Some logic to close log file
CloseLogFile();
_started = false;
}
/// <summary>
/// Implements IDiposable
/// In this case, it is simply an alias for Stop()
/// </summary>
void IDisposable.Dispose()
{
Stop();
}
/// <summary>
/// This is message consumer thread
/// </summary>
private void Work()
{
LogMessage message;
//Try to get data from queue
while(_messages.TryTake(out message, Timeout.Infinite))
WriteLogMessage(message); //... some simple logic to write 'message'
}
}
I create a new instance of the Logger class, call its Start() method. Then, if I forget to call the Dispose method when the instance is no longer referenced, then the Worker Thread will never end. That's a kind of memory leak. Am I right? and how to overcome this?
You may try to keep only a weak reference to the BlockingCollection in your worker thread and do not reference an object that is referencing the BlockingCollection. I made a static method to ensure that we don't reference the Logger instance this.
This way the collection can be finalized/collected when it is no longer referenced. I'm not sure it'll work, you have to try, It depends on whether TryTake keeps the collection alive or not. It may not work in debug as the GC behave differently, so try it in release without debugger attached.
public class Logger : IDisposable
{
private BlockingCollection<LogMessage> _messages = null;
private Thread _worker = null;
private bool _started = false;
public void Start()
{
if (_started) return;
//Some logic to open log file
OpenLogFile();
_messages = new BlockingCollection<LogMessage>(); //int.MaxValue is the default upper-bound
_worker = new Thread(Work) { IsBackground = true };
_worker.Start(new WeakReference<BlockingCollection<LogMessage>>(_messages));
_started = true;
}
public void Stop()
{
if (!_started) return;
// prohibit adding new messages to the queue,
// and cause TryTake to return false when the queue becomes empty.
_messages.CompleteAdding();
// Wait for the consumer's thread to finish.
_worker.Join();
//Dispose managed resources
_worker.Dispose();
_messages.Dispose();
//Some logic to close log file
CloseLogFile();
_started = false;
}
/// <summary>
/// Implements IDiposable
/// In this case, it is simply an alias for Stop()
/// </summary>
void IDisposable.Dispose()
{
Stop();
}
/// <summary>
/// This is message consumer thread
/// </summary>
private static void Work(object state)
{
LogMessage message;
//Try to get data from queue
do
{
BlockingCollection<LogMessage> messages;
if (((WeakReference<BlockingCollection<LogMessage>>)state).TryGetTarget(out messages)
&& messages.TryTake(out message, Timeout.Infinite))
{
WriteLogMessage(message); //... some simple logic to write 'message'
continue;
}
} while (false);
}
}

Ensuring execution order between multiple threads

I am having a kinda annoying problem mostly due to my low skill level/experience in C# multithreading.
Here is the background. In my framework, I have a static class named WaitFormHelper, which has two static methods (well... actually more but we don't care here), Start() and Close()
The Start() method initializes and starts a thread which will acquire a lock on the locker object and create a WaitForm (which is a small loading control with a custom message and a progress bar)
In my current project, I have a method which starts a WaitForm, performs calculations, then closes the WaitForm. Nothing fancy at all.
The method looks like the following, I simplified it as much as possible:
public void PerformCalculations()
{
try
{
WaitFormHelper.Start("Title", "message", false);
if (this.CalculationsParameters.IsInvalid)
{
return;
}
// Perform all those lengthy calculations here
}
// catch whatever exception I may have to catch, we don't care here
finally
{
WaitFormHelper.Close();
}
}
Here are the Start() and Close() methods with related methods & attributes, simplified as well:
private static Thread instanceCaller;
private static WaitForm instance;
private static AutoResetEvent waitFormStarted = new AutoResetEvent(false);
private static object locker = new object();
/// <summary>
/// Initializes WaitForm to start a single task
/// </summary>
/// <param name="header">WaitForm header</param>
/// <param name="message">Message displayed</param>
/// <param name="showProgressBar">True if we want a progress bar, else false</param>
public static void Start(string header, string message, bool showProgressBar)
{
InitializeCallerThread(showProgressBar, header, message);
instanceCaller.Start();
}
/// <summary>
/// Initializes caller thread for executing a single command
/// </summary>
/// <param name="showProgressBar"></param>
/// <param name="header"></param>
/// <param name="message"></param>
private static void InitializeCallerThread(bool showProgressBar, string header, string message)
{
waitFormStarted.Reset();
instanceCaller = new Thread(() =>
{
lock (locker)
{
instance = new WaitForm()
{
Header = header,
Message = message,
IsProgressBarVisible = showProgressBar
};
waitFormStarted.Set();
}
instance.ShowDialog();
});
instanceCaller.Name = "WaitForm thread";
instanceCaller.SetApartmentState(ApartmentState.STA);
instanceCaller.IsBackground = true;
}
/// <summary>
/// Closes current form
/// </summary>
public static void Close()
{
lock (locker)
{
if (instance != null && !instance.IsClosed)
{
waitFormStarted.WaitOne();
instance.FinalizeWork();
instance.Dispatcher.Invoke(
new Action(() =>
{
instance.Close();
}));
}
}
}
Now let's get to the problem
This usually works fine, except in this case:
If this.CalculationsParameters.IsInvalid is true (ie. as you probably already understood, I can't perform calculations for some reason, such as "user entering crap in my forms"), the execution directly closes my WaitForm.
However in this case, the main thread will reach the Close method and acquire a lock on the locker object before the thread fired by the Start() method.
What happens is that: Close acquires lock, tries to close the form but instance is still null because the Thread created in InitializeCallerThread is still waiting for the lock to actually create it. Close releases lock, InitializeCallerThread acquires it and... shows a WaitForm which will not close.
Now I know that I can simply fix this problem by testing if the calculation parameters are invalid before actually starting the WaitForm, but well, problem is this WaitForm is supposed to be used by all applications of our framework (which includes 40+ different apps used and maintained in 4 countries), so ideally I'd rather prefer seeing my WaitForm working at all cases.
Do you have any idea on how could I synchronize this to make sure the starter thread is definitely called and executed first?
As you can see I already use an AutoResetEvent for this matter, but if I listen to it before the test if (instance != null) I will end up stuck in my case.
Hope this is clear enough! Thank you!
You need some mechanism to make a "queue control", coordinating steps according to the order you want them occurring.
Recently I have needed to implement something like this to force a specific order in the execution of several threads in a unit test.
Here is my suggestion:
QueueSynchronizer:
/// <summary>
/// Synchronizes steps between threads.
/// </summary>
public class QueueSynchronizer
{
/// <summary>
/// Constructor.
/// </summary>
/// <param name="minWait">Minimum waiting time until the next try.</param>
/// <param name="maxWait">Maximum waiting time until the next try.</param>
public QueueSynchronizer(Int32 minWait, Int32 maxWait)
{
}
private Mutex mx = new Mutex();
/// <summary>
/// Minimum waiting time until the next try.
/// </summary>
private Int32 minWait = 5;
/// <summary>
/// Maximum waiting time until the next try.
/// </summary>
private Int32 maxWait = 500;
int currentStep = 1;
/// <summary>
/// Key: order in the queue; Value: Time to wait.
/// </summary>
private Dictionary<int, int> waitingTimeForNextMap = new Dictionary<int, int>();
/// <summary>
/// Synchronizes by the order in the queue. It starts from 1. If is not
/// its turn, the thread waits for a moment, after that, it tries again,
/// and so on until its turn.
/// </summary>
/// <param name="orderInTheQueue">Order in the queue. It starts from 1.</param>
/// <returns>The <see cref="Mutex"/>The mutex that must be released at the end of turn.
/// </returns>
public Mutex Sincronize(int orderInTheQueue)
{
do
{
//while it is not the turn, the thread will stay in this loop and sleeping for 100, 200, ... 1000 ms
if (orderInTheQueue != this.currentStep)
{
//The next in queue will be waiting here (other threads).
mx.WaitOne();
mx.ReleaseMutex();
//Prevents 100% processing while the current step does not happen
if (!waitingTimeForNextMap.ContainsKey(orderInTheQueue))
{
waitingTimeForNextMap[orderInTheQueue] = this.minWait;
}
Thread.Sleep(waitingTimeForNextMap[orderInTheQueue]);
waitingTimeForNextMap[orderInTheQueue] = Math.Min(waitingTimeForNextMap[orderInTheQueue] * 2, this.maxWait);
}
} while (orderInTheQueue != this.currentStep);
mx.WaitOne();
currentStep++;
return mx;
}
}
Start() and Close() with QueueSynchronizer:
//synchronizer
private static QueueSynchronizer queueSynchronizer;
private static Thread instanceCaller;
private static WaitForm instance;
private static AutoResetEvent waitFormStarted = new AutoResetEvent(false);
private static object locker = new object();
/// <summary>
/// Initializes WaitForm to start a single task
/// </summary>
/// <param name="header">WaitForm header</param>
/// <param name="message">Message displayed</param>
/// <param name="showProgressBar">True if we want a progress bar, else false</param>
public static void Start(string header, string message, bool showProgressBar)
{
queueSynchronizer = new QueueSynchronizer();
InitializeCallerThread(showProgressBar, header, message);
instanceCaller.Start();
}
/// <summary>
/// Initializes caller thread for executing a single command
/// </summary>
/// <param name="showProgressBar"></param>
/// <param name="header"></param>
/// <param name="message"></param>
private static void InitializeCallerThread(bool showProgressBar, string header, string message)
{
waitFormStarted.Reset();
instanceCaller = new Thread(() =>
{
lock (locker)
{
//Queuing to run on first.
Mutex mx = queueSynchronizer.Sincronize(1);
try
{
instance = new WaitForm()
{
Header = header,
Message = message,
IsProgressBarVisible = showProgressBar
};
}
finally
{
//I think is here that ends the first step!?
mx.ReleaseMutex();
}
waitFormStarted.Set();
}
instance.ShowDialog();
});
instanceCaller.Name = "WaitForm thread";
instanceCaller.SetApartmentState(ApartmentState.STA);
instanceCaller.IsBackground = true;
}
/// <summary>
/// Closes current form
/// </summary>
public static void Close()
{
//Queuing to run on second.
Mutex mx = queueSynchronizer.Sincronize(2);
try
{
lock (locker)
{
if (instance != null && !instance.IsClosed)
{
waitFormStarted.WaitOne();
instance.FinalizeWork();
instance.Dispatcher.Invoke(
new Action(() =>
{
instance.Close();
}));
}
}
}
finally
{
mx.ReleaseMutex();
}
}
You need to Join on the thread that you created. By joining on a thread you'll block at that point until the thread has finished executing.
public static void Close()
{
lock (locker)
{
instanceCaller.Join();
if (instance != null && !instance.IsClosed)
{
waitFormStarted.WaitOne();
instance.FinalizeWork();
instance.Dispatcher.Invoke(
new Action(() =>
{
instance.Close();
}));
}
}
}

Using AutoResetEvent to signal worker thread

I have a service that is running constantly processing data, it receives requests to process new data through messaging. While it's busy processing new requests get merged together so that they are then all processed at once. An AutoResetEvent is used to notify the processor that a new request is available.
My question is in EventLoop should it be possible that currentRequest after the WaitOne to be null?
Is it bad practice to have the _eventAvailable.Set() outside of the lock(_eventLocker)? I moved it out so that it wouldn't start going at the WaitOne and immediately contest the lock(_eventLocker).
Any suggestions on how to better write the following code?
public sealed class RealtimeRunner : MarshalByRefObject
{
/// <summary>
/// The actual event, new events get merged into this if it is not null
/// </summary>
private Request _pendingRequest;
/// <summary>
/// Used to signal the runner thread when an event is available to process
/// </summary>
private readonly AutoResetEvent _eventAvailable = new AutoResetEvent(false);
private readonly object _eventLocker = new object();
/// <summary>
/// Called on a background thread via messaging
/// </summary>
public void QueueEvent(RealtimeProcessorMessage newRequest)
{
bool mergedRequest;
lock (_eventLocker)
{
if (_pendingRequest == null)
{
mergedRequest = false;
_pendingRequest = new Request(newRequest, _engine);
}
else
{
mergedRequest = true;
_pendingRequest.Merge(newRequest, _engine);
}
}
_eventAvailable.Set();
}
/// <summary>
/// This is running on its own thread
/// </summary>
private void EventLoop()
{
while (true)
{
// Block until something exists in _pendingRequest
_eventAvailable.WaitOne();
Request currentRequest;
lock (_eventLocker)
{
currentRequest = _pendingRequest;
_pendingRequest = null;
}
// CAN THIS EVER BE NULL?
if (currentRequest == null)
continue;
//do stuff with the currentRequest here
}
}
}
Yes, the if (currrentRequest == null) could evaluate to true. Consider two threads racing to call _eventAvailable.Set(). One completes the call and the other gets preempted. Meanwhile the EventLoop thread wakes up and completes an entire iteration of the loop. You now have a situation where _pendingRequest is null and the WaitHandle is still waiting to be signaled again.
I want to present an entirely different solution to the problem. It looks your code could be simplified by using the producer-consumer pattern. This pattern is most easily implemented using a blocking queue. The BlockingCollection class implements such a queue.
public sealed class RealtimeRunner : MarshalByRefObject
{
private BlockingCollection<Request> m_Queue = new BlockingCollection<Request>();
public void QueueEvent(RealtimeProcessorMessage newRequest)
{
m_Queue.Add(new Request(newRequest _engine));
}
private void EventLoop()
{
while (true)
{
// This blocks until an item appears in the queue.
Request request = m_Queue.Take();
// Process the request here.
}
}
}

Is this BlockingQueue susceptible to deadlock?

I've been using this code as a queue that blocks on Dequeue() until an element is enqueued. I've used this code for a few years now in several projects, all with no issues... until now. I'm seeing a deadlock in some code I'm writing now, and in investigating the problem, my 'eye of suspicion' has settled on this BlockingQueue<T>. I can't prove it, so I figured I'd ask some people smarter than me to review it for potential issues. Can you guys see anything that might cause a deadlock in this code?
public class BlockingQueue<T>
{
private readonly Queue<T> _queue;
private readonly ManualResetEvent _event;
/// <summary>
/// Constructor
/// </summary>
public BlockingQueue()
{
_queue = new Queue<T>();
_event = new ManualResetEvent(false);
}
/// <summary>
/// Read-only property to get the size of the queue
/// </summary>
public int Size
{
get
{
int count;
lock (_queue)
{
count = _queue.Count;
}
return count;
}
}
/// <summary>
/// Enqueues element on the queue
/// </summary>
/// <param name="element">Element to enqueue</param>
public void Enqueue(T element)
{
lock (_queue)
{
_queue.Enqueue(element);
_event.Set();
}
}
/// <summary>
/// Dequeues an element from the queue
/// </summary>
/// <returns>Dequeued element</returns>
public T Dequeue()
{
T element;
while (true)
{
if (Size == 0)
{
_event.Reset();
_event.WaitOne();
}
lock (_queue)
{
if (_queue.Count == 0) continue;
element = _queue.Dequeue();
break;
}
}
return element;
}
/// <summary>
/// Clears the queue
/// </summary>
public void Clear()
{
lock (_queue)
{
_queue.Clear();
}
}
}
I think this could be your problem:
Thread 1 Thread 2
Dequeue
Enqueue
if (Size == 0) // Thread 1 gets the lock
lock (_queue) // Thread 2 has to wait
return _queue.Count // Thread 1 sees: Size == 0
_queue.Enqueue // Thread 2 gets the lock
_event.Set
_event.Reset // uh oh
_event.WaitOne // now Dequeue's going to block
// until Enqueue gets called again
// (even though queue isn't empty)
This code is broken in several ways. Here's one scenario. There's a race-condition between if (Size == 0) and _event.Reset(). An Enqueue might fire between the two, and its signal will be lost.
An unounded-length BlockingQueue is much more easily implemented with a semaphore.
I don't know about your requirements or what else your class does, but if you can use .NET 4, you might want to consider using ConcurrentQueue<T> and BlockingCollection<T> which, used together, should give you a blocking queue.

Categories