Multi-threading testing of Semaphore based business logic in C# / MSTEST - c#

In my C# / Net Core project, I have a method like
var MySemaphore = new SemaphoreSlim(1, 1);
public async Task<RetrievalStatusReply> InitializeUpdate()
{
// This instruction lets routine quit if lock is taken by other thread
if (!MySemaphore.Wait(0))
{
// if can not enter the lock
return;
}
try
{
// business logic
}
finally
{
MySemaphore.Release();
}
}
and would like to test its behaviour using some kind of multi-threading setup or test utility library in an MsTest, for example, similar to Java's Concurrentunit.
How I could write such a test for C# / Net Core ?

Testing multithreaded code can be quite messy. My preferred approach is to use ManualResetEvents to control the execution order. This usually require that you have some way to instrument the code to test, by injecting the test code or some other way.
So something like this should give a consistent test result:
SemaphoreSlim MySemaphore = new SemaphoreSlim(1, 1);
public ManualResetEventSlim BusinessLogicReached= new ManualResetEventSlim(false);
public ManualResetEventSlim BusinessLogicWaiting = new ManualResetEventSlim(false);
public bool InitializeUpdate()
{
// This instruction lets routine quit if lock is taken by other thread
if (!MySemaphore.Wait(0))
{
// if can not enter the lock
return false;
}
try
{
BusinessLogicReached.Set();
BusinessLogicWaiting.Wait();
return true;
}
finally
{
MySemaphore.Release();
}
}
[Test]
public void RunTest()
{
var updateTask = Task.Run(InitializeUpdate);
BusinessLogicReached.Wait();
Assert.IsFalse(InitializeUpdate()); // Semaphore taken
BusinessLogicWaiting.Set();
updateTask.Wait();
Assert.IsTrue(updateTask.Result);
Assert.IsTrue(InitializeUpdate()); // Semaphore released
}
That said, these kinds of tests are cumbersome to write and maintain, and does not prove general thread-safety, so it might be more cost effective to do thorough code reviews of multi-threaded code.

Related

System.Threading.ThreadStateException (maybe a raise condition)

First, I would like to describe the general structure of the classes/methods involved in my problem.
I have a class which should start a thread cyclically. This thread deals with a function which writes the log entries into a log file. I realized this with a timer (System.Threading.Timer). Further there is a ThreadHandler class which keeps all threads in a list. These threads are controlled with the standard functions of System.Threading.Thread by the name of the thread. Now to the code that is affected by my problem:
In the constructor of my Log class (LogWriter) I call the method InitializeLoggerThread():
private void InitializeLoggerThread()
{
LoggerLoggingThread = new System.Threading.Thread(new System.Threading.ThreadStart(WriteLog));
LoggerLoggingThread.Name = LoggerLogginThreadName;
ObjectObserver.ThreadHandler.AddThread(LoggerLoggingThread); // Obersver class from which all objects can be accessed
}
The Timer itself will be starte das
public void StartLogging()
{
this.LoggerTimer = new System.Threading.Timer(LoggerCallback, null, 1000, LoggerInterval);
}
Furthermore the Log class contains the implementation oft he timer:
private const int LoggerInterval = 5000;
private System.Threading.Thread LoggerLoggingThread;
private static void LoggerCallback(object state)
{
if ((BufferCount > 0))
{
ObjectObserver.ThreadHandler.StartThread(LoggerLogginThreadName);
}
}
The ThreadHandler will strart the thread with te following function:
public void StartThread(string threadName)
{
lock (Locker)
{
if (GetThread(threadName).ThreadState == ThreadState.Stopped || GetThread(threadName).ThreadState == ThreadState.Unstarted)
{
GetThread(threadName).Start();
}
}
}
I have already checked the parameters etc.. Everything is correct in this case. Basically, when debugging, it seems that the threads all try to start the logger thread at the same time.
The thread will be calles by its name with the following function:
public Thread GetThread(string threadName)
{
foreach (Thread thread in Threads)
{
if (thread.Name == threadName)
{
return thread;
}
}
return null;
}
Now to my question: What is the error in my construct that I get
System.Threading.ThreadStateException at StartThread(...)
after the first execution a multiple attempt of the execution?
If desired, I can provide a copy&paste code of all important functions for debugging.
Thread passes through a set of states (taken from here):
Once thread is completed you cannot start it again. Your timer function tries to start Stopped thread though:
if (GetThread(threadName).ThreadState == ThreadState.Stopped ...
GetThread(threadName).Start();
There are awesome logging frameworks in .NET, such as NLog and log4net, please don't try to reinvent the wheel, most likely these frameworks already can do what you need and they are doing that much more efficiently.

How to unit test a thread safe queue

I need a simple data structure with these requirements:
it should behave like a queue,
all the enqueue operations should be atomic.
I have very limited experience with multithreading, but this is what I came up to:
public class Tickets
{
private ConcurrentQueue<uint> _tickets;
public Tickets(uint from, uint to)
{
Initialize(from, to);
}
private readonly object _lock = new object();
public void Initialize(uint from, uint to)
{
lock(_lock)
{
_tickets = new ConcurrentQueue<uint>();
for (uint i = from; i <= to; i++)
{
_tickets.Enqueue(i);
}
}
}
public uint Dequeue()
{
uint number;
if (_tickets.TryDequeue(out number))
{
return number;
}
throw new ArgumentException("Ticket queue empty!");
}
}
First question: is this code ok?
Secod question: how can I unit test this class (for instance with two threads which are perfoming dequeue operation periodically on the queue with elements (1, 2, 3, 4, 5, 6) and the first thread should get only odd numbers and the second thread only the even numbers)? I tried this, but the asserts aren't executing:
[Test]
public void Test()
{
var tickets = new Tickets(1, 4);
var t1 = new Thread(() =>
{
Assert.AreEqual(1, tickets.Dequeue());
Thread.Sleep(100);
Assert.AreEqual(3, tickets.Dequeue());
});
var t2 = new Thread(() =>
{
Assert.AreEqual(2, tickets.Dequeue());
Thread.Sleep(100);
Assert.AreEqual(4, tickets.Dequeue());
});
t1.Start();
t2.Start();
}
I would use chess: http://research.microsoft.com/en-us/projects/chess
CHESS is a tool for finding and reproducing Heisenbugs in concurrent programs. CHESS repeatedly runs a concurrent test ensuring that every run takes a different interleaving. If an interleaving results in an error, CHESS can reproduce the interleaving for improved debugging. CHESS is available for both managed and native programs.
The problem with multithreading and unit tests is one of timing. When you try to introduce multiple threads to unit tests you run the risk of non-reproducable test results, tests that pass sometimes but not other times.
But just to explain why your asserts may not be executing, the unit test completes before the threads. It needs to wait for the threads to complete rather than just kicking them off and moving on. It's also feasible that the unit test framework itself is not threadsafe or capable of Asserts being called from other threads.
Sorry it's not a solution, but I don't know of any automated testing solution for multithreaded code either.
See also: How should I unit test threaded code?

.NET threading like Node.js/V8?

I've been away from .NET desktop programming for some time, while drinking the Node.js koolaid. There are some parts of Node.js I find easy to work with. In particular, I like the simplicity of the threading model, and that I can have a few of the benefits of a multithreaded application while only writing code to keep track of a single thread.
Now, I have a need to write a multi-threaded application in .NET, and it occurred to me that there is no reason I cannot use a similar threading model that is used to build Node.js applications. In particular, I want to:
Call long-running functions with callback parameters. (That function would execute on a thread from a pool. Maybe a simple wrapper function to call functions on new threads would be sufficient?)
Have those callback function calls ran on the "main" thread for processing
Maintain automatic synchronization for all objects accessed by this "main" thread, so locking isn't an issue
Does such a framework for this threading model already exist within, or for .NET applications? If not, are there parts of .NET that already support or handle some of the functionality that I am seeking?
As others have mentioned, async / await is an excellent choice for .NET. In particular:
Task / Task<T> / TaskCompletionSource<T> are analogous to JavaScript's Deferred / Promise / Future.
It's pretty easy to create JavaScript-style continuations using .NET-style continuations, but for the most part you won't need them.
There is no JavaScript equivalent to async / await. async allows you to write your methods as though they were synchronous, and under the hood it breaks them up into continuations wherever there's an await. So you don't have to use continuation passing style.
For operations on a background thread, your best choice is Task.Run. However, the standard pattern for .NET is to have the background operation compute and return a single value, instead of having continuous bidirectional messaging with the main thread.
If you do need a "stream" of asynchronous data, you should use TPL Dataflow or Rx. This is where things diverge from JS quite a bit.
I recommend you start with my async / await intro post.
I would recommend the TPL. Here’s an example of how it works
Void Work()
{
Task<string> ts = Get();
ts.ContinueWith(t =>
{
string result = t.Result;
Console.WriteLine(result);
});
}
There are a whole range of possibilities for cancelation, error handling using different schedulers etc. With .Net 4.5 you have the possibility of using await
async void Work()
{
Task<string> ts = Get();
string result = await ts;
Console.WriteLine(result);
}
Here the compiler looks at methods marked async and adds a whole pile of thread safe robust task synchronizing code while leaving the code readable.
I recommend a look at TPL (Task Parallel Library) which became available in .Net 4.0. It can do points 1 and 2 but not 3.
See http://msdn.microsoft.com/en-us/library/dd460717.aspx
It can be achieved, among other options, by taking advantage of Window's native event loop.
Following code is a POC for the same and it addresses all the 3 points you have mentioned.
But note that it is just a POC. It is not type safe and it uses Delegate.DynamicInvoke which can be slow but it proves the concept nevertheless.
public static class EventLoop
{
private class EventTask
{
public EventTask(Delegate taskHandler) : this(taskHandler, null) {}
public EventTask(Delegate taskHandler, Delegate callback)
{
TaskHandler = taskHandler;
Callback = callback;
}
private Delegate Callback {get; set;}
private Delegate TaskHandler {get; set;}
public void Invoke(object param)
{
object[] paramArr = null;
if (param.GetType().Equals(typeof(object[])))
{
paramArr = (object[]) param; //So that DynamicInvoke does not complain
}
object res = null;
if (TaskHandler != null)
{
if (paramArr != null)
{
res = TaskHandler.DynamicInvoke(paramArr);
}
else
{
res = TaskHandler.DynamicInvoke(param);
}
}
if (Callback != null)
{
EnqueueSyncTask(Callback, res);
}
}
}
private static WindowsFormsSynchronizationContext _syncContext;
public static void Run(Action<string[]> mainProc, string[] args)
{
//You need to reference System.Windows.Forms
_syncContext = new WindowsFormsSynchronizationContext();
EnqueueSyncTask(mainProc, args);
Application.Run();
}
public static void EnqueueSyncTask(Delegate taskHandler, object param)
{
//All these tasks will run one-by-one in order on Main thread
//either on call of Application.DoEvenets or when Main thread becomes idle
_syncContext.Post(new EventTask(taskHandler).Invoke, param);
}
public static void EnqueueAsyncTask(Delegate taskHandler, object param, Delegate callback)
{
//En-queue on .Net Thread Pool
ThreadPool.QueueUserWorkItem(new EventTask(taskHandler, callback).Invoke, param);
}
}
Client Code:
[STAThread]
static void Main(string[] args)
{
Thread.CurrentThread.Name = "MAIN THREAD";
Console.WriteLine("Method Main: " + Thread.CurrentThread.Name);
EventLoop.Run(MainProc, args);
}
static void MainProc(string[] args)
{
Console.WriteLine("Method MainProc: " + Thread.CurrentThread.Name);
Console.WriteLine("Queuing Long Running Task...");
EventLoop.EnqueueAsyncTask(new Func<int,int,int>(LongCalculation), new object[]{5,6}, new Action<int>(PrintResult));
Console.WriteLine("Queued Long Running Task");
Thread.Sleep(400); //Do more work
EventLoop.EnqueueAsyncTask(new Func<int, int, int>(LongCalculation), new object[] { 15, 16 }, new Action<int>(PrintResult));
Thread.Sleep(150); //Do some more work but within this time 2nd task is not able to complete, meanwhile 1st task completes
//Long running Tasks will run in background but callback will be executed only when Main thread becomes idle
//To execute the callbacks before that, call Application.DoEvents
Application.DoEvents(); //PrintResult for 1st task as 2nd is not yet complete
Console.WriteLine("Method MainProc: Working over-time!!!!");
Thread.Sleep(500); //After this sleep, 2nd Task's print will also be called as Main thread will become idle
}
static int LongCalculation(int a, int b)
{
Console.WriteLine("Method LongCalculation, Is Thread Pool Thread: " + Thread.CurrentThread.IsThreadPoolThread);
Console.WriteLine("Running Long Calculation");
Thread.Sleep(500); //long calc
Console.WriteLine("completed Long Calculation");
return a + b;
}
static void PrintResult(int a)
{
Console.WriteLine("Method PrintResult: " + Thread.CurrentThread.Name);
Console.WriteLine("Result: " + a);
//Continue processing potentially queuing more long running tasks
}
Output:

Thread-safe buffer for .NET

(Note: Though I would like ideas for the future for .Net 4.0, I'm limited to .Net 3.5 for this project.)
I have a thread, which is reading data asynchronously from an external device (simulated in the code example by the ever-so-creative strSomeData :-) and storing it in a StringBuilder 'buffer' (strBuilderBuffer :-)
In the 'main code' I want to 'nibble' at this 'buffer'. However, I am unsure as to how to do this in a thread safe manner, from a 'operational' perspective. I understand it is safe from a 'data' perspective, because according to msdn, "Any public static members of this (StringBuilder) type are thread safe. Any instance members are not guaranteed to be thread safe." However, my code below illustrates that it is possibly not thread-safe from an 'operational' perspective.
The key is that I'm worried about two lines of the code:
string strCurrentBuffer = ThreadWorker_TestThreadSafety_v1a.strBuilderBuffer.ToString();
// Thread 'randomly' slept due to 'inconvenient' comp resource scheduling...
ThreadWorker_TestThreadSafety_v1a.strBuilderBuffer.Length = 0;
if the computer OS sleeps my thread between the 'reading' of the buffer & the 'clearing' of the buffer, I can lose data (which is bad :-(
Is there any way to guarantee the 'atomocy?' of those two lines & force the computer to not interrupt them?
With respect to Vlad's suggestion below regarding the use of lock, I tried it but it didn't work (at all really):
public void BufferAnalyze()
{
String strCurrentBuffer;
lock (ThreadWorker_TestThreadSafety_v1a.strBuilderBuffer)
{
strCurrentBuffer = ThreadWorker_TestThreadSafety_v1a.strBuilderBuffer.ToString();
Console.WriteLine("[BufferAnalyze()] ||<< Thread 'Randomly' Slept due to comp resource scheduling");
Thread.Sleep(1000); // Simulate poor timing of thread resourcing...
ThreadWorker_TestThreadSafety_v1a.strBuilderBuffer.Length = 0;
}
Console.WriteLine("[BufferAnalyze()]\r\nstrCurrentBuffer[{0}] == {1}", strCurrentBuffer.Length.ToString(), strCurrentBuffer);
}
Is there a better way of implementing a thread safe buffer?
Here's the full code:
namespace ExploringThreads
{
/// <summary>
/// Description of BasicThreads_TestThreadSafety_v1a
/// </summary>
class ThreadWorker_TestThreadSafety_v1a
{
private Thread thread;
public static StringBuilder strBuilderBuffer = new StringBuilder("", 7500);
public static StringBuilder strBuilderLog = new StringBuilder("", 7500);
public bool IsAlive
{
get { return thread.IsAlive; }
}
public ThreadWorker_TestThreadSafety_v1a(string strThreadName)
{
// It is possible to have a thread begin execution as soon as it is created.
// In the case of MyThread this is done by instantiating a Thread object inside MyThread's constructor.
thread = new Thread(new ThreadStart(this.threadRunMethod));
thread.Name = strThreadName;
thread.Start();
}
public ThreadWorker_TestThreadSafety_v1a() : this("")
{
// NOTE: constructor overloading ^|^
}
//Entry point of thread.
public void threadRunMethod()
{
Console.WriteLine("[ThreadWorker_TestThreadSafety_v1a threadRunMethod()]");
Console.WriteLine(thread.Name + " starting.");
int intSomeCounter = 0;
string strSomeData = "";
do
{
Console.WriteLine("[ThreadWorker_TestThreadSafety_v1a threadRunMethod()] running.");
intSomeCounter++;
strSomeData = "abcdef" + intSomeCounter.ToString() + "|||";
strBuilderBuffer.Append(strSomeData);
strBuilderLog.Append(strSomeData);
Thread.Sleep(200);
} while(intSomeCounter < 15);
Console.WriteLine(thread.Name + " terminating.");
}
}
/// <summary>
/// Description of BasicThreads_TestThreadSafety_v1a.
/// </summary>
public class BasicThreads_TestThreadSafety_v1a
{
public BasicThreads_TestThreadSafety_v1a()
{
}
public void BufferAnalyze()
{
string strCurrentBuffer = ThreadWorker_TestThreadSafety_v1a.strBuilderBuffer.ToString();
Console.WriteLine("[BufferAnalyze()] ||<< Thread 'Randomly' Slept due to comp resource scheduling");
Thread.Sleep(1000); // Simulate poor timing of thread resourcing...
ThreadWorker_TestThreadSafety_v1a.strBuilderBuffer.Length = 0;
Console.WriteLine("[BufferAnalyze()]\r\nstrCurrentBuffer[{0}] == {1}", strCurrentBuffer.Length.ToString(), strCurrentBuffer);
}
public void TestBasicThreads_TestThreadSafety_v1a()
{
Console.Write("Starting TestBasicThreads_TestThreadSafety_v1a >>> Press any key to continue . . . ");
Console.Read();
// First, construct a MyThread object.
ThreadWorker_TestThreadSafety_v1a threadWorker_TestThreadSafety_v1a = new ThreadWorker_TestThreadSafety_v1a("threadWorker_TestThreadSafety_v1a Child");
do
{
Console.WriteLine("[TestBasicThreads_TestThreadSafety_v1a()]");
Thread.Sleep(750);
BufferAnalyze();
//} while (ThreadWorker_TestThreadSafety_v1a.thread.IsAlive);
} while (threadWorker_TestThreadSafety_v1a.IsAlive);
BufferAnalyze();
Thread.Sleep(1250);
Console.WriteLine("[TestBasicThreads_TestThreadSafety_v1a()]");
Console.WriteLine("ThreadWorker_TestThreadSafety_v1a.strBuilderLog[{0}] == {1}", ThreadWorker_TestThreadSafety_v1a.strBuilderLog.Length.ToString(), ThreadWorker_TestThreadSafety_v1a.strBuilderLog);
Console.Write("Completed TestBasicThreads_TestThreadSafety_v1a >>> Press any key to continue . . . ");
Console.Read();
}
}
}
Download the Reactive Extensions backport for 3.5 here. There is also a NuGet package for it. After you have it downloaded then just reference System.Threading.dll in your project.
Now you can use all of the new concurrent collections standard in .NET 4.0 within .NET 3.5 as well. The best one for your situation is the BlockingCollection. It is basically a buffer that allows threads to enqueue items and dequeue them like a normal queue. Except that the dequeue operation blocks until an item is available.
There is no need to use the StringBuilder class at all now. Here is how I would refactor your code. I tried to keep my example short so that it is easier to understand.
public class Example
{
private BlockingCollection<string> buffer = new BlockingCollection<string>();
public Example()
{
new Thread(ReadFromExternalDevice).Start();
new Thread(BufferAnalyze).Start();
}
private void ReadFromExteneralDevice()
{
while (true)
{
string data = GetFromExternalDevice();
buffer.Add(data);
Thread.Sleep(200);
}
}
private void BufferAnalyze()
{
while (true)
{
string data = buffer.Take(); // This blocks if nothing is in the queue.
Console.WriteLine(data);
}
}
}
For future reference the BufferBlock<T> class from the TPL Data Flow library will do basically the same thing as BlockingCollection. It will be available in .NET 4.5.
Using StringBuffer is not thread safe, but you can switch to ConcurrentQueue<char>.
In case you need other data structure, there are other thread-safe collections in .NET 4, see http://msdn.microsoft.com/en-us/library/dd997305.aspx.
Edit: in .NET 3.5 there are less synchronization primitives. You can make a simple solution by adding a lock around Queue<char>, though it will be less efficient than the .NET 4's ConcurrentQueue. Or use the same StrignBuffer, again with locking reading/writing operations:
public static StringBuilder strBuilderBuffer = new StringBuilder("", 7500);
private object BufferLock = new object();
...
lock (BufferLock)
strBuilderBuffer.Append(strSomeData);
...
string strCurrentBuffer;
lock (BufferLock)
{
strCurrentBuffer = ThreadWorker_TestThreadSafety_v1a.strBuilderBuffer.ToString();
ThreadWorker_TestThreadSafety_v1a.strBuilderBuffer.Clear();
}
Console.WriteLine("[BufferAnalyze()] ||<< Thread 'Randomly' Slept ...");
Thread.Sleep(1000); // Simulate poor timing of thread resourcing...
Edit:
You cannot guarantee that the OS won't suspend your working thread which is holding the lock. However the lock guarantees that the other threads will be unable to interfere and change the buffer as long as one thread is processing it.
That's why your time of holding the lock should be as short as possible:
taken the lock, added data, released the lock, -or-
taken the lock, copied data, emptied the buffer, released the lock, started processing the copied data.
If you are doing a lot of reads out of the buffer, perhaps this will help:
http://msdn.microsoft.com/en-us/library/system.threading.readerwriterlock.aspx
Multiple readers are possible, but only one writer.
It is available in .NET 1.X and up...

Producer Consumer queue does not dispose

i have built a Producer Consumer queue wrapping a ConcurrentQueue of .net 4.0 with SlimManualResetEvent signaling between the producing (Enqueue) and the consuming (while(true) thread based.
the queue looks like:
public class ProducerConsumerQueue<T> : IDisposable, IProducerConsumerQueue<T>
{
private bool _IsActive=true;
public int Count
{
get
{
return this._workerQueue.Count;
}
}
public bool IsActive
{
get { return _IsActive; }
set { _IsActive = value; }
}
public event Dequeued<T> OnDequeued = delegate { };
public event LoggedHandler OnLogged = delegate { };
private ConcurrentQueue<T> _workerQueue = new ConcurrentQueue<T>();
private object _locker = new object();
Thread[] _workers;
#region IDisposable Members
int _workerCount=0;
ManualResetEventSlim _mres = new ManualResetEventSlim();
public void Dispose()
{
_IsActive = false;
_mres.Set();
LogWriter.Write("55555555555");
for (int i = 0; i < _workerCount; i++)
// Wait for the consumer's thread to finish.
{
_workers[i].Join();
}
LogWriter.Write("6666666666");
// Release any OS resources.
}
public ProducerConsumerQueue(int workerCount)
{
try
{
_workerCount = workerCount;
_workers = new Thread[workerCount];
// Create and start a separate thread for each worker
for (int i = 0; i < workerCount; i++)
(_workers[i] = new Thread(Work)).Start();
}
catch (Exception ex)
{
OnLogged(ex.Message + ex.StackTrace);
}
}
#endregion
#region IProducerConsumerQueue<T> Members
public void EnqueueTask(T task)
{
if (_IsActive)
{
_workerQueue.Enqueue(task);
//Monitor.Pulse(_locker);
_mres.Set();
}
}
public void Work()
{
while (_IsActive)
{
try
{
T item = Dequeue();
if (item != null)
OnDequeued(item);
}
catch (Exception ex)
{
OnLogged(ex.Message + ex.StackTrace);
}
}
}
#endregion
private T Dequeue()
{
try
{
T dequeueItem;
//if (_workerQueue.Count > 0)
//{
_workerQueue.TryDequeue(out dequeueItem);
if (dequeueItem != null)
return dequeueItem;
//}
if (_IsActive)
{
_mres.Wait();
_mres.Reset();
}
//_workerQueue.TryDequeue(out dequeueItem);
return dequeueItem;
}
catch (Exception ex)
{
OnLogged(ex.Message + ex.StackTrace);
T dequeueItem;
//if (_workerQueue.Count > 0)
//{
_workerQueue.TryDequeue(out dequeueItem);
return dequeueItem;
}
}
public void Clear()
{
_workerQueue = new ConcurrentQueue<T>();
}
}
}
when calling Dispose it sometimes blocks on the join (one thread consuming) and the dispose method is stuck. i guess it get's stuck on the Wait of the resetEvents but for that i call the set on the dispose.
any suggestions?
Update: I understand your point about needing a queue internally. My suggestion to use a BlockingCollection<T> is based on the fact that your code contains a lot of logic to provide the blocking behavior. Writing such logic yourself is very prone to bugs (I know this from experience); so when there's an existing class within the framework that does at least some of the work for you, it's generally preferable to go with that.
A complete example of how you can implement this class using a BlockingCollection<T> is a little bit too large to include in this answer, so I've posted a working example on pastebin.com; feel free to take a look and see what you think.
I also wrote an example program demonstrating the above example here.
Is my code correct? I wouldn't say yes with too much confidence; after all, I haven't written unit tests, run any diagnostics on it, etc. It's just a basic draft to give you an idea how using BlockingCollection<T> instead of ConcurrentQueue<T> cleans up a lot of your logic (in my opinion) and makes it easier to focus on the main purpose of your class (consuming items from a queue and notifying subscribers) rather than a somewhat difficult aspect of its implementation (the blocking behavior of the internal queue).
Question posed in a comment:
Any reason you're not using BlockingCollection<T>?
Your answer:
[...] i needed a queue.
From the MSDN documentation on the default constructor for the BlockingCollection<T> class:
The default underlying collection is a ConcurrentQueue<T>.
If the only reason you opted to implement your own class instead of using BlockingCollection<T> is that you need a FIFO queue, well then... you might want to rethink your decision. A BlockingCollection<T> instantiated using the default parameterless constructor is a FIFO queue.
That said, while I don't think I can offer a comprehensive analysis of the code you've posted, I can at least offer a couple of pointers:
I'd be very hesitant to use events in the way that you are here for a class that deals with such tricky multithreaded behavior. Calling code can attach any event handlers it wants, and these can in turn throw exceptions (which you don't catch), block for long periods of time, or possibly even deadlock for reasons completely outside your control--which is very bad in the case of a blocking queue.
There's a race condition in your Dequeue and Dispose methods.
Look at these lines of your Dequeue method:
if (_IsActive) // point A
{
_mres.Wait(); // point C
_mres.Reset(); // point D
}
And now take a look at these two lines from Dispose:
_IsActive = false;
_mres.Set(); // point B
Let's say you have three threads, T1, T2, and T3. T1 and T2 are both at point A, where each checks _IsActive and finds true. Then Dispose is called, and T3 sets _IsActive to false (but T1 and T2 have already passed point A) and then reaches point B, where it calls _mres.Set(). Then T1 gets to point C, moves on to point D, and calls _mres.Reset(). Now T2 reaches point C and will be stuck forever since _mres.Set will not be called again (any thread executing Enqueue will find _IsActive == false and return immediately, and the thread executing Dispose has already passed point B).
I'd be happy to try and offer some help on solving this race condition, but I'm skeptical that BlockingCollection<T> isn't in fact exactly the class you need for this. If you can provide some more information to convince me that this isn't the case, maybe I'll take another look.
Since _IsActive isn't marked as volatile and there's no lock around all access, each core can have a separate cache for this value and that cache may never get refreshed. So marking _IsActive to false in Dispose will not actually affect all running threads.
http://igoro.com/archive/volatile-keyword-in-c-memory-model-explained/
private volatile bool _IsActive=true;

Categories