Producer-Consumer with a variation - How to synchronize with thread signal/wait? - c#

While working on a large project I realized I was making a lot of calls to be scheduled in the future. Since these were fairly light-weight, I thought it might be better to use a separate scheduler.
ThreadPool.QueueUserWorkItem (() =>
{
Thread.Sleep (5000);
Foo (); // Call is to be executed after sometime
});
So I created a separate scheduler class that runs on its own thread and executes these events. I have 2 functions that access a shared queue from separate threads. I'd use a lock, but since one of the threads needs to sleep-wait, I wasn't sure how to release the lock.
class Scheduler
{
SortedDictionary <DateTime, Action> _queue;
EventWaitHandle _sync;
// Runs on its own thread
void Run ()
{
while (true)
{
// Calculate time till first event
// If queue empty, use pre-defined value
TimeSpan timeDiff = _queue.First().Key - DateTime.Now;
// Execute action if in the next 100ms
if (timeDiff < 100ms)
...
// Wait on event handle for time
else
_sync.WaitOne (timeDiff);
}
}
// Can be called by any thread
void ScheduleEvent (Action action, DataTime time)
{
_queue.Add (time, action);
// Signal thread to wake up and check again
_sync.Set ();
}
}
The trouble is, I'm not sure how to synchronize access to the queue between the 2 functions. I can't use a monitor or mutex, because Run() will sleep-wait, thus causing a deadlock. What is the right synchronization mechanism to use here? (If there a mechanism to atomically start the sleep-wait process and immediately release the lock, that might solve my problem)
How can I verify there is no race-condition?
Is this a variation of the producer consumer problem, or is there a more relevant synchronization problem-description?
While this is somewhat geared towards C#, I'd be happy to hear a general solution to this. Thanks!

OK, take 2 with Monitor/Pulse.
void Run ()
{
while (true)
{
Action doit = null;
lock(_queueLock)
{
while (_queue.IsEmpty())
Monitor.Wait(_queueLock);
TimeSpan timeDiff = _queue.First().Key - DateTime.Now;
if (timeDiff < 100ms)
doit = _queue.Dequeue();
}
if (doit != null)
; //execute doit
else
_sync.WaitOne (timeDiff);
}
}
void ScheduleEvent (Action action, DataTime time)
{
lock (_queueLock)
{
_queue.Add(time, action);
// Signal thread to wake up and check again
_sync.Set ();
if (_queue.Count == 1)
Monitor.Pulse(_queuLock);
}
}

The problem is easily solved, make sure the WaitOne is outside the lock.
//untested
while (true)
{
Action doit = null;
// Calculate time till first event
// If queue empty, use pre-defined value
lock(_queueLock)
{
TimeSpan timeDiff = _queue.First().Key - DateTime.Now;
if (timeDiff < 100ms)
doit = _queue.Dequeue();
}
if (doit != null)
// execute it
else
_sync.WaitOne (timeDiff);
}
_queueLock is a private helper object.

Since your goal is to schedule a task after a particular period of time, why not just use the System.Threading.Timer? It doesn't require dedicating a thread for the scheduling and takes advantage of the OS to wake up a worker thread. I've used this (removed some comments and other timer service functionality):
public sealed class TimerService : ITimerService
{
public void WhenElapsed(TimeSpan duration, Action callback)
{
if (callback == null) throw new ArgumentNullException("callback");
//Set up state to allow cleanup after timer completes
var timerState = new TimerState(callback);
var timer = new Timer(OnTimerElapsed, timerState, Timeout.Infinite, Timeout.Infinite);
timerState.Timer = timer;
//Start the timer
timer.Change((int) duration.TotalMilliseconds, Timeout.Infinite);
}
private void OnTimerElapsed(Object state)
{
var timerState = (TimerState)state;
timerState.Timer.Dispose();
timerState.Callback();
}
private class TimerState
{
public Timer Timer { get; set; }
public Action Callback { get; private set; }
public TimerState(Action callback)
{
Callback = callback;
}
}
}

The monitores were created for this kind of situation, simple problems that can cost mutch for the application, i present my solution to this very simple and if u want to make a shutdown easy to implement:
void Run()
{
while(true)
lock(this)
{
int timeToSleep = getTimeToSleep() //check your list and return a value
if(timeToSleep <= 100)
action...
else
{
int currTime = Datetime.Now;
int currCount = yourList.Count;
try{
do{
Monitor.Wait(this,timeToSleep);
if(Datetime.now >= (tomeToSleep + currtime))
break; //time passed
else if(yourList.Count != currCount)
break; //new element added go check it
currTime = Datetime.Now;
}while(true);
}
}catch(ThreadInterruptedException e)
{
//do cleanup code or check for shutdown notification
}
}
}
}
void ScheduleEvent (Action action, DataTime time)
{
lock(this)
{
yourlist.add ...
Monitor.Pulse(this);
}
}

Related

How to make a controlled infinite loop async in c#?

I found this
Run async method regularly with specified interval
which does half of what I want, but at the same time I want to be able to stop the loop whenever I want and then resume it as well. However while it's stopped, I don't want the infinite loop to keep running where the body gets skipped through a flag.
Basically I don't want this
while (true) {
if (!paused) {
// run work
}
// task delay
}
because then the while loop still runs.
How can I set it so that while its paused, nothing executes?
How can I set it so that while its paused, nothing executes?
That's hard to answer: if you define "pause" as: the object state remains valid while the loop doesn't use any resources then you'll have to stop and restart it (the loop).
All other timers, including Thread.Sleep, Task.Delays etc. will put your thread in idle/suspended mode.
If that's not sufficient for your needs, you'll need to actually stop the "infinite" loop.
It will free up thread related resources as well.
More info about sleep:
Thread.Sleep
More about sleep
You could use System.Threading.Timer and dispose of it while it is not in use and re-create it when you are ready to "resume". These timers are light weight so creating and destroying them on demand is not a problem.
private System.Threading.Timer _timer;
public void StartResumeTimer()
{
if(_timer == null)
_timer = new System.Threading.Timer(async (e) => await DoWorkAsync(e), null, 0, 5000);
}
public void StopPauseTimer()
{
_timer?.Dispose();
_timer = null;
}
public async Task DoWorkAsync(object state)
{
await Task.Delay(500); // do some work here, Task.Delay is just something to make the code compile
}
If you are really adverse to timers and want it to look like a while loop, then you can use TaskCompletionSource<T>:
private TaskCompletionSource<bool> _paused = null;
public async Task DoWork()
{
while (true)
{
if (_paused != null)
{
await _paused.Task;
_paused = null;
}
//run work
await Task.Delay(100);
}
}
public void Pause()
{
_paused = _paused ?? new TaskCompletionSource<bool>();
}
public void UnPause()
{
_paused?.SetResult(true);
}

C#: How do you use Mutex objects to sync multiple Threading.Timers when the threads come from the ThreadPool?

Perhaps it's because I'm trying to write code when I'm tired, but I'm having trouble figuring out how to ensure two different timers don't try to execute at the same time.
This is in a Windows Store (UWP) app. I have two System.Threading.Timer timers. I am using a Mutex to synchronize between them. Most of the time that seems to work fine, but sometimes the timers seem to end up on the same thread (based on looking at Environment.CurrentManagedThreadId). Timer1 claims the mutex, and then Timer2 fires and successfully claims the mutex as well because it's on the same thread. Disaster follows.
Any suggestions on the right way to deal with this requirement?
UPDATE
Adding some code per request. I think I've boiled this down to its essence.
private void StartCheckRefreshTimer() {
if (_timer1 != null) return;
_timer1 = new Timer(Timer1Expired, null, 0,
MagicStrings.FrequencyToCheckForRefresh*1000);
}
private async void Timer1Expired(object state) {
if (!_myMutex.WaitOne(5000))
DebugMutex("Timer1Expired. Timed out waiting for mutex.");
else {
try {
DebugMutex($"Timer1Expired. Claimed the mutex. Thread id: {Environment.CurrentManagedThreadId}");
// await Do something that needs to be in a critical section.
}
finally {
DebugMutex($"Timer1Expired. Releasing the mutex. Thread id: {Environment.CurrentManagedThreadId}");
_myMutex.ReleaseMutex();
}
}
}
private Timer _timer2;
private async void Timer2Expired(object state) {
if (!_myMutex.WaitOne(5000)) {
DebugMutex("Timer2Expired. Timed out waiting for mutex.");
}
else {
try {
DisposeTimer2();
DebugMutex($"Timer2Expired. Claimed the mutex. Thread Id: {Environment.CurrentManagedThreadId}");
// await Do something else that needs to be in a critical section.
}
finally {
DebugMutex($"Timer2Expired. Releasing mutex. Thread Id: {Environment.CurrentManagedThreadId}");
_myMutex.ReleaseMutex();
}
}
}
private void SomethingHappensThatMakesTimer2Start() {
lock (this) {
var interval = MagicStrings.DelayAfterPushRequiredBeforePushing*1000;
if (_timer2 != null) {
DebugMutex($"Resetting timer to go off at {DateTime.Now.AddMilliseconds(interval).ToString("T")}");
_timer2.Change(interval, Timeout.Infinite);
}
else {
DebugMutex($"Creating timer to go off at {DateTime.Now.AddMilliseconds(interval).ToString("T")}");
_timer2 = new Timer(Timer2Expired, null, interval, Timeout.Infinite);
}
}
}
private void DisposeTimer2() {
if (_timer2 != null) {
_timer2.Dispose();
_timer2 = null;
}
}

C# I need to allow The First thread to reach a certian point prevents other threads from continuing

I have multithreads working on the same threadsafe function. After X amount of iterations, The first thread to reach firstThread(), will execute firstThread() and prevent the other threads from continuing until thread is finished with firstThread(). Only the first thread to reach firstThread() will execute the others will not. Kind of like a race first one to the finish line is the winner. After firstThread() is completed all threads continue until limit is reached again. Does anyone have any ideas one best way to accomplish this, Would be greatly appreciated.
private void ThreadBrain()
{
Thread[] tList = new Thread[ThreadCount];
sw.Start();
for (int i = 0; i < tList.Length; i++)
{
tList[i] = new Thread(ThProc);
tList[i].Start();
}
foreach (Thread t in tList)
if (t != null) t.Join();
}
private void ThProc()
{
doWork();
}
private void firstThread()
{
//do some work
loopCount=0;
}
private void doWork()
{
//do some work
loopCount++;
//first thread to reach this point calls firstThread() and prevent other threads from continuing until current thread completes firstThread()
If(loopCount>=loopLimit)firstThread()
}
This will do it. Only the first thread to enter will change OnlyFirst from 0 to 1 and receive 0 from the Interlocked.CompareExchange. The other threads will fail and receive 1 from Interlocked.CompareExchange and then return.
private int OnlyFirst = 0;
private void doWork()
{
if (Interlocked.CompareExchange(ref OnlyFirst, 1, 0) != 0)
{
return;
}
// Flag that will only be "true" for the first thread to enter the method.
private bool isFirstThread = true;
// The "Synchronized" option ensures that only one thread can execute the method
// at a time, with the others getting temporarily blocked.
[MethodImplOptions.Synchronized]
private void firstThread()
{
if (isFirstThread)
{
//do some work
loopCount=0;
isFirstThread = false;
}
}

Monitor, lock or volatile?

I have a windows service (.NET 4) that periodically processes a queue, for example every 15 minutes. I use a System.Threading.Timer which is set when the service starts to fire a callback every X milliseconds. Typically each run takes seconds and never collides, but what if I could not assume that - then I want the next run to exit at once if processing is in progress.
This is easily solved with lock, volatile bool or a monitor, but what is actually the appropriate to use in this scenario, or simply the preferred option in general?
I've found other posts that answers almost this scenario (like Volatile vs. Interlocked vs. lock) but need some advice on extending this to a Timer example with immediate exit.
You don't need any locks for this, you should just reschedule next timer execution from within the timer delegate. That should ensure 100% no overlaps.
At the end of timer's event handler call timer.Change(nextRunInMilliseconds, Timeout.Infinite), that way the timer will fire only once, after nextRunInMilliseconds.
Example:
//Object that holds timer state, and possible additional data
private class TimerState
{
public Timer Timer { get; set; }
public bool Stop { get; set; }
}
public void Run()
{
var timerState = new TimerState();
//Create the timer but don't start it
timerState.Timer = new Timer(OnTimer, timerState, Timeout.Infinite, Timeout.Infinite);
//Start the timer
timerState.Timer.Change(1000, Timeout.Infinite);
}
public void OnTimer(object state)
{
var timerState = (TimerState) state;
try
{
//Do work
}
finally
{
//Reschedule timer
if (!timerState.Stop)
timerState.Timer.Change(1000, Timeout.Infinite);
}
}
Well, any of them will do the job. Monitor is usually pretty simple to use via lock, but you can't use lock in this case because you need to specify a zero timeout; as such, the simplest approach is probably a CompareExchange:
private int isRunning;
...
if(Interlocked.CompareExchange(ref isRunning, 1, 0) == 0) {
try {
// your work
} finally {
Interlocked.Exchange(ref isRunning, 0);
}
}
to do the same with Monitor is:
private readonly object syncLock = new object();
...
bool lockTaken = false;
try {
Monitor.TryEnter(syncLock, 0, ref lockTaken);
if (lockTaken) {
// your work
}
} finally {
if(lockTaken) Monitor.Exit(syncLock);
}
I think, that if you find that you need to synchronize timer delegate - you are doing it wrong, and Timer is probably not the class you want to use. Imho its better to :
1) either keep the Timer, but increase the interval value to the point, where its safe to assume, that there will be no issues with threading,
2) or remove Timer and use simple Thread instead. You know, something like:
var t = new Thread();
t.Start(() =>
{
while (!_stopEvent.WaitOne(100))
{
..........
}
});

what's wrong with my producer-consumer queue design?

I'm starting with the C# code example here. I'm trying to adapt it for a couple reasons: 1) in my scenario, all tasks will be put in the queue up-front before consumers will start, and 2) I wanted to abstract the worker into a separate class instead of having raw Thread members within the WorkerQueue class.
My queue doesn't seem to dispose of itself though, it just hangs, and when I break in Visual Studio it's stuck on the _th.Join() line for WorkerThread #1. Also, is there a better way to organize this? Something about exposing the WaitOne() and Join() methods seems wrong, but I couldn't think of an appropriate way to let the WorkerThread interact with the queue.
Also, an aside - if I call q.Start(#) at the top of the using block, only some of the threads every kick in (e.g. threads 1, 2, and 8 process every task). Why is this? Is it a race condition of some sort, or am I doing something wrong?
using System;
using System.Collections.Generic;
using System.Text;
using System.Messaging;
using System.Threading;
using System.Linq;
namespace QueueTest
{
class Program
{
static void Main(string[] args)
{
using (WorkQueue q = new WorkQueue())
{
q.Finished += new Action(delegate { Console.WriteLine("All jobs finished"); });
Random r = new Random();
foreach (int i in Enumerable.Range(1, 10))
q.Enqueue(r.Next(100, 500));
Console.WriteLine("All jobs queued");
q.Start(8);
}
}
}
class WorkQueue : IDisposable
{
private Queue<int> _jobs = new Queue<int>();
private int _job_count;
private EventWaitHandle _wh = new AutoResetEvent(false);
private object _lock = new object();
private List<WorkerThread> _th;
public event Action Finished;
public WorkQueue()
{
}
public void Start(int num_threads)
{
_job_count = _jobs.Count;
_th = new List<WorkerThread>(num_threads);
foreach (int i in Enumerable.Range(1, num_threads))
{
_th.Add(new WorkerThread(i, this));
_th[_th.Count - 1].JobFinished += new Action<int>(WorkQueue_JobFinished);
}
}
void WorkQueue_JobFinished(int obj)
{
lock (_lock)
{
_job_count--;
if (_job_count == 0 && Finished != null)
Finished();
}
}
public void Enqueue(int job)
{
lock (_lock)
_jobs.Enqueue(job);
_wh.Set();
}
public void Dispose()
{
Enqueue(Int32.MinValue);
_th.ForEach(th => th.Join());
_wh.Close();
}
public int GetNextJob()
{
lock (_lock)
{
if (_jobs.Count > 0)
return _jobs.Dequeue();
else
return Int32.MinValue;
}
}
public void WaitOne()
{
_wh.WaitOne();
}
}
class WorkerThread
{
private Thread _th;
private WorkQueue _q;
private int _i;
public event Action<int> JobFinished;
public WorkerThread(int i, WorkQueue q)
{
_i = i;
_q = q;
_th = new Thread(DoWork);
_th.Start();
}
public void Join()
{
_th.Join();
}
private void DoWork()
{
while (true)
{
int job = _q.GetNextJob();
if (job != Int32.MinValue)
{
Console.WriteLine("Thread {0} Got job {1}", _i, job);
Thread.Sleep(job * 10); // in reality would to actual work here
if (JobFinished != null)
JobFinished(job);
}
else
{
Console.WriteLine("Thread {0} no job available", _i);
_q.WaitOne();
}
}
}
}
}
The worker threads are all blocking on the _q.WaitOne() call in DoWork(). Calling the thread's Join() method will deadlock, the threads never exit. You'll need to add a mechanism to signal to worker thread to exit. A ManualResetEvent, tested with WaitAny in the worker, will get the job done.
One debugging tip: get familiar with the Debug + Windows + Threads window. It lets you switch between threads and look at their call stacks. You'd have quickly found this problem by yourself.
You do a WaitOne() at the end of DoWork but you never set it after the threads start running.
Note that AutoResetEvent will go back to not set state after a 'successful' WaitOne
Your loop in your DoWork method never finishes. This will cause the thread to always be busy and this thread.Join() will block forever, waiting for it to complete.
You have a WaitOne, but I don't think it's necessary unless there is a reason you want your threadpool to stick around after your work is complete:
private void DoWork()
{
bool done = false;
while (!done)
{
int job = _q.GetNextJob();
if (job != Int32.MinValue)
{
Console.WriteLine("Thread {0} Got job {1}", _i, job);
Thread.Sleep(job * 10); // in reality would to actual work here
if (JobFinished != null)
JobFinished(job);
}
else
{
Console.WriteLine("Thread {0} no job available", _i);
done = true;
}
}
}
If you want the threads to stick around so you don't have to realloc more threads when WorkQueue.Start is called, you'd have to do something more elaborate with the AutoResetEvent.
Your main problem is the deterministic deadlock described in the other answers.
The correct way to handle it, though, is not to fix the deadlock, but to eliminate the Event altogether.
The whole idea of the Producer-Consumer model is that the clients En-queue and De-queue elements concurrently, and that's why sync mechanisms are required. If you're enqueuing all of the elements beforehand and then only dequeue concurrently, you only need a lock on the dequeue, since the "Event" is used to let "Consumers" wait for new elements to be enqueued; this will not happen in your case (based on your description).
Also, the "single responsibility" design principle suggests that the threading code should be separated from the "Blocking Queue" code. Make the "Blocking Queue" a class of its own, then use it in your thread-management class.

Categories