Consider this example:
When the user clicks a button, ClassA fires OnUserInteraction event rapidly 10 times. ClassB is attached to this event and in it's event handler it fires ClassC's Render method. In the Render method the AxisAngleRotation3D is executed, but every single animation is lasting 1 second.
In this scenario all 10 AxisAngleRotation3D animations are executed almost at the same time, but I would want them to execute one after another. As I understand threads, I would probably have to implement a thread queue in ClassB, where the Completed event of the AxisAngleRotation3D signals that the next event is allowed to fire...?
Is this correct and how can I achieve this?
Have a task queue. Simply put, have a ConcurrentQueue<Func<bool>> field or similar, and add tasks to it as necessary. Then have your task execution thread pop Func<bool> delegates off the queue and invoke them. If they return true, they're done. If they return false, add them back onto the queue, as they couldn't complete at that time.
Here's an example:
using System;
using System.Collections.Concurrent;
using System.Threading;
namespace Example
{
public class TaskScheduler : IDisposable
{
public const int IDLE_DELAY = 100;
private ConcurrentQueue<Func<bool>> PendingTasks;
private Thread ExecuterThread;
private volatile bool _IsDisposed;
public bool IsDisposed
{
get { return _IsDisposed; }
}
public void EnqueueTask(Func<bool> task)
{
PendingTasks.Enqueue(task);
}
public void Start()
{
CheckDisposed();
if (ExecuterThread != null)
{
throw new InvalidOperationException("The task scheduler is alreader running.");
}
ExecuterThread = new Thread(Run);
ExecuterThread.IsBackground = true;
ExecuterThread.Start();
}
private void CheckDisposed()
{
if (_IsDisposed)
{
throw new ObjectDisposedException("TaskScheduler");
}
}
private void Run()
{
while (!_IsDisposed)
{
if (PendingTasks.IsEmpty)
{
Thread.Sleep(IDLE_DELAY);
continue;
}
Func<bool> task;
while (!PendingTasks.TryDequeue(out task))
{
Thread.Sleep(0);
}
if (!task.Invoke())
{
PendingTasks.Enqueue(task);
}
}
}
public void Dispose()
{
CheckDisposed();
_IsDisposed = true;
}
}
}
ClassB could add the event to a queue and then render them one at a time (possibly use a timer to read from the queue).
Related
I would like to use Condition Variable in order to know when Messages Queue is not empty, i would like to use it in "HandleMessageQueue" as a thread
private static Queue<Message> messages = new Queue<Message>();
/// <summary>
/// function return the first message
/// </summary>
/// <returns>first message element</returns>
public static Message GetFirst()
{
return messages.Dequeue();
}
in another class:
/// <summary>
/// Function run while the clients connected and handle the queue message
/// </summary>
public static void HandleMessageQueue()
{
// ...
}
What you're probably looking for is a simple producer-consumer pattern. In this case I'd recommend using .NET's BlockingCollection, which allows you to easily handle the following cases:
have one thread push stuff in a queue
have another thread block until stuff is available
make the whole thing easy to shutdown without having to forcibly terminate the thread
Here's a short code sample, read the comments for more information about what every bit does:
public class Queue : IDisposable
{
private readonly Thread _messageThread; // thread for processing messages
private readonly BlockingCollection<Message> _messages; // queue for messages
private readonly CancellationTokenSource _cancellation; // used to abort the processing when we're done
// initializes everything and starts a processing thread
public Queue()
{
_messages = new BlockingCollection<Message>();
_cancellation = new CancellationTokenSource();
_messageThread = new Thread(ProcessMessages);
_messageThread.Start();
}
// processing thread function
private void ProcessMessages()
{
try
{
while (!_cancellation.IsCancellationRequested)
{
// Take() blocks until either:
// 1) a message is available, in which case it returns it, or
// 2) the cancellation token is cancelled, in which case it throws an OperationCanceledException
var message = _messages.Take(_cancellation.Token);
// process the message here
}
}
catch (OperationCanceledException)
{
// Take() was cancelled, let the thread exit
}
}
// pushes a message
public void QueueMessage(Message message)
{
_messages.Add(message);
}
// stops processing and clean up resources
public void Dispose()
{
_cancellation.Cancel(); // let Take() abort by throwing
_messageThread.Join(); // wait for thread to exit
_cancellation.Dispose(); // release the cancellation source
_messages.Dispose(); // release the queue
}
}
Another option would be to combine a ConcurrentQueue<T> with a ManualResetEvent (events are roughly the .NET equivalent to condition variables), but that would be doing by hand what BlockingCollection<T> does).
something like this?
public class EventArgs<T> : EventArgs
{
private T eventData;
public EventArgs(T eventData)
{
this.eventData = eventData;
}
public T EventData
{
get { return eventData; }
}
}
public class ObservableQueue<T>
{
public event EventHandler<EventArgs<T>> EnQueued;
public event EventHandler<EventArgs<T>> DeQueued;
public int Count { get { return queue.Count; } }
private readonly Queue<T> queue = new Queue<T>();
protected virtual void OnEnqueued(T item)
{
if (EnQueued != null)
EnQueued(this, new EventArgs<T>(item));
}
protected virtual void OnDequeued(T item)
{
if (DeQueued != null)
DeQueued(this, new EventArgs<T>(item));
}
public virtual void Enqueue(T item)
{
queue.Enqueue(item);
OnEnqueued(item);
}
public virtual T Dequeue()
{
var item = queue.Dequeue();
OnDequeued(item);
return item;
}
}
and use it
static void Main(string[] args)
{
ObservableQueue<string> observableQueue = new ObservableQueue<string>();
observableQueue.EnQueued += ObservableQueue_EnQueued;
observableQueue.DeQueued += ObservableQueue_DeQueued;
observableQueue.Enqueue("abc");
observableQueue.Dequeue();
Console.Read();
}
Let's say I have a class which has a Timer object that doesn't do any critical work - just some GUI work. Let's say there are 2 scenarios where the timer elapses every 5 minutes:
in the Timer_Elapsed delegate there is a lot of work that is done and it takes 2 minutes to complete.
in the Timer_Elapsed delegate there is little work to be done and it takes a couple of milliseconds to complete
What is the proper way to dispose of the object & timer? Does the amount of time the Timer_Elapsed event delegate runs influence your decision on how to Dispose properly?
If, you need to stop your timer during disposal, and work could still be in progress in your timer delegate, that relies on shared resources, being disposed at the same time, you need to coordinate the "shutdown" process. The below snippet shows an example of doing this:
public class PeriodicTimerTask : IDisposable
{
private readonly System.Timers.Timer _timer;
private CancellationTokenSource _tokenSource;
private readonly ManualResetEventSlim _callbackComplete;
private readonly Action<CancellationToken> _userTask;
public PeriodicTimerTask(TimeSpan interval, Action<CancellationToken> userTask)
{
_tokenSource = new CancellationTokenSource();
_userTask = userTask;
_callbackComplete = new ManualResetEventSlim(true);
_timer = new System.Timers.Timer(interval.TotalMilliseconds);
}
public void Start()
{
if (_tokenSource != null)
{
_timer.Elapsed += (sender, e) => Tick();
_timer.AutoReset = true;
_timer.Start();
}
}
public void Stop()
{
var tokenSource = Interlocked.Exchange(ref _tokenSource, null);
if (tokenSource != null)
{
_timer.Stop();
tokenSource.Cancel();
_callbackComplete.Wait();
_timer.Dispose();
_callbackComplete.Dispose();
tokenSource.Dispose();
}
}
public void Dispose()
{
Stop();
GC.SuppressFinalize(this);
}
private void Tick()
{
var tokenSource = _tokenSource;
if (tokenSource != null && !tokenSource.IsCancellationRequested)
{
try
{
_callbackComplete.Wait(tokenSource.Token); // prevent multiple ticks.
_callbackComplete.Reset();
try
{
tokenSource = _tokenSource;
if (tokenSource != null && !tokenSource.IsCancellationRequested)
_userTask(tokenSource.Token);
}
finally
{
_callbackComplete.Set();
}
}
catch (OperationCanceledException) { }
}
}
}
Usage example:
public static void Main(params string[] args)
{
var periodic = new PeriodicTimerTask(TimeSpan.FromSeconds(1), cancel => {
int n = 0;
Console.Write("Tick ...");
while (!cancel.IsCancellationRequested && n < 100000)
{
n++;
}
Console.WriteLine(" completed.");
});
periodic.Start();
Console.WriteLine("Press <ENTER> to stop");
Console.ReadLine();
Console.WriteLine("Stopping");
periodic.Dispose();
Console.WriteLine("Stopped");
}
With output like below:
Press <ENTER> to stop
Tick ... completed.
Tick ... completed.
Tick ... completed.
Tick ... completed.
Tick ... completed.
Stopping
Stopped
There are multiple approaches to this, and like Alex said in the comments it depends on whether or not objects the delegate will be using are also disposed.
Let's say we have a "worst-case" scenario, in which the delegate does need to use objects which would be disposed.
A good way to handle this would be similar to a method the Process object has: WaitForExit(). This method would simply loop until it sees the delegate is done working (have a working bool which is set before and after the delegate runs?) then returns. Now you can have something like this in the code using that class:
// Time to shut down
myDisposable.WaitForFinish();
myDisposable.Dispose();
Thus we are essentially ensuring the delegate is done before disposing of it, stopping any sort of ObjectDisposedException.
I'm using a Timer class that wraps CreateTimerQueueTimer and DeleteTimerQueueTimer.
Here is the class:
using System;
using System.Threading;
using MyCompany.Internal;
using TimerCallback = MyCompany.Internal.TimerCallback;
public class Timer : IDisposable
{
public Timer()
{
this.callback = this.ticked;
this.autoReset = true;
Computer.ChangeTimerResolutionTo(1);
this.priority = ThreadPriority.Normal;
}
public virtual event EventHandler Elapsed;
public virtual bool AutoReset
{
get
{
return this.autoReset;
}
set
{
this.autoReset = value;
}
}
public virtual ThreadPriority Priority
{
get
{
return this.priority;
}
set
{
this.priority = value;
}
}
public virtual void Start(int interval)
{
if (interval < 1)
{
throw new ArgumentOutOfRangeException("interval", "Interval must be at least 1 millisecond.");
}
if (Interlocked.CompareExchange(ref this.started, 1, 0) == 1)
{
return;
}
NativeMethods.CreateTimerQueueTimer(
out this.handle,
IntPtr.Zero,
this.callback,
IntPtr.Zero,
(uint)interval,
(uint)interval,
CallbackOptions.ExecuteInTimerThread);
}
public virtual void Stop()
{
if (Interlocked.CompareExchange(ref this.started, 0, 1) == 0)
{
return;
}
NativeMethods.DeleteTimerQueueTimer(IntPtr.Zero, this.handle, IntPtr.Zero);
}
public virtual void Dispose()
{
this.Stop();
}
private void ticked(IntPtr parameterPointer, bool unused)
{
if (!this.AutoReset)
{
this.Stop();
}
Thread.CurrentThread.Priority = this.Priority;
var elapsed = this.Elapsed;
if (elapsed != null)
{
elapsed(this, EventArgs.Empty);
}
}
private int started;
private IntPtr handle;
private volatile bool autoReset;
private ThreadPriority priority;
private readonly TimerCallback callback;
}
The problem is, after awhile I'm getting an SEHException when calling Start and Stop simultaneously from multiple threads. The Interlocked.CompareExchange methods should prevent DeleteTimerQueueTimer from being called once after Stop() is called, right? Even if Stop() is called simultaneously from different threads?
The SEHException is being thrown at DeleteTimerQueueTimer(); I assume it's because it's trying to delete a timer that has already been deleted, making the handle invalid. Doesn't the CompareExchange prevent DeleteTimerQueueTimer from being called more than once, even by multiple threads simultaneously?
The function Interlocked.CompareExchange prevent the variable 'started' from modified at same time from 2 threads, but the handle of the timer is the real one you want to protected, but the code fails to do in some cases.
For example, thread A call the start function, it execute the function Interlocked.CompareExchange and then this.started is 1; at this time thread A call the stop function, it sees that 'started' is one, so it will call function DeleteTimerQueueTimer to delete the timer, while the timer might not be created yet and the handle is invalid.
So you should protect the handle of the timer
This is (roughly) what I have:
class A
{
public bool IsInUpdate = false;
public void Update()
{
IsInUpdate = true;
//(...do stuff...)
IsInUpdate = false;
}
}
class B
{
A a_inst;
System.Threading.Thread physicsThread = null;
void Draw()
{
physicsThread = new System.Threading.Thread(a_inst.Update);
physicsThread.Start();
}
void Update()
{
while(physicsThread.IsAlive)
{
// Right here there can be cases where physicsThread.IsAlive is true but IsInUpdate is false, how does that happen?
}
(...do stuff...)
}
}
Question is in the comments of the code. Basically the physics thread instance says it's alive but the function it's calling has clearly been finished calling (as can be seen by the bool being set to false).
Any ideas why this happens? All I want to do is make sure the update function in class B does not execute until the threaded update function of class A has executed...
Since IsInUpdate is simply a public field (and non-volatile at that), there are no guarantees about what you see; the normal sensible rules about what you see only apply on a single thread, and you have not guarded any of this data. There is also an edge-case around the start condition, but personally I would be using either lock (if you need to wait for it to complete), or maybe Interlocked if you just need to know if it is active.
For example:
class A
{
private readonly object syncLock = new object();
public object SyncLock { get { return syncLock; } }
public void Update()
{
lock(SyncLock)
{
//(...do stuff...)
}
}
}
and
void Update()
{
lock(a_inst.SyncLock)
{
(...do stuff...)
}
}
With the above, you are guaranteed that only one thread will have the lock at any time, so if you get to "do stuff" you know that it isn't also running the other Update(). If you need to wait etc there are also Wait() / Pulse() methods against locks, or you can use gates such as ManualResetEvent/AutoResetEvent.
Things like lock also ensure correct memory barriers between the threads, so you see the correct data.
This situation can happen when the Update function has not been called yet. Just because you have called Start on the thread doesn't mean it's immediately going to execute it's main function. I'm not 100% sure if there is a slight window of opportunity where the thread is still alive but the main function has finished executing.
Basically you want to have a look at ManualResetEvent or AutoResetEvent to signal that your thread has finished working. Alternatively an event you can raise after Update() has finished and B can subscribe to might be good enough. Like this:
class A
{
public event EventHandler UpdateFinished;
public void Update()
{
... do work
var handler = UpdateFinished;
if (handler != null)
{
handler(this, EventArgs.Empty);
}
}
}
class B
{
public void Draw()
{
a_inst.UpdateFinished += HandleUpdateFinished;
... start your thread
}
private void HandleUpdateFinished(object sender, EventArgs e)
{
... do whatever
}
}
I'm having a small background thread which runs for the applications lifetime - however when the application is shutdown, the thread should exit gracefully.
The problem is that the thread runs some code at an interval of 15 minutes - which means it sleeps ALOT.
Now in order to get it out of sleep, I toss an interrupt at it - my question is however, if there's a better approach to this, since interrupts generate ThreadInterruptedException.
Here's the gist of my code (somewhat pseudo):
public class BackgroundUpdater : IDisposable
{
private Thread myThread;
private const int intervalTime = 900000; // 15 minutes
public void Dispose()
{
myThread.Interrupt();
}
public void Start()
{
myThread = new Thread(ThreadedWork);
myThread.IsBackground = true; // To ensure against app waiting for thread to exit
myThread.Priority = ThreadPriority.BelowNormal;
myThread.Start();
}
private void ThreadedWork()
{
try
{
while (true)
{
Thread.Sleep(900000); // 15 minutes
DoWork();
}
}
catch (ThreadInterruptedException)
{
}
}
}
There's absolutely a better way - either use Monitor.Wait/Pulse instead of Sleep/Interrupt, or use an Auto/ManualResetEvent. (You'd probably want a ManualResetEvent in this case.)
Personally I'm a Wait/Pulse fan, probably due to it being like Java's wait()/notify() mechanism. However, there are definitely times where reset events are more useful.
Your code would look something like this:
private readonly object padlock = new object();
private volatile bool stopping = false;
public void Stop() // Could make this Dispose if you want
{
stopping = true;
lock (padlock)
{
Monitor.Pulse(padlock);
}
}
private void ThreadedWork()
{
while (!stopping)
{
DoWork();
lock (padlock)
{
Monitor.Wait(padlock, TimeSpan.FromMinutes(15));
}
}
}
For more details, see my threading tutorial, in particular the pages on deadlocks, waiting and pulsing, the page on wait handles. Joe Albahari also has a tutorial which covers the same topics and compares them.
I haven't looked in detail yet, but I wouldn't be surprised if Parallel Extensions also had some functionality to make this easier.
You could use an Event to Check if the Process should end like this:
var eventX = new AutoResetEvent(false);
while (true)
{
if(eventX.WaitOne(900000, false))
{
break;
}
DoWork();
}
There is CancellationTokenSource class in .NET 4 and later which simplifies this task a bit.
private readonly CancellationTokenSource cancellationTokenSource =
new CancellationTokenSource();
private void Run()
{
while (!cancellationTokenSource.IsCancellationRequested)
{
DoWork();
cancellationTokenSource.Token.WaitHandle.WaitOne(
TimeSpan.FromMinutes(15));
}
}
public void Stop()
{
cancellationTokenSource.Cancel();
}
Don't forget that CancellationTokenSource is disposable, so make sure you dispose it properly.
One method might be to add a cancel event or delegate that the thread will subscribe to. When the cancel event is invoke, the thread can stop itself.
I absolutely like Jon Skeets answer. However, this might be a bit easier to understand and should also work:
public class BackgroundTask : IDisposable
{
private readonly CancellationTokenSource cancellationTokenSource;
private bool stop;
public BackgroundTask()
{
this.cancellationTokenSource = new CancellationTokenSource();
this.stop = false;
}
public void Stop()
{
this.stop = true;
this.cancellationTokenSource.Cancel();
}
public void Dispose()
{
this.cancellationTokenSource.Dispose();
}
private void ThreadedWork(object state)
{
using (var syncHandle = new ManualResetEventSlim())
{
while (!this.stop)
{
syncHandle.Wait(TimeSpan.FromMinutes(15), this.cancellationTokenSource.Token);
if (!this.cancellationTokenSource.IsCancellationRequested)
{
// DoWork();
}
}
}
}
}
Or, including waiting for the background task to actually have stopped (in this case, Dispose must be invoked by other thread than the one the background thread is running on, and of course this is not perfect code, it requires the worker thread to actually have started):
using System;
using System.Threading;
public class BackgroundTask : IDisposable
{
private readonly ManualResetEventSlim threadedWorkEndSyncHandle;
private readonly CancellationTokenSource cancellationTokenSource;
private bool stop;
public BackgroundTask()
{
this.threadedWorkEndSyncHandle = new ManualResetEventSlim();
this.cancellationTokenSource = new CancellationTokenSource();
this.stop = false;
}
public void Dispose()
{
this.stop = true;
this.cancellationTokenSource.Cancel();
this.threadedWorkEndSyncHandle.Wait();
this.cancellationTokenSource.Dispose();
this.threadedWorkEndSyncHandle.Dispose();
}
private void ThreadedWork(object state)
{
try
{
using (var syncHandle = new ManualResetEventSlim())
{
while (!this.stop)
{
syncHandle.Wait(TimeSpan.FromMinutes(15), this.cancellationTokenSource.Token);
if (!this.cancellationTokenSource.IsCancellationRequested)
{
// DoWork();
}
}
}
}
finally
{
this.threadedWorkEndSyncHandle.Set();
}
}
}
If you see any flaws and disadvantages over Jon Skeets solution i'd like to hear them as i always enjoy learning ;-)
I guess this is slower and uses more memory and should thus not be used in a large scale and short timeframe. Any other?