I am trying to handle exceptions from a System.Threading.Tasks.Task
I haven't used these before, and seem to be misunderstanding how the ContinueWith works; thus my ContinueWith is firing at the wrong time.
Given the following; workers is just a list of my long running processes.
......
workers.Add(new Workers.Tests.TestWorker1());
workers.Add(new Workers.Tests.TestWorker2());
// Start all the workers.
workers.ForEach(worker =>
{
// worker.Start schedules a timer and calls DoWork in the worker
System.Threading.Tasks.Task task = new System.Threading.Tasks.Task(worker.Start);
task.ContinueWith(ExceptionHandler, TaskContinuationOptions.OnlyOnFaulted);
task.Start();
})
.....
My handler method is
private void ExceptionHandler(System.Threading.Tasks.Task arg1, object arg2)
{
DebugLogger.Write("uh oh.. it died");
}
My TestWorker's are:
class TestWorker1 : Worker
{
int count = 1;
public override void DoWork(object timerState)
{
DebugLogger.Write(string.Format("{0} ran {1} times", workerName, count));
count++;
ScheduleTimer();
}
}
And
class TestWorker2 : Worker
{
int count = 1;
public override void DoWork(object timerState)
{
DebugLogger.Write(string.Format("{0} ran {1} times", workerName, count));
count++;
if (count == 3)
throw new Exception("I'm going to die....");
ScheduleTimer();
}
}
ScheduleTimer() simply sets an interval for DoWork to be run
What happens...
When I debug, all tasks are created and started. As soon as theDoWork has called ScheduleTimer() for the first time, my ExceptionHandler is hit; as shown in this screenshot - this happens for both workers.
When the exception is hit in TestWorker2 the debugger will not move on from there - in that i press continue, hoping to hit my ExceptionHandler, and the debugger just keeps throwing the exception.
What I am hoping to achieve
I would like my ExceptionHandler to only fire when an exception within the running tasks is thrown. I'm finding the only time i get into my ExceptionHandler is when it's run, and my actual exception just keeps looping.
What am i missing?
Per comment, here is the code for the main Worker
public abstract class Worker : IDisposable
{
internal string workerName;
internal Timer scheduler;
internal DateTime scheduledTime;
public Worker()
{
string t = this.GetType().ToString();
workerName = t.Substring(t.LastIndexOf(".") + 1).AddSpacesBeforeUppercase(true).Trim();
}
/// <summary>
/// Set to true when the worker is performing its task, false when its complete
/// </summary>
public bool IsCurrentlyProcessing { get; set; }
public void Start()
{
DebugLogger.Write(workerName + " Started");
ScheduleTimer();
}
/// <summary>
/// default functionality for setting up the timer.
/// Typically, the timer will fire in 60 second intervals
/// Override this method in child classes for different functionality
/// </summary>
public virtual void ScheduleTimer()
{
scheduler = new Timer(new TimerCallback(DoWork));
int interval = 60;
int.TryParse(ConfigurationManager.AppSettings[string.Format("{0}{1}", workerName.Replace(" ", ""), "Interval")], out interval);
scheduledTime = DateTime.Now.AddSeconds(interval);
if (DateTime.Now > scheduledTime)
scheduledTime = scheduledTime.AddSeconds(interval);
int dueTime = Convert.ToInt32(scheduledTime.Subtract(DateTime.Now).TotalMilliseconds);
scheduler.Change(dueTime, Timeout.Infinite);
}
public abstract void DoWork(object timerState);
public void Stop()
{
// kill stuff
if (scheduler != null)
scheduler.Dispose();
DebugLogger.Write(workerName + " stop");
this.Dispose();
}
private bool disposed = false;
protected virtual void Dispose(bool disposing)
{
if (!this.disposed)
if (disposing)
{
// any specific cleanup
}
this.disposed = true;
}
public void Dispose()
{
Dispose(true);
GC.SuppressFinalize(this);
}
}
From your screenshot it appears that arg2 is your TaskContinuationOptions.OnlyOnFaulted object, that is the biggest clue of what is going wrong. Because you passed in a Action<Task, Object> it is using the Task.ContinueWith Method (Action<Task, Object>, Object) overload of ContinueWith, this is causing your continuation options to be passed in as the state parameter.
Either change ExceptionHandler to
private void ExceptionHandler(System.Threading.Tasks.Task arg1)
{
DebugLogger.Write("uh oh.. it died");
}
so you will use the Task.ContinueWith(Action<Task>, TaskContinuationOptions) overload, or you can change your call to
task.ContinueWith(ExceptionHandler, null, TaskContinuationOptions.OnlyOnFaulted);
so that you will start using the Task.ContinueWith(Action<Task, Object>, Object, TaskContinuationOptions) overload.
Might be caused by your logging component not supporting multiple concurrent writes.
If it's possible for you, I'd suggest you refactor the code to the async/await pattern, it will be much more readable.
Let's say you create a list of all the tasks you want to run:
List<Task> tasks = new List<Task>();
workers.ForEach(worker => tasks.Add(Task.Run(() => worker.Start())));
and then use await on the list surrounded by a try catch block:
try
{
await Task.WhenAll(tasks);
}
catch (Exception ex)
{
DebugLogger.Write("uh oh.. it died");
}
Also, make sure you are not doing any Thread.Wait(xxx) calls (or any other Thread.XXX calls for that matter) inside ScheduleTimer(), because tasks and threads don't play nice together.
Hope it helps!
Related
I would like to use Condition Variable in order to know when Messages Queue is not empty, i would like to use it in "HandleMessageQueue" as a thread
private static Queue<Message> messages = new Queue<Message>();
/// <summary>
/// function return the first message
/// </summary>
/// <returns>first message element</returns>
public static Message GetFirst()
{
return messages.Dequeue();
}
in another class:
/// <summary>
/// Function run while the clients connected and handle the queue message
/// </summary>
public static void HandleMessageQueue()
{
// ...
}
What you're probably looking for is a simple producer-consumer pattern. In this case I'd recommend using .NET's BlockingCollection, which allows you to easily handle the following cases:
have one thread push stuff in a queue
have another thread block until stuff is available
make the whole thing easy to shutdown without having to forcibly terminate the thread
Here's a short code sample, read the comments for more information about what every bit does:
public class Queue : IDisposable
{
private readonly Thread _messageThread; // thread for processing messages
private readonly BlockingCollection<Message> _messages; // queue for messages
private readonly CancellationTokenSource _cancellation; // used to abort the processing when we're done
// initializes everything and starts a processing thread
public Queue()
{
_messages = new BlockingCollection<Message>();
_cancellation = new CancellationTokenSource();
_messageThread = new Thread(ProcessMessages);
_messageThread.Start();
}
// processing thread function
private void ProcessMessages()
{
try
{
while (!_cancellation.IsCancellationRequested)
{
// Take() blocks until either:
// 1) a message is available, in which case it returns it, or
// 2) the cancellation token is cancelled, in which case it throws an OperationCanceledException
var message = _messages.Take(_cancellation.Token);
// process the message here
}
}
catch (OperationCanceledException)
{
// Take() was cancelled, let the thread exit
}
}
// pushes a message
public void QueueMessage(Message message)
{
_messages.Add(message);
}
// stops processing and clean up resources
public void Dispose()
{
_cancellation.Cancel(); // let Take() abort by throwing
_messageThread.Join(); // wait for thread to exit
_cancellation.Dispose(); // release the cancellation source
_messages.Dispose(); // release the queue
}
}
Another option would be to combine a ConcurrentQueue<T> with a ManualResetEvent (events are roughly the .NET equivalent to condition variables), but that would be doing by hand what BlockingCollection<T> does).
something like this?
public class EventArgs<T> : EventArgs
{
private T eventData;
public EventArgs(T eventData)
{
this.eventData = eventData;
}
public T EventData
{
get { return eventData; }
}
}
public class ObservableQueue<T>
{
public event EventHandler<EventArgs<T>> EnQueued;
public event EventHandler<EventArgs<T>> DeQueued;
public int Count { get { return queue.Count; } }
private readonly Queue<T> queue = new Queue<T>();
protected virtual void OnEnqueued(T item)
{
if (EnQueued != null)
EnQueued(this, new EventArgs<T>(item));
}
protected virtual void OnDequeued(T item)
{
if (DeQueued != null)
DeQueued(this, new EventArgs<T>(item));
}
public virtual void Enqueue(T item)
{
queue.Enqueue(item);
OnEnqueued(item);
}
public virtual T Dequeue()
{
var item = queue.Dequeue();
OnDequeued(item);
return item;
}
}
and use it
static void Main(string[] args)
{
ObservableQueue<string> observableQueue = new ObservableQueue<string>();
observableQueue.EnQueued += ObservableQueue_EnQueued;
observableQueue.DeQueued += ObservableQueue_DeQueued;
observableQueue.Enqueue("abc");
observableQueue.Dequeue();
Console.Read();
}
Problem
In a project case I need to create multiple threads that are picking tasks from a queue and running them. Some of these tasks cannot run if a group of other tasks are still running. Consider something like file copy and defrag (runs when system is idle) in windows .
Solution
To implement this, I created a class based on
System.Threading.CountdownEvent.
Whenever a thread picks a blocking task from queue, they will Increment the CounterEvent and after they finished their job they will Decrement the CounterEvent.
If a thread picks a low priority task, it will Wait until CounterEvent is zero then starts running.
A low priority taks can immediately start with Reset of CounterEvent
Main thread or a parallel thread can monitor the status of lock by querying the CurrentCount.
Here is the Code:
using System;
using System.Diagnostics.Contracts;
using System.Threading;
public class CounterEvent : IDisposable {
private volatile int m_currentCount;
private volatile bool m_disposed;
private ManualResetEventSlim m_event;
// Gets the number of remaining signals required to set the event.
public int CurrentCount {
get {
return m_currentCount;
}
}
// Allocate a thin event, Create a latch in signaled state.
public CounterEvent() {
m_currentCount = 0;
m_event = new ManualResetEventSlim();
m_event.Set(); //
}
// Decrements the counter. if counter is zero signals other threads to continue
public void Decrement() {
ThrowIfDisposed();
Contract.Assert(m_event != null);
int newCount = 0;
if (m_currentCount >= 0) {
#pragma warning disable 0420
newCount = Interlocked.Decrement(ref m_currentCount);
#pragma warning restore 0420
}
if (newCount == 0) {
m_event.Set();
}
}
// increments the current count by one.
public void Increment() {
ThrowIfDisposed();
#pragma warning disable 0420
Interlocked.Increment(ref m_currentCount);
#pragma warning restore 0420
}
// Resets the CurrentCount to the value of InitialCount.
public void Reset() {
ThrowIfDisposed();
m_currentCount = 0;
m_event.Set();
}
// Blocks the current thread until the System.Threading.CounterEvent is set.
public void Wait() {
ThrowIfDisposed();
m_event.Wait();
}
/// <summary>
/// Throws an exception if the latch has been disposed.
/// </summary>
private void ThrowIfDisposed() {
if (m_disposed) {
throw new ObjectDisposedException("CounterEvent");
}
}
// According to MSDN this is not thread safe
public void Dispose() {
Dispose(true);
GC.SuppressFinalize(this);
}
// According to MSDN Dispose() is not thread-safe.
protected virtual void Dispose(bool disposing) {
if (disposing) {
m_event.Dispose();
m_disposed = true;
}
}
}
Question
will this code work as expected?
Any flaws that I didn't see in it?
Is there any better option doing this?
Note
Application is written with System.Threading.Thread and cost of converting it for me is very high, however a great replacement solution always worth working on for future.
This should be one atomic operation and it is not threadsafe if you do it like this
if (m_currentCount >= 0)
{
newCount = Interlocked.Decrement(ref m_currentCount);
}
It may happen that m_currentCount is changed between the if and the Interlocked.Decrement
You should rewrite your logic to use Interlocked.CompareExchange I would also use Interlocked.Exchange in every place where you assign to m_currentCount then you don´t need volatile and the pragma You should also be aware of that under very heavy load it can happen that a reset event Set is getting lost
Let's say I have a class which has a Timer object that doesn't do any critical work - just some GUI work. Let's say there are 2 scenarios where the timer elapses every 5 minutes:
in the Timer_Elapsed delegate there is a lot of work that is done and it takes 2 minutes to complete.
in the Timer_Elapsed delegate there is little work to be done and it takes a couple of milliseconds to complete
What is the proper way to dispose of the object & timer? Does the amount of time the Timer_Elapsed event delegate runs influence your decision on how to Dispose properly?
If, you need to stop your timer during disposal, and work could still be in progress in your timer delegate, that relies on shared resources, being disposed at the same time, you need to coordinate the "shutdown" process. The below snippet shows an example of doing this:
public class PeriodicTimerTask : IDisposable
{
private readonly System.Timers.Timer _timer;
private CancellationTokenSource _tokenSource;
private readonly ManualResetEventSlim _callbackComplete;
private readonly Action<CancellationToken> _userTask;
public PeriodicTimerTask(TimeSpan interval, Action<CancellationToken> userTask)
{
_tokenSource = new CancellationTokenSource();
_userTask = userTask;
_callbackComplete = new ManualResetEventSlim(true);
_timer = new System.Timers.Timer(interval.TotalMilliseconds);
}
public void Start()
{
if (_tokenSource != null)
{
_timer.Elapsed += (sender, e) => Tick();
_timer.AutoReset = true;
_timer.Start();
}
}
public void Stop()
{
var tokenSource = Interlocked.Exchange(ref _tokenSource, null);
if (tokenSource != null)
{
_timer.Stop();
tokenSource.Cancel();
_callbackComplete.Wait();
_timer.Dispose();
_callbackComplete.Dispose();
tokenSource.Dispose();
}
}
public void Dispose()
{
Stop();
GC.SuppressFinalize(this);
}
private void Tick()
{
var tokenSource = _tokenSource;
if (tokenSource != null && !tokenSource.IsCancellationRequested)
{
try
{
_callbackComplete.Wait(tokenSource.Token); // prevent multiple ticks.
_callbackComplete.Reset();
try
{
tokenSource = _tokenSource;
if (tokenSource != null && !tokenSource.IsCancellationRequested)
_userTask(tokenSource.Token);
}
finally
{
_callbackComplete.Set();
}
}
catch (OperationCanceledException) { }
}
}
}
Usage example:
public static void Main(params string[] args)
{
var periodic = new PeriodicTimerTask(TimeSpan.FromSeconds(1), cancel => {
int n = 0;
Console.Write("Tick ...");
while (!cancel.IsCancellationRequested && n < 100000)
{
n++;
}
Console.WriteLine(" completed.");
});
periodic.Start();
Console.WriteLine("Press <ENTER> to stop");
Console.ReadLine();
Console.WriteLine("Stopping");
periodic.Dispose();
Console.WriteLine("Stopped");
}
With output like below:
Press <ENTER> to stop
Tick ... completed.
Tick ... completed.
Tick ... completed.
Tick ... completed.
Tick ... completed.
Stopping
Stopped
There are multiple approaches to this, and like Alex said in the comments it depends on whether or not objects the delegate will be using are also disposed.
Let's say we have a "worst-case" scenario, in which the delegate does need to use objects which would be disposed.
A good way to handle this would be similar to a method the Process object has: WaitForExit(). This method would simply loop until it sees the delegate is done working (have a working bool which is set before and after the delegate runs?) then returns. Now you can have something like this in the code using that class:
// Time to shut down
myDisposable.WaitForFinish();
myDisposable.Dispose();
Thus we are essentially ensuring the delegate is done before disposing of it, stopping any sort of ObjectDisposedException.
I have a service running some different tasks in a loop until the service is stopped.
However one of these tasks i calling a web service and this call can take several minutes to complete. I want to be able to stop the service instantly, 'cancelling' the web service call without calling Thread.Abort because that causes some strange behavior even if the only thing the thread is doing is calling this web service method.
How can i cancel or break from a synchronous method call (if it's even possible)?
Or should I try a different approach?
I have tried to use the AutoResetEvent and then calling Thread.Abort which is working fine in the below code sample, but when implementing this solution in the actual service I get some unexpected behavior probably because of what's going on in the external libraries I'm using.
AutoResetEvent and Thread.Abort:
class Program
{
static void Main(string[] args)
{
MainProgram p = new MainProgram();
p.Start();
var key = Console.ReadKey();
if (key.Key == ConsoleKey.Q)
p.Stop();
}
}
class MainProgram
{
private Thread workerThread;
private Thread webServiceCallerThread;
private volatile bool doWork;
public void Start()
{
workerThread = new Thread(() => DoWork());
doWork = true;
workerThread.Start();
}
public void Stop()
{
doWork = false;
webServiceCallerThread.Abort();
}
private void DoWork()
{
try
{
while (doWork)
{
AutoResetEvent are = new AutoResetEvent(false);
WebServiceCaller caller = new WebServiceCaller(are);
webServiceCallerThread = new Thread(() => caller.TimeConsumingMethod());
webServiceCallerThread.Start();
// Wait for the WebServiceCaller.TimeConsumingMethod to finish
WaitHandle.WaitAll(new[] { are });
// If doWork has been signalled to stop
if (!doWork)
break;
// All good - continue
Console.WriteLine(caller.Result);
}
}
catch (Exception e)
{
Console.Write(e);
}
}
}
class WebServiceCaller
{
private AutoResetEvent ev;
private int result;
public int Result
{
get { return result; }
}
public WebServiceCaller(AutoResetEvent ev)
{
this.ev = ev;
}
public void TimeConsumingMethod()
{
try
{
// Simulates a method running for 1 minute
Thread.Sleep(60000);
result = 1;
ev.Set();
}
catch (ThreadAbortException e)
{
ev.Set();
result = -1;
Console.WriteLine(e);
}
}
}
Can someone suggest a solution to this issue?
Try this
public void Start()
{
workerThread = new Thread(() => DoWork());
doWork = true;
workerThread.IsBackground = true;
workerThread.Start();
}
A thread is either a background thread or a foreground thread.
Background threads are identical to foreground threads, except that
background threads do not prevent a process from terminating. Once all
foreground threads belonging to a process have terminated, the common
language runtime ends the process. Any remaining background threads
are stopped and do not complete.
For more details see http://msdn.microsoft.com/en-us/library/system.threading.thread.isbackground.aspx
The solution is really this simple: Don't make calls that block for several minutes unless you want to block for several minutes. If there is no way to do a particular thing without blocking, potentially for several minutes, complain loudly to whoever wrote the code that imposes that painful requirement (or fix it yourself, if possible).
Once you've made the call, it's too late. You're committed. If the function you are calling doesn't provide a safe way to abort it, then there's no safe way.
As all you want to do is make one an asynchonrous web service call at a time and on each response make another call you can dispense with the worker thread and simply make an aynchronous call, register a callback and make another async call from the callback:
class Program
{
private static WebServiceCaller.TCMDelegate _wscDelegate;
private static readonly WebServiceCaller _wsCaller = new WebServiceCaller();
static void Main(string[] args)
{
_wscDelegate = _wsCaller.TimeConsumingMethod;
MakeWSCallAsync();
Console.WriteLine("Enter Q to quit");
while (Console.ReadLine().ToUpper().Trim()!="Q"){}
}
public static void MakeWSCallAsync()
{
_wscDelegate.BeginInvoke(OnWSCallComplete, null);
}
public static void OnWSCallComplete(IAsyncResult ar)
{
Console.WriteLine("Result {0}", _wscDelegate.EndInvoke(ar));
MakeWSCallAsync();
}
}
class WebServiceCaller
{
public delegate int TCMDelegate();
public int TimeConsumingMethod()
{
try
{
// Simulates a method running for 1 minute
Thread.Sleep(1000);
return 1;
}
catch (ThreadAbortException e)
{
return -1;
}
}
}
No blocking (well, the console thread is blocking on ReadLine()) and no windows kernal mode sync objects (AutoResetEvent) which are expensive.
Consider this example:
When the user clicks a button, ClassA fires OnUserInteraction event rapidly 10 times. ClassB is attached to this event and in it's event handler it fires ClassC's Render method. In the Render method the AxisAngleRotation3D is executed, but every single animation is lasting 1 second.
In this scenario all 10 AxisAngleRotation3D animations are executed almost at the same time, but I would want them to execute one after another. As I understand threads, I would probably have to implement a thread queue in ClassB, where the Completed event of the AxisAngleRotation3D signals that the next event is allowed to fire...?
Is this correct and how can I achieve this?
Have a task queue. Simply put, have a ConcurrentQueue<Func<bool>> field or similar, and add tasks to it as necessary. Then have your task execution thread pop Func<bool> delegates off the queue and invoke them. If they return true, they're done. If they return false, add them back onto the queue, as they couldn't complete at that time.
Here's an example:
using System;
using System.Collections.Concurrent;
using System.Threading;
namespace Example
{
public class TaskScheduler : IDisposable
{
public const int IDLE_DELAY = 100;
private ConcurrentQueue<Func<bool>> PendingTasks;
private Thread ExecuterThread;
private volatile bool _IsDisposed;
public bool IsDisposed
{
get { return _IsDisposed; }
}
public void EnqueueTask(Func<bool> task)
{
PendingTasks.Enqueue(task);
}
public void Start()
{
CheckDisposed();
if (ExecuterThread != null)
{
throw new InvalidOperationException("The task scheduler is alreader running.");
}
ExecuterThread = new Thread(Run);
ExecuterThread.IsBackground = true;
ExecuterThread.Start();
}
private void CheckDisposed()
{
if (_IsDisposed)
{
throw new ObjectDisposedException("TaskScheduler");
}
}
private void Run()
{
while (!_IsDisposed)
{
if (PendingTasks.IsEmpty)
{
Thread.Sleep(IDLE_DELAY);
continue;
}
Func<bool> task;
while (!PendingTasks.TryDequeue(out task))
{
Thread.Sleep(0);
}
if (!task.Invoke())
{
PendingTasks.Enqueue(task);
}
}
}
public void Dispose()
{
CheckDisposed();
_IsDisposed = true;
}
}
}
ClassB could add the event to a queue and then render them one at a time (possibly use a timer to read from the queue).