C# TPL application stops running - c#

I have an application that was working and has a loop with a variable number of iterations. I have one function call in this loop. I then tried to change the program to launch the function as a separate thread. I set up a unit test to run, and the application stops running before completing any work.
I have set the loop to have one iteration and debug on the one thread. It stops running near the top of the function, not always on the same line, but in the same area where I try to make a copy of an object that has a data table and data rows where the selection can be changed in each thread. The following is the code and it consistently stops when debugging in this area, but the line that is reached varies.
// main thread called by unit test
...
for(...
{
Task compute = Task.Factory.StartNew(() => results.Add(Compute(originalObject)));
}
...
private ReturnObject Compute(MyObject originalObject)
{
...
// near top of function after some assignment statements
// of some string and boolean variables
MyObject myObject = originalObject.Copy;
// never makes it to the next line
...
}
// MyObject class
private MyObject(DataTable dtTable)
{
_dataService = new DataService();
_dataTable = dtTable.Copy();
_dataRows = _dataTable.Select();
}
public MyObject Copy()
{
MyObject copy = new MyObject(_dtTable);
return copy;
}
// DataService class
public DataService()
{
_oleDbConnection = null;
}

You do not appear to Wait for the tasks that you create to complete: you must either call the Wait method or access the Result property of a generic task to block the calling thread until the work is complete, try the following:
var tasks = new List<Task>();
for ...
{
Task compute = Task.Factory.StartNew(() => results.Add(Compute(originalObject)));
tasks.Add(compute);
}
Task.WaitAll(tasks.ToArray());

Related

Forcing certain code to always run on the same thread

We have an old 3rd party system (let's call it Junksoft® 95) that we interface with via PowerShell (it exposes a COM object) and I'm in the process of wrapping it in a REST API (ASP.NET Framework 4.8 and WebAPI 2). I use the System.Management.Automation nuget package to create a PowerShell in which I instantiate Junksoft's COM API as a dynamic object that I then use:
//I'm omitting some exception handling and maintenance code for brevity
powerShell = System.Management.Automation.PowerShell.Create();
powerShell.AddScript("Add-Type -Path C:\Path\To\Junksoft\Scripting.dll");
powerShell.AddScript("New-Object Com.Junksoft.Scripting.ScriptingObject");
dynamic junksoftAPI = powerShell.Invoke()[0];
//Now we issue commands to junksoftAPI like this:
junksoftAPI.Login(user,pass);
int age = junksoftAPI.GetAgeByCustomerId(custId);
List<string> names = junksoftAPI.GetNames();
This works fine when I run all of this on the same thread (e.g. in a console application). However, for some reason this usually doesn't work when I put junksoftAPI into a System.Web.Caching.Cache and use it from different controllers in my web app. I say ususally because this actually works when ASP.NET happens to give the incoming call to the thread that junksoftAPI was created on. If it doesn't, Junksoft 95 gives me an error.
Is there any way for me to make sure that all interactions with junksoftAPI happen on the same thread?
Note that I don't want to turn the whole web application into a single-threaded application! The logic in the controllers and elswhere should happen like normal on different threads. It should only be the Junksoft interactions that happen on the Junksoft-specific thread, something like this:
[HttpGet]
public IHttpActionResult GetAge(...)
{
//finding customer ID in database...
...
int custAge = await Task.Run(() => {
//this should happen on the Junksoft-specific thread and not the next available thread
var cache = new System.Web.Caching.Cache();
var junksoftAPI = cache.Get(...); //This has previously been added to cache on the Junksoft-specific thread
return junksoftAPI.GetAgeByCustomerId(custId);
});
//prepare a response using custAge...
}
You can create your own singleton worker thread to achieve this. Here is the code which you can plug it into your web application.
public class JunkSoftRunner
{
private static JunkSoftRunner _instance;
//singleton pattern to restrict all the actions to be executed on a single thread only.
public static JunkSoftRunner Instance => _instance ?? (_instance = new JunkSoftRunner());
private readonly SemaphoreSlim _semaphore;
private readonly AutoResetEvent _newTaskRunSignal;
private TaskCompletionSource<object> _taskCompletionSource;
private Func<object> _func;
private JunkSoftRunner()
{
_semaphore = new SemaphoreSlim(1, 1);
_newTaskRunSignal = new AutoResetEvent(false);
var contextThread = new Thread(ThreadLooper)
{
Priority = ThreadPriority.Highest
};
contextThread.Start();
}
private void ThreadLooper()
{
while (true)
{
//wait till the next task signal is received.
_newTaskRunSignal.WaitOne();
//next task execution signal is received.
try
{
//try execute the task and get the result
var result = _func.Invoke();
//task executed successfully, set the result
_taskCompletionSource.SetResult(result);
}
catch (Exception ex)
{
//task execution threw an exception, set the exception and continue with the looper
_taskCompletionSource.SetException(ex);
}
}
}
public async Task<TResult> Run<TResult>(Func<TResult> func, CancellationToken cancellationToken = default(CancellationToken))
{
//allows only one thread to run at a time.
await _semaphore.WaitAsync(cancellationToken);
//thread has acquired the semaphore and entered
try
{
//create new task completion source to wait for func to get executed on the context thread
_taskCompletionSource = new TaskCompletionSource<object>();
//set the function to be executed by the context thread
_func = () => func();
//signal the waiting context thread that it is time to execute the task
_newTaskRunSignal.Set();
//wait and return the result till the task execution is finished on the context/looper thread.
return (TResult)await _taskCompletionSource.Task;
}
finally
{
//release the semaphore to allow other threads to acquire it.
_semaphore.Release();
}
}
}
Console Main Method for testing:
public class Program
{
//testing the junk soft runner
public static void Main()
{
//get the singleton instance
var softRunner = JunkSoftRunner.Instance;
//simulate web request on different threads
for (var i = 0; i < 10; i++)
{
var taskIndex = i;
//launch a web request on a new thread.
Task.Run(async () =>
{
Console.WriteLine($"Task{taskIndex} (ThreadID:'{Thread.CurrentThread.ManagedThreadId})' Launched");
return await softRunner.Run(() =>
{
Console.WriteLine($"->Task{taskIndex} Completed On '{Thread.CurrentThread.ManagedThreadId}' thread.");
return taskIndex;
});
});
}
}
}
Output:
Notice that, though the function was launched from the different threads, some portion of code got always executed always on the same context thread with ID: '5'.
But beware that, though all the web requests are executed on independent threads, they will eventually wait for some tasks to get executed on the singleton worker thread. This will eventually create a bottle neck in your web application. This is anyway your design limitation.
Here is how you could issue commands to the Junksoft API from a dedicated STA thread, using a BlockingCollection class:
public class JunksoftSTA : IDisposable
{
private readonly BlockingCollection<Action<Lazy<dynamic>>> _pump;
private readonly Thread _thread;
public JunksoftSTA()
{
_pump = new BlockingCollection<Action<Lazy<dynamic>>>();
_thread = new Thread(() =>
{
var lazyApi = new Lazy<dynamic>(() =>
{
var powerShell = System.Management.Automation.PowerShell.Create();
powerShell.AddScript("Add-Type -Path C:\Path\To\Junksoft.dll");
powerShell.AddScript("New-Object Com.Junksoft.ScriptingObject");
dynamic junksoftAPI = powerShell.Invoke()[0];
return junksoftAPI;
});
foreach (var action in _pump.GetConsumingEnumerable())
{
action(lazyApi);
}
});
_thread.SetApartmentState(ApartmentState.STA);
_thread.IsBackground = true;
_thread.Start();
}
public Task<T> CallAsync<T>(Func<dynamic, T> function)
{
var tcs = new TaskCompletionSource<T>(
TaskCreationOptions.RunContinuationsAsynchronously);
_pump.Add(lazyApi =>
{
try
{
var result = function(lazyApi.Value);
tcs.SetResult(result);
}
catch (Exception ex)
{
tcs.SetException(ex);
}
});
return tcs.Task;
}
public Task CallAsync(Action<dynamic> action)
{
return CallAsync<object>(api => { action(api); return null; });
}
public void Dispose() => _pump.CompleteAdding();
public void Join() => _thread.Join();
}
The purpose of using the Lazy class is for surfacing a possible exception during the construction of the dynamic object, by propagating it to the callers.
...exceptions are cached. That is, if the factory method throws an exception the first time a thread tries to access the Value property of the Lazy<T> object, the same exception is thrown on every subsequent attempt.
Usage example:
// A static field stored somewhere
public static readonly JunksoftSTA JunksoftStatic = new JunksoftSTA();
await JunksoftStatic.CallAsync(api => { api.Login("x", "y"); });
int age = await JunksoftStatic.CallAsync(api => api.GetAgeByCustomerId(custId));
In case you find that a single STA thread is not enough to serve all the requests in a timely manner, you could add more STA threads, all of them running the same code (private readonly Thread[] _threads; etc). The BlockingCollection class is thread-safe and can be consumed concurrently by any number of threads.
If you did not say that was a 3rd party tool, I would have asumed it is a GUI class. For practical reasons, it is a very bad idea to have multiple threads write to them. .NET enforces a strict "only the creating thread shall write" rule, from 2.0 onward.
WebServers in general and ASP.Net in particular use a pretty big thread pool. We are talking 10's to 100's of Threads per Core. That means it is really hard to nail any request down to a specific Thread. You might as well not try.
Again, looking at the GUI classes might be your best bet. You could basically make a single thread with the sole purpose of immitating a GUI's Event Queue. The Main/UI Thread of your average Windows Forms application, is responsible for creating every GUI class instance. It is kept alive by polling/processing the event queue. It ends onlyx when it receies a cancel command, via teh Event Queue. Dispatching just puts orders into that Queue, so we can avoid Cross-Threading issues.

Observable (and cancelable) loop

I'm creating an emulator. The core of the emulation runs in an infinite loop like this:
while (true)
{
UpdateMachineState();
}
I would like to introduce Reactive Extensions to execute this loop into another thread and to make it cancelable, but I'm completely lost.
Since my emulator is a GUI application (Universal Windows), I don't wan't to block the UI thread.
It should look like:
...
while (true)
{
if (machine.IsHalted)
{
observer.OnCompleted;
}
observer.OnNext(machine.GetState());
cancellationToken.ThrowIfCancellationRequested();
}
...
The created sequence would eventually complete when the emulator enters the "halted" state. Otherwise, it will keep pushing States (an object that represents its internal state) forever.
I've tried with Observable.Create, but the overload that provides a CancellationToken requires a Task<Action>.
Here's how you do it in Rx:
void Main()
{
var scheduler = new EventLoopScheduler();
var loop = scheduler.Schedule(a =>
{
UpdateMachineState();
a();
});
Thread.Sleep(1);
loop.Dispose();
}
public void UpdateMachineState()
{
Console.Write(".");
}
The overload on .Schedule that I used takes a Action<Action> as the parameter. You simply call the inner action if you want the the action to be rescheduled - so the above code effectively creates the infinite loop.
You then call .Dispose() on the return from the .Schedule call to cancel the loop.
Another alternative is to use the .Generate operator:
var scheduler = new EventLoopScheduler();
var query =
Observable
.Generate(0, x => true, x => x, x => machine.GetState(), scheduler);
var subscription = query.Subscribe(x => Console.Write("."));
Thread.Sleep(1);
subscription.Dispose();

Wait for dynamically created Tasks until they completed

My program is creating some Tasks to do some work.
Now I want to wait for all tasks to complete before I will exit my program.
I know the Task.WaitAll(...) method but in my case I want to create some Tasks dynamically --> they don't exist when Task.WaitAll() is called.
My question now is:
how can I wait for all running Tasks until they are completed?
This is what I'm doing now.
It works but I want to be sure this is also a good way to do it.
public class Test
{
public ConcurrentBag<Task> RunningTasks { get; set; }
public Test()
{
RunningTasks = new ConcurrentBag<Task>();
}
public void RunIt(){
//...create some Tasks asynchronously so they may be created in a few seconds and add them to the list like
//RunningTasks.Add(...);
//Just wait until all tasks finished.
var waitTask = Task.Run(() =>
{
while (true)
{
if (RunningTasks.All(t => t.IsCompleted))
break;
Thread.Sleep(1000);
}
});
waitTask.Wait();
}
}
Task.WhenAll() will take an array of Task, and the ConcurrentBag<T> supports the ToArray() extension method, so you can simply do:
await Task.WhenAll(RunningTasks.ToArray())
or if you don't want to use the await keyword:
Task.WhenAll(RunningTasks.ToArray()).Wait()
UPDATE
So if your RunningTasks is changing after the initial call you can do this pretty easily to handle it:
while(RunningTasks.Any(t=>!t.IsCompleted))
{
Task.WhenAll(RunningTasks.ToArray());
}
The main benefit over your method is that this will yield the thread to other work while it waits, where your code will tie up the thread in a sleep state until all the tasks are completed.

There is a need to define Int variable to have access from different threads?

I have function that received List of files and do work in separate threads:
private static CancellationTokenSource _tokenSource;
private static int _filesInProcess;
private static int _filesFinished;
private IEnumerable<Tuple<int, string>> _indexedSource;
public static int FilesInProcess
{
get { return _filesInProcess; }
set { _filesInProcess = value; }
}
public static int FilesFinished
{
get { return _filesFinished; }
set { _filesFinished = value; }
}
public void DoWork(int parallelThreads)
{
_filesInProcess = 0;
_filesFinished = 0;
_tokenSource = new CancellationTokenSource();
var token = _tokenSource.Token;
Task.Factory.StartNew(() =>
{
try
{
Parallel.ForEach(_indexedSource,
new ParallelOptions
{
MaxDegreeOfParallelism = parallelThreads //limit number of parallel threads
},
file =>
{
if (token.IsCancellationRequested)
return;
//do work...
});
}
catch (Exception)
{ }
}, _tokenSource.Token).ContinueWith(
t =>
{
//finish...
}
, TaskScheduler.FromCurrentSynchronizationContext() //to ContinueWith (update UI) from UI thread
);
}
As you can see i have 2 variables that indicate how many files already start and how many finished (_filesInProcess and _filesFinished)
My questions:
Do I need to set these variables to be accessed from different threads or this is OK ?
After the function finished and all my files finished to play and i want to start a new one, there is option to do from Task class or simple while will do the work for me ?
1 Do I need to set these variables to be accessed from different threads or this is OK ?
Yes you do. Couple of things. You should add a volatile keyword to the declarations of the counters, like
private static volatile int _filesInProcess;
This ensures that all the reads actually read the current value, in contrast to reads of a cashed value. If you want to modify and read the counters, you should consider using the Interlocked class, for example Interlocker.Increment
2 After the function finished and all my files finished to play and i want to start a new one, there is option to do from Task class or simple while will do the work for me ?
Not sure about this one, wild guess (it's not clear what you need). You can use task continuations, like you did (last block of code). As alternative a Task.Factory.StartNew returns a task, which you could save to a local variable and start it as you please (say on a button click). You may need to update the code slightly as Task.Factory.StartNew will kick the task immediately, while you may only want to create a task and run it on an event.
Based you your comment you can do something like (coded in notepad)
private Task _task; // have a local task variable
// move your work here, effectively what you have in Task.Factory.StartNew(...)
public void SetupWork()
{
task = new Task (/*your work here*/);
// see, I don't start this task here
// ...
}
// Call this when you need to start/restart work
public void RunWork()
{
task.Run();
}

Running multiple Tasks reuse the same object instance

Here's an intersting one. I have a service creating a bunch of Tasks. At the moment only two tasks are configured in the list. However, if I put a breakpoint within the Task action and inspect the value of schedule.Name, it is hit twice with the same schedule name. However, two separate schedules are configured and in the schedule list. Can anyone explain why the Task reuses the last schedule in the loop? It this a scope issue?
// make sure that we can log any exceptions thrown by the tasks
TaskScheduler.UnobservedTaskException += new EventHandler<UnobservedTaskExceptionEventArgs>(TaskScheduler_UnobservedTaskException);
// kick off all enabled tasks
foreach (IJobSchedule schedule in _schedules)
{
if (schedule.Enabled)
{
Task.Factory.StartNew(() =>
{
// breakpoint at line below. Inspecting "schedule.Name" always returns the name
// of the last schedule in the list. List contains 2 separate schedule items.
IJob job = _kernel.Get<JobFactory>().CreateJob(schedule.Name);
JobRunner jobRunner = new JobRunner(job, schedule);
jobRunner.Run();
},
CancellationToken.None,
TaskCreationOptions.LongRunning,
TaskScheduler.Default
);
}
} // next schedule
If you use a temporary variable inside the foreach loop, it should solve your issue.
foreach (IJobSchedule schedule in _schedules)
{
var tmpSchedule = schedule;
if (tmpSchedule.Enabled)
{
Task.Factory.StartNew(() =>
{
// breakpoint at line below. Inspecting "schedule.Name" always returns the name
// of the last schedule in the list. List contains 2 separate schedule items.
IJob job = _kernel.Get<JobFactory>().CreateJob(tmpSchedule.Name);
JobRunner jobRunner = new JobRunner(job, tmpSchedule);
jobRunner.Run();
},
CancellationToken.None,
TaskCreationOptions.LongRunning,
TaskScheduler.Default
);
}
} //
For further reference about closures and loop variables, see
Closing over the loop variable considered harmful

Categories