I've been attempting to use SemaphoreSlim to limit the amount of concurrent tasks I have running at any one time but it seems to have no effect, likely down to my implementation which is why I'm here. My SemaphoreSlim code is like so:
First it's called by
await Task.Run(() => mc.StartAsync());
Calling this method
public async Task StartAsync()
{
using (SemaphoreSlim concurrencySemaphore = new SemaphoreSlim(5))
{
foreach (var task in ThreadHandler.ThreadList)
{
await concurrencySemaphore.WaitAsync();
try
{
await Task.Run(() => task.Operation.Start());
}
finally
{
concurrencySemaphore.Release();
}
}
}
}
This in turn is starting a task from a list which looks like this, this is in a custom model with the task stored and created previously
public Task Operation { get; set; }
var t = new Task(async () =>
{
await Task.Run(() => Method(input, sps));
});
Remembering my code doesn't work as I'd expect, is this the correct way to start something like this? I don't expect 3 tasks launching from a single point is a good idea. The main reason It's like this is because I'm executing an Action<> and couldn't figure out a way to await it alone. Do these tasks count towards the SemaphoreSlim limit?
After various tests I can confirm that my SemaphoreSlim code is just continuously executing the tasks, I added a large task delay into the list of tasks to see if I could stop it from executing which worked but new tasks were still launched... what am I missing?
My goal is to have a limit on the number of tasks concurrently running, if that wasn't clear. Thank you for any assistance!
EDIT: I think I've realised I'm only awaiting the starting of the task, not the completion.
I think I've realised I'm only awaiting the starting of the task, not the completion.
Indeed, that is the core of the problem.
You shouldn't use the Task constructor, ever, at all, for anything. Just pretend it doesn't exist. It will always lead you down an awkward path.
If you have an action you want to perform at a later time, you should use a delegate: Action or Func<T> for synchronous work, and Func<Task> or Func<Task<T>> for asynchronous work. E.g., if Method is synchronous, then you would have:
public Action Operation { get; set; }
...
Operation = () => Method(input, sps);
Then you can invoke it using Task.Run as such:
public async Task ProcessAsync()
{
using (SemaphoreSlim concurrencySemaphore = new SemaphoreSlim(5))
{
var tasks = ThreadHandler.ThreadList.Select(async task =>
{
await concurrencySemaphore.WaitAsync();
try
{
await Task.Run(() => task.Operation());
}
finally
{
concurrencySemaphore.Release();
}
}).ToList();
await Task.WhenAll(tasks);
}
}
The above code will work fine if Operation is Action (synchronous) or Func<Task> (asynchronous).
However, if it is Action (i.e., synchronous), then what you're really doing is parallel processing, not asynchronous concurrency, and there's built-in types that can help with that:
public void Process()
{
// Only valid if Operation is Action, not Func<Task>!
Parallel.ForEach(
ThreadHandler.ThreadList,
new ParallelOptions { MaxDegreeOfParallelism = 5 },
task => task.Operation());
}
Related
Ex, the following code manually instantiates a Task and passes to a Task.WhenAll in a List<T>
public async Task Do3()
{
var task1 = new Task(async () => { await Task.Delay(2000); Console.WriteLine("########## task1"); });
var taskList = new List<Task>() { task1};
taskList[0].Start();
var taskDone = Task.WhenAll(taskList);
await taskDone;
}
without starting the Task it doesn't work, it hangs forever calling from a console app, but the below works just fine without starting it
public async Task Do3()
{
//var task1 = new Task(async () => { await Task.Delay(2000); Console.WriteLine("########## task1"); });
var taskList = new List<Task>() { SubDo1() };
//taskList[0].Start();
var taskDone = Task.WhenAll(taskList);
await taskDone;
}
public async Task SubDo1()
{
await Task.Delay(2000);
Console.WriteLine("########## task1");
}
Task is used in two completely different ways here; when you call an async method: you are starting it yourself; at this point, two things can happen:
it can run to completion (eventually) without ever reaching a truly asynchronous state, and return a completed (or faulted) task to the caller
it can reach an incomplete awaitable (in this case await Task.Delay), at which point it creates a state machine that represents the current position, schedules a completion operation on that incomplete awaitable (to do whatever comes next), and then returns an incomplete task to the caller
It is not "not started"; to return anything to the caller: we have started it. However, unlike Task.Start(), we start that work on our current thread - not an external worker thread - with other threads only getting involved based on how that incomplete awaitable schedules the completion callbacks that the compiler gives it.
This is very different to the new Task(...) scenario, where nothing is initially started. That's why they behave differently. Note also the Remarks section of the Task constructor here - it is a very niche API, and honestly: not hugely recommended.
Additionally: when you don't immediately await an async method, you're essentially going into concurrent territory (assuming the awaitable won't always complete synchronously). In some cases, this matters, and may cause threading problems re race-conditions. It shouldn't matter much in this case, though.
I have few methods that report some data to Data base. We want to invoke all calls to Data service asynchronously. These calls to data service are all over and so we want to make sure that these DS calls are executed one after another in order at any given time. Initially, i was using async await on each of these methods and each of the calls were executed asynchronously but we found out if they are out of sequence then there are room for errors.
So, i thought we should queue all these asynchronous tasks and send them in a separate thread but i want to know what options we have? I came across 'SemaphoreSlim' . Will this be appropriate in my use case?
Or what other options will suit my use case? Please, guide me.
So, what i have in my code currently
public static SemaphoreSlim mutex = new SemaphoreSlim(1);
//first DS call
public async Task SendModuleDataToDSAsync(Module parameters)
{
var tasks1 = new List<Task>();
var tasks2 = new List<Task>();
//await mutex.WaitAsync(); **//is this correct way to use SemaphoreSlim ?**
foreach (var setting in Module.param)
{
Task job1 = SaveModule(setting);
tasks1.Add(job1);
Task job2= SaveModule(GetAdvancedData(setting));
tasks2.Add(job2);
}
await Task.WhenAll(tasks1);
await Task.WhenAll(tasks2);
//mutex.Release(); // **is this correct?**
}
private async Task SaveModule(Module setting)
{
await Task.Run(() =>
{
// Invokes Calls to DS
...
});
}
//somewhere down the main thread, invoking second call to DS
//Second DS Call
private async Task SendInstrumentSettingsToDS(<param1>, <param2>)
{
//await mutex.WaitAsync();// **is this correct?**
await Task.Run(() =>
{
//TrackInstrumentInfoToDS
//mutex.Release();// **is this correct?**
});
if(param2)
{
await Task.Run(() =>
{
//TrackParam2InstrumentInfoToDS
});
}
}
Initially, i was using async await on each of these methods and each of the calls were executed asynchronously but we found out if they are out of sequence then there are room for errors.
So, i thought we should queue all these asynchronous tasks and send them in a separate thread but i want to know what options we have? I came across 'SemaphoreSlim' .
SemaphoreSlim does restrict asynchronous code to running one at a time, and is a valid form of mutual exclusion. However, since "out of sequence" calls can cause errors, then SemaphoreSlim is not an appropriate solution since it does not guarantee FIFO.
In a more general sense, no synchronization primitive guarantees FIFO because that can cause problems due to side effects like lock convoys. On the other hand, it is natural for data structures to be strictly FIFO.
So, you'll need to use your own FIFO queue, rather than having an implicit execution queue. Channels is a nice, performant, async-compatible queue, but since you're on an older version of C#/.NET, BlockingCollection<T> would work:
public sealed class ExecutionQueue
{
private readonly BlockingCollection<Func<Task>> _queue = new BlockingCollection<Func<Task>>();
public ExecutionQueue() => Completion = Task.Run(() => ProcessQueueAsync());
public Task Completion { get; }
public void Complete() => _queue.CompleteAdding();
private async Task ProcessQueueAsync()
{
foreach (var value in _queue.GetConsumingEnumerable())
await value();
}
}
The only tricky part with this setup is how to queue work. From the perspective of the code queueing the work, they want to know when the lambda is executed, not when the lambda is queued. From the perspective of the queue method (which I'm calling Run), the method needs to complete its returned task only after the lambda is executed. So, you can write the queue method something like this:
public Task Run(Func<Task> lambda)
{
var tcs = new TaskCompletionSource<object>();
_queue.Add(async () =>
{
// Execute the lambda and propagate the results to the Task returned from Run
try
{
await lambda();
tcs.TrySetResult(null);
}
catch (OperationCanceledException ex)
{
tcs.TrySetCanceled(ex.CancellationToken);
}
catch (Exception ex)
{
tcs.TrySetException(ex);
}
});
return tcs.Task;
}
This queueing method isn't as perfect as it could be. If a task completes with more than one exception (this is normal for parallel code), only the first one is retained (this is normal for async code). There's also an edge case around OperationCanceledException handling. But this code is good enough for most cases.
Now you can use it like this:
public static ExecutionQueue _queue = new ExecutionQueue();
public async Task SendModuleDataToDSAsync(Module parameters)
{
var tasks1 = new List<Task>();
var tasks2 = new List<Task>();
foreach (var setting in Module.param)
{
Task job1 = _queue.Run(() => SaveModule(setting));
tasks1.Add(job1);
Task job2 = _queue.Run(() => SaveModule(GetAdvancedData(setting)));
tasks2.Add(job2);
}
await Task.WhenAll(tasks1);
await Task.WhenAll(tasks2);
}
Here's a compact solution that has the least amount of moving parts but still guarantees FIFO ordering (unlike some of the suggested SemaphoreSlim solutions). There are two overloads for Enqueue so you can enqueue tasks with and without return values.
using System;
using System.Threading;
using System.Threading.Tasks;
public class TaskQueue
{
private Task _previousTask = Task.CompletedTask;
public Task Enqueue(Func<Task> asyncAction)
{
return Enqueue(async () => {
await asyncAction().ConfigureAwait(false);
return true;
});
}
public async Task<T> Enqueue<T>(Func<Task<T>> asyncFunction)
{
var tcs = new TaskCompletionSource(TaskCreationOptions.RunContinuationsAsynchronously);
// get predecessor and wait until it's done. Also atomically swap in our own completion task.
await Interlocked.Exchange(ref _previousTask, tcs.Task).ConfigureAwait(false);
try
{
return await asyncFunction().ConfigureAwait(false);
}
finally
{
tcs.SetResult();
}
}
}
Please keep in mind that your first solution queueing all tasks to lists doesn't ensure that the tasks are executed one after another. They're all running in parallel because they're not awaited until the next tasks is startet.
So yes you've to use a SemapohoreSlim to use async locking and await. A simple implementation might be:
private readonly SemaphoreSlim _syncRoot = new SemaphoreSlim(1);
public async Task SendModuleDataToDSAsync(Module parameters)
{
await this._syncRoot.WaitAsync();
try
{
foreach (var setting in Module.param)
{
await SaveModule(setting);
await SaveModule(GetAdvancedData(setting));
}
}
finally
{
this._syncRoot.Release();
}
}
If you can use Nito.AsyncEx the code can be simplified to:
public async Task SendModuleDataToDSAsync(Module parameters)
{
using var lockHandle = await this._syncRoot.LockAsync();
foreach (var setting in Module.param)
{
await SaveModule(setting);
await SaveModule(GetAdvancedData(setting));
}
}
One option is to queue operations that will create tasks instead of queuing already running tasks as the code in the question does.
PseudoCode without locking:
Queue<Func<Task>> tasksQueue = new Queue<Func<Task>>();
async Task RunAllTasks()
{
while (tasksQueue.Count > 0)
{
var taskCreator = tasksQueue.Dequeu(); // get creator
var task = taskCreator(); // staring one task at a time here
await task; // wait till task completes
}
}
// note that declaring createSaveModuleTask does not
// start SaveModule task - it will only happen after this func is invoked
// inside RunAllTasks
Func<Task> createSaveModuleTask = () => SaveModule(setting);
tasksQueue.Add(createSaveModuleTask);
tasksQueue.Add(() => SaveModule(GetAdvancedData(setting)));
// no DB operations started at this point
// this will start tasks from the queue one by one.
await RunAllTasks();
Using ConcurrentQueue would be likely be right thing in actual code. You also would need to know total number of expected operations to stop when all are started and awaited one after another.
Building on your comment under Alexeis answer, your approch with the SemaphoreSlim is correct.
Assumeing that the methods SendInstrumentSettingsToDS and SendModuleDataToDSAsync are members of the same class. You simplay need a instance variable for a SemaphoreSlim and then at the start of each methode that needs synchornization call await lock.WaitAsync() and call lock.Release() in the finally block.
public async Task SendModuleDataToDSAsync(Module parameters)
{
await lock.WaitAsync();
try
{
...
}
finally
{
lock.Release();
}
}
private async Task SendInstrumentSettingsToDS(<param1>, <param2>)
{
await lock.WaitAsync();
try
{
...
}
finally
{
lock.Release();
}
}
and it is importend that the call to lock.Release() is in the finally-block, so that if an exception is thrown somewhere in the code of the try-block the semaphore is released.
I'm struggling to understand what's happening in this simple program.
In the example below I have a task factory that uses the LimitedConcurrencyLevelTaskScheduler from ParallelExtensionsExtras with maxDegreeOfParallelism set to 2.
I then start 2 tasks that each call an async method (e.g. an async Http request), then gets the awaiter and the result of the completed task.
The problem seem to be that Task.Delay(2000) never completes. If I set maxDegreeOfParallelism to 3 (or greater) it completes. But with maxDegreeOfParallelism = 2 (or less) my guess is that there is no thread available to complete the task. Why is that?
It seems to be related to async/await since if I remove it and simply do Task.Delay(2000).GetAwaiter().GetResult() in DoWork it works perfectly. Does async/await somehow use the parent task's task scheduler, or how is it connected?
using System;
using System.Linq;
using System.Threading.Tasks;
using System.Threading.Tasks.Schedulers;
namespace LimitedConcurrency
{
class Program
{
static void Main(string[] args)
{
var test = new TaskSchedulerTest();
test.Run();
}
}
class TaskSchedulerTest
{
public void Run()
{
var scheduler = new LimitedConcurrencyLevelTaskScheduler(2);
var taskFactory = new TaskFactory(scheduler);
var tasks = Enumerable.Range(1, 2).Select(id => taskFactory.StartNew(() => DoWork(id)));
Task.WaitAll(tasks.ToArray());
}
private void DoWork(int id)
{
Console.WriteLine($"Starting Work {id}");
HttpClientGetAsync().GetAwaiter().GetResult();
Console.WriteLine($"Finished Work {id}");
}
async Task HttpClientGetAsync()
{
await Task.Delay(2000);
}
}
}
Thanks in advance for any help
await by default captures the current context and uses that to resume the async method. This context is SynchronizationContext.Current, unless it is null, in which case it is TaskScheduler.Current.
In this case, await is capturing the LimitedConcurrencyLevelTaskScheduler used to execute DoWork. So, after starting the Task.Delay both times, both of those threads are blocked (due to the GetAwaiter().GetResult()). When the Task.Delay completes, the await schedules the remainder of the HttpClientGetAsync method to its context. However, the context will not run it since it already has 2 threads.
So you end up with threads blocked in the context until their async methods complete, but the async methods cannot complete until there is a free thread in the context; thus a deadlock. Very similar to the standard "don't block on async code" style of deadlock, just with n threads instead of one.
Clarifications:
The problem seem to be that Task.Delay(2000) never completes.
Task.Delay is completing, but the await cannot continue executing the async method.
If I set maxDegreeOfParallelism to 3 (or greater) it completes. But with maxDegreeOfParallelism = 2 (or less) my guess is that there is no thread available to complete the task. Why is that?
There are plenty of threads available. But the LimitedConcurrencyTaskScheduler only allows 2 threads at a time to run in its context.
It seems to be related to async/await since if I remove it and simply do Task.Delay(2000).GetAwaiter().GetResult() in DoWork it works perfectly.
Yes; it's the await that is capturing the context. Task.Delay does not capture a context internally, so it can complete without needing to enter the LimitedConcurrencyTaskScheduler.
Solution:
Task schedulers in general do not work very well with asynchronous code. This is because task schedulers were designed for Parallel Tasks rather than asynchronous tasks. So they only apply when code is running (or blocked). In this case, LimitedConcurrencyLevelTaskScheduler only "counts" code that's running; if you have a method that's doing an await, it won't "count" against that concurrency limit.
So, your code has ended up in a situation where it has the sync-over-async antipattern, probably because someone was trying to avoid the problem of await not working as expected with limited concurrency task schedulers. This sync-over-async antipattern has then caused the deadlock problem.
Now, you could add in more hacks by using ConfigureAwait(false) everywhere and continue blocking on asynchronous code, or you could fix it better.
A more proper fix would be to do asynchronous throttling. Toss out the LimitedConcurrencyLevelTaskScheduler completely; concurrency-limiting task schedulers only work with synchronous code, and your code is asynchronous. You can do asynchronous throttling using SemaphoreSlim, as such:
class TaskSchedulerTest
{
private readonly SemaphoreSlim _mutex = new SemaphoreSlim(2);
public async Task RunAsync()
{
var tasks = Enumerable.Range(1, 2).Select(id => DoWorkAsync(id));
await Task.WhenAll(tasks);
}
private async Task DoWorkAsync(int id)
{
await _mutex.WaitAsync();
try
{
Console.WriteLine($"Starting Work {id}");
await HttpClientGetAsync();
Console.WriteLine($"Finished Work {id}");
}
finally
{
_mutex.Release();
}
}
async Task HttpClientGetAsync()
{
await Task.Delay(2000);
}
}
I think you are encountering a sync deadlock. You are waiting for a thread to complete that is waiting for your thread to complete. Never going to happen. If you make your DoWork method async so you can await the HttpClientGetAsync() call, and you'll avoid the deadlock.
using MassTransit.Util;
using System;
using System.Linq;
using System.Threading.Tasks;
//using System.Threading.Tasks.Schedulers;
namespace LimitedConcurrency
{
class Program
{
static void Main(string[] args)
{
var test = new TaskSchedulerTest();
test.Run();
}
}
class TaskSchedulerTest
{
public void Run()
{
var scheduler = new LimitedConcurrencyLevelTaskScheduler(2);
var taskFactory = new TaskFactory(scheduler);
var tasks = Enumerable.Range(1, 2).Select(id => taskFactory.StartNew(() => DoWork(id)));
Task.WaitAll(tasks.ToArray());
}
private async Task DoWork(int id)
{
Console.WriteLine($"Starting Work {id}");
await HttpClientGetAsync();
Console.WriteLine($"Finished Work {id}");
}
async Task HttpClientGetAsync()
{
await Task.Delay(2000);
}
}
}
https://medium.com/rubrikkgroup/understanding-async-avoiding-deadlocks-e41f8f2c6f5d
TLDR never call .result, which I'm sure .GetResult(); was doing
I'm stuck in something I thought so simple that it drives me out of myself.
I need to declare a Task in some point and run it later, I thought of:
Task T1 { get; set; }
public async Task CreateAndAwaitAsync()
{
T1 = new Task(() => {
// time consuming work like:
System.Threading.Thread.Sleep(1000);
}
await T1;
}
Of course the body of the lambda AND the method are just for the sake of this example (as I said I need to run it later), but no matter what, with await T1 I just cannot make it into the lambda! What am I missing?? I feel stupid because I have been using the async-await paradigm for quite the few years already that I didn't even imagine this wouldn't work!
I thought it can be answered in comment but better to provide more info.
await means "wait for this task to complete, then do the rest stuff". Task from your example is not started. await will not start it for you, so the whole method will just stuck at await until you start the task.
Even with your current code, if you later do T1.Start() - it will run your lambda and after it is completed - your task returned by CreateAndAwaitAsync will also complete.
Otherwise - either start task right when creating it (Task.Run) or just return Task directly without any async\await:
public Task CreateAndAwaitAsync()
{
T1 = new Task(() => {
// time consuming work like:
System.Threading.Thread.Sleep(1000);
}
// if you need to start it somewhere, do T1.Start();
return T1;
}
Like #Evk said, you need to call T1.Start()
public async Task CreateAndAwaitAsync()
{
T1 = new Task(() =>
{
// time consuming work like:
System.Threading.Thread.Sleep(1000);
});
T1.Start();
await T1;
}
You are not starting the task.
Example:
public async Task DoSomething()
{
await Task.Run(() => SomeMethod());
}
Unless you are doing CPU-bound work you almost never need to call Task.Run but can just use async/await like:
public async Task DelayMe()
{
await Task.Delay(1000);
await MoreAsyncThings();
}
Task T1 { get; set; }
public async Task CreateAndAwaitAsync()
{
T1 = DelayMe();
await T1;
}
Update:
To store code for later execution I would propose to simple use a lambda like:
Func<Task> SomeWorkFactory()
{
return async () =>
{
await AsyncStuff();
// More Stuff
};
}
var deferedWork = SomeWorkFactory();
// later somewhere else
await deferedWork();
So basically calling "SomeWorkFactory" creates the work package but only calling the Func executes it and also returns an awaitable Task.
Maybe relevant:
If you provide methods which do CPU-bound work (like calculating something as opposed to waiting for a callback) you should not offer an async signature at all but leave the execution mode to the caller. Mainly because asynchronism and concurrency are different concepts. Mads Torgersen gave some insights on this here: https://channel9.msdn.com/Events/TechEd/NorthAmerica/2014/DEV-B362 and calls it "library methods should not lie" (From minute 42)
You're only creating a Task but it is not running even with await here. Try to Start() it, or you can assign T1 = Task.Run(your => lambda) and then it should work when you awaiting it
I am not an advanced developer. I'm just trying to get a hold on the task library and just googling. I've never used the class SemaphoreSlim so I would like to know what it does. Here I present code where SemaphoreSlim is used with async & await but which I do not understand. Could someone help me to understand the code below.
1st set of code
await WorkerMainAsync();
async Task WorkerMainAsync()
{
SemaphoreSlim ss = new SemaphoreSlim(10);
while (true)
{
await ss.WaitAsync();
// you should probably store this task somewhere and then await it
var task = DoPollingThenWorkAsync();
}
}
async Task DoPollingThenWorkAsync(SemaphoreSlim semaphore)
{
var msg = Poll();
if (msg != null)
{
await Task.Delay(3000); // process the I/O-bound job
}
// this assumes you don't have to worry about exceptions
// otherwise consider try-finally
semaphore.Release();
}
Firstly, the WorkerMainAsync will be called and a SemaphoreSlim is used. Why is 10 passed to the constructor of SemaphoreSlim?
When does the control come out of the while loop again?
What does ss.WaitAsync(); do?
The DoPollingThenWorkAsync() function is expecting a SemaphoreSlim but is not passed anything when it is called. Is this typo?
Why is await Task.Delay(3000); used?
They could simply use Task.Delay(3000) but why do they use await here instead?
2nd set of code for same purpose
async Task WorkerMainAsync()
{
SemaphoreSlim ss = new SemaphoreSlim(10);
List<Task> trackedTasks = new List<Task>();
while (DoMore())
{
await ss.WaitAsync();
trackedTasks.Add(Task.Run(() =>
{
DoPollingThenWorkAsync();
ss.Release();
}));
}
await Task.WhenAll(trackedTasks);
}
void DoPollingThenWorkAsync()
{
var msg = Poll();
if (msg != null)
{
Thread.Sleep(2000); // process the long running CPU-bound job
}
}
Here is a task & ss.Release added to a list. I really do not understand how tasks can run after adding to a list?
trackedTasks.Add(Task.Run(async () =>
{
await DoPollingThenWorkAsync();
ss.Release();
}));
I am looking forward for a good explanation & help to understand the two sets of code. Thanks
why 10 is passing to SemaphoreSlim constructor.
They are using SemaphoreSlim to limit to 10 tasks at a time. The semaphore is "taken" before each task is started, and each task "releases" it when it finishes. For more about semaphores, see MSDN.
they can use simply Task.Delay(3000) but why they use await here.
Task.Delay creates a task that completes after the specified time interval and returns it. Like most Task-returning methods, Task.Delay returns immediately; it is the returned Task that has the delay. So if the code did not await it, there would be no delay.
just really do not understand after adding task to list how they can run?
In the Task-based Asynchronous Pattern, Task objects are returned "hot". This means they're already running by the time they're returned. The await Task.WhenAll at the end is waiting for them all to complete.