Waiting asynchronously for some tasks to complete (Task.WhenSome) - c#

I am writing a service that combines data from various internet sources, and generates a response on the fly. Speed is more important than completeness, so I would like to generate my response as soon as some (not all) of the internet sources have responded. Typically my service creates 10 concurrent web requests, and should stop waiting and start processing after 5 of them have completed. Neither the .NET Framework, nor any of the third-party libraries I am aware of are offering this functionality, so I 'll probably have to write it myself. The method I am trying to implement has the following signature:
public static Task<TResult[]> WhenSome<TResult>(int atLeast, params Task<TResult>[] tasks)
{
// TODO
}
Contrary to how Task.WhenAny works, exceptions should be swallowed, provided that the required number of results have been acquired. If however, after completion of all tasks, there are not enough gathered results, then an AggregateException should be thrown propagating all exceptions.
Usage example:
var tasks = new Task<int>[]
{
Task.Delay(100).ContinueWith<int>(_ => throw new ApplicationException("Oops!")),
Task.Delay(200).ContinueWith(_ => 10),
Task.Delay(Timeout.Infinite).ContinueWith(_ => 0,
new CancellationTokenSource(300).Token),
Task.Delay(400).ContinueWith(_ => 20),
Task.Delay(500).ContinueWith(_ => 30),
};
var results = await WhenSome(2, tasks);
Console.WriteLine($"Results: {String.Join(", ", results)}");
Expected output:
Results: 10, 20
In this example the last task returning the value 30 should be ignored (not even awaited), because we have already acquired the number of results we want (2 results). The faulted and cancelled tasks should also be ignored, for the same reason.

This is some clunky code which I think achieves your requirements. It may be a starting point.
It may also be a bad way of handling tasks and/or not threadsafe, and/or just a terrible idea. But I expect if so someone will point that out.
async Task<TResult[]> WhenSome<TResult>(int atLeast, List<Task<TResult>> tasks)
{
List<Task<TResult>> completedTasks = new List<System.Threading.Tasks.Task<TResult>>();
int completed = 0;
List<Exception> exceptions = new List<Exception>();
while (completed < atLeast && tasks.Any()) {
var completedTask = await Task.WhenAny(tasks);
tasks.Remove(completedTask);
if (completedTask.IsCanceled)
{
continue;
}
if (completedTask.IsFaulted)
{
exceptions.Add(completedTask.Exception);
continue;
}
completed++;
completedTasks.Add(completedTask);
}
if (completed >= atLeast)
{
return completedTasks.Select(t => t.Result).ToArray();
}
throw new AggregateException(exceptions).Flatten();
}

I am adding one more solution to this problem, not because stuartd's answer in not sufficient, but just for the sake of variety. This implementation uses the Unwrap technique in order to return a Task that contains all the exceptions, in exactly the same way that the built-in Task.WhenAll method propagates all the exceptions.
public static Task<TResult[]> WhenSome<TResult>(int atLeast, params Task<TResult>[] tasks)
{
if (tasks == null) throw new ArgumentNullException(nameof(tasks));
if (atLeast < 1 || atLeast > tasks.Length)
throw new ArgumentOutOfRangeException(nameof(atLeast));
var cts = new CancellationTokenSource();
int successfulCount = 0;
var continuationAction = new Action<Task<TResult>>(task =>
{
if (task.IsCompletedSuccessfully)
if (Interlocked.Increment(ref successfulCount) == atLeast) cts.Cancel();
});
var continuations = tasks.Select(task => task.ContinueWith(continuationAction,
cts.Token, TaskContinuationOptions.ExecuteSynchronously, TaskScheduler.Default));
return Task.WhenAll(continuations).ContinueWith(_ =>
{
cts.Dispose();
if (successfulCount >= atLeast) // Success
return Task.WhenAll(tasks.Where(task => task.IsCompletedSuccessfully));
else
return Task.WhenAll(tasks); // Failure
}, TaskScheduler.Default).Unwrap();
}
The continuations do not propagate the results or the exceptions of the tasks. These are cancelable continuations, and they are canceled en masse when the specified number of successful tasks has been reached.
Note: This implementation might propagate more than atLeast results. If you want exactly this number of results, you can chain a .Take(atLeast) after the .Where LINQ operatgor.

Related

Task.WhenAll on List<Task> behaving differently than Task.WhenAll on IEnumerable<Task>

I'm seeing some odd behavioral differences when calling Task.WhenAll(IEnumerable<Task<T>>) and calling Task.WhenAll(List<Task<T>>) while trying to catch exceptions
My code is as follows:
public async Task Run()
{
var en = GetResources(new []{"a","b","c","d"});
await foreach (var item in en)
{
var res = item.Select(x => x.Id).ToArray();
System.Console.WriteLine(string.Join("-> ", res));
}
}
private async IAsyncEnumerable<IEnumerable<ResponseObj>> GetResources(
IEnumerable<string> identifiers)
{
IEnumerable<IEnumerable<string>> groupedIds = identifiers.Batch(2);
// MoreLinq extension method -- batches IEnumerable<T>
// into IEnumerable<IEnumerable<T>>
foreach (var batch in groupedIds)
{
//GetHttpResource is simply a wrapper around HttpClient which
//makes an Http request to an API endpoint with the given parameter
var tasks = batch.Select(id => ac.GetHttpResourceAsync(id)).ToList();
// if I remove this ToList(), the behavior changes
var stats = tasks.Select(t => t.Status);
// at this point the status being WaitingForActivation is reasonable
// since I have not awaited yet
IEnumerable<ResponseObj> res = null;
var taskGroup = Task.WhenAll(tasks);
try
{
res = await taskGroup;
var awaitedStats = tasks.Select(t => t.Status);
//this is the part that changes
//if I have .ToList(), the statuses are RanToCompletion or Faulted
//if I don't have .ToList(), the statuses are always WaitingForActivation
}
catch (Exception ex)
{
var exceptions = taskGroup.Exception.InnerException;
DoSomethingWithExceptions(exceptions);
res = tasks.Where(g => !g.IsFaulted).Select(t => t.Result);
//throws an exception because all tasks are WaitingForActivation
}
yield return res;
}
}
Ultimately, I have an IEnumerable of identifiers, I'm batching that into batches of 2 (hard coded in this example), and then running Task.WhenAll to run each batch of 2 at the same time.
What I want is if 1 of the 2 GetResource tasks fails, to still return the successful result of the other, and handle the exception (say, write it to a log).
If I run Task.WhenAll on a list of tasks, this works exactly how I want. However, if I remove the .ToList(), when I attempt to find my faulted tasks in the catch block after the await taskGroup, I run into problems because the statuses of my tasks are still WaitingForActivation although I believe they have been awaited.
When there is no exception thrown, the List and IEnumerable act the same way. This only starts causing issues when I try to catch exceptions.
What is the reasoning behind this behavior? The Task.WhenAll must have completed since I get into the catch block, however why are the statuses still WaitingForActivation? Have I failed to grasp something fundamental here?
Unless you make the list concrete (by using ToList()), each time you enumerate over the list you are calling GetHttpResourceAsync again, and creating a new task. This is due to the deferred execution.
I would definitely keep the ToList() call when working with a list of tasks

Multiple Async Calls with Pause Between Calls

I have an IEnumerable<Task>, where each Task will call the same endpoint. However, the endpoint can only handle so many calls per second. How can I put, say, a half second delay between each call?
I have tried adding Task.Delay(), but of course awaiting them simply means that the app waits a half second before sending all the calls at once.
Here is a code snippet:
var resultTasks = orders
.Select(async task =>
{
var result = new VendorTaskResult();
try
{
result.Response = await result.CallVendorAsync();
}
catch(Exception ex)
{
result.Exception = ex;
}
return result;
} );
var results = Task.WhenAll(resultTasks);
I feel like I should do something like
Task.WhenAll(resultTasks.EmitOverTime(500));
... but how exactly do I do that?
What you describe in your question is in other words rate limiting. You'd like to apply rate limiting policy to your client, because the API you use enforces such a policy on the server to protect itself from abuse.
While you could implement rate limiting yourself, I'd recommend you to go with some well established solution. Rate Limiter from Davis Desmaisons was the one that I picked at random and I instantly liked it. It had solid documentation, superior coverage and was easy to use. It is also available as NuGet package.
Check out the simple snippet below that demonstrates running semi-overlapping tasks in sequence while defering the task start by half a second after the immediately preceding task started. Each task lasts at least 750 ms.
using ComposableAsync;
using RateLimiter;
using System;
using System.Threading.Tasks;
namespace RateLimiterTest
{
class Program
{
static void Main(string[] args)
{
Log("Starting tasks ...");
var constraint = TimeLimiter.GetFromMaxCountByInterval(1, TimeSpan.FromSeconds(0.5));
var tasks = new[]
{
DoWorkAsync("Task1", constraint),
DoWorkAsync("Task2", constraint),
DoWorkAsync("Task3", constraint),
DoWorkAsync("Task4", constraint)
};
Task.WaitAll(tasks);
Log("All tasks finished.");
Console.ReadLine();
}
static void Log(string message)
{
Console.WriteLine(DateTime.Now.ToString("HH:mm:ss.fff ") + message);
}
static async Task DoWorkAsync(string name, IDispatcher constraint)
{
await constraint;
Log(name + " started");
await Task.Delay(750);
Log(name + " finished");
}
}
}
Sample output:
10:03:27.121 Starting tasks ...
10:03:27.154 Task1 started
10:03:27.658 Task2 started
10:03:27.911 Task1 finished
10:03:28.160 Task3 started
10:03:28.410 Task2 finished
10:03:28.680 Task4 started
10:03:28.913 Task3 finished
10:03:29.443 Task4 finished
10:03:29.443 All tasks finished.
If you change the constraint to allow maximum two tasks per second (var constraint = TimeLimiter.GetFromMaxCountByInterval(2, TimeSpan.FromSeconds(1));), which is not the same as one per half a second, then the output could be like:
10:06:03.237 Starting tasks ...
10:06:03.264 Task1 started
10:06:03.268 Task2 started
10:06:04.026 Task2 finished
10:06:04.031 Task1 finished
10:06:04.275 Task3 started
10:06:04.276 Task4 started
10:06:05.032 Task4 finished
10:06:05.032 Task3 finished
10:06:05.033 All tasks finished.
Note that the current version of Rate Limiter targets .NETFramework 4.7.2+ or .NETStandard 2.0+.
This is just a thought, but another approach could be to create a queue and add another thread that runs polling the queue for calls that need to go out to your endpoint.
Have you considered just turning that into a foreach-loop with a Task.Delay call? You seem to want to explicitly call them sequentially, it won't hurt if that is obvious from your code.
var results = new List<YourResultType>;
foreach(var order in orders){
var result = new VendorTaskResult();
try
{
result.Response = await result.CallVendorAsync();
results.Add(result.Response);
}
catch(Exception ex)
{
result.Exception = ex;
}
}
Instead of selecting from orders you could loop over them, and inside the loop put the result into a list and then call Task.WhenAll.
Would look something like:
var resultTasks = new List<VendorTaskResult>(orders.Count);
orders.ToList().ForEach( item => {
var result = new VendorTaskResult();
try
{
result.Response = await result.CallVendorAsync();
}
catch(Exception ex)
{
result.Exception = ex;
}
resultTasks.Add(result);
Thread.Sleep(x);
});
var results = Task.WhenAll(resultTasks);
If you want to control the number of requests executed simultaneously, you have to use a semaphore.
I have something very similar, and it works fine with me. Please note that I call ToArray() after the Linq query finishes, that triggers the tasks:
using (HttpClient client = new HttpClient()) {
IEnumerable<Task<string>> _downloads = _group
.Select(job => {
await Task.Delay(300);
return client.GetStringAsync(<url with variable job>);
});
Task<string>[] _downloadTasks = _downloads.ToArray();
_pages = await Task.WhenAll(_downloadTasks);
}
Now please note that this will create n nunmber of tasks, all in parallel, and the Task.Delay literally does nothing. If you want to call the pages synchronously (as it sounds by putting a delay between the calls), then this code may be better:
using (HttpClient client = new HttpClient()) {
foreach (string job in _group) {
await Task.Delay(300);
_pages.Add(await client.GetStringAsync(<url with variable job>));
}
}
The download of the pages is still asynchronous (while downloading other tasks are done), but each call to download the page is synchronous, ensuring that you can wait for one to finish in order to call the next one.
The code can be easily changed to call the pages asynchronously in chunks, like every 10 pages, wait 300ms, like in this sample:
IEnumerable<string[]> toParse = myData
.Select((v, i) => new { v.code, group = i / 20 })
.GroupBy(x => x.group)
.Select(g => g.Select(x => x.code).ToArray());
using (HttpClient client = new HttpClient()) {
foreach (string[] _group in toParse) {
string[] _pages = null;
IEnumerable<Task<string>> _downloads = _group
.Select(job => {
return client.GetStringAsync(<url with job>);
});
Task<string>[] _downloadTasks = _downloads.ToArray();
_pages = await Task.WhenAll(_downloadTasks);
await Task.Delay(5000);
}
}
All this does is group your pages in chunks of 20, iterate through the chunks, download all pages of the chunk asynchronously, wait 5 seconds, move on to the next chunk.
I hope that is what you were waiting for :)
The proposed method EmitOverTime is doable, but only by blocking the current thread:
public static IEnumerable<Task<TResult>> EmitOverTime<TResult>(
this IEnumerable<Task<TResult>> tasks, int delay)
{
foreach (var item in tasks)
{
Thread.Sleep(delay); // Delay by blocking
yield return item;
}
}
Usage:
var results = await Task.WhenAll(resultTasks.EmitOverTime(500));
Probably better is to create a variant of Task.WhenAll that accepts a delay argument, and delays asyncronously:
public static async Task<TResult[]> WhenAllWithDelay<TResult>(
IEnumerable<Task<TResult>> tasks, int delay)
{
var tasksList = new List<Task<TResult>>();
foreach (var task in tasks)
{
await Task.Delay(delay).ConfigureAwait(false);
tasksList.Add(task);
}
return await Task.WhenAll(tasksList).ConfigureAwait(false);
}
Usage:
var results = await WhenAllWithDelay(resultTasks, 500);
This design implies that the enumerable of tasks should be enumerated only once. It is easy to forget this during development, and start enumerating it again, spawning a new set of tasks. For this reason I propose to make it an OnlyOnce enumerable, as it is shown in this question.
Update: I should mention why the above methods work, and under what premise. The premise is that the supplied IEnumerable<Task<TResult>> is deferred, in other words non-materialized. At the method's start there are no tasks created yet. The tasks are created one after the other during the enumeration of the enumerable, and the trick is that the enumeration is slow and controlled. The delay inside the loop ensures that the tasks are not created all at once. They are created hot (in other words already started), so at the time the last task has been created some of the first tasks may have already been completed. The materialized list of half-running/half-completed tasks is then passed to Task.WhenAll, that waits for all to complete asynchronously.

Is there a proper pattern for multiple ContinueWith methods

In the docs for TPL I found this line:
Invoke multiple continuations from the same antecedent
But this isn't explained any further. I naively assumed you could chain ContinueWiths in a pattern matching like manner until you hit the right TaskContinuationOptions.
TaskThatReturnsString()
.ContinueWith((s) => Console.Out.WriteLine(s.Result), TaskContinuationOptions.OnlyOnRanToCompletion)
.ContinueWith((f) => Console.Out.WriteLine(f.Exception.Message), TaskContinuationOptions.OnlyOnFaulted)
.ContinueWith((f) => Console.Out.WriteLine("Cancelled"), TaskContinuationOptions.OnlyOnCanceled)
.Wait();
But this doesn't work like I hoped for at least two reasons.
The continuations are properly chained so the 2nd ContinueWith gets the result form the 1st, that is implemented as new Task, basically the ContinueWith task itself. I realize that the String could be returned onwards, but won't that be a new task with other info lost?
Since the first option is not met, the Task is just cancelled. Meaning that the second set will never be met and the exceptions are lost.
So what do they mean in the docs when they say multiple continuations from the same antecedent?
Is there a proper patter for this or do we just have to wrap the calls in try catch blocks?
EDIT
So I guess this was what I was hoping I could do, note this is a simplified example.
public void ProccessAllTheThings()
{
var theThings = util.GetAllTheThings();
var tasks = new List<Task>();
foreach (var thing in theThings)
{
var task = util.Process(thing)
.ContinueWith((t) => Console.Out.WriteLine($"Finished processing {thing.ThingId} with result {t.Result}"), TaskContinuationOptions.OnlyOnRanToCompletion)
.ContinueWith((t) => Console.Out.WriteLine($"Error on processing {thing.ThingId} with error {t.Exception.Message}"), TaskContinuationOptions.OnlyOnFaulted);
tasks.Add(task);
}
Task.WaitAll(tasks.ToArray());
}
Since this wasn't possible I was thinking I would have to wrap each task call in a try catch inside the loop so I wouldn't stop the process but not wait on it there. I wasn't sure what the correct way.
Sometimes a solution is just staring you in the face, this would work wouldn't it?
public void ProccessAllTheThings()
{
var theThings = util.GetAllTheThings();
var tasks = new List<Task>();
foreach (var thing in theThings)
{
var task = util.Process(thing)
.ContinueWith((t) =>
{
if (t.Status == TaskStatus.RanToCompletion)
{
Console.Out.WriteLine($"Finished processing {thing.ThingId} with result {t.Result}");
}
else
{
Console.Out.WriteLine($"Error on processing {thing.ThingId} - {t.Exception.Message}");
}
});
tasks.Add(task);
}
Task.WaitAll(tasks.ToArray());
}
What you did is to create a sequential chain of multiple tasks.
What you need to do is attach all your continuation tasks to the first one:
var firstTask = TaskThatReturnsString();
var t1 = firstTask.ContinueWith (…);
var t2 = firstTask.ContinueWith (…);
var t3 = firstTask.ContinueWith (…);
Then you need to wait for all the continuation tasks:
Task.WaitAll (t1, t2, t3);

Gracefully handeling exceptions when dealing with WhenAll

I'm playing with a piece of code I wrote a while back. That piece of code deals with making a few requests in an async manner.
var client = new HttpClient();
var searchPromises = searchTerms
.Select(GetSearchUrl)
.Select(client.GetStringAsync);
var searchPages = await Task.WhenAll(searchPromises);
What happens is I create a new HttpClient. Using some search terch terms I compose search engine urls. Then I use those urls as inputs to get tasks representing the async requests for a page with the results. And last, I await those responses using Task.WhenAll to group them together.
The problem is if just one of those requests gets a 404, a 500 or anything like that my code throws an AggregateException.
Is there a way of specifying what should happen in the case of an error in one of those threads, so that I get a result from everything else?
I've looked at ContinueWith, but it doesn't seem to fit the bill, that is, it doesn't know how to deal with all the errors, just the aggregate one.
What happens is I create a new HttpClient. Using some search terch terms I compose search engine urls. Then I use those urls as inputs to get tasks representing the async requests for a page with the results. And last, I await those responses using Task.WhenAll to group them together.
Is there a way of specifying what should happen in the case of an error in one of those threads, so that I get a result from everything else?
IMO, the easiest solution is to change how you think about the problem. Right now, you're thinking "perform a download on each url" and then "what for them all to complete and handle errors on a per-item basis". Just change your operation ("download") to include anything you want to do per-item. In other words, what you want to do is "perform a download on each url and handle errors" and then "wait for them all to complete":
var client = new HttpClient();
var searchPromises = searchTerms
.Select(GetSearchUrl)
.Select(url => DownloadAsync(client, url));
var searchPages = await Task.WhenAll(searchPromises);
var successfulSearchPages = searchPages.Where(x => x != null);
...
private static async Task<string> DownloadAsync(HttpClient client, string url)
{
try
{
return await client.GetStringAsync(url);
}
catch (HttpRequestException ex)
{
// TODO: Perform appropriate error handling
return null;
}
}
Task.WhenAll will return a task that is completed when all the tasks passed as argument are completed.
If any of the tasks passed as argument ends in a Faulted state (an exception was thrown), the returned task will also end in a Faulted state and its Exception property will contain the aggregation of all exceptions thrown by the tasks passed as argument.
Because the code generated by the compiler picks the first exceptin on the list, only the excpetion thrown by the first exception that throws (not the first exception thrwing) will be rethrown.
But the tasks passed as argument still exist and can still be queried for result.
This code snippet shows this working:
var tasks = new Task[] {
((Func<Task>)(async () =>
{
await Task.Delay(10);
await Task.Delay(10);
await Task.Delay(10);
throw new Exception("First");
}))(),
((Func<Task>)(async () =>
{
await Task.Delay(10);
throw new Exception("Second");
}))(),
((Func<Task>)(async () =>
{
await Task.Delay(10);
}))()
};
var allTasks = Task.WhenAll(tasks);
try
{
await allTasks;
}
catch (Exception ex)
{
Console.WriteLine("Overall failed: {0}", ex.Message);
}
for(var i = 0; i < tasks.Length; i++)
{
try
{
await tasks[i];
Console.WriteLine("Taks {0} succeeded!", i);
}
catch (Exception ex)
{
Console.WriteLine("Taks {0} failed!", i);
}
}
/*
Overall failed: First
Taks 0 failed!
Taks 1 failed!
Taks 2 succeeded!
*/
You can create your own version of Task.WhenAll that returns just the results disregarding any exception using Task.WhenAny:
public static async Task<IEnumerable<TResult>> WhenAllSwallowExceptions<TResult>(IEnumerable<Task<TResult>> tasks)
{
var tasklist = tasks.ToList();
var results = new List<TResult>();
while (tasklist.Any())
{
var completedTask = await Task.WhenAny(tasklist);
try
{
results.Add(await completedTask);
}
catch (Exception e)
{
// handle
}
tasklist.Remove(completedTask);
}
return results;
}
Usage:
var searchPages = await WhenAllSwallowExceptions(searchPromises);
This waits for tasks one at a time (with Task.WhenAny) and aggregates all the results (if there are any).
I've found a way to do this, after many iterations. Tasks are starting to look like things that you need a library to abstract.
Anyway, here's the code:
var client = new HttpClient();
var exceptions = new ConcurrentBag<Exception>();
var searchPromises = searchTerms
.Select(GetSearchUrl)
.Select(client.GetStringAsync)
.Select(t=>t.Catch(e=>exceptions.Add(e)));
var searchPages = (await Task.WhenAll(searchPromises))
.Where(r => r != null);
And the implementation for Catch:
public static Task<TResult> Catch<TResult>(this Task<TResult> self, Action<Exception> exceptionHandlerTask)
{
return self.ContinueWith(s =>
{
if (!s.IsFaulted)
{
return s.Result;
}
exceptionHandlerTask(s.Exception);
return default(TResult);
},
CancellationToken.None,
TaskContinuationOptions.ExecuteSynchronously |
TaskContinuationOptions.DenyChildAttach,
TaskScheduler.Default);
}
What happens now is that it gives you a way to append a failure state function to the Task<T> promise. This allows me to still have chainability. It is a shame that c# doesn't have robust support for functional pattern matching to make this easier.
Edit: added minimal code for error logging.
Edit: separated the code for logging errors to be more generic/reusable.
Edit: separated the code for saving the errors from the Catch function.

Automatic repeat of one task until another is finished (TAP)

I have two operations - long running OperationA and much quicker OperationB. I was running them in parallel using TAP and returning results as they both finish :
var taskA = Task.Factory.StartNew(()=>OperationA());
var taskB = Task.Factory.StartNew(()=>OperationB());
var tasks = new Task[] { taskA, taskB };
Task.WaitAll(tasks);
// processing taskA.Result, taskB.Result
No magic here. Now what I want to do is repeat OperationB when it's finished indefinitely in case OperationA is still running. So whole procedure finish point will occur when OperationA is finished and last pass of OperationB is finished. I'm looking for some sort of effective pattern for doing that will not involve polling for OperationA's Status in while loop if that's possible. Looking toward improving WaitAllOneByOne pattern proposed in this Pluralsight course or something similar.
Try this
// Get cancellation support.
CancellationTokenSource source = new CancellationTokenSource();
CancellationToken token = source.Token;
// Start off A and set continuation to cancel B when finished.
bool taskAFinished = false;
var taskA = Task.Factory.StartNew(() => OperationA());
Task contA = taskA.ContinueWith(ant => source.Cancel());
// Set off B and in the method perform your loop. Cancellation with be thrown when
// A has completed.
var taskB = Task.Factory.StartNew(() => OperationB(token), token);
Task contB = taskB.ContinueWith(ant =>
{
switch (task.Status)
{
// Handle any exceptions to prevent UnobservedTaskException.
case TaskStatus.RanToCompletion:
// Do stuff.
break;
case TaskStatus.Canceled:
// You know TaskA is finished.
break;
case TaskStatus.Faulted:
// Something bad.
break;
}
});
then in the OperationB method you can perform your loop and include a cancellation upon TaskA's compleation...
private void OperationB(CancellationToken token)
{
foreach (var v in object)
{
...
token.ThrowIfCancellationRequested(); // This must be handeled. AggregateException.
}
}
Note, instead of complicating with a cancellation, you can just set a bool from with in the continuation of TaskA and check for this in TaskB' loop - this will avoid any faffing about with cancellations.
I hope this helps
Took your approach as basis and adapted a bit :
var source = new CancellationTokenSource();
var token = source.Token;
var taskA = Task.Factory.StartNew(
() => OperationA()
);
var taskAFinished = taskA.ContinueWith(antecedent =>
{
source.Cancel();
return antecedent.Result;
});
var taskB = Task.Factory.StartNew(
() => OperationB(token), token
);
var taskBFinished = taskB.ContinueWith(antecedent =>
{
switch (antecedent.Status)
{
case TaskStatus.RanToCompletion:
case TaskStatus.Canceled:
try
{
return ant.Result;
}
catch (AggregateException ae)
{
// Operation was canceled before start if OperationA is short
return null;
}
case TaskStatus.Faulted:
return null;
}
return null;
});
Made two continuations that returns results for respective operations so I could make wait for them both to be finished (tried to do that only with second one and it didn't work).
var tasks = new Task[] { taskAFinished, taskBFinished };
Task.WaitAll(tasks);
First one is just passing antecedent's task Result further, second takes aggregate results available at this point in OperationB (both RanToCompletion and Canceled statuses are considered correct end of process). OperationB now looks like this :
public static List<Result> OperationB(CancellationToken token)
{
var resultsList = new List<Result>();
while (true)
{
foreach (var op in operations)
{
resultsList.Add(op.GetResult();
}
if (token.IsCancellationRequested)
{
return resultsList;
}
}
}
Changes logic a bit - all loops inside OperationB now are considered as single task, but this is easier than keep them atomic and write some sort of coordination primitive that will gather results from each run. In case I don't really care which loop produced which results this seems to be a decent solution. May improve to more flexible implementation later if needed (what I was actually looking for is to chain multiple operations recursively - OperationB itself may have smaller repeating OperationC's inside with same behavior, OperationC - multiple OperationD's that are running when C is active etc.).
edit
Added exception handling in taskBFinished for case when OperationA is quick and cancellation is issued before OperationB is even started.

Categories