How to wait for all Timer Tasks to be done? - c#

I've got multiple System.Threading.Timer which starts in parallel. In the end, I have got a Task.Wait to wait till all tasks are done. But it doesn't wait for all, how can I make it wait for all?
private List<Task> todayTasks = new List<Task>();
foreach (var item in todayReport)
{
todayTasks.Add(SetupTimer(item.Exec_Time, item.Report_Id));
}
Task.WaitAll(todayTasks.ToArray());
--SetupTimer--
private Task SetupTimer(DateTime alertTime, int id)
{
DateTime current = DateTime.Now;
TimeSpan timeToGo = alertTime.TimeOfDay - current.TimeOfDay;
if (timeToGo < TimeSpan.Zero) {
//TODO: ERROR time already passed
}
ExecCustomReportService executeCustom = new ExecCustomReportService();
return Task.Run(
() => new Timer(
x => executeCustom.AdhockReport(id), null, timeToGo, Timeout.InfiniteTimeSpan
)
);
}

You're really better off using a tool that is suited to the job. I'd suggest Microsoft's Reactive Framework (Rx). Then you can do this:
var query =
from item in todayReport.ToObservable()
from report in Observable.Start(() => executeCustom.AdhockReport(item.Report_Id))
select report;
IDisposable subscription =
query
.Subscribe(
report =>
{
/* Do something with each report */
},
() =>
{
/* Do something when finished */
});
You just need to NuGet "System.Reactive".

As #YacoubMassad stated in the comment, your task is merely creating the timer and returns.
What you could do is get rid of the timer and use Task.Delay:
return Task.Delay(timeToGo).ContinueWith(t=> executeCustom.AdhockReport(id));

Related

How to delay 'hot' tasks so they can processed in a set order

Say I have a set of tasks:
var task1 = DoThisAsync(...);
var task2 = DoThatAsync(...);
var task3 = DoOtherAsync(...);
var taskN...
I am looking for a way to process a set of tasks in order (determined by place in containing collection say), but to have the tasks only run/start when its their turn and not before - and have all of that wrapped up in its own task.
Problem constraints / details are:
These tasks need to be performed in a certain order i.e.task1, task2,...
The previous task must complete asynchronously before the next can start
The number of tasks is variable
A different set of tasks may be nominated each time code is run
The order and number of tasks is known in advance.
The main problem is that as soon as I call the relevant method (like DoThis()...) to return each task, that task is already 'hot' or running, violating (2) above.
I have tried working with.ContinueWith(..) , but if I call each of tasks like above to set the continuations or add them to a list or collection they've already started.
Not sure if Lazy < T > might help but can't see how at present?
Hope this makes sense as I'm fairly new to async / await / tasks.
Many thanks in advance.
Calling a method runs code. If you want an object that will call this method later, then use a delegate.
In this case, you could use Func<Task>, which is an asynchronous delegate. A list of these should suffice:
// Build the list of operations in order.
var operations = new List<Func<Task>>();
operations.Add(() => DoThisAsync(...));
operations.Add(() => DoThatAsync(...));
operations.Add(() => DoOtherAsync(...));
// Execute them all one at a time.
foreach (var operation in operations)
await operation();
you can simply create tasks with its constructor and then, call execution with .Start() methods.
Here an example:
var taskList = InitQueue();
foreach (var t in taskList.OrderBy(i => i.Order))
{
//Here I can skedule an existing task
t.TaskToRun.Start();
t.TaskToRun.Wait();
Console.WriteLine($"Task {t.Order} has finished its job");
}
public class TaskQueue : List<TaskItem>
{
}
public class TaskItem
{
public int Order { get; set; }
public Task TaskToRun { get; set; }
}
private static TaskQueue InitQueue()
{
var queue = new TaskQueue();
queue.Add(new TaskItem
{
Order = 1,
TaskToRun = new Task(() =>
{
Task.Delay(500);
Console.WriteLine("Hello from task 1");
})
});
queue.Add(new TaskItem
{
Order = 4,
TaskToRun = new Task(() => Console.WriteLine("Hello from task 4"))
});
queue.Add(new TaskItem
{
Order = 3,
TaskToRun = new Task(() =>
{
Task.Delay(5000);
Console.WriteLine("Hello from task 3");
})
});
queue.Add(new TaskItem
{
Order = 2,
TaskToRun = new Task(() => Console.WriteLine("Hello from task 2"))
});
return queue;
}

How to limit the Maximum number of parallel tasks in c#

I have a collection of 1000 input message to process. I'm looping the input collection and starting the new task for each message to get processed.
//Assume this messages collection contains 1000 items
var messages = new List<string>();
foreach (var msg in messages)
{
Task.Factory.StartNew(() =>
{
Process(msg);
});
}
Can we guess how many maximum messages simultaneously get processed at the time (assuming normal Quad core processor), or can we limit the maximum number of messages to be processed at the time?
How to ensure this message get processed in the same sequence/order of the Collection?
You could use Parallel.Foreach and rely on MaxDegreeOfParallelism instead.
Parallel.ForEach(messages, new ParallelOptions {MaxDegreeOfParallelism = 10},
msg =>
{
// logic
Process(msg);
});
SemaphoreSlim is a very good solution in this case and I higly recommend OP to try this, but #Manoj's answer has flaw as mentioned in comments.semaphore should be waited before spawning the task like this.
Updated Answer: As #Vasyl pointed out Semaphore may be disposed before completion of tasks and will raise exception when Release() method is called so before exiting the using block must wait for the completion of all created Tasks.
int maxConcurrency=10;
var messages = new List<string>();
using(SemaphoreSlim concurrencySemaphore = new SemaphoreSlim(maxConcurrency))
{
List<Task> tasks = new List<Task>();
foreach(var msg in messages)
{
concurrencySemaphore.Wait();
var t = Task.Factory.StartNew(() =>
{
try
{
Process(msg);
}
finally
{
concurrencySemaphore.Release();
}
});
tasks.Add(t);
}
Task.WaitAll(tasks.ToArray());
}
Answer to Comments
for those who want to see how semaphore can be disposed without Task.WaitAll
Run below code in console app and this exception will be raised.
System.ObjectDisposedException: 'The semaphore has been disposed.'
static void Main(string[] args)
{
int maxConcurrency = 5;
List<string> messages = Enumerable.Range(1, 15).Select(e => e.ToString()).ToList();
using (SemaphoreSlim concurrencySemaphore = new SemaphoreSlim(maxConcurrency))
{
List<Task> tasks = new List<Task>();
foreach (var msg in messages)
{
concurrencySemaphore.Wait();
var t = Task.Factory.StartNew(() =>
{
try
{
Process(msg);
}
finally
{
concurrencySemaphore.Release();
}
});
tasks.Add(t);
}
// Task.WaitAll(tasks.ToArray());
}
Console.WriteLine("Exited using block");
Console.ReadKey();
}
private static void Process(string msg)
{
Thread.Sleep(2000);
Console.WriteLine(msg);
}
I think it would be better to use Parallel LINQ
Parallel.ForEach(messages ,
new ParallelOptions{MaxDegreeOfParallelism = 4},
x => Process(x);
);
where x is the MaxDegreeOfParallelism
With .NET 5.0 and Core 3.0 channels were introduced.
The main benefit of this producer/consumer concurrency pattern is that you can also limit the input data processing to reduce resource impact.
This is especially helpful when processing millions of data records.
Instead of reading the whole dataset at once into memory, you can now consecutively query only chunks of the data and wait for the workers to process it before querying more.
Code sample with a queue capacity of 50 messages and 5 consumer threads:
/// <exception cref="System.AggregateException">Thrown on Consumer Task exceptions.</exception>
public static async Task ProcessMessages(List<string> messages)
{
const int producerCapacity = 10, consumerTaskLimit = 3;
var channel = Channel.CreateBounded<string>(producerCapacity);
_ = Task.Run(async () =>
{
foreach (var msg in messages)
{
await channel.Writer.WriteAsync(msg);
// blocking when channel is full
// waiting for the consumer tasks to pop messages from the queue
}
channel.Writer.Complete();
// signaling the end of queue so that
// WaitToReadAsync will return false to stop the consumer tasks
});
var tokenSource = new CancellationTokenSource();
CancellationToken ct = tokenSource.Token;
var consumerTasks = Enumerable
.Range(1, consumerTaskLimit)
.Select(_ => Task.Run(async () =>
{
try
{
while (await channel.Reader.WaitToReadAsync(ct))
{
ct.ThrowIfCancellationRequested();
while (channel.Reader.TryRead(out var message))
{
await Task.Delay(500);
Console.WriteLine(message);
}
}
}
catch (OperationCanceledException) { }
catch
{
tokenSource.Cancel();
throw;
}
}))
.ToArray();
Task waitForConsumers = Task.WhenAll(consumerTasks);
try { await waitForConsumers; }
catch
{
foreach (var e in waitForConsumers.Exception.Flatten().InnerExceptions)
Console.WriteLine(e.ToString());
throw waitForConsumers.Exception.Flatten();
}
}
As pointed out by Theodor Zoulias:
On multiple consumer exceptions, the remaining tasks will continue to run and have to take the load of the killed tasks. To avoid this, I implemented a CancellationToken to stop all the remaining tasks and handle the exceptions combined in the AggregateException of waitForConsumers.Exception.
Side note:
The Task Parallel Library (TPL) might be good at automatically limiting the tasks based on your local resources. But when you are processing data remotely via RPC, it's necessary to manually limit your RPC calls to avoid filling the network/processing stack!
If your Process method is async you can't use Task.Factory.StartNew as it doesn't play well with an async delegate. Also there are some other nuances when using it (see this for example).
The proper way to do it in this case is to use Task.Run. Here's #ClearLogic answer modified for an async Process method.
static void Main(string[] args)
{
int maxConcurrency = 5;
List<string> messages = Enumerable.Range(1, 15).Select(e => e.ToString()).ToList();
using (SemaphoreSlim concurrencySemaphore = new SemaphoreSlim(maxConcurrency))
{
List<Task> tasks = new List<Task>();
foreach (var msg in messages)
{
concurrencySemaphore.Wait();
var t = Task.Run(async () =>
{
try
{
await Process(msg);
}
finally
{
concurrencySemaphore.Release();
}
});
tasks.Add(t);
}
Task.WaitAll(tasks.ToArray());
}
Console.WriteLine("Exited using block");
Console.ReadKey();
}
private static async Task Process(string msg)
{
await Task.Delay(2000);
Console.WriteLine(msg);
}
You can create your own TaskScheduler and override QueueTask there.
protected virtual void QueueTask(Task task)
Then you can do anything you like.
One example here:
Limited concurrency level task scheduler (with task priority) handling wrapped tasks
You can simply set the max concurrency degree like this way:
int maxConcurrency=10;
var messages = new List<1000>();
using(SemaphoreSlim concurrencySemaphore = new SemaphoreSlim(maxConcurrency))
{
foreach(var msg in messages)
{
Task.Factory.StartNew(() =>
{
concurrencySemaphore.Wait();
try
{
Process(msg);
}
finally
{
concurrencySemaphore.Release();
}
});
}
}
If you need in-order queuing (processing might finish in any order), there is no need for a semaphore. Old fashioned if statements work fine:
const int maxConcurrency = 5;
List<Task> tasks = new List<Task>();
foreach (var arg in args)
{
var t = Task.Run(() => { Process(arg); } );
tasks.Add(t);
if(tasks.Count >= maxConcurrency)
Task.WaitAny(tasks.ToArray());
}
Task.WaitAll(tasks.ToArray());
I ran into a similar problem where I wanted to produce 5000 results while calling apis, etc. So, I ran some speed tests.
Parallel.ForEach(products.Select(x => x.KeyValue).Distinct().Take(100), id =>
{
new ParallelOptions { MaxDegreeOfParallelism = 100 };
GetProductMetaData(productsMetaData, client, id).GetAwaiter().GetResult();
});
produced 100 results in 30 seconds.
Parallel.ForEach(products.Select(x => x.KeyValue).Distinct().Take(100), id =>
{
new ParallelOptions { MaxDegreeOfParallelism = 100 };
GetProductMetaData(productsMetaData, client, id);
});
Moving the GetAwaiter().GetResult() to the individual async api calls inside GetProductMetaData resulted in 14.09 seconds to produce 100 results.
foreach (var id in ids.Take(100))
{
GetProductMetaData(productsMetaData, client, id);
}
Complete non-async programming with the GetAwaiter().GetResult() in api calls resulted in 13.417 seconds.
var tasks = new List<Task>();
while (y < ids.Count())
{
foreach (var id in ids.Skip(y).Take(100))
{
tasks.Add(GetProductMetaData(productsMetaData, client, id));
}
y += 100;
Task.WhenAll(tasks).GetAwaiter().GetResult();
Console.WriteLine($"Finished {y}, {sw.Elapsed}");
}
Forming a task list and working through 100 at a time resulted in a speed of 7.36 seconds.
using (SemaphoreSlim cons = new SemaphoreSlim(10))
{
var tasks = new List<Task>();
foreach (var id in ids.Take(100))
{
cons.Wait();
var t = Task.Factory.StartNew(() =>
{
try
{
GetProductMetaData(productsMetaData, client, id);
}
finally
{
cons.Release();
}
});
tasks.Add(t);
}
Task.WaitAll(tasks.ToArray());
}
Using SemaphoreSlim resulted in 13.369 seconds, but also took a moment to boot to start using it.
var throttler = new SemaphoreSlim(initialCount: take);
foreach (var id in ids)
{
throttler.WaitAsync().GetAwaiter().GetResult();
tasks.Add(Task.Run(async () =>
{
try
{
skip += 1;
await GetProductMetaData(productsMetaData, client, id);
if (skip % 100 == 0)
{
Console.WriteLine($"started {skip}/{count}, {sw.Elapsed}");
}
}
finally
{
throttler.Release();
}
}));
}
Using Semaphore Slim with a throttler for my async task took 6.12 seconds.
The answer for me in this specific project was use a throttler with Semaphore Slim. Although the while foreach tasklist did sometimes beat the throttler, 4/6 times the throttler won for 1000 records.
I realize I'm not using the OPs code, but I think this is important and adds to this discussion because how is sometimes not the only question that should be asked, and the answer is sometimes "It depends on what you are trying to do."
Now to answer the specific questions:
How to limit the maximum number of parallel tasks in c#: I showed how to limit the number of tasks that are completed at a time.
Can we guess how many maximum messages simultaneously get processed at the time (assuming normal Quad core processor), or can we limit the maximum number of messages to be processed at the time? I cannot guess how many will be processed at a time unless I set an upper limit but I can set an upper limit. Obviously different computers function at different speeds due to CPU, RAM etc. and how many threads and cores the program itself has access to as well as other programs running in tandem on the same computer.
How to ensure this message get processed in the same sequence/order of the Collection? If you want to process everything in a specific order, it is synchronous programming. The point of being able to run things asynchronously is ensuring that they can do everything without an order. As you can see from my code, the time difference is minimal in 100 records unless you use async code. In the event that you need an order to what you are doing, use asynchronous programming up until that point, then await and do things synchronously from there. For example, task1a.start, task2a.start, then later task1a.await, task2a.await... then later task1b.start task1b.await and task2b.start task 2b.await.
public static void RunTasks(List<NamedTask> importTaskList)
{
List<NamedTask> runningTasks = new List<NamedTask>();
try
{
foreach (NamedTask currentTask in importTaskList)
{
currentTask.Start();
runningTasks.Add(currentTask);
if (runningTasks.Where(x => x.Status == TaskStatus.Running).Count() >= MaxCountImportThread)
{
Task.WaitAny(runningTasks.ToArray());
}
}
Task.WaitAll(runningTasks.ToArray());
}
catch (Exception ex)
{
Log.Fatal("ERROR!", ex);
}
}
you can use the BlockingCollection, If the consume collection limit has reached, the produce will stop producing until a consume process will finish. I find this pattern more easy to understand and implement than the SemaphoreSlim.
int TasksLimit = 10;
BlockingCollection<Task> tasks = new BlockingCollection<Task>(new ConcurrentBag<Task>(), TasksLimit);
void ProduceAndConsume()
{
var producer = Task.Factory.StartNew(RunProducer);
var consumer = Task.Factory.StartNew(RunConsumer);
try
{
Task.WaitAll(new[] { producer, consumer });
}
catch (AggregateException ae) { }
}
void RunConsumer()
{
foreach (var task in tasks.GetConsumingEnumerable())
{
task.Start();
}
}
void RunProducer()
{
for (int i = 0; i < 1000; i++)
{
tasks.Add(new Task(() => Thread.Sleep(1000), TaskCreationOptions.AttachedToParent));
}
}
Note that the RunProducer and RunConsumer has spawn two independent tasks.

Can I use Win32.TaskScheduler to schedule my procedures?

I want to use http://taskscheduler.codeplex.com/
I can schedule external Actions with it alike in example.
Can I schedule my internal actions with it or maybe I should use something else for external tasks?
e.g. :
public static void cstask() {
using (TaskService ts = new TaskService()) {
const string taskName = "Test2";
System.Action test = new System.Action( () => {
;
}
);
Task t = ts.AddTask(taskName,
new TimeTrigger() { StartBoundary = DateTime.Now + TimeSpan.FromHours(1), Enabled = false },
//new ExecAction("notepad.exe", "c:\\test.log", "C:\\")
(Microsoft.Win32.TaskScheduler.Action)test
);
ts.RootFolder.DeleteTask(taskName);
}
}
->
Cannot convert type 'System.Action' to 'Microsoft.Win32.TaskScheduler.Action'
Seems like a job for Reactive Extensions
using System.Reactive;
using System.Reactive.Linq;
and
IDisposable subscription =
Observable
.Interval(TimeSpan.FromHours(1))
.Take(1)
.Subscribe(i =>
{
Console.WriteLine("I'm doing something delayed by an hour");
});
dispose the subscription to kill it before it runs. If you want your
task to return something for processing then a little different.
double SomeFunction(){
...
}
and
public async double DoItInOneHour(){
return await Observable
.Interval(TimeSpan.FromHours(1))
.Take(1)
.Select( i=> SomeFunction());
}
will asynchronously wait for the timeout, evaluate SomeFunction and then return
the result to value.
You can't invoke a System.Action directly with the windows task scheduler.
But you can create a scheduled task that executes your application. The needed information to invoke an internal action could be passed by through commandline arguments.

Anonymous Parallel Task Timers?

Maybe my brain is a bit fried so I'm missing some nice way to do the following... I want to be able to launch a timer though a Task that runs on a certain interval and checks some condition on each interval whether it should cancel itself, what's the most elegant solution?
Optimally I'd like something like:
Task.Factory.StartNew(() =>
{
Timer.Do(TimeSpan.FromMilliSeconds(200),() => ShouldCancel(), ()=>
{
//DoStuff
});
});
using a while/thread-sleep loop doesn't seem optimal. I guess I could define and use a ordinary timer but it seems a bit clunky...
How about something like the following.I'm sure the API could be cleaned up a bit.
Points to note:
The DoWork method must support cooperative cancellation, this is the only cancellation approach supported by the Task Parallel Library.
The timer must start inside the Task, otherwise the Task may be created and scheduled but not executed and the timer will be timing task wait time not execution time.
If you want to provide other external mechanisms for cancellation (other tokens) then you need to pass in another context and link them. See: CancellationTokenSource.CreateLinkedTokenSource
This is only approximate as System.Threading.Timer only has millisecond accuracy. It should be good enough for limiting a Task to run for a few seconds.
public static class TimeLimitedTaskFactory
{
public static Task StartNew<T>
(Action<CancellationToken> action, int maxTime)
{
Task tsk = Task.Factory.StartNew(() =>
{
var cts = new CancellationTokenSource();
System.Threading.Timer timer = new System.Threading.Timer(o =>
{
cts.Cancel();
Console.WriteLine("Cancelled!");
}, null, maxTime, int.MaxValue);
action(cts.Token);
});
return tsk;
}
}
class Program
{
static void Main(string[] args)
{
int maxTime = 2000;
int maxWork = 10;
Task tsk = TimeLimitedTaskFactory
.StartNew<int>((ctx) => DoWork(ctx, maxWork), maxTime);
Console.WriteLine("Waiting on Task...");
tsk.Wait();
Console.WriteLine("Finished...");
Console.ReadKey();
}
static void DoWork(CancellationToken ctx, int workSize)
{
int i = 0;
while (!ctx.IsCancellationRequested && i < workSize)
{
Thread.Sleep(500);
Console.WriteLine(" Working on ", ++i);
}
}
}
You also can use RX library.
var timerTask = Observable.Timer(TimeSpan.Zero, TimeSpan.FromSeconds(3));
timerTask.Subscribe(x =>
{
//Do stuff here
});
I think this is what you want:
var cancelToken = new CancellationTokenSource();
var tt = Task.Factory.StartNew(obj =>
{
var tk = (CancellationTokenSource) obj;
while (!tk.IsCancellationRequested)
{
if (condition)//your condition
{
//Do work
}
Thread.Sleep(1000);
}
}, cancelToken);

Is there any way to start task using ContinueWith task?

My code:
var r = from x in new Task<int>(() => 1)
from y in new Task<int>(() => x + 1)
select y;
r.ContinueWith(x => Console.WriteLine(x.Result)).Start();
or
new Task<int>(() => 1)
.ContinueWith(x => x.Result + 1)
.ContinueWith(x => Console.WriteLine(x.Result))
.Start();
Exception:
Start may not be called on a continuation task.
So I need to start the first task. Is there any way to call last task Start method to run all tasks?
Any reason not to use Task.Factory.StartNewmsdn, ms docs for the first task? Yes, it's inconsistent - but it's fundamentally a different kind of task, in terms of being started explicitly rather than just as a continuation.
I'm not really sure what's wrong with just writing this:
var t1 = new Task<int>(() => 1)
var r = from x in t1
from y in new Task<int>(() => x + 1)
select y;
r.ContinueWith(x => Console.WriteLine(x.Result));
t1.Start();
or this:
var t = new Task<int>(() => 1)
t.ContinueWith(x => x.Result + 1)
.ContinueWith(x => Console.WriteLine(x.Result))
t.Start();
That directly expresses what you actually want to do. (It's the initial task that you want to kick off. So what's wrong with invoking Start on that initial task?) Why are you looking for a syntax that obscures that?
EDIT: fixed first example...
EDIT 2 to add:
So I realise now that LinqToTasks expects task selectors to return running tasks. So the second from clause in your first example returns a task that nothing will ever run. So what you actually need is this:
var t1 = new Task<int>(() => 1);
var r = from x in t1
from y in Task<int>.Factory.StartNew(() => x + 1)
select y;
r.ContinueWith(x => Console.WriteLine(x.Result));
t1.Start();
Nothing else is going to call Start on the tasks produced in these from clauses. Since the relevant selectors don't actually get executed until the previous task completes, you're still in control of when to kick off the root task.
That appears to work, but it's pretty ugly. But that appears to be how LinqToTasks is designed... I think I'd stick with the regular function call syntax.
The problem is that selecting tasks with LINQ Will Only Create an Expression Tree!
So here's what you need to do:
var query =
from i in Enumerable.Range(1, 4)
let task = Task.Factory.StartNew(() => Tuple.Create(i, IsPrime(i))) // put a breakpoint here
select task.ContinueWith(delegate {
Console.WriteLine("{0} {1} prime.", _.Result.Item1, _.Result.Item2 ? "is" : "is not");
});
// breakpoint never hit yet
query.ToArray(); // breakpoint hit here 4 times
// all tasks are now running and continuations will start
TaskEx.Await(query.ToArray()); // breakpoint hit 4 more times!!
I had the same problem today. I wanted to create a wrapper task that handles an error from an inner task. This is what I came up with:
var yourInitialTask = new Task(delegate
{
throw e;
});
var continuation = task.ContinueWith(t =>
{
if (task.IsCanceled)
{
Debug.WriteLine("IsCanceled: " + job.GetType());
}
else if (task.IsFaulted)
{
Debug.WriteLine("IsFaulted: " + job.GetType());
}
else if (task.IsCompleted)
{
Debug.WriteLine("IsCompleted: " + job.GetType());
}
}, TaskContinuationOptions.ExecuteSynchronously); //or consider removing execute synchronously if your continuation task is going to take long
var wrapper = new Task(() =>
{
task.Start();
continuation.Wait();
});
return wrapper;
The key features here are that
-the continuation part runs after the original task, as you want
-the wrapper is Startable. Continuation tasks created with ContineWith() are nto Startable.
A less key feature of this example, is that the exception is being logged and discarded (solves my problem, not yours). You might want to be doing something different when exceptions occur in the continuation such as rethrow it as an exception of the current task, so that it bubbles out.
As far as I'm aware, there's no sensible way to compose non-started tasks provided by the framework. The simplest solution I can think of is extension methods. Here are some examples which you could build on if you need this functionality.
Warning: Just as with passing around and composing tons of lambdas, if you find yourself needing these, it often means you are missing a type in your design that would simplify your code. Ask yourself what you gained by creating the subtasks.
/// <summary>
/// Compose tasks without starting them.
/// Waiting on the returned task waits for both components to complete.
/// An exception in the first task will stop the second task running.
/// </summary>
public static class TaskExtensions
{
public static Task FollowedBy(this Task first, Task second)
{
return FollowedBy(first,
() =>
{
second.Start();
second.Wait();
});
}
public static Task FollowedBy(this Task first, Action second)
{
return new Task(
() =>
{
if (first.Status == TaskStatus.Created) first.Start();
first.Wait();
second();
});
}
public static Task FollowedBy<T>(this Task first, Task<T> second)
{
return new Task<T>(
() =>
{
if (first.Status == TaskStatus.Created) first.Start();
first.Wait();
second.Start();
return second.Result;
});
}
public static Task FollowedBy<T>(this Task<T> first, Action<T> second)
{
return new Task(
() =>
{
if (first.Status == TaskStatus.Created) first.Start();
var firstResult = first.Result;
second(firstResult);
});
}
public static Task<TSecond> FollowedBy<TFirst, TSecond>(this Task<TFirst> first, Func<TFirst, TSecond> second)
{
return new Task<TSecond>(
() =>
{
if (first.Status == TaskStatus.Created) first.Start();
return second(first.Result);
});
}
}
The answer is simple. ContinueWith is automatically start task. And first task need to be running.
var r= Task<int>.Run<int>( () => 1 )
.ContinueWith<int>( x => x.Result + 1 )
.ContinueWith( x => Console.WriteLine( x.Result ) );
ContinueWith return task that start with checking previous task is done or not. This code work in the same way like below code
var firstTask = new Task<int>( () => 1 );
firstTask.Start();
var firstawaiter = firstTask.GetAwaiter();
var secondTask = new Task<int>( x => (int)x + 1 , firstawaiter.GetResult());
firstawaiter.OnCompleted( () =>
{
secondTask.Start();
} );
var secondawaiter = secondTask.GetAwaiter();
var thirdTask = new Task( x => Console.WriteLine( x ) , secondawaiter.GetResult());
secondawaiter.OnCompleted( () =>
{
thirdTask.Start();
} );
So, If first task is not completed, next task will not be started.
And you not need to start continuewith block .

Categories