c# build a list of tasks before executing - c#

i'm trying to build a list of tasks before executing them. here's some example code:
public string Returnastring(string b)
{
return b;
}
public string Returnanotherstring(string a)
{
return a;
}
private void btnT_Click(object sender, EventArgs e)
{
bool cont = true;
var Returnastringtask = Task.Factory.StartNew(() => Returnastring("hi"));
var Returnanotherstringtask = Task.Factory.StartNew(() => Returnanotherstring("bye"));
if (cont)
{
Task.WaitAll(new Task[] { Returnastringtask });
}
else
{
Task.WaitAll(new Task[] { Returnanotherstringtask });
}
i know this code doesn't behave how i expect it as both tasks run. i want to basically create the tasks initially and then execute one or the other based on the bool. i don't want to create the tasks inside the true or false conditions as i want to avoid code copying. by this i mean if cont is true i might want to run tasks 1,2,3,4 but if cont is false i might want to run tasks 2,3,7,8.

Well, another approach, (which I find very direct)
var list = new List<Task>();
for (var i = 0; i < 10; ++i)
{
var i2 = i;
var t = new Task(() =>
{
Thread.Sleep(100);
Console.WriteLine(i2);
});
list.Add(t);
t.Start();
}
Task.WaitAll(list.ToArray());

Instead of using Task.Factory.StartNew to create the tasks (the clue is in the name), instead just create them by using new Task(...) with your lambdas, then simply use taskName.Start() inside the condition you want to begin them.

You can create an array of Action based on a flag, and then use Parallel.Invoke() to run in parallel all the actions in the array and wait for them to finish.
You can use lambdas for the actions which will allow you to assign their return values to a local variable, if you want.
Here's a complete compilable example. Try it with getFlag() returning true and again with it returning false:
using System;
using System.Threading;
using System.Threading.Tasks;
namespace ConsoleApp1
{
sealed class Program
{
void run()
{
bool flag = getFlag();
var results = new string[5];
Action[] actions;
if (flag)
{
actions = new Action[]
{
() => results[0] = function("f1"),
() => results[1] = function("f2"),
() => results[2] = function("f3")
};
}
else
{
actions = new Action[]
{
() => results[3] = function("f4"),
() => results[4] = function("f5")
};
}
Parallel.Invoke(actions); // No tasks are run until you call this.
for (int i = 0; i < results.Length; ++i)
Console.WriteLine("Result {0} = {1}", i, results[i]);
}
private bool getFlag()
{
return true; // Also try with this returning false.
}
string function(string param)
{
Thread.Sleep(100); // Simulate work.
return param;
}
static void Main(string[] args)
{
new Program().run();
}
}
}

The Task.Factory.StartNew will actually begin your tasks. You want to setup the tasks and then run them based on some logic.
You can build your tasks wherever but you should start them after the logic. This example builds them after the logic.
Maybe you could run it like this:
If(A)
{
doA();
}
Else
{
doB();
}
Then start your tasks inside the function you call like:
public void doA()
{
for (int i = 0; i < NumberOfTasks; i++)
{
tasks[i] = Task.Factory.StartNew(() =>
{
try
{
//enter tasks here
// i.e. task 1, 2, 3, 4
}
}
}, token);
Task.WaitAll(tasks);
}

I based what I did on what Samuel did, except I have a recursive event handler that needs to finish what it's doing because its child events depend on it having completed (for nesting controls in a dynamic UI in an ASP.NET app). So if you want to do the same thing, except you're handling an event, and you are NOT multithreading because you need to process multiple tasks synchronously without goofing around with your call stack.
private static Queue<Task> _dqEvents = new Queue<Task>();
private static bool _handlingDqEvent = false;
protected void HandleDynamicQuestion(int SourceQuestionId, int QuestionId)
{
//create a task so that we can handle our events in sequential order, since additional events may fire before this task is completed, and depend upon the completion of prior events
Task task = new Task(() => DoDynamicQuestion(SourceQuestionId, QuestionId));
lock(_dqEvents) _dqEvents.Enqueue(task);
if (!_handlingDqEvent)
{
try
{
//lockout any other calls in the stack from hitting this chunk of code
lock (_dqEvents) _handlingDqEvent = true;
//now run all events in the queue, including any added deeper in the call stack that were added to this queue before we finished this iteration of the loop
while (_dqEvents.Any())
{
Task qt;
lock (_dqEvents) qt = _dqEvents.Dequeue();
qt.RunSynchronously();
}
}
finally
{
lock (_dqEvents) _handlingDqEvent = false;
}
}
else
//We exit the method if we're already handling an event, as the addition of new tasks to the static queue will be handled synchronously.
//Basically, this lets us escape the call stack without processing the event until we're ready, since the handling of the grandchild event
//is dependent upon its parent completing.
return;
}
private void DoDynamicQuestion(int SourceQuestionId, int QuestionId)
{
//does some stuff that has no dependency on synchronicity
//does some stuff that may eventually raise the event above
//does some other stuff that has to complete before events it triggers can process correctly
}

Related

C# Windows Async Pinging Network - different results each run

I've written a class that asynchronously pings a subnet. It works, however, the number of hosts returned will sometimes change between runs. Some questions:
Am I doing something wrong in the code below?
What can I do to make it work better?
The ScanIPAddressesAsync() method is called like this:
NetworkDiscovery nd = new NetworkDiscovery("192.168.50.");
nd.RaiseIPScanCompleteEvent += HandleScanComplete;
nd.ScanIPAddressesAsync();
namespace BPSTestTool
{
public class IPScanCompleteEvent : EventArgs
{
public List<String> IPList { get; set; }
public IPScanCompleteEvent(List<String> _list)
{
IPList = _list;
}
}
public class NetworkDiscovery
{
private static object m_lockObj = new object();
private List<String> m_ipsFound = new List<string>();
private String m_ipBase = null;
public List<String> IPList
{
get { return m_ipsFound; }
}
public EventHandler<IPScanCompleteEvent> RaiseIPScanCompleteEvent;
public NetworkDiscovery(string ipBase)
{
this.m_ipBase = ipBase;
}
public async void ScanIPAddressesAsync()
{
var tasks = new List<Task>();
m_ipsFound.Clear();
await Task.Run(() => AsyncScan());
return;
}
private async void AsyncScan()
{
List<Task> tasks = new List<Task>();
for (int i = 2; i < 255; i++)
{
String ip = m_ipBase + i.ToString();
if (m_ipsFound.Contains(ip) == false)
{
for (int x = 0; x < 2; x++)
{
Ping p = new Ping();
var task = HandlePingReplyAsync(p, ip);
tasks.Add(task);
}
}
}
await Task.WhenAll(tasks).ContinueWith(t =>
{
OnRaiseIPScanCompleteEvent(new IPScanCompleteEvent(m_ipsFound));
});
}
protected virtual void OnRaiseIPScanCompleteEvent(IPScanCompleteEvent args)
{
RaiseIPScanCompleteEvent?.Invoke(this, args);
}
private async Task HandlePingReplyAsync(Ping ping, String ip)
{
PingReply reply = await ping.SendPingAsync(ip, 1500);
if ( reply != null && reply.Status == System.Net.NetworkInformation.IPStatus.Success)
{
lock (m_lockObj)
{
if (m_ipsFound.Contains(ip) == false)
{
m_ipsFound.Add(ip);
}
}
}
}
}
}
One problem I see is async void. The only reason async void is even allowed is only for event handlers. If it's not an event handler, it's a red flag.
Asynchronous methods always start running synchronously until the first await that acts on an incomplete Task. In your code, that is at await Task.WhenAll(tasks). At that point, AsyncScan returns - before all the tasks have completed. Usually, it would return a Task that will let you know when it's done, but since the method signature is void, it cannot.
So now look at this:
await Task.Run(() => AsyncScan());
When AsyncScan() returns, then the Task returned from Task.Run completes and your code moves on, before all of the pings have finished.
So when you report your results, the number of results will be random, depending on how many happened to finish before you displayed the results.
If you want make sure that all of the pings are done before continuing, then change AsyncScan() to return a Task:
private async Task AsyncScan()
And change the Task.Run to await it:
await Task.Run(async () => await AsyncScan());
However, you could also just get rid of the Task.Run and just have this:
await AsyncScan();
Task.Run runs the code in a separate thread. The only reason to do that is in a UI app where you want to move CPU-heavy computations off of the UI thread. When you're just doing network requests like this, that's not necessary.
On top of that, you're also using async void here:
public async void ScanIPAddressesAsync()
Which means that wherever you call ScanIPAddressesAsync() is unable to wait until everything is done. Change that to async Task and await it too.
This code needs a lot of refactoring and bugs like this in concurrency are hard to pinpoint. My bet is on await Task.Run(() => AsyncScan()); which is wrong because AsyncScan() is async and Task.Run(...) will return before it is complete.
My second guess is m_ipsFound which is called a shared state. This means there might be many threads simultaneously reading and writing on this. List<T> is not a data type for this.
Also as a side point having a return in the last line of a method is not adding to the readability and async void is a prohibited practice. Always use async Task even if you return nothing. You can read more on this very good answer.

how can i implement a concurrent execution queue class?

I need to have a class that will execute actions in thread pool, but these actions should be queued. For example:
method 1
method 2
method 3
When someone called method 1 from his thread he can also call method 2 or method 3 and they all 3 methods can be performed concurrently, but when another call came from user for method 1 or 2 or 3, this time the thread pool should block these call, until the old ones execution is completed.
Something like the below picture:
Should I use channels?
To should i use channels?, the answer is yes, but there are other features available too.
Dataflow
.NET already offers this feature through the TPL Dataflow classes. You can use an ActionBlock class to pass messages (ie data) to a worker method that executes i the background with guaranteed order and a configurable degree of parallelism. Channels are a new feature which does essentially the same job.
What you describe is actually the simplest way of using an ActionBlock - just post data messages to it and have it process them one by one :
void Method1(MyDataObject1 data){...}
var block=new ActionBlock<MyDataObject1>(Method1);
//Start sending data to the block
for(var msg in someListOfItems)
{
block.PostAsync(msg);
}
By default, an ActionBlock has an infinite input queue. It will use only one task to process messages asynchronously, in the order they are posted.
When you're done with it, you can tell it to Complete() and await asynchronously for all remaining items to finish processing :
block.Complete();
await block.Completion;
To handle different methods, you can simply use multiple blocks, eg :
var block1=new ActionBlock<MyDataObject1>(Method1);
var block2=new ActionBlock<MyDataObject1>(Method2);
Channels
Channels are a lower-level feature than blocks. This means you have to write more code but you get far better control on how the "processing blocks" work. In fact, you can probably rewrite the TPL Dataflow library using channels.
You could create a processing block similar to an ActionBlock with the following (a bit naive) method:
ChannelWriter<TIn> Work(Action<TIn> action)
{
var channel=Channel.CreateUnbounded<TIn>();
var workerTask=Task.Run(async ()=>{
await foreach(var msg in channel.Reader.ReadAllAsync())
{
action(msg);
}
})
var writer=channel.Writer;
return writer;
}
This method creates a channel and runs a task in the background to read data asynchronously and process them. I'm cheating "a bit" here by using await foreach and ChannelReader.ReadAllAsync() which are available in C#8 and .NET Core 3.0.
This method can be used like a block :
ChannelWriter<DataObject1> writer1 = Work(Method1);
foreach(var msg in someListOfItems)
{
writer1.WriteAsync(msg);
}
writer1.Complete();
There's a lot more to Channels though. SignalR for example uses them to allow streaming of notifications to the clients.
Here is my suggestion. For each synchronous method, an asynchronous method should be added. For example the method FireTheGun is synchronous:
private static void FireTheGun(int bulletsCount)
{
var ratata = Enumerable.Repeat("Ta", bulletsCount).Prepend("Ra");
Console.WriteLine(String.Join("-", ratata));
}
The asynchronous counterpart FireTheGunAsync is very simple, because the complexity of queuing the synchronous action is delegated to a helper method QueueAsync.
public static Task FireTheGunAsync(int bulletsCount)
{
return QueueAsync(FireTheGun, bulletsCount);
}
Here is the implementation of QueueAsync. Each action has its dedicated SemaphoreSlim, to prevent multiple concurrent executions:
private static ConcurrentDictionary<MethodInfo, SemaphoreSlim> semaphores =
new ConcurrentDictionary<MethodInfo, SemaphoreSlim>();
public static Task QueueAsync<T1>(Action<T1> action, T1 param1)
{
return Task.Run(async () =>
{
var semaphore = semaphores
.GetOrAdd(action.Method, key => new SemaphoreSlim(1));
await semaphore.WaitAsync();
try
{
action(param1);
}
finally
{
semaphore.Release();
}
});
}
Usage example:
FireTheGunAsync(5);
FireTheGunAsync(8);
Output:
Ra-Ta-Ta-Ta-Ta-Ta
Ra-Ta-Ta-Ta-Ta-Ta-Ta-Ta-Ta
Implementing versions of QueueAsync with different number of parameters should be trivial.
Update: My previous implementation of QueueAsync has the probably undesirable behavior that executes the actions in random order. This happens because the second task may be the first one to acquire the semaphore. Below is an implementation that guaranties the corrent order of execution. The performance could be bad in case of high contention, because each task is entering a loop until it takes the semaphore in the right order.
private class QueueInfo
{
public SemaphoreSlim Semaphore = new SemaphoreSlim(1);
public int TicketToRide = 0;
public int Current = 0;
}
private static ConcurrentDictionary<MethodInfo, QueueInfo> queues =
new ConcurrentDictionary<MethodInfo, QueueInfo>();
public static Task QueueAsync<T1>(Action<T1> action, T1 param1)
{
var queue = queues.GetOrAdd(action.Method, key => new QueueInfo());
var ticket = Interlocked.Increment(ref queue.TicketToRide);
return Task.Run(async () =>
{
while (true) // Loop until our ticket becomes current
{
await queue.Semaphore.WaitAsync();
try
{
if (Interlocked.CompareExchange(ref queue.Current,
ticket, ticket - 1) == ticket - 1)
{
action(param1);
break;
}
}
finally
{
queue.Semaphore.Release();
}
}
});
}
What about this solution?
public class ConcurrentQueue
{
private Dictionary<byte, PoolFiber> Actionsfiber;
public ConcurrentQueue()
{
Actionsfiber = new Dictionary<byte, PoolFiber>()
{
{ 1, new PoolFiber() },
{ 2, new PoolFiber() },
{ 3, new PoolFiber() },
};
foreach (var fiber in Actionsfiber.Values)
{
fiber.Start();
}
}
public void ExecuteAction(Action Action , byte Code)
{
if (Actionsfiber.ContainsKey(Code))
Actionsfiber[Code].Enqueue(() => { Action.Invoke(); });
else
Console.WriteLine($"invalid byte code");
}
}
public static void SomeAction1()
{
Console.WriteLine($"{DateTime.Now} Action 1 is working");
for (long i = 0; i < 2400000000; i++)
{
}
Console.WriteLine($"{DateTime.Now} Action 1 stopped");
}
public static void SomeAction2()
{
Console.WriteLine($"{DateTime.Now} Action 2 is working");
for (long i = 0; i < 5000000000; i++)
{
}
Console.WriteLine($"{DateTime.Now} Action 2 stopped");
}
public static void SomeAction3()
{
Console.WriteLine($"{DateTime.Now} Action 3 is working");
for (long i = 0; i < 5000000000; i++)
{
}
Console.WriteLine($"{DateTime.Now} Action 3 stopped");
}
public static void Main(string[] args)
{
ConcurrentQueue concurrentQueue = new ConcurrentQueue();
concurrentQueue.ExecuteAction(SomeAction1, 1);
concurrentQueue.ExecuteAction(SomeAction2, 2);
concurrentQueue.ExecuteAction(SomeAction3, 3);
concurrentQueue.ExecuteAction(SomeAction1, 1);
concurrentQueue.ExecuteAction(SomeAction2, 2);
concurrentQueue.ExecuteAction(SomeAction3, 3);
Console.WriteLine($"press any key to exit the program");
Console.ReadKey();
}
the output :
8/5/2019 7:56:57 AM Action 1 is working
8/5/2019 7:56:57 AM Action 3 is working
8/5/2019 7:56:57 AM Action 2 is working
8/5/2019 7:57:08 AM Action 1 stopped
8/5/2019 7:57:08 AM Action 1 is working
8/5/2019 7:57:15 AM Action 2 stopped
8/5/2019 7:57:15 AM Action 2 is working
8/5/2019 7:57:16 AM Action 3 stopped
8/5/2019 7:57:16 AM Action 3 is working
8/5/2019 7:57:18 AM Action 1 stopped
8/5/2019 7:57:33 AM Action 2 stopped
8/5/2019 7:57:33 AM Action 3 stopped
the poolFiber is a class in the ExitGames.Concurrency.Fibers namespace.
more info :
How To Avoid Race Conditions And Other Multithreading Issues?

Parallel processing using TPL in windows service

I have a windows service which is consuming a messaging system to fetch messages. I have also created a callback mechanism with the help of Timer class which helps me to check the message after some fixed time to fetch and process. Previously, the service is processing the message one by one. But I want after the message arrives the processing mechanism to execute in parallel. So if the first message arrived it should go for processing on one task and even if the processing is not finished for the first message still after the interval time configured using the callback method (callback is working now) next message should be picked and processed on a different task.
Below is my code:
Task.Factory.StartNew(() =>
{
Subsriber<Message> subsriber = new Subsriber<Message>()
{
Interval = 1000
};
subsriber.Callback(Process, m => m != null);
});
public static void Process(Message message)
{
if (message != null)
{
// Processing logic
}
else
{
}
}
But using the Task Factory I am not able to control the number of tasks in parallel so in my case I want to configure the number of tasks on which messages will run on the availability of the tasks?
Update:
Updated my above code to add multiple tasks
Below is the code:
private static void Main()
{
try
{
int taskCount = 5;
Task.Factory.StartNewAsync(() =>
{
Subscriber<Message> consumer = new
Subcriber<Message>()
{
Interval = 1000
};
consumer.CallBack(Process, msg => msg!=
null);
}, taskCount);
Console.ReadLine();
}
catch (Exception e)
{
Console.WriteLine(e.Message);
}
public static void StartNewAsync(this TaskFactory
target, Action action, int taskCount)
{
var tasks = new Task[taskCount];
for (int i = 0; i < taskCount; i++)
{
tasks[i] = target.StartNew(action);
}
}
public static void Process(Message message)
{
if (message != null)
{
}
else
{ }
}
}
I think what your looking for will result in quite a large sample. I'm trying just to demonstrate how you would do this with ActionBlock<T>. There's still a lot of unknowns so I left the sample as skeleton you can build off. In the sample the ActionBlock will handle and process in parallel all your messages as they're received from your messaging system
public class Processor
{
private readonly IMessagingSystem _messagingSystem;
private readonly ActionBlock<Message> _handler;
private bool _pollForMessages;
public Processor(IMessagingSystem messagingSystem)
{
_messagingSystem = messagingSystem;
_handler = new ActionBlock<Message>(msg => Process(msg), new ExecutionDataflowBlockOptions()
{
MaxDegreeOfParallelism = 5 //or any configured value
});
}
public async Task Start()
{
_pollForMessages = true;
while (_pollForMessages)
{
var msg = await _messagingSystem.ReceiveMessageAsync();
await _handler.SendAsync(msg);
}
}
public void Stop()
{
_pollForMessages = false;
}
private void Process(Message message)
{
//handle message
}
}
More Examples
And Ideas
Ok, sorry I'm short on time but here's the general idea/skeleton of what I was thinking as an alternative.
If I'm honest though I think the ActionBlock<T> is the better option as there's just so much done for you, with the only limit being that you can't dynamically scale the amount of work it will do it once, although I think the limit can be quite high. If you get into doing it this way you could have more control or just have a kind of dynamic amount of tasks running but you'll have to do a lot of things manually, e.g if you want to limit the amount of tasks running at a time, you'd have to implement a queueing system (something ActionBlock handles for you) and then maintain it. I guess it depends on how many messages you're receiving and how fast your process handles them.
You'll have to check it out and think of how it could apply to your direct use case as I think some of the details area a little sketchily implemented on my side around the concurrentbag idea.
So the idea behind what I've thrown together here is that you can start any number of tasks, or add to the tasks running or cancel tasks individually by using the collection.
The main thing I think is just making the method that the Callback runs fire off a thread that does the work, instead of subscribing within a separate thread.
I used Task.Factory.StartNew as you did, but stored the returned Task object in an object (TaskInfo) which also had it's CancellationTokenSource, it's Id (assigned externally) as properties, and then added that to a collection of TaskInfo which is a property on the class this is all a part of:
Updated - to avoid this being too confusing i've just updated the code that was here previously.
You'll have to update bits of it and fill in the blanks in places like with whatever you have for my HeartbeatController, and the few events that get called because they're beyond the scope of the question but the idea would be the same.
public class TaskContainer
{
private ConcurrentBag<TaskInfo> Tasks;
public TaskContainer(){
Tasks = new ConcurrentBag<TaskInfo>();
}
//entry point
//UPDATED
public void StartAndMonitor(int processorCount)
{
for (int i = 0; i <= processorCount; i++)
{
Processor task = new Processor(ProcessorId = i);
CreateProcessorTask(task);
}
this.IsRunning = true;
MonitorTasks();
}
private void CreateProcessorTask(Processor processor)
{
CancellationTokenSource cancellationTokenSource = new CancellationTokenSource();
Task taskInstance = Task.Factory.StartNew(
() => processor.Start(cancellationTokenSource.Token)
);
//bind status update event
processor.ProcessorStatusUpdated += ReportProcessorProcess;
Tasks.Add(new ProcessorInfo()
{
ProcessorId = processor.ProcessorId,
Task = taskInstance,
CancellationTokenSource = cancellationTokenSource
});
}
//this method gets called once but the HeartbeatController gets an action as a param that it then
//executes on a timer. I haven't included that but you get the idea
//This method also checks for tasks that have stopped and restarts them if the manifest call says they should be running.
//Will also start any new tasks included in the manifest and stop any that aren't included in the manifest.
internal void MonitorTasks()
{
HeartbeatController.Beat(() =>
{
HeartBeatHappened?.Invoke(this, null);
List<int> tasksToStart = new List<int>();
//this is an api call or whatever drives your config that says what tasks must be running.
var newManifest = this.GetManifest(Properties.Settings.Default.ResourceId);
//task Removed Check - If a Processor is removed from the task pool, cancel it if running and remove it from the Tasks List.
List<int> instanceIds = new List<int>();
newManifest.Processors.ForEach(x => instanceIds.Add(x.ProcessorId));
var removed = Tasks.Select(x => x.ProcessorId).ToList().Except(instanceIds).ToList();
if (removed.Count() > 0)
{
foreach (var extaskId in removed)
{
var task = Tasks.FirstOrDefault(x => x.ProcessorId == extaskId);
task.CancellationTokenSource?.Cancel();
}
}
foreach (var newtask in newManifest.Processors)
{
var oldtask = Tasks.FirstOrDefault(x => x.ProcessorId == newtask.ProcessorId);
//Existing task check
if (oldtask != null && oldtask.Task != null)
{
if (!oldtask.Task.IsCanceled && (oldtask.Task.IsCompleted || oldtask.Task.IsFaulted))
{
var ex = oldtask.Task.Exception;
tasksToStart.Add(oldtask.ProcessorId);
continue;
}
}
else //New task Check
tasksToStart.Add(newtask.ProcessorId);
}
foreach (var item in tasksToStart)
{
var taskToRemove = Tasks.FirstOrDefault(x => x.ProcessorId == item);
if (taskToRemove != null)
Tasks.Remove(taskToRemove);
var task = newManifest.Processors.FirstOrDefault(x => x.ProcessorId == item);
if (task != null)
{
CreateProcessorTask(task);
}
}
});
}
}
//UPDATED
public class Processor{
private int ProcessorId;
private Subsriber<Message> subsriber;
public Processor(int processorId) => ProcessorId = processorId;
public void Start(CancellationToken token)
{
Subsriber<Message> subsriber = new Subsriber<Message>()
{
Interval = 1000
};
subsriber.Callback(Process, m => m != null);
}
private void Process()
{
//do work
}
}
Hope this gives you an idea of how else you can approach your problem and that I didn't miss the point :).
Update
To use events to update progress or which tasks are processing, I'd extract them into their own class, which then has subscribe methods on it, and when creating a new instance of that class, assign the event to a handler in the parent class which can then update your UI or whatever you want it to do with that info.
So the content of Process() would look more like this:
Processor processor = new Processor();
Task task = Task.Factory.StartNew(() => processor.ProcessMessage(cancellationTokenSource.CancellationToken));
processor.StatusUpdated += ReportProcess;

Can i change and int value inside a task while its running? c#

I currently am learning how to use Tasks in c#, i want to be able to run 2 tasks at the same time. then when the first task ends. tell the code to stop the second one. I have tried many things but none have worked, i have tried:
Try looking for something related to task.stop and have not found it. i am using task.wait for the first task so when the first one ends i have to do something to stop the second one.
Since the second one is infinite (its an eternal loop) i tried making the parameter of the loop something i could change in the main code, but its like the task is a method and variables in them are unique.
TL;DR: I want to know if i can change a parameter inside a task in order to stop it from outside its code. do the task itself take any parameters? and can i change them in the main code after they start running?
If none of the previous things are possible is it then possible in any way to stop an infinite task?
CODE:
Task a = new Task(() =>
{
int sd = 3;
while (sd < 20)
{
Console.Write("peanuts");
sd++; //this i can change cuz its like local to the task
}
});
a.Start();
// infinite task
Task b = new Task(() =>
{
int s = 3; // parameter i want to change to stop it
while (s < 10)
{
Console.Write(s+1);
}
});
b.Start();
a.Wait();
// Now here I want to stop task b
Console.WriteLine("peanuts");
Console.ReadKey();
Try this:
public static void Run()
{
CancellationTokenSource cts = new CancellationTokenSource();
Task1(cts);
Task2(cts.Token);
}
private static void Task2(CancellationToken token)
{
Task.Factory.StartNew(() =>
{
int s = 3; // parameter i want to change to stop it
while (!token.IsCancellationRequested)
{
Console.Write(s + 1);
}
}, token);
}
private static void Task1(CancellationTokenSource cts)
{
Task.Factory.StartNew(() =>
{
int sd = 3;
while (sd < 20)
{
Console.Write("peanuts");
sd++; //this i can change cuz its like local to the task
}
}).ContinueWith(t => cts.Cancel());
}
CancellationTokenSource will be cancelled when Task1 is finished. So, Task2 checks cancellation token each iteration and exits infinite loop when cancellation is requested.

Anonymous Parallel Task Timers?

Maybe my brain is a bit fried so I'm missing some nice way to do the following... I want to be able to launch a timer though a Task that runs on a certain interval and checks some condition on each interval whether it should cancel itself, what's the most elegant solution?
Optimally I'd like something like:
Task.Factory.StartNew(() =>
{
Timer.Do(TimeSpan.FromMilliSeconds(200),() => ShouldCancel(), ()=>
{
//DoStuff
});
});
using a while/thread-sleep loop doesn't seem optimal. I guess I could define and use a ordinary timer but it seems a bit clunky...
How about something like the following.I'm sure the API could be cleaned up a bit.
Points to note:
The DoWork method must support cooperative cancellation, this is the only cancellation approach supported by the Task Parallel Library.
The timer must start inside the Task, otherwise the Task may be created and scheduled but not executed and the timer will be timing task wait time not execution time.
If you want to provide other external mechanisms for cancellation (other tokens) then you need to pass in another context and link them. See: CancellationTokenSource.CreateLinkedTokenSource
This is only approximate as System.Threading.Timer only has millisecond accuracy. It should be good enough for limiting a Task to run for a few seconds.
public static class TimeLimitedTaskFactory
{
public static Task StartNew<T>
(Action<CancellationToken> action, int maxTime)
{
Task tsk = Task.Factory.StartNew(() =>
{
var cts = new CancellationTokenSource();
System.Threading.Timer timer = new System.Threading.Timer(o =>
{
cts.Cancel();
Console.WriteLine("Cancelled!");
}, null, maxTime, int.MaxValue);
action(cts.Token);
});
return tsk;
}
}
class Program
{
static void Main(string[] args)
{
int maxTime = 2000;
int maxWork = 10;
Task tsk = TimeLimitedTaskFactory
.StartNew<int>((ctx) => DoWork(ctx, maxWork), maxTime);
Console.WriteLine("Waiting on Task...");
tsk.Wait();
Console.WriteLine("Finished...");
Console.ReadKey();
}
static void DoWork(CancellationToken ctx, int workSize)
{
int i = 0;
while (!ctx.IsCancellationRequested && i < workSize)
{
Thread.Sleep(500);
Console.WriteLine(" Working on ", ++i);
}
}
}
You also can use RX library.
var timerTask = Observable.Timer(TimeSpan.Zero, TimeSpan.FromSeconds(3));
timerTask.Subscribe(x =>
{
//Do stuff here
});
I think this is what you want:
var cancelToken = new CancellationTokenSource();
var tt = Task.Factory.StartNew(obj =>
{
var tk = (CancellationTokenSource) obj;
while (!tk.IsCancellationRequested)
{
if (condition)//your condition
{
//Do work
}
Thread.Sleep(1000);
}
}, cancelToken);

Categories