Threads that return data in .NET - c#

I'm writing a program where I typically start five threads. The threads return in a non-determinate order. Each thread is calling a method which returns a List.
I'm doing this:
var masterList = List<string>();
foreach (var threadParam in threadParams)
{
var expression = threadParam ;
ThreadStart sub = () => MyMethod(expressions);
var thread = new Thread(sub)
{
Name = expression
};
listThreads.Add(thread);
thread.Start();
}
var abort = true;
while (abort) //Wait until all threads finish
{
var count = 0;
foreach (var list in listThreads)
{
if (!list.IsAlive)
{
count++;
}
}
if (count == listThreads.Count)
{
abort = false;
}
}
So here is the problem:
Each thread when it terminates returns a list I would like to append the masterList declared earlier.
How would one go about this?
Also I KNOW there must be a better way than below to wait for all threads to finish
var abort = true;
while (abort) //Wait until all threads finish
{
var count = 0;
foreach (var list in listThreads)
{
if (!list.IsAlive)
{
count++;
}
}
if (count == listThreads.Count)
{
abort = false;
}
}

Use a WaitHandle
Here's an example:
using System;
using System.Threading;
class ThreadSleeper
{
int seconds;
AutoResetEvent napDone = new AutoResetEvent(false);
private ThreadSleeper(int seconds)
{
this.seconds = seconds;
}
public void Nap()
{
Console.WriteLine("Napping {0} seconds", seconds);
Thread.Sleep(seconds * 1000);
Console.WriteLine("{0} second nap finished", seconds);
napDone.Set();
}
public static WaitHandle DoSleep(int seconds)
{
ThreadSleeper ts = new ThreadSleeper(seconds);
Thread thread = new Thread(new ThreadStart(ts.Nap));
thread.Start();
return(ts.napDone);
}
}
public class OperationsThreadsWaitingwithWaitHandle
{
public static void Main()
{
WaitHandle[] waits = new WaitHandle[2];
waits[0] = ThreadSleeper.DoSleep(8);
waits[1] = ThreadSleeper.DoSleep(4);
Console.WriteLine("Waiting for threads to finish");
WaitHandle.WaitAll(waits);
Console.WriteLine("Threads finished");
}
}
Links to check out:
Threads:Waiting with WaitHandle
Jon Skeet's post
Difference between Barrier in C# 4.0 and WaitHandle in C# 3.0?
Novice C# threading: WaitHandles
WaitHandle Exceptions and Work Arounds
Multithreading with C#
WaitHandle, AutoResetEvent and ManualResetEvent Classes in VB.Net

The best way would be to make each thread it's own object. Don't have any intermingling with other objects, all you do is construct it (passing in the variables), add yourself as a listener and start it.
When it's done, it stores the values in a member variable and notifies your listener.
Your listener can retrieve the values at leisure.
The obvious shortcut, returning the values directly to the listener, works but you may find this version more flexible later (and really not much more code)

You can of course use regular delegates as well and APM.
Note that the pattern you describe is normally described as a Future, a background job that promises to return something later on (a return on investment if you may).
Here's a short example, in the form of a console program.
Output:
sequential: 3224 ms
parallel: 2074 ms
Of course, since I'm not constructing explicit threads, I leave it to the threadpool system to figure out how many threads to run in parallel.
There's also provisions to be informed of when the background threads are completed, via a callback method, as well as WaitHandle support to explicitly checking and waiting with timeout, etc.
And the source:
using System;
using System.Collections.Generic;
using System.Threading;
using System.Diagnostics;
namespace SO1215227
{
public class Program
{
public static void Main()
{
Stopwatch sw = new Stopwatch();
sw.Start();
var list1 = Sequence(1, 100);
var list2 = Sequence(101, 200);
var list3 = Sequence(201, 300);
sw.Stop();
Console.Out.WriteLine("sequential: " + sw.ElapsedMilliseconds + " ms");
sw.Reset();
Func<Int32, Int32, List<Int32>> listProducer = Sequence;
sw.Start();
var list1Background = listProducer.BeginInvoke(1, 100, null, null);
var list2Background = listProducer.BeginInvoke(101, 200, null, null);
var list3Background = listProducer.BeginInvoke(201, 300, null, null);
list1 = listProducer.EndInvoke(list1Background);
list2 = listProducer.EndInvoke(list2Background);
list3 = listProducer.EndInvoke(list3Background);
sw.Stop();
Console.Out.WriteLine("parallel: " + sw.ElapsedMilliseconds + " ms");
Console.Out.Write("Press enter to exit...");
Console.In.ReadLine();
}
private static List<Int32> Sequence(Int32 from, Int32 to)
{
List<Int32> result = new List<Int32>();
for (Int32 index = from; index <= to; index++)
{
result.Add(index);
Thread.Sleep(10); // simulate I/O wait
}
return result;
}
}
}

Check out the WaitHandle class and the WaitHandle.WaitAll (I've used the ManualResetEvent class in some of my code for the WaitHandle but it's just from an example. I don't know if there is something better for your situation). Fire off all five threads, give them a reference to your master list, lock this list when adding to it, then signal completion. Use WaitHandle.WaitAll to block until all five have signaled completion.

Related

Why my workers work distribution count does not total the number of produced items in this System.Threading.Channel sample?

Following this post, I have been playing with System.Threading.Channel to get confident enough and use it in my production code, replacing the Threads/Monitor.Pulse/Wait based approach I am currently using (described in the referred post).
Basically I created a sample with a bounded channel where I run a couple of producer tasks at the beginning and, without waiting, start my consumer tasks, which start pushing elements from the channel.
After waiting for the producers tasks to complete, I then signal the channel as complete, so the consumer tasks can quit listening to new channel elements.
My channel is a Channel<Action>, and in each action I increment the count for each given worker in the WorkDistribution concurrent dictionary, and at the end of the sample I print it so I can check I consumed as many items as I expected, and also how did the channel distributed the actions between the consumers.
For some reason this "Work Distribution footer" is not printing the same number of items as the total items produced by producer tasks.
What am I missing ?
Some of the variables present were added for the sole purpose of helping troubleshoot.
Here's the full code:
public class ChannelSolution
{
object LockObject = new object();
Channel<Action<string>> channel;
int ItemsToProduce;
int WorkersCount;
int TotalItemsProduced;
ConcurrentDictionary<string, int> WorkDistribution;
CancellationToken Ct;
public ChannelSolution(int workersCount, int itemsToProduce, int maxAllowedItems,
CancellationToken ct)
{
WorkersCount = workersCount;
ItemsToProduce = itemsToProduce;
channel = Channel.CreateBounded<Action<string>>(maxAllowedItems);
Console.WriteLine($"Created channel with max {maxAllowedItems} items");
WorkDistribution = new ConcurrentDictionary<string, int>();
Ct = ct;
}
async Task ProduceItems(int cycle)
{
for (var i = 0; i < ItemsToProduce; i++)
{
var index = i + 1 + (ItemsToProduce * cycle);
bool queueHasRoom;
var stopwatch = new Stopwatch();
stopwatch.Start();
do
{
if (Ct.IsCancellationRequested)
{
Console.WriteLine("exiting read loop - cancellation requested !");
break;
}
queueHasRoom = await channel.Writer.WaitToWriteAsync();
if (!queueHasRoom)
{
if (Ct.IsCancellationRequested)
{
Console.WriteLine("exiting read loop - cancellation"
+ " requested !");
break;
}
if (stopwatch.Elapsed.Seconds % 3 == 0)
Console.WriteLine("Channel reached maximum capacity..."
+ " producer waiting for items to be freed...");
}
}
while (!queueHasRoom);
channel.Writer.TryWrite((workerName) => action($"A{index}", workerName));
Console.WriteLine($"Channel has room, item {index} added"
+ $" - channel items count: [{channel.Reader.Count}]");
Interlocked.Increment(ref TotalItemsProduced);
}
}
List<Task> GetConsumers()
{
var tasks = new List<Task>();
for (var i = 0; i < WorkersCount; i++)
{
var workerName = $"W{(i + 1).ToString("00")}";
tasks.Add(Task.Run(async () =>
{
while (await channel.Reader.WaitToReadAsync())
{
if (Ct.IsCancellationRequested)
{
Console.WriteLine("exiting write loop - cancellation"
+ "requested !");
break;
}
if (channel.Reader.TryRead(out var action))
{
Console.WriteLine($"dequed action in worker [{workerName}]");
action(workerName);
}
}
}));
}
return tasks;
}
void action(string actionNumber, string workerName)
{
Console.WriteLine($"processing {actionNumber} in worker {workerName}...");
var secondsToWait = new Random().Next(2, 5);
Thread.Sleep(TimeSpan.FromSeconds(secondsToWait));
Console.WriteLine($"action {actionNumber} completed by worker {workerName}"
+ $" after {secondsToWait} secs! channel items left:"
+ $" [{channel.Reader.Count}]");
if (WorkDistribution.ContainsKey(workerName))
{
lock (LockObject)
{
WorkDistribution[workerName]++;
}
}
else
{
var succeeded = WorkDistribution.TryAdd(workerName, 1);
if (!succeeded)
{
Console.WriteLine($"!!! failed incremeting dic value !!!");
}
}
}
public void Summarize(Stopwatch stopwatch)
{
Console.WriteLine("--------------------------- Thread Work Distribution "
+ "------------------------");
foreach (var kv in this.WorkDistribution)
Console.WriteLine($"thread: {kv.Key} items consumed: {kv.Value}");
Console.WriteLine($"Total actions consumed: "
+ $"{WorkDistribution.Sum(w => w.Value)} - Elapsed time: "
+ $"{stopwatch.Elapsed.Seconds} secs");
}
public void Run(int producerCycles)
{
var stopwatch = new Stopwatch();
stopwatch.Start();
var producerTasks = new List<Task>();
Console.WriteLine($"Started running at {DateTime.Now}...");
for (var i = 0; i < producerCycles; i++)
{
producerTasks.Add(ProduceItems(i));
}
var consumerTasks = GetConsumers();
Task.WaitAll(producerTasks.ToArray());
Console.WriteLine($"-------------- Completed waiting for PRODUCERS -"
+ " total items produced: [{TotalItemsProduced}] ------------------");
channel.Writer.Complete(); //just so I can complete this demo
Task.WaitAll(consumerTasks.ToArray());
Console.WriteLine("----------------- Completed waiting for CONSUMERS "
+ "------------------");
//Task.WaitAll(GetConsumers().Union(producerTasks/*.Union(
// new List<Task> { taskKey })*/).ToArray());
//Console.WriteLine("Completed waiting for tasks");
Summarize(stopwatch);
}
}
And here is the calling code in Program.cs
var workersCount = 5;
var itemsToProduce = 10;
var maxItemsInQueue = 5;
var cts = new CancellationTokenSource();
var producerConsumerTests = new ProducerConsumerTests(workersCount, itemsToProduce,
maxItemsInQueue, cts.Token);
producerConsumerTests.Run(2);
From a quick look there is a race condition in the ProduceItems method, around the queueHasRoom variable. You don't need this variable. The channel.Writer.TryWrite method will tell you whether there is room in the channel's buffer or not. Alternatively you could simply await the WriteAsync method, instead of using the WaitToWriteAsync/TryWrite combo. AFAIK this combo is intended as a performance optimization of the former method. If you absolutely need to know whether there is available space before attempting to post a value, then the Channel<T> is probably not a suitable container for your use case. You'll need to find something that can be locked during the whole operation of "check-for-available-space -> create-the-value -> post-the-value", so that this operation can be made atomic.
As a side note, using a lock to protect the updating of the ConcurrentDictionary is redundant. The ConcurrentDictionary offers the AddOrUpdate method, that can replace atomically a value it contains with another value. You may had to lock if the dictionary contained mutable objects, and you needed to mutate that objects with thread-safety. But in your case the values are of type Int32, which is an immutable struct. You don't change it, you just replace it with a new Int32, which is created based on the existing value:
WorkDistribution.AddOrUpdate(workerName, 1, (_, existing) => existing + 1);

TaskFactory, Starting a new Task when one ends

I have found many methods of using the TaskFactory but I could not find anything about starting more tasks and watching when one ends and starting another one.
I always want to have 10 tasks working.
I want something like this
int nTotalTasks=10;
int nCurrentTask=0;
Task<bool>[] tasks=new Task<bool>[nThreadsNum];
for (int i=0; i<1000; i++)
{
string param1="test";
string param2="test";
if (nCurrentTask<10) // if there are less than 10 tasks then start another one
tasks[nCurrentThread++] = Task.Factory.StartNew<bool>(() =>
{
MyClass cls = new MyClass();
bool bRet = cls.Method1(param1, param2, i); // takes up to 2 minutes to finish
return bRet;
});
// How can I stop the for loop until a new task is finished and start a new one?
}
Check out the Task.WaitAny method:
Waits for any of the provided Task objects to complete execution.
Example from the documentation:
var t1 = Task.Factory.StartNew(() => DoOperation1());
var t2 = Task.Factory.StartNew(() => DoOperation2());
Task.WaitAny(t1, t2)
I would use a combination of Microsoft's Reactive Framework (NuGet "Rx-Main") and TPL for this. It becomes very simple.
Here's the code:
int nTotalTasks=10;
string param1="test";
string param2="test";
IDisposable subscription =
Observable
.Range(0, 1000)
.Select(i => Observable.FromAsync(() => Task.Factory.StartNew<bool>(() =>
{
MyClass cls = new MyClass();
bool bRet = cls.Method1(param1, param2, i); // takes up to 2 minutes to finish
return bRet;
})))
.Merge(nTotalTasks)
.ToArray()
.Subscribe((bool[] results) =>
{
/* Do something with the results. */
});
The key part here is the .Merge(nTotalTasks) which limits the number of concurrent tasks.
If you need to stop the processing part way thru just call subscription.Dispose() and everything gets cleaned up for you.
If you want to process each result as they are produced you can change the code from the .Merge(...) like this:
.Merge(nTotalTasks)
.Subscribe((bool result) =>
{
/* Do something with each result. */
});
This should be all you need, not complete, but all you need to do is wait on the first to complete and then run the second.
Task.WaitAny(task to wait on);
Task.Factory.StartNew()
Have you seen the BlockingCollection class? It allows you to have multiple threads running in parallel and you can wait from results from one task to execute another. See more information here.
The answer depends on whether the tasks to be scheduled are CPU or I/O bound.
For CPU-intensive work I would use Parallel.For() API setting the number of thread/tasks through MaxDegreeOfParallelism property of ParallelOptions
For I/O bound work the number of concurrently executing tasks can be significantly larger than the number of available CPUs, so the strategy is to rely on async methods as much as possible, which reduces the total number of threads waiting for completion.
How can I stop the for loop until a new task is finished and start a
new one?
The loop can be throttled by using await:
static void Main(string[] args)
{
var task = DoWorkAsync();
task.Wait();
// handle results
// task.Result;
Console.WriteLine("Done.");
}
async static Task<bool> DoWorkAsync()
{
const int NUMBER_OF_SLOTS = 10;
string param1="test";
string param2="test";
var results = new bool[NUMBER_OF_SLOTS];
AsyncWorkScheduler ws = new AsyncWorkScheduler(NUMBER_OF_SLOTS);
for (int i = 0; i < 1000; ++i)
{
await ws.ScheduleAsync((slotNumber) => DoWorkAsync(i, slotNumber, param1, param2, results));
}
ws.Complete();
await ws.Completion;
}
async static Task DoWorkAsync(int index, int slotNumber, string param1, string param2, bool[] results)
{
results[slotNumber] = results[slotNumber} && await Task.Factory.StartNew<bool>(() =>
{
MyClass cls = new MyClass();
bool bRet = cls.Method1(param1, param2, i); // takes up to 2 minutes to finish
return bRet;
}));
}
A helper class AsyncWorkScheduler uses TPL.DataFlow components as well as Task.WhenAll():
class AsyncWorkScheduler
{
public AsyncWorkScheduler(int numberOfSlots)
{
m_slots = new Task[numberOfSlots];
m_availableSlots = new BufferBlock<int>();
m_errors = new List<Exception>();
m_tcs = new TaskCompletionSource<bool>();
m_completionPending = 0;
// Initial state: all slots are available
for(int i = 0; i < m_slots.Length; ++i)
{
m_slots[i] = Task.FromResult(false);
m_availableSlots.Post(i);
}
}
public async Task ScheduleAsync(Func<int, Task> action)
{
if (Volatile.Read(ref m_completionPending) != 0)
{
throw new InvalidOperationException("Unable to schedule new items.");
}
// Acquire a slot
int slotNumber = await m_availableSlots.ReceiveAsync().ConfigureAwait(false);
// Schedule a new task for a given slot
var task = action(slotNumber);
// Store a continuation on the task to handle completion events
m_slots[slotNumber] = task.ContinueWith(t => HandleCompletedTask(t, slotNumber), TaskContinuationOptions.ExecuteSynchronously);
}
public async void Complete()
{
if (Interlocked.CompareExchange(ref m_completionPending, 1, 0) != 0)
{
return;
}
// Signal the queue's completion
m_availableSlots.Complete();
await Task.WhenAll(m_slots).ConfigureAwait(false);
// Set completion
if (m_errors.Count != 0)
{
m_tcs.TrySetException(m_errors);
}
else
{
m_tcs.TrySetResult(true);
}
}
public Task Completion
{
get
{
return m_tcs.Task;
}
}
void SetFailed(Exception error)
{
lock(m_errors)
{
m_errors.Add(error);
}
}
void HandleCompletedTask(Task task, int slotNumber)
{
if (task.IsFaulted || task.IsCanceled)
{
SetFailed(task.Exception);
return;
}
if (Volatile.Read(ref m_completionPending) == 1)
{
return;
}
// Release a slot
m_availableSlots.Post(slotNumber);
}
int m_completionPending;
List<Exception> m_errors;
BufferBlock<int> m_availableSlots;
TaskCompletionSource<bool> m_tcs;
Task[] m_slots;
}

waitall for multiple handles on sta thread is not supported [duplicate]

This question already has answers here:
WaitAll for multiple handles on a STA thread is not supported
(5 answers)
Closed 9 years ago.
Hi all i have this exception when i run my app.
i work on .net 3.5 so i cannot use Task
waitall for multiple handles on sta thread is not supported
this is the code :-
private void ThreadPopFunction(ContactList SelectedContactList, List<User> AllSelectedUsers)
{
int NodeCount = 0;
AllSelectedUsers.EachParallel(user =>
{
NodeCount++;
if (user != null)
{
if (user.OCSEnable)
{
string messageExciption = string.Empty;
if (!string.IsNullOrEmpty(user.SipURI))
{
//Lync.Lync.Lync lync = new Lync.Lync.Lync(AdObjects.Pools);
List<Pool> myPools = AdObjects.Pools;
if (new Lync.Lync.Lync(myPools).Populate(user, SelectedContactList, out messageExciption))
{
}
}
}
}
});
}
and this is my extension method i use to work with multithreading
public static void EachParallel<T>(this IEnumerable<T> list, Action<T> action)
{
// enumerate the list so it can't change during execution
// TODO: why is this happening?
list = list.ToArray();
var count = list.Count();
if (count == 0)
{
return;
}
else if (count == 1)
{
// if there's only one element, just execute it
action(list.First());
}
else
{
// Launch each method in it's own thread
const int MaxHandles = 64;
for (var offset = 0; offset <= count/MaxHandles; offset++)
{
// break up the list into 64-item chunks because of a limitiation in WaitHandle
var chunk = list.Skip(offset*MaxHandles).Take(MaxHandles);
// Initialize the reset events to keep track of completed threads
var resetEvents = new ManualResetEvent[chunk.Count()];
// spawn a thread for each item in the chunk
int i = 0;
foreach (var item in chunk)
{
resetEvents[i] = new ManualResetEvent(false);
ThreadPool.QueueUserWorkItem(new WaitCallback((object data) =>
{
int methodIndex =
(int) ((object[]) data)[0];
// Execute the method and pass in the enumerated item
action((T) ((object[]) data)[1]);
// Tell the calling thread that we're done
resetEvents[methodIndex].Set();
}), new object[] {i, item});
i++;
}
// Wait for all threads to execute
WaitHandle.WaitAll(resetEvents);
}
}
}
if you can help me, i'll appreciate your support
OK, as you're using .Net 3.5, you can't use the TPL introduced with .Net 4.0.
STA thread or not, in your case there is a way more simple/efficient approach than WaitAll. You could simply have a counter and a unique WaitHandle. Here's some code (can't test it right now, but it should be fine):
// No MaxHandle limitation ;)
for (var offset = 0; offset <= count; offset++)
{
// Initialize the reset event
var resetEvent = new ManualResetEvent();
// Queue action in thread pool for each item in the list
int counter = count;
foreach (var item in list)
{
ThreadPool.QueueUserWorkItem(new WaitCallback((object data) =>
{
int methodIndex =
(int) ((object[]) data)[0];
// Execute the method and pass in the enumerated item
action((T) ((object[]) data)[1]);
// Decrements counter atomically
Interlocked.Decrement(ref counter);
// If we're at 0, then last action was executed
if (Interlocked.Read(ref counter) == 0)
{
resetEvent.Set();
}
}), new object[] {i, item});
}
// Wait for the single WaitHandle
// which is only set when the last action executed
resetEvent.WaitOne();
}
Also FYI, ThreadPool.QueueUserWorkItem doesn't spawn a thread each time it's called (I'm saying that because of the comment "spawn a thread for each item in the chunk"). It uses a pool of thread, so it mostly reuses existing threads.
For those like me, who need to use the examples.
ken2k's solution is great and it works but with a few corrections (he said he didn't test it). Here is ken2k's working example (worked for me):
// No MaxHandle limitation ;)
for (var offset = 0; offset <= count; offset++)
{
// Initialize the reset event
var resetEvent = new ManualResetEvent(false);
// Queue action in thread pool for each item in the list
long counter = count;
// use a thread for each item in the chunk
int i = 0;
foreach (var item in list)
{
ThreadPool.QueueUserWorkItem(new WaitCallback((object data) =>
{
int methodIndex =
(int) ((object[]) data)[0];
// Execute the method and pass in the enumerated item
action((T) ((object[]) data)[1]);
// Decrements counter atomically
Interlocked.Decrement(ref counter);
// If we're at 0, then last action was executed
if (Interlocked.Read(ref counter) == 0)
{
resetEvent.Set();
}
}), new object[] {i, item});
}
// Wait for the single WaitHandle
// which is only set when the last action executed
resetEvent.WaitOne();
}
Actually there is a way to use (at least a good part) of the TPL in .net 3.5. There is a backport that was done for the Rx-Project.
You can find it here: http://www.nuget.org/packages/TaskParallelLibrary
Maybe this will help.

Monitor multiple Threading.Timers after Disposal

I have a process, which creates a dynamic list of timers(System.Threading.Timer) and continues to run until a signal is received to terminate. Once a signal is received to terminate I want any existing timer callbacks to complete (See Below):
private IList<Timer> _timers = new List<Timer>();
...
...
private void WaitOnExecutingThreads()
{
var waiters = new List<ManualResetEvent>(_timers.Count);
foreach (var timer in _timers)
{
var onWait = new ManualResetEvent(false);
waiters.Add(onWait);
timer.Dispose(onWait);
}
WaitHandle.WaitAll(waiters.ToArray());
waiters.ForEach(x=> x.Dispose());
}
This code works right now, but I would like to monitor the ongoing thread callbacks once the timers are disposed. My intent is to write to a log at a given interval "Timer A is still running".
I started playing with:
ThreadPool.RegisterWaitForSingleObject(....)
add added the following:
(Note:I created a class ThreadContext which contains the timer and associated data)
private void WaitOnExecutingThreads()
{
var waiters = new List<ManualResetEvent>();
WaitOrTimerCallback IsRunning = (x, timeout) => { if (timeout) { Log(x + "is still running"); } };
foreach (var threadContext in _threadContexts)
{
var onWait = new ManualResetEvent(false);
threadContext.Timer.Dispose(onWait);
ThreadPool.RegisterWaitForSingleObject(onWait, IsRunning , threadContext.ThreadInfo.Id, new TimeSpan(0, 0, 30), false);
waiters.Add(onWait);
}
WaitHandle.WaitAll(waiters.ToArray());
waiters.ForEach(x=> x.Dispose());
}
I feel like this should be a straight forward task in C# .net 4.0. In my simple unit test, My IsRunning callback fires quite a bit after the wait. I do not perform any further execution after this call. but I am writing quite a bit of code that I am not too comfortable with and feel like this will fail.
Is there a simpler solution or I am misunderstanding something?
UPDATE
Based on Peter R. suggestion I came up with the following below. Granted its more lines of code but I don't have to register a single thread object. If all the threads are still executing after disposal I sleep for 10 seconds and check again for this example.
private void WaitOnExecutingThreads()
{
foreach (var threadContext in _threadContexts)
{
threadContext.DisposeWaiter = new ManualResetEvent(false);
threadContext.Timer.Dispose(threadContext.DisposeWaiter);
}
while(_threadContexts.Count > 0)
{
for(var i = 0; i < _threadContexts.Count; i++)
{
var threadContext = _threadContexts[i];
var isComplete = threadContext.DisposeWaiter.WaitOne(0);
if(isComplete)
{
Console.WriteLine(string.Format("{0}: {1} has completed", DateTime.Now, threadContext.Name));
_threadContexts.RemoveAt(i);
}
else
{
Console.WriteLine(string.Format("{0}: {1} is still running", DateTime.Now, threadContext.Name));
}
}
if (_threadContexts.Count > 0)
{
Thread.Sleep(new TimeSpan(0, 0, 10));
}
}
}
....
public class ThreadContext
{
public string Name { get; set; }
public Timer Timer { get; set; }
public WaitHandle DisposeWaiter { get; set; }
}
_
If your handlers haven't completed, your ManualResetEvents will not be signalled. So, you could simply test if the event is in a signaled state or not. i.e.
var isComplete = waiters[0].WaitOne(0);

Create multiple threads and wait for all of them to complete

How can I create multiple threads and wait for all of them to complete?
It depends which version of the .NET Framework you are using. .NET 4.0 made thread management a whole lot easier using Tasks:
class Program
{
static void Main(string[] args)
{
Task task1 = Task.Factory.StartNew(() => doStuff());
Task task2 = Task.Factory.StartNew(() => doStuff());
Task task3 = Task.Factory.StartNew(() => doStuff());
Task.WaitAll(task1, task2, task3);
Console.WriteLine("All threads complete");
}
static void doStuff()
{
//do stuff here
}
}
In previous versions of .NET you could use the BackgroundWorker object, use ThreadPool.QueueUserWorkItem(), or create your threads manually and use Thread.Join() to wait for them to complete:
static void Main(string[] args)
{
Thread t1 = new Thread(doStuff);
t1.Start();
Thread t2 = new Thread(doStuff);
t2.Start();
Thread t3 = new Thread(doStuff);
t3.Start();
t1.Join();
t2.Join();
t3.Join();
Console.WriteLine("All threads complete");
}
I think you need WaitHandler.WaitAll. Here is an example:
public static void Main(string[] args)
{
int numOfThreads = 10;
WaitHandle[] waitHandles = new WaitHandle[numOfThreads];
for (int i = 0; i < numOfThreads; i++)
{
var j = i;
// Or you can use AutoResetEvent/ManualResetEvent
var handle = new EventWaitHandle(false, EventResetMode.ManualReset);
var thread = new Thread(() =>
{
Thread.Sleep(j * 1000);
Console.WriteLine("Thread{0} exits", j);
handle.Set();
});
waitHandles[j] = handle;
thread.Start();
}
WaitHandle.WaitAll(waitHandles);
Console.WriteLine("Main thread exits");
Console.Read();
}
FCL has a few more convenient functions.
(1) Task.WaitAll, as well as its overloads, when you want to do some tasks in parallel (and with no return values).
var tasks = new[]
{
Task.Factory.StartNew(() => DoSomething1()),
Task.Factory.StartNew(() => DoSomething2()),
Task.Factory.StartNew(() => DoSomething3())
};
Task.WaitAll(tasks);
(2) Task.WhenAll when you want to do some tasks with return values. It performs the operations and puts the results in an array. It's thread-safe, and you don't need to using a thread-safe container and implement the add operation yourself.
var tasks = new[]
{
Task.Factory.StartNew(() => GetSomething1()),
Task.Factory.StartNew(() => GetSomething2()),
Task.Factory.StartNew(() => GetSomething3())
};
var things = Task.WhenAll(tasks);
I've made a very simple extension method to wait for all threads of a collection:
using System.Collections.Generic;
using System.Threading;
namespace Extensions {
public static class ThreadExtension {
public static void WaitAll (this IEnumerable<Thread> threads) {
if (threads != null) {
foreach (Thread thread in threads) {
thread.Join();
}
}
}
}
}
Then you simply call:
List<Thread> threads = new List<Thread>();
// Add your threads to this collection
threads.WaitAll();
In .NET 4.0, you can use the Task Parallel Library.
In earlier versions, you can create a list of Thread objects in a loop, calling Start on each one, and then make another loop and call Join on each one.
If you don't want to use the Task class (for instance, in .NET 3.5), you can just start all your threads, and then add them to the list and join them in a foreach loop.
Example:
List<Thread> threads = new List<Thread>();
// Start threads
for (int i = 0; i < 10; i++) {
int tmp = i; // Copy value for closure
Thread t = new Thread(() => Console.WriteLine(tmp));
t.Start();
threads.Add(t);
}
// Join threads (wait threads)
foreach (Thread thread in threads) {
thread.Join();
}
I don't know if there is a better way, but the following describes how I did it with a counter and background worker thread.
private object _lock = new object();
private int _runningThreads = 0;
private int Counter{
get{
lock(_lock)
return _runningThreads;
}
set{
lock(_lock)
_runningThreads = value;
}
}
Now whenever you create a worker thread, increment the counter:
var t = new BackgroundWorker();
// Add RunWorkerCompleted handler
// Start thread
Counter++;
In work completed, decrement the counter:
private void RunWorkerCompleted(object sender, RunWorkerCompletedEventArgs e)
{
Counter--;
}
Now you can check for the counter anytime to see if any thread is running:
if(Couonter>0){
// Some thread is yet to finish.
}
Most proposed answers don't take into account a time-out interval, which is very important to prevent a possible deadlock. Next is my sample code. (Note that I'm primarily a Win32 developer, and that's how I'd do it there.)
//'arrRunningThreads' = List<Thread>
//Wait for all threads
const int knmsMaxWait = 3 * 1000; //3 sec timeout
int nmsBeginTicks = Environment.TickCount;
foreach(Thread thrd in arrRunningThreads)
{
//See time left
int nmsElapsed = Environment.TickCount - nmsBeginTicks;
int nmsRemain = knmsMaxWait - nmsElapsed;
if(nmsRemain < 0)
nmsRemain = 0;
//Then wait for thread to exit
if(!thrd.Join(nmsRemain))
{
//It didn't exit in time, terminate it
thrd.Abort();
//Issue a debugger warning
Debug.Assert(false, "Terminated thread");
}
}
In my case, I could not instantiate my objects on the the thread pool with Task.Run() or Task.Factory.StartNew(). They would not synchronize my long running delegates correctly.
I needed the delegates to run asynchronously, pausing my main thread for their collective completion. The Thread.Join() would not work since I wanted to wait for collective completion in the middle of the parent thread, not at the end.
With the Task.Run() or Task.Factory.StartNew(), either all the child threads blocked each other or the parent thread would not be blocked, ... I couldn't figure out how to go with async delegates because of the re-serialization of the await syntax.
Here is my solution using Threads instead of Tasks:
using (EventWaitHandle wh = new EventWaitHandle(false, EventResetMode.ManualReset))
{
int outdex = mediaServerMinConnections - 1;
for (int i = 0; i < mediaServerMinConnections; i++)
{
new Thread(() =>
{
sshPool.Enqueue(new SshHandler());
if (Interlocked.Decrement(ref outdex) < 1)
wh.Set();
}).Start();
}
wh.WaitOne();
}

Categories