I have a situation where I need one thread to wait until another thread provides it with data. I've created this class :Propery Highlighted,Syntax Highlighted Version
public class AsyncObjectWait<T1, TriggerType>
{
private void _timeout(TimeSpan timeout, TaskCompletionSource<TriggerType> tcs)
{
System.Threading.Thread.Sleep(timeout);
lock (tcs)
{
if (!tcs.TrySetException(new TimeoutException("Wait Timed Out")))
{
int a = 0;
}
else
{
Console.WriteLine("Timed Out");
}
}
}
public Task<TriggerType> WaitOneAsync(TimeSpan timeout)
{
TaskCompletionSource<TriggerType> wait = new TaskCompletionSource<TriggerType>();
lock (_waits)
{
_waits.Enqueue(wait);
Task.Run(() =>
{
_timeout(timeout, wait);
});
}
return wait.Task;
}
public bool TrySetOne(TriggerType trigger, Converter<TriggerType, T1> converter)
{
TaskCompletionSource<TriggerType> wait;
bool set = false;
lock (_waits)
{
while ((!set) && (!(_waits.Count == 0))) //while we havent completed a task, and as long as we have some more to check.
{
; //get the next wait.
lock (_waits.Peek()) //wait for any other threads to stop using the next wait it, then get exclusive access.
{
wait = _waits.Peek();
if (EqualityComparer<T1>.Default.Equals(converter(trigger), _value))
{
set = wait.TrySetResult(trigger); //try and set the result to complete the task. if we cant, try the next task
_waits.Dequeue();
if (!set)
{
continue;
}
else
{
return true;
}
}
else
{
return false;
}
}
}
return false;
}
}
}
To do this, as well as provide a Task-based interface that doesn't freeze my UI while it's waiting.
It seems to have an issue, though. When the Recieve() method is called, it can take up to a second for the task generated by WaitFor to recieve the completion info. I haven't been able to find a reason for this.
Could this be due to async method overhead? Is there another, less complicated way to do this?
Related
I am trying to debug some old code in a product running on a Windows Embedded Compact 7 and .NET CF 3.5. I do suspect a deadlock scenario but can't figure out the reason. The originating code seems to be a class (code below) encapsulating a thread and an AutoResetEvent. It uses lock and Monitor.Enter to acquire lock for thread safety.
Is there something obvious in the following code potentially causing a deadlock that i would be missing?
//constructor
public SequentialThreadWorker(
string threadName,
ThreadPriority threadPriority)
{
_workerThread = new Thread(OnThreadStart)
{
Name = threadName,
Priority = threadPriority,
IsBackground = true
};
_waitHandle = new AutoResetEvent(false);
_actionsQueue = new Queue<Action>();
_workerThread.Start();
}
//external multi-threaded code can enqueue task using the following method
public void Enqueue(Action action)
{
lock (_lock)
{
_actionsQueue.Enqueue(action);
_waitHandle.Set();
}
}
//ThreadStart code depiling action from the queue action to execute
private void OnThreadStart()
{
while (_run)
{
Action actionData = null;
Monitor.Enter(_lock);
try
{
if (_actionsQueue.Count > 0)
{
actionData = _actionsQueue.Dequeue();
if (_actionsQueue.Count > 0)
_waitHandle.Set();
else
_waitHandle.Reset();
}
}
catch (Exception e)
{
//do log
}
finally
{
Monitor.Exit(_lock);
}
if (actionData != null)
{
try
{
actionData();
}
catch (Exception e)
{
//do log
}
}
_waitHandle.WaitOne();
}
}
}
}
I have a static class and it has a static function IsDataCorrect() which does a http request.
The function can be called from multiple threads at the same time, and I want to let the first thread doing the request, and the others should be rejected (meaning they should get false as return value, they should not just be blocked!) until half a second after the first thread finished the request.
After that, the next winning thread should be able to do the next request, others should be rejected, and so on.
This is my approach, could someone please confirm if that is reasonable:
static class MyClass
{
private static bool IsBusy = false;
private static object lockObject = new object();
public static bool IsDataCorrect(string testString)
{
lock (lockObject)
{
if (IsBusy) return false;
IsBusy = true;
}
var uri = $"https://something.com";
bool htmlCheck = GetDocFromUri(uri, 2);
var t = new Thread(WaitBeforeFree);
t.Start();
//Fast Evaluations
//...
return htmlCheck;
}
private static void WaitBeforeFree()
{
Thread.Sleep(500);
IsBusy = false;
}
}
Your threads accessing the function would still be serialized in access for checking IsBusy flag, since only one thread at a time would be able to check it due to synchronization on lockObject. Instead, you can simply attempt to get a lock, and consequently, you don't need a flag since the lock itself will serve as the lock. Second, I would replace launching of new thread every time just to sleep and reset the flag, and replace it with a check on DateTime field.
static class MyClass
{
private static DateTime NextEntry = DateTime.Now;
private static ReaderWriterLockSlim timeLock = new ReaderWriterLockSlim();
private static object lockObject = new object();
public static bool IsDataCorrect(string testString)
{
bool tryEnterSuccess = false;
try
{
try
{
timeLock.EnterReadLock()
if (DateTime.Now < NextEntry) return false;
}
finally
{
timeLock.ExitReadLock()
}
Monitor.TryEnter(lockObject, ref tryEnterSuccess);
if (!tryEnterSuccess) return false;
var uri = $"https://something.com";
bool htmlCheck = GetDocFromUri(uri, 2);
//Fast Evaluations
//...
try
{
timeLock.EnterWriteLock()
NextEntry = DateTime.Now.AddMilliseconds(500);
} finally {
timeLock.ExitWriteLock()
}
return htmlCheck;
} finally {
if (tryEnterSuccess) Monitor.Exit(lockObject);
}
}
}
More efficient this way for not launching new threads, DateTime access is safe and yet concurrent so threads only stop when absolutely have to. Otherwise, everything keeps moving along with minimal resource usage.
I see you guys solved the problem correctly, but I think that there is still room to make it correct, efficient and simple in same time:).
How about this way?
EDIT: Edit to make calming easier and part of the example.
public static class ConcurrentCoordinationExtension
{
private static int _executing = 0;
public static bool TryExecuteSequentially(this Action actionToExecute)
{
// compate _executing with zero, if zero, set 1,
// return original value as result,
// successfull entry then result is zero, non zero returned, then somebody is executing
if (Interlocked.CompareExchange(ref _executing, 1, 0) != 0) return false;
try
{
actionToExecute.Invoke();
return true;
}
finally
{
Interlocked.Exchange(ref _executing, 0);//
}
}
public static bool TryExecuteSequentially(this Func<bool> actionToExecute)
{
// compate _executing with zero, if zero, set 1,
// return original value as result,
// successfull entry then result is zero, non zero returned, then somebody is executing
if (Interlocked.CompareExchange(ref _executing, 1, 0) != 0) return false;
try
{
return actionToExecute.Invoke();
}
finally
{
Interlocked.Exchange(ref _executing, 0);//
}
}
}
class Program
{
static void Main(string[] args)
{
DateTime last = DateTime.MinValue;
Func<bool> operation= () =>
{
//calming condition was not meant
if (DateTime.UtcNow - last < TimeSpan.FromMilliseconds(500)) return false;
last = DateTime.UtcNow;
//some stuff you want to process sequentially
return true;
};
operation.TryExecuteSequentially();
}
}
I have a Busy property that is set to true before an async call is made, then set to false when finished. Now I have 2 async calls, how do I handle this logic? Do I need to lock variables or some other parallel problems I need to look out for?
private bool _busy;
public bool Busy
{
get { return _busy; }
set
{
bool changed = value != _busy;
_busy = value;
if (changed) RaisePropertyChanged("Busy");
}
}
private void loadUser(int userId)
{
Busy = true;
api.GetUser(userId, CancellationToken.None).ContinueWith(t =>
Deployment.Current.Dispatcher.BeginInvoke(() =>
{
Busy = false;
}));
}
private void loadOtherData(int dataId)
{
Busy = true;
api.GetData(dataId, CancellationToken.None).ContinueWith(t =>
Deployment.Current.Dispatcher.BeginInvoke(() =>
{
Busy = false;
}));
}
I know this logic is flawed due to the Busy property is set to false on the first method that finishes execution. An idea I have is to use 2 more fields; isUserLoading and isOtherDataLoading and ensure both are false before setting Busy to false.
I'd like to know if there is a better way of accomplishing this.
If you have two booleans, _isUserLoading and _isOtherDataLoading, that you update in your load methods, then you can just change Busy to this:
public bool busy
{
get
{
return _isUserLoading || _isOtherDataLoading;
}
}
Another version including the call to RaisePropertyChanged could work like this:
public bool busy
{
get
{
return _isUserLoading || _isOtherDataLoading;
}
}
public bool IsUserLoading
{
get
{
return _isUserLoading;
}
set
{
bool busy = Busy;
_isUserLoading = value;
if (busy != Busy) RaisePropertyChanged("Busy");
}
}
And of course a similar property for IsOtherDataLoading.
Ideally you want to expose an observable property/event from the API stating when it's busy. Asuming you can't make any modifications, I'll suggest you go with a more generic approach. Something like
class CountedContext {
int workersCount = 0;
Action<bool> notifier;
public CountedContext(Action<bool> notifier) { this.notifier = notifier; }
public Task<TResult> Execte<TResult>(Func<Task<TResult>> func)
{
lock(worksersCount)
{
workersCount++;
if (workdersCount == 1)
notifier(true);
}
var result = func();
result.ContinueWith(_ =>
{
lock (worksersCount)
{
workersCount--;
if (worksersCount == 0){
notifier(false);
}
}
});
return result;
}
}
In your main class you can use:
// constructor:
countedContext = new CountedContext(state =>
{
...BeginInvoke(() =>
{
Busy = state;
});
});
...
// loadData
countedContext.Execute(() => api.GetData());
I have a thread, which creates a variable number of worker threads and distributes tasks between them. This is solved by passing the threads a TaskQueue object, whose implementation you will see below.
These worker threads simply iterate over the TaskQueue object they were given, executing each task.
private class TaskQueue : IEnumerable<Task>
{
public int Count
{
get
{
lock(this.tasks)
{
return this.tasks.Count;
}
}
}
private readonly Queue<Task> tasks = new Queue<Task>();
private readonly AutoResetEvent taskWaitHandle = new AutoResetEvent(false);
private bool isFinishing = false;
private bool isFinished = false;
public void Enqueue(Task task)
{
Log.Trace("Entering Enqueue, lock...");
lock(this.tasks)
{
Log.Trace("Adding task, current count = {0}...", Count);
this.tasks.Enqueue(task);
if (Count == 1)
{
Log.Trace("Count = 1, so setting the wait handle...");
this.taskWaitHandle.Set();
}
}
Log.Trace("Exiting enqueue...");
}
public Task Dequeue()
{
Log.Trace("Entering Dequeue...");
if (Count == 0)
{
if (this.isFinishing)
{
Log.Trace("Finishing (before waiting) - isCompleted set, returning empty task.");
this.isFinished = true;
return new Task();
}
Log.Trace("Count = 0, lets wait for a task...");
this.taskWaitHandle.WaitOne();
Log.Trace("Wait handle let us through, Count = {0}, IsFinishing = {1}, Returned = {2}", Count, this.isFinishing);
if(this.isFinishing)
{
Log.Trace("Finishing - isCompleted set, returning empty task.");
this.isFinished = true;
return new Task();
}
}
Log.Trace("Entering task lock...");
lock(this.tasks)
{
Log.Trace("Entered task lock, about to dequeue next item, Count = {0}", Count);
return this.tasks.Dequeue();
}
}
public void Finish()
{
Log.Trace("Setting TaskQueue state to isFinishing = true and setting wait handle...");
this.isFinishing = true;
if (Count == 0)
{
this.taskWaitHandle.Set();
}
}
public IEnumerator<Task> GetEnumerator()
{
while(true)
{
Task t = Dequeue();
if(this.isFinished)
{
yield break;
}
yield return t;
}
}
IEnumerator IEnumerable.GetEnumerator()
{
return GetEnumerator();
}
}
As you can see, I'm using an AutoResetEvent object to make sure that the worker threads don't exit prematurely, i.e. before getting any tasks.
In a nutshell:
the main thread assigns a task to a thread by Enqeueue-ing a task to its TaskQueue
the main thread notifies the thread that are no more tasks to execute by calling the TaskQueue's Finish() method
the worker thread retrieves the next task assigned to it by calling the TaskQueue's Dequeue() method
The problem is that the Dequeue() method often throws an InvalidOperationException, saying that the Queue is empty. As you can see I added some logging, and it turns out, that the AutoResetEvent doesn't block the Dequeue(), even though there were no calls to its Set() method.
As I understand it, calling AutoResetEvent.Set() will allow a waiting thread to proceed (who previously called AutoResetEvent.WaitOne()), and then automatically calls AutoResetEvent.Reset(), blocking the next waiter.
So what can be wrong? Did I get something wrong? Do I have an error somewhere?
I'm sitting above this for 3 hours now, but I cannot figure out what's wrong.
Please help me!
Thank you very much!
Your dequeue code is incorrect. You check the Count under lock, then fly by the seams of your pants, and then you expect the tasks to have something. You cannot retain assumptions while you release the lock :). Your Count check and tasks.Dequeue must occur under lock:
bool TryDequeue(out Tasks task)
{
task = null;
lock (this.tasks) {
if (0 < tasks.Count) {
task = tasks.Dequeue();
}
}
if (null == task) {
Log.Trace ("Queue was empty");
}
return null != task;
}
You Enqueue() code is similarly riddled with problems. Your Enqueue/Dequeue don't ensure progress (you will have dequeue threads blocked waiting even though there are items in the queue). Your signature of Enqueue() is wrong. Overall your post is very very poor code. Frankly, I think you're trying to chew more than you can bite here... Oh, and never log under lock.
I strongly suggest you just use ConcurrentQueue.
If you don't have access to .Net 4.0 here is an implementation to get you started:
public class ConcurrentQueue<T>:IEnumerable<T>
{
volatile bool fFinished = false;
ManualResetEvent eventAdded = new ManualResetEvent(false);
private Queue<T> queue = new Queue<T>();
private object syncRoot = new object();
public void SetFinished()
{
lock (syncRoot)
{
fFinished = true;
eventAdded.Set();
}
}
public void Enqueue(T t)
{
Debug.Assert (false == fFinished);
lock (syncRoot)
{
queue.Enqueue(t);
eventAdded.Set();
}
}
private bool Dequeue(out T t)
{
do
{
lock (syncRoot)
{
if (0 < queue.Count)
{
t = queue.Dequeue();
return true;
}
if (false == fFinished)
{
eventAdded.Reset ();
}
}
if (false == fFinished)
{
eventAdded.WaitOne();
}
else
{
break;
}
} while (true);
t = default(T);
return false;
}
public IEnumerator<T> GetEnumerator()
{
T t;
while (Dequeue(out t))
{
yield return t;
}
}
System.Collections.IEnumerator System.Collections.IEnumerable.GetEnumerator()
{
return GetEnumerator();
}
}
A more detailed answer from me is pending, but I just want to point out something very important.
If you're using .NET 3.5, you can use the ConcurrentQueue<T> class. A backport is included in the Rx extensions library, which is available for .NET 3.5.
Since you want blocking behavior, you would need to wrap a ConcurrentQueue<T> in a BlockingCollection<T> (also available as part of Rx).
It looks like you are trying to replicate a blocking queue. One already exists in the .NET 4.0 BCL as a BlockingCollection. If .NET 4.0 is not an option for you then you can use this code. It use the Monitor.Wait and Monitor.Pulse method instead of AutoResetEvent.
public class BlockingCollection<T>
{
private Queue<T> m_Queue = new Queue<T>();
public T Take() // Dequeue
{
lock (m_Queue)
{
while (m_Queue.Count <= 0)
{
Monitor.Wait(m_Queue);
}
return m_Queue.Dequeue();
}
}
public void Add(T data) // Enqueue
{
lock (m_Queue)
{
m_Queue.Enqueue(data);
Monitor.Pulse(m_Queue);
}
}
}
Update:
I am fairly certain that it is not possible to implement a producer-consumer queue using AutoResetEvent if you want it to be thread-safe for multiple producers and multiple consumers (I am prepared to be proven wrong if someone can come up with a counter example). Sure, you will see examples on the internet, but they are all wrong. In fact, one such attempt by Microsoft is flawed in that the queue can get live-locked.
I'm sorry for a redundant question. However, I've found many solutions to my problem but none of them are very well explained. I'm hoping that it will be made clear, here.
My C# application's main thread spawns 1..n background workers using the ThreadPool. I wish for the original thread to lock until all of the workers have completed. I have researched the ManualResetEvent in particular but I'm not clear on it's use.
In pseudo:
foreach( var o in collection )
{
queue new worker(o);
}
while( workers not completed ) { continue; }
If necessary, I will know the number of workers that are about to be queued before hand.
Try this. The function takes in a list of Action delegates. It will add a ThreadPool worker entry for each item in the list. It will wait for every action to complete before returning.
public static void SpawnAndWait(IEnumerable<Action> actions)
{
var list = actions.ToList();
var handles = new ManualResetEvent[actions.Count()];
for (var i = 0; i < list.Count; i++)
{
handles[i] = new ManualResetEvent(false);
var currentAction = list[i];
var currentHandle = handles[i];
Action wrappedAction = () => { try { currentAction(); } finally { currentHandle.Set(); } };
ThreadPool.QueueUserWorkItem(x => wrappedAction());
}
WaitHandle.WaitAll(handles);
}
Here's a different approach - encapsulation; so your code could be as simple as:
Forker p = new Forker();
foreach (var obj in collection)
{
var tmp = obj;
p.Fork(delegate { DoSomeWork(tmp); });
}
p.Join();
Where the Forker class is given below (I got bored on the train ;-p)... again, this avoids OS objects, but wraps things up quite neatly (IMO):
using System;
using System.Threading;
/// <summary>Event arguments representing the completion of a parallel action.</summary>
public class ParallelEventArgs : EventArgs
{
private readonly object state;
private readonly Exception exception;
internal ParallelEventArgs(object state, Exception exception)
{
this.state = state;
this.exception = exception;
}
/// <summary>The opaque state object that identifies the action (null otherwise).</summary>
public object State { get { return state; } }
/// <summary>The exception thrown by the parallel action, or null if it completed without exception.</summary>
public Exception Exception { get { return exception; } }
}
/// <summary>Provides a caller-friendly wrapper around parallel actions.</summary>
public sealed class Forker
{
int running;
private readonly object joinLock = new object(), eventLock = new object();
/// <summary>Raised when all operations have completed.</summary>
public event EventHandler AllComplete
{
add { lock (eventLock) { allComplete += value; } }
remove { lock (eventLock) { allComplete -= value; } }
}
private EventHandler allComplete;
/// <summary>Raised when each operation completes.</summary>
public event EventHandler<ParallelEventArgs> ItemComplete
{
add { lock (eventLock) { itemComplete += value; } }
remove { lock (eventLock) { itemComplete -= value; } }
}
private EventHandler<ParallelEventArgs> itemComplete;
private void OnItemComplete(object state, Exception exception)
{
EventHandler<ParallelEventArgs> itemHandler = itemComplete; // don't need to lock
if (itemHandler != null) itemHandler(this, new ParallelEventArgs(state, exception));
if (Interlocked.Decrement(ref running) == 0)
{
EventHandler allHandler = allComplete; // don't need to lock
if (allHandler != null) allHandler(this, EventArgs.Empty);
lock (joinLock)
{
Monitor.PulseAll(joinLock);
}
}
}
/// <summary>Adds a callback to invoke when each operation completes.</summary>
/// <returns>Current instance (for fluent API).</returns>
public Forker OnItemComplete(EventHandler<ParallelEventArgs> handler)
{
if (handler == null) throw new ArgumentNullException("handler");
ItemComplete += handler;
return this;
}
/// <summary>Adds a callback to invoke when all operations are complete.</summary>
/// <returns>Current instance (for fluent API).</returns>
public Forker OnAllComplete(EventHandler handler)
{
if (handler == null) throw new ArgumentNullException("handler");
AllComplete += handler;
return this;
}
/// <summary>Waits for all operations to complete.</summary>
public void Join()
{
Join(-1);
}
/// <summary>Waits (with timeout) for all operations to complete.</summary>
/// <returns>Whether all operations had completed before the timeout.</returns>
public bool Join(int millisecondsTimeout)
{
lock (joinLock)
{
if (CountRunning() == 0) return true;
Thread.SpinWait(1); // try our luck...
return (CountRunning() == 0) ||
Monitor.Wait(joinLock, millisecondsTimeout);
}
}
/// <summary>Indicates the number of incomplete operations.</summary>
/// <returns>The number of incomplete operations.</returns>
public int CountRunning()
{
return Interlocked.CompareExchange(ref running, 0, 0);
}
/// <summary>Enqueues an operation.</summary>
/// <param name="action">The operation to perform.</param>
/// <returns>The current instance (for fluent API).</returns>
public Forker Fork(ThreadStart action) { return Fork(action, null); }
/// <summary>Enqueues an operation.</summary>
/// <param name="action">The operation to perform.</param>
/// <param name="state">An opaque object, allowing the caller to identify operations.</param>
/// <returns>The current instance (for fluent API).</returns>
public Forker Fork(ThreadStart action, object state)
{
if (action == null) throw new ArgumentNullException("action");
Interlocked.Increment(ref running);
ThreadPool.QueueUserWorkItem(delegate
{
Exception exception = null;
try { action(); }
catch (Exception ex) { exception = ex;}
OnItemComplete(state, exception);
});
return this;
}
}
First, how long do the workers execute? pool threads should generally be used for short-lived tasks - if they are going to run for a while, consider manual threads.
Re the problem; do you actually need to block the main thread? Can you use a callback instead? If so, something like:
int running = 1; // start at 1 to prevent multiple callbacks if
// tasks finish faster than they are started
Action endOfThread = delegate {
if(Interlocked.Decrement(ref running) == 0) {
// ****run callback method****
}
};
foreach(var o in collection)
{
var tmp = o; // avoid "capture" issue
Interlocked.Increment(ref running);
ThreadPool.QueueUserWorkItem(delegate {
DoSomeWork(tmp); // [A] should handle exceptions internally
endOfThread();
});
}
endOfThread(); // opposite of "start at 1"
This is a fairly lightweight (no OS primitives) way of tracking the workers.
If you need to block, you can do the same using a Monitor (again, avoiding an OS object):
object syncLock = new object();
int running = 1;
Action endOfThread = delegate {
if (Interlocked.Decrement(ref running) == 0) {
lock (syncLock) {
Monitor.Pulse(syncLock);
}
}
};
lock (syncLock) {
foreach (var o in collection) {
var tmp = o; // avoid "capture" issue
ThreadPool.QueueUserWorkItem(delegate
{
DoSomeWork(tmp); // [A] should handle exceptions internally
endOfThread();
});
}
endOfThread();
Monitor.Wait(syncLock);
}
Console.WriteLine("all done");
I have been using the new Parallel task library in CTP here:
Parallel.ForEach(collection, o =>
{
DoSomeWork(o);
});
Here is a solution using the CountdownEvent class.
var complete = new CountdownEvent(1);
foreach (var o in collection)
{
var capture = o;
ThreadPool.QueueUserWorkItem((state) =>
{
try
{
DoSomething(capture);
}
finally
{
complete.Signal();
}
}, null);
}
complete.Signal();
complete.Wait();
Of course, if you have access to the CountdownEvent class then you have the whole TPL to work with. The Parallel class takes care of the waiting for you.
Parallel.ForEach(collection, o =>
{
DoSomething(o);
});
I think you were on the right track with the ManualResetEvent. This link has a code sample that closely matches what your trying to do. The key is to use the WaitHandle.WaitAll and pass an array of wait events. Each thread needs to set one of these wait events.
// Simultaneously calculate the terms.
ThreadPool.QueueUserWorkItem(
new WaitCallback(CalculateBase));
ThreadPool.QueueUserWorkItem(
new WaitCallback(CalculateFirstTerm));
ThreadPool.QueueUserWorkItem(
new WaitCallback(CalculateSecondTerm));
ThreadPool.QueueUserWorkItem(
new WaitCallback(CalculateThirdTerm));
// Wait for all of the terms to be calculated.
WaitHandle.WaitAll(autoEvents);
// Reset the wait handle for the next calculation.
manualEvent.Reset();
Edit:
Make sure that in your worker thread code path you set the event (i.e. autoEvents1.Set();). Once they are all signaled the waitAll will return.
void CalculateSecondTerm(object stateInfo)
{
double preCalc = randomGenerator.NextDouble();
manualEvent.WaitOne();
secondTerm = preCalc * baseNumber *
randomGenerator.NextDouble();
autoEvents[1].Set();
}
I've found a good solution here :
http://msdn.microsoft.com/en-us/magazine/cc163914.aspx
May come in handy for others with the same issue
Using .NET 4.0 Barrier class:
Barrier sync = new Barrier(1);
foreach(var o in collection)
{
WaitCallback worker = (state) =>
{
// do work
sync.SignalAndWait();
};
sync.AddParticipant();
ThreadPool.QueueUserWorkItem(worker, o);
}
sync.SignalAndWait();
Try using CountdownEvent
// code before the threads start
CountdownEvent countdown = new CountdownEvent(collection.Length);
foreach (var o in collection)
{
ThreadPool.QueueUserWorkItem(delegate
{
// do something with the worker
Console.WriteLine("Thread Done!");
countdown.Signal();
});
}
countdown.Wait();
Console.WriteLine("Job Done!");
// resume the code here
The countdown would wait until all threads have finished execution.
Wait for completion of all threads in thread pool there is no inbuilt method available.
Using count no. of threads are active, we can achieve it...
{
bool working = true;
ThreadPool.GetMaxThreads(out int maxWorkerThreads, out int maxCompletionPortThreads);
while (working)
{
ThreadPool.GetAvailableThreads(out int workerThreads, out int completionPortThreads);
//Console.WriteLine($"{workerThreads} , {maxWorkerThreads}");
if (workerThreads == maxWorkerThreads)
{ working = false; }
}
//when all threads are completed then 'working' will be false
}
void xyz(object o)
{
console.writeline("");
}