Sequentially call asynchronous methods in a loop - c#

I'm working with some code, that has an approach I've never dealt with before, and am hoping someone might be able to provide a spark of knowledge:
Effectively the class is setup like this:
void Loop()
{
for (int i = 0; i < 100; i++)
{
//need to block calls to this until UpdateComplete is hit
Update(i);
}
}
void Update(int i)
{
//asynchronous - returns immediately
}
void UpdateComplete()//called from another thread
{
//unblock further calls to Update();
}
The big caveat is that it's not possible to call Update() before UpdateComplete() has 'returned' or been calledback.
Ignoring any UI side effects, is there a neat solution to this problem?
I've currently got two strategies - one hacky and one I feel is over-complicated:
1 - Hacky: Set a global class boolean IsBlocked, that Update() sets to true and UpdateComplete() sets to false, and inside my for loop (in Loop()), just put in a while (IsBlocked){}.
2 - Over-complicated: Get rid of the loop altogether. Call Update(0); from inside Loop() instead, and have UpdateComplete() call Update(1);, Update(2); and so on.
I guess I'm hoping for something like a Thread.Pause that can then be remotely 'resumed' by another thread (the separate one that calls UpdateComplete()
Any ideas appreciated!

Async/await can also be used here
TaskCompletionSource<object> tcs = null;
async void Loop()
{
for (int i = 0; i < 100; i++)
{
tcs = new TaskCompletionSource<object>();
Update(i);
await tcs.Task;
}
}
void Update(int i)
{
//asynchronous - returns immediately
}
void UpdateComplete()//called from another thread
{
tcs.TrySetResult(null);
//unblock further calls to Update();
}

Post the updates into the thread pool using QueueUserWorkItem
Have a counter of the number of posted items.
Have main thread wait on AutoResetEvent
When Each update finishes decrement the count under lock.
When the count hits 0 Set the AutoResetEvent to awake the main thread
I didn't post any code cos I assumed you can google how to do each bit, if not comment and I can add code

Related

Right way to continuously update the UI

Assuming that a client app gets data from a server nearly in real time. What is the more efficient way to continuously update the UI based on the retrieved data. Think of multiple xaml controls, like texts that show numbers. Those get updated as long as the application is running. They never stop unless the user decides it. (let's say by pressing a stop button or exit the app)
Below I have a simple example utilizing async and await keywords. Is that a good way for my scenario? Or for example BackgroundWorker would be a better way?
private async void Button_Click_Begin_RT_Update(object sender, RoutedEventArgs e)
{
while(true)
textField1.Text = await DoWork();
}
Task<string> DoWork()
{
return Task.Run(() =>
{
return GetRandomNumberAsString();
});
}
*for the sake of simplicity I use code-behind and not mvvm in my example
Your code is more or less OK if your GetRandomNumberAsString() takes at least 15ms to complete.
If it takes less than that, and you want to minimize update latency i.e. you don't want to just wait, you might want to (1) replace your per-operation Task.Run with an endless loop that completely runs in a background thread (2) Implement throttling mechanism in that loop, and only update your GUI (using e.g. Dispatcher.BeginInvoke()) at around 30-60Hz.
P.S. The exact mechanism how you update your GUI (databinding + INotifyPropertyChanged, or directly like in your code) is not relevant for performance.
Update: here's example (untested)
static readonly TimeSpan updateFrequency = TimeSpan.FromMilliseconds( 20 );
void ThreadProc()
{
Stopwatch sw = Stopwatch.StartNew();
while( true )
{
string val = GetRandomNumberAsString();
if( sw.Elapsed < updateFrequency )
continue; // Too early to update
sw.Restart();
Application.Current.Dispatcher.BeginInvoke( () => { textField1.Text = val; } );
}
}

cancelling a backgroundworker with while loop

i know the common ways of cancelling a backgroundworker using eventwaithandles...
but i wanna know is that right to use a while loop to trap and pause working of a backgroundworker ? i coded like this :
Bool stop = false;
private void backgroundWorker1_DoWork(object sender, DoWorkEventArgs e)
{
progressBar1.Minimum = 0;
progressBar1.Maximum = 100000;
progressBar1.Value = 0;
for (int i = 0; i < 100000; i++)
{
progressBar1.Value++;
if (i == 50000)
stop = true;
while (stop)
{ }
}
}
private void button1_Click(object sender, EventArgs e)
{
stop = !stop;
}
Did you try it? What happened? Was it what you wanted to happen? Did you notice your computer's fans speeding up, to handle all the heat from your CPU in a tight, "do-nothing" loop?
Fact is, you should not "pause" a background task in the first place; if you don't it to keep running, interrupt it. If you want to be able to resume later, provide a mechanism to allow that. Even having your thread blocked efficiently waiting on a WaitHandle object would be the wrong thing to do, because it wastes a thread pool thread.
The code you've posted here is about the worst way to implement "pausing". Instead of waiting on some synchronization object such as a WaitHandle, you have the current thread just loop without interrupting, constantly checking the value of a flag. Even ignoring the question of whether you're using volatile (the code example doesn't show that, but then it also wouldn't compile, so…), it's terrible to force a CPU core to do so much work and yet get nowhere.
Don't pause your BackgroundWorker.DoWork handler in the first place. Really. Just don't do that. But if you insist, then at least use some kind of waitable object instead of a "spin-wait" loop as in the example you've posted here.
Here's an example of how your code might work if you wanted to avoid altogether tying up a thread while "paused". First, don't use BackgroundWorker, because it doesn't have a graceful way to do this. Second, do use await…that does specifically what you want: it allows the current method to return, but without losing track of its progress. The method will resume executing when the thing it waited on indicates completion.
In the example below, I've tried to guess at what the code that calls RunWorkerAsync() looks like. Or rather, I just assumed you've got a button2, which when clicked you call that method to start your worker task. If this is not enough to get you pointed in the right direction, please improve your question by including a good, minimal, complete code example showing what you're actually doing.
// These fields will work together to provide a way for the thread to interrupt
// itself temporarily without actually using a thread at all.
private TaskCompletionSource<object> _pause;
private readonly object _pauseLock = new object();
private void button2_Click(object sender, DoWorkEventArgs e)
{
// Initialize ProgressBar. Note: in your version of the code, this was
// done in the DoWork event handler, but that handler isn't executed in
// the UI thread, and so accessing a UI object like progressBar1 is not
// a good idea. If you got away with it, you were lucky.
progressBar1.Minimum = 0;
progressBar1.Maximum = 100000;
progressBar1.Value = 0;
// This object will perform the duty of the BackgroundWorker's
// ProgressChanged event and ReportProgress() method.
Progress<int> progress = new Progress<int>(i => progressBar1.Value++);
// We do want the code to run in the background. Use Task.Run() to accomplish that
Task.Run(async () =>
{
for (int i = 0; i < 100000; i++)
{
progress.Report(i);
Task task = null;
// Locking ensures that the two threads which may be interacting
// with the _pause object do not interfere with each other.
lock (_pauseLock)
{
if (i == 50000)
{
// We want to pause. But it's possible we lost the race with
// the user, who also just pressed the pause button. So
// only allocate a new TCS if there isn't already one
if (_pause == null)
{
_pause = new TaskCompletionSource<object>();
}
}
// If by the time we get here, there's a TCS to wait on, then
// set our local variable for the Task to wait on. In this way
// we resolve any other race that might occur between the time
// we checked the _pause object and then later tried to wait on it
if (_pause != null)
{
task = _pause.Task;
}
}
if (task != null)
{
// This is the most important part: using "await" tells the method to
// return, but in a way that will allow execution to resume later.
// That is, when the TCS's Task transitions to the completed state,
// this method will resume executing, using any available thread
// in the thread pool.
await task;
// Once we resume execution here, reset the TCS, to allow the pause
// to go back to pausing again.
lock (_pauseLock)
{
_pause.Dispose();
_pause = null;
}
}
}
});
}
private void button1_Click(object sender, EventArgs e)
{
lock (_pauseLock)
{
// A bit more complicated than toggling a flag, granted. But it achieves
// the desirable goal.
if (_pause == null)
{
// Creates the object to wait on. The worker thread will look for
// this and wait if it exists.
_pause = new TaskCompletionSource<object>();
}
else if (!_pause.Task.IsCompleted)
{
// Giving the TCS a result causes its corresponding Task to transition
// to the completed state, releasing any code that might be waiting
// on it.
_pause.SetResult(null);
}
}
}
Note that the above is just as contrived as your original example. If all you had really was a simple single loop variable iterating from 0 to 100,000 and stopping halfway through, nothing nearly so complicated as the above would be required. You'd just store the loop variable in a data structure somewhere, exit the running task thread, and then when you want to resume, pass in the current loop variable value so the method can resume at the right index.
But I'm assuming your real-world example is not so simple. And the above strategy will work for any stateful processing, with the compiler doing all the heavy-lifting of storing away intermediate state for you.

Using "thread" in c#

I read somewhere that using Thread.Abort() method is one of the worst way to kill a thread because it does not free the memory assigned to this thread. (I don't know if that's true, correct me if it's wrong and Abort() method is the one I should use.) Therefore the best way to call killing a thread would be creating a variable that defiles if thread can run, ie:
bool threadResult;
t = new System.Threading.Thread(() => doSomeStuff());
t.Start();
abortThread();
//***************************************************
bool threadCanRun = true;
void doSomeStuff()
{
while(threadCanRun)
// do work
}
void abortThread()
{
threadCanRun = false;
}
But... what if the thread cannot be stopped like that? Ie:
void doSomeStuff()
{
WebClient wc = new Webclient();
string url = "www.mywebsite.com";
string content = wc.DownloadString(url);
}
Let's say that I want to spend less than 100ms on doing this thread^. If it won't end until the time passes (I'm using the following construction: if (t.Join(100)) ) I should abort it somehow to keep my program running. So, what's the proper way to end up the working thread?
Sure, in this particular case I can use try-catch to handle most of exceptions, but this is just an example. Also, if my connection is really slow and the webpage is really big it would take more that 100ms and no exception will be thrown.
PS. I'm almost sure that it does not matter, but I'm working on WPF app with some Forms references. The target FW is NET 4.0.
If you want your thread to be able to end in 100ms, you should design thread to check for running condition (threadCanRun) at least once per 100ms. Your question is too general so I cannot give you more precise answer.
Also, it is good programming practice to join with your thread immediately after threadCanRun = false; although somebody could disagree with that.
In my opinion using the BackgroundWorker class is safer but if you want to use a Thread then you have to implement a flag cancellation pattern.
If you are using .Net Framework 4.0 the you can use the CancellationTokenSource class like this:
public partial class Form1 : Form
{
CancellationTokenSource cancelSource = new CancellationTokenSource();
int threadCounter = 0;
int mainCounter = 0;
public void doSomeStuff(CancellationToken cancelToken)
{
cancelToken.ThrowIfCancellationRequested();
for (int i = 0; i < 100; i++)
{
threadCounter = i;
// If you want to cancel the thread just call the Cancel method of the cancelSource.
if (i == 88)
{
cancelSource.Cancel();
}
if (cancelSource.IsCancellationRequested)
{
// Do some Thread clean up here
}
}
}
private void button1_Click(object sender, EventArgs e)
{
new Thread(() => doSomeStuff(cancelSource.Token)).Start();
// Do something else while the thread has not been cancelled
while (!cancelSource.IsCancellationRequested)
{
mainCounter++;
}
textBox1.Text = "The thread was cancelled when 'mainCounter' was at: " + mainCounter.ToString();
}
}
Also you can use the CancellationToken which has the IsCancellationRequested property and ThrowIfCancellationRequested method.

Cancel ThreadPool .QueueUserWorkItem Task

I need to cancel a background task started using ThreadPool.QueueUserWorkItem(...). I know a BackgroundWorker has constructs especially for this sort of thing, but I believe it's overkill in this case, since no user interface is involved. By cancellation, I simply mean force the completion of the callback method.
What are the pitfalls of adding something like the following to my class?
// Cancellation Property.
private bool _canceled;
public bool CancelTask
{
get { return _canceled; }
set { _canceled = value; }
}
public void DoSomeTask()
{
int iterations = 50;
ThreadPool.QueueUserWorkItem(new WaitCallback(ThreadPoolCallback), iterations);
}
private void ThreadPoolCallback(object state)
{
if (_canceled)
return; // don't even start.
int iterations = (int)state;
for (int i = 0; !_canceled && i < iterations; i++)
{
//
// do work ...
//
// This allows you to cancel in the middle of an iteration...
if (_canceled)
break;
}
}
Is there a better way?
I'd use a method CancelTask() rather than a property. The point is that callers should be able to cancel a task, but no one should be able to un-cancel a task.
Then you need to be sure that the read and the write of _cancelled have the appropriate memory barriers otherwise one thread might not ever observe the change made by the other thread. For this I'd use Thread.VolatileWrite (inside CancelTask) and Thread.VolatileRead (inside your loop)

How to ensure run of a thread exactly after end of running of a specifc number of other threads?

I have a class in C# like this:
public MyClass
{
public void Start() { ... }
public void Method_01() { ... }
public void Method_02() { ... }
public void Method_03() { ... }
}
When I call the "Start()" method, an external class start to work and will create many parallel threads that those parallel threads call the "Method_01()" and "Method_02()" form above class. after end of working of the external class, the "Method_03()" will be run in another parallel thread.
Threads of "Method_01()" or "Method_02()" are created before creation of thread of Method_03(), but there is no guaranty to end before start of thread of "Method_03()". I mean the "Method_01()" or the "Method_02()" will lost their CPU turn and the "Method_03" will get the CPU turn and will end completely.
In the "Start()" method I know the total number of threads that are supposed to create and run "Method_01" and "Method_02()". The question is that I'm searching for a way using semaphore or mutex to ensure that the first statement of "Method_03()" will be run exactly after end of all threads which are running "Method_01()" or "Method_02()".
Three options that come to mind are:
Keep an array of Thread instances and call Join on all of them from Method_03.
Use a single CountdownEvent instance and call Wait from Method_03.
Allocate one ManualResetEvent for each Method_01 or Method_02 call and call WaitHandle.WaitAll on all of them from Method_03 (this is not very scalable).
I prefer to use a CountdownEvent because it is a lot more versatile and is still super scalable.
public class MyClass
{
private CountdownEvent m_Finished = new CountdownEvent(0);
public void Start()
{
m_Finished.AddCount(); // Increment to indicate that this thread is active.
for (int i = 0; i < NUMBER_OF_THREADS; i++)
{
m_Finished.AddCount(); // Increment to indicate another active thread.
new Thread(Method_01).Start();
}
for (int i = 0; i < NUMBER_OF_THREADS; i++)
{
m_Finished.AddCount(); // Increment to indicate another active thread.
new Thread(Method_02).Start();
}
new Thread(Method_03).Start();
m_Finished.Signal(); // Signal to indicate that this thread is done.
}
private void Method_01()
{
try
{
// Add your logic here.
}
finally
{
m_Finished.Signal(); // Signal to indicate that this thread is done.
}
}
private void Method_02()
{
try
{
// Add your logic here.
}
finally
{
m_Finished.Signal(); // Signal to indicate that this thread is done.
}
}
private void Method_03()
{
m_Finished.Wait(); // Wait for all signals.
// Add your logic here.
}
}
This appears to be a perfect job for Tasks. Below I assume that Method01 and Method02 are allowed to run concurrently with no specific order of invocation or finishing (with no guarantee, just typed in out of memory without testing):
int cTaskNumber01 = 3, cTaskNumber02 = 5;
Task tMaster = new Task(() => {
for (int tI = 0; tI < cTaskNumber01; ++tI)
new Task(Method01, TaskCreationOptions.AttachedToParent).Start();
for (int tI = 0; tI < cTaskNumber02; ++tI)
new Task(Method02, TaskCreationOptions.AttachedToParent).Start();
});
// after master and its children are finished, Method03 is invoked
tMaster.ContinueWith(Method03);
// let it go...
tMaster.Start();
What it sounds like you need to do is to create a ManualResetEvent (initialized to unset) or some other WatHandle for each of Method_01 and Method_02, and then have Method_03's thread use WaitHandle.WaitAll on the set of handles.
Alternatively, if you can reference the Thread variables used to run Method_01 and Method_02, you could have Method_03's thread use Thread.Join to wait on both. This assumes however that those threads are actually terminated when they complete execution of Method_01 and Method_02- if they are not, you need to resort to the first solution I mention.
Why not use a static variable static volatile int threadRuns, which is initialized with the number threads Method_01 and Method_02 will be run.
Then you modify each of those two methods to decrement threadRuns just before exit:
...
lock(typeof(MyClass)) {
--threadRuns;
}
...
Then in the beginning of Method_03 you wait until threadRuns is 0 and then proceed:
while(threadRuns != 0)
Thread.Sleep(10);
Did I understand the quesiton correctly?
There is actually an alternative in the Barrier class that is new in .Net 4.0. This simplifies the how you can do the signalling across multiple threads.
You could do something like the following code, but this is mostly useful when synchronizing different processing threads.
public class Synchro
{
private Barrier _barrier;
public void Start(int numThreads)
{
_barrier = new Barrier((numThreads * 2)+1);
for (int i = 0; i < numThreads; i++)
{
new Thread(Method1).Start();
new Thread(Method2).Start();
}
new Thread(Method3).Start();
}
public void Method1()
{
//Do some work
_barrier.SignalAndWait();
}
public void Method2()
{
//Do some other work.
_barrier.SignalAndWait();
}
public void Method3()
{
_barrier.SignalAndWait();
//Do some other cleanup work.
}
}
I would also like to suggest that since your problem statement was quite abstract, that often actual problems that are solved using countdownevent are now better solved using the new Parallel or PLINQ capabilities. If you were actually processing a collection or something in your code, you might have something like the following.
public class Synchro
{
public void Start(List<someClass> collection)
{
new Thread(()=>Method3(collection));
}
public void Method1(someClass)
{
//Do some work.
}
public void Method2(someClass)
{
//Do some other work.
}
public void Method3(List<someClass> collection)
{
//Do your work on each item in Parrallel threads.
Parallel.ForEach(collection, x => { Method1(x); Method2(x); });
//Do some work on the total collection like sorting or whatever.
}
}

Categories