Dispatcher Invoke(...) vs BeginInvoke(...) confusion - c#

I'm confused why I can't make this test counter application work with 2 (or more) simultaneous running countertextboxes with the use of "BeginInvoke" on my Dispatcher in the Count() method.
You can solve the issue by replacing the BeginInvoke by an Invoke. But this doesn't solve my confusion.
Here's the sample code I'm talking about:
public class CounterTextBox : TextBox
{
private int _number;
public void Start()
{
(new Action(Count)).BeginInvoke(null, null);
}
private void Count()
{
while (true)
{
if (_number++ > 10000) _number = 0;
this.Dispatcher.BeginInvoke(new Action(UpdateText), System.Windows.Threading.DispatcherPriority.Background, null);
}
}
private void UpdateText()
{
this.Text = "" + _number;
}
}

When you use Dispatcher.BeginInvoke it means that it schedules the given action for execution in the UI thread at a later point in time, and then returns control to allow the current thread to continue executing. Invoke blocks the caller until the scheduled action finishes.
When you use BeginInvoke your loop is going to run super fast since BeginInvoke returns right away. This means that you're adding lot and lots of actions to the message queue. You're adding them much faster than they can actually be processed. This means that there's a long time between when you schedule a message and when it actually gets a chance to be run.
The actual action that you're running uses the field _number. But _number is being modified by the other thread very quickly and while the action is in the queue. This means that it won't display the value of _number at the time you scheduled the action, but rather what it is after it has been continuing on in it's very tight loop.
If you use Dispatcher.Invoke instead then it prevents the loop from "getting ahead of itself" and having multiple scheduled events, which ensures that the value that it's writing is always the "current" value. Additionally, by forcing each iteration of the loop to wait for the message to be run it makes the loop a lot less "tight", so it can't run as quickly in general.
If you want to use BeginInvoke the first thing you really need to do is slow down your loop. If you want it to update the text every second, or ever 10ms, or whatever, then you can use Thread.Sleep to wait the appropriate amount of time.
Next, you need to take a copy of _number before passing it to the Dispatcher so that it displays the value at the time you scheduled it, not at the time it is executed:
while (true)
{
if (_number++ > 10000)
_number = 0;
int copy = _number;
this.Dispatcher.BeginInvoke(new Action(() => UpdateText(copy))
, System.Windows.Threading.DispatcherPriority.Background, null);
Thread.Sleep(200);
}
private void UpdateText(int number)
{
this.Text = number.ToString();
}

Related

cancelling a backgroundworker with while loop

i know the common ways of cancelling a backgroundworker using eventwaithandles...
but i wanna know is that right to use a while loop to trap and pause working of a backgroundworker ? i coded like this :
Bool stop = false;
private void backgroundWorker1_DoWork(object sender, DoWorkEventArgs e)
{
progressBar1.Minimum = 0;
progressBar1.Maximum = 100000;
progressBar1.Value = 0;
for (int i = 0; i < 100000; i++)
{
progressBar1.Value++;
if (i == 50000)
stop = true;
while (stop)
{ }
}
}
private void button1_Click(object sender, EventArgs e)
{
stop = !stop;
}
Did you try it? What happened? Was it what you wanted to happen? Did you notice your computer's fans speeding up, to handle all the heat from your CPU in a tight, "do-nothing" loop?
Fact is, you should not "pause" a background task in the first place; if you don't it to keep running, interrupt it. If you want to be able to resume later, provide a mechanism to allow that. Even having your thread blocked efficiently waiting on a WaitHandle object would be the wrong thing to do, because it wastes a thread pool thread.
The code you've posted here is about the worst way to implement "pausing". Instead of waiting on some synchronization object such as a WaitHandle, you have the current thread just loop without interrupting, constantly checking the value of a flag. Even ignoring the question of whether you're using volatile (the code example doesn't show that, but then it also wouldn't compile, so…), it's terrible to force a CPU core to do so much work and yet get nowhere.
Don't pause your BackgroundWorker.DoWork handler in the first place. Really. Just don't do that. But if you insist, then at least use some kind of waitable object instead of a "spin-wait" loop as in the example you've posted here.
Here's an example of how your code might work if you wanted to avoid altogether tying up a thread while "paused". First, don't use BackgroundWorker, because it doesn't have a graceful way to do this. Second, do use await…that does specifically what you want: it allows the current method to return, but without losing track of its progress. The method will resume executing when the thing it waited on indicates completion.
In the example below, I've tried to guess at what the code that calls RunWorkerAsync() looks like. Or rather, I just assumed you've got a button2, which when clicked you call that method to start your worker task. If this is not enough to get you pointed in the right direction, please improve your question by including a good, minimal, complete code example showing what you're actually doing.
// These fields will work together to provide a way for the thread to interrupt
// itself temporarily without actually using a thread at all.
private TaskCompletionSource<object> _pause;
private readonly object _pauseLock = new object();
private void button2_Click(object sender, DoWorkEventArgs e)
{
// Initialize ProgressBar. Note: in your version of the code, this was
// done in the DoWork event handler, but that handler isn't executed in
// the UI thread, and so accessing a UI object like progressBar1 is not
// a good idea. If you got away with it, you were lucky.
progressBar1.Minimum = 0;
progressBar1.Maximum = 100000;
progressBar1.Value = 0;
// This object will perform the duty of the BackgroundWorker's
// ProgressChanged event and ReportProgress() method.
Progress<int> progress = new Progress<int>(i => progressBar1.Value++);
// We do want the code to run in the background. Use Task.Run() to accomplish that
Task.Run(async () =>
{
for (int i = 0; i < 100000; i++)
{
progress.Report(i);
Task task = null;
// Locking ensures that the two threads which may be interacting
// with the _pause object do not interfere with each other.
lock (_pauseLock)
{
if (i == 50000)
{
// We want to pause. But it's possible we lost the race with
// the user, who also just pressed the pause button. So
// only allocate a new TCS if there isn't already one
if (_pause == null)
{
_pause = new TaskCompletionSource<object>();
}
}
// If by the time we get here, there's a TCS to wait on, then
// set our local variable for the Task to wait on. In this way
// we resolve any other race that might occur between the time
// we checked the _pause object and then later tried to wait on it
if (_pause != null)
{
task = _pause.Task;
}
}
if (task != null)
{
// This is the most important part: using "await" tells the method to
// return, but in a way that will allow execution to resume later.
// That is, when the TCS's Task transitions to the completed state,
// this method will resume executing, using any available thread
// in the thread pool.
await task;
// Once we resume execution here, reset the TCS, to allow the pause
// to go back to pausing again.
lock (_pauseLock)
{
_pause.Dispose();
_pause = null;
}
}
}
});
}
private void button1_Click(object sender, EventArgs e)
{
lock (_pauseLock)
{
// A bit more complicated than toggling a flag, granted. But it achieves
// the desirable goal.
if (_pause == null)
{
// Creates the object to wait on. The worker thread will look for
// this and wait if it exists.
_pause = new TaskCompletionSource<object>();
}
else if (!_pause.Task.IsCompleted)
{
// Giving the TCS a result causes its corresponding Task to transition
// to the completed state, releasing any code that might be waiting
// on it.
_pause.SetResult(null);
}
}
}
Note that the above is just as contrived as your original example. If all you had really was a simple single loop variable iterating from 0 to 100,000 and stopping halfway through, nothing nearly so complicated as the above would be required. You'd just store the loop variable in a data structure somewhere, exit the running task thread, and then when you want to resume, pass in the current loop variable value so the method can resume at the right index.
But I'm assuming your real-world example is not so simple. And the above strategy will work for any stateful processing, with the compiler doing all the heavy-lifting of storing away intermediate state for you.

Is there such a synchronization tool as "single-item-sized async task buffer"?

Many times in UI development I handle events in such a way that when an event first comes - I immediately start processing, but if there is one processing operation in progress - I wait for it to complete before I process another event. If more than one event occurs before the operation completes - I only process the most recent one.
The way I typically do that my process method has a loop and in my event handler I check a field that indicates if I am currently processing something and if I am - I put my current event arguments in another field that is basically a one item sized buffer and when current processing pass completes - I check if there is some other event to process and I loop until I am done.
Now this seems a bit too repetitive and possibly not the most elegant way to do it, though it seems to otherwise work fine for me. I have two questions then:
Does what I need to do have a name?
Is there some reusable synchronization type out there that could do that for me?
I'm thinking of adding something to the set of async coordination primitives by Stephen Toub that I included in my toolkit.
So first, we'll handle the case that you described in which the method is always used from the UI thread, or some other synchronization context. The Run method can itself be async to handle all of the marshaling through the synchronization context for us.
If we're running we just set the next stored action. If we're not, then we indicate that we're now running, await the action, and then continue to await the next action until there is no next action. We ensure that whenever we're done we indicate that we're done running:
public class EventThrottler
{
private Func<Task> next = null;
private bool isRunning = false;
public async void Run(Func<Task> action)
{
if (isRunning)
next = action;
else
{
isRunning = true;
try
{
await action();
while (next != null)
{
var nextCopy = next;
next = null;
await nextCopy();
}
}
finally
{
isRunning = false;
}
}
}
private static Lazy<EventThrottler> defaultInstance =
new Lazy<EventThrottler>(() => new EventThrottler());
public static EventThrottler Default
{
get { return defaultInstance.Value; }
}
}
Because the class is, at least generally, going to be used exclusively from the UI thread there will generally need to be only one, so I added a convenience property of a default instance, but since it may still make sense for there to be more than one in a program, I didn't make it a singleton.
Run accepts a Func<Task> with the idea that it would generally be an async lambda. It might look like:
public class Foo
{
public void SomeEventHandler(object sender, EventArgs args)
{
EventThrottler.Default.Run(async () =>
{
await Task.Delay(1000);
//do other stuff
});
}
}
Okay, so, just to be verbose, here is a version that handles the case where the event handlers are called from different threads. I know you said that you assume they're all called from the UI thread, but I generalized it a bit. This means locking over all access to instance fields of the type in a lock block, but not actually executing the function inside of a lock block. That last part is important not just for performance, to ensure we're not blocking items from just setting the next field, but also to avoid issues with that action also calling run, so that it doesn't need to deal with re-entrancy issues or potential deadlocks. This pattern, of doing stuff in a lock block and then responding based on conditions determined in the lock means setting local variables to indicate what should be done after the lock ends.
public class EventThrottlerMultiThreaded
{
private object key = new object();
private Func<Task> next = null;
private bool isRunning = false;
public void Run(Func<Task> action)
{
bool shouldStartRunning = false;
lock (key)
{
if (isRunning)
next = action;
else
{
isRunning = true;
shouldStartRunning = true;
}
}
Action<Task> continuation = null;
continuation = task =>
{
Func<Task> nextCopy = null;
lock (key)
{
if (next != null)
{
nextCopy = next;
next = null;
}
else
{
isRunning = false;
}
}
if (nextCopy != null)
nextCopy().ContinueWith(continuation);
};
if (shouldStartRunning)
action().ContinueWith(continuation);
}
}
Does what I need to do have a name?
What you're describing sounds a bit like a trampoline combined with a collapsing queue. A trampoline is basically a loop that iteratively invokes thunk-returning functions. An example is the CurrentThreadScheduler in the Reactive Extensions. When an item is scheduled on a CurrentThreadScheduler, the work item is added to the scheduler's thread-local queue, after which one of the following things will happen:
If the trampoline is already running (i.e., the current thread is already processing the thread-local queue), then the Schedule() call returns immediately.
If the trampoline is not running (i.e., no work items are queued/running on the current thread), then the current thread begins processing the items in the thread-local queue until it is empty, at which point the call to Schedule() returns.
A collapsing queue accumulates items to be processed, with the added twist that if an equivalent item is already in the queue, then that item is simply replaced with the newer item (resulting in only the most recent of the equivalent items remaining in the queue, as opposed to both). The idea is to avoid processing stale/obsolete events. Consider a consumer of market data (e.g., stock ticks). If you receive several updates for a frequently traded security, then each update renders the earlier updates obsolete. There is likely no point in processing earlier ticks for the same security if a more recent tick has already arrived. Thus, a collapsing queue is appropriate.
In your scenario, you essentially have a trampoline processing a collapsing queue with for which all incoming events are considered equivalent. This results in an effective maximum queue size of 1, as every item added to a non-empty queue will result in the existing item being evicted.
Is there some reusable synchronization type out there that could do that for me?
I do not know of an existing solution that would serve your needs, but you could certainly create a generalized trampoline or event loop capable of supporting pluggable scheduling strategies. The default strategy could use a standard queue, while other strategies might use a priority queue or a collapsing queue.
What you're describing sounds very similar to how TPL Dataflow's BrodcastBlock behaves: it always remembers only the last item that you sent to it. If you combine it with ActionBlock that executes your action and has capacity only for the item currently being processed, you get what you want (the method needs a better name):
// returns send delegate
private static Action<T> CreateProcessor<T>(Action<T> executedAction)
{
var broadcastBlock = new BroadcastBlock<T>(null);
var actionBlock = new ActionBlock<T>(
executedAction, new ExecutionDataflowBlockOptions { BoundedCapacity = 1 });
broadcastBlock.LinkTo(actionBlock);
return item => broadcastBlock.Post(item);
}
Usage could be something like this:
var processor = CreateProcessor<int>(
i =>
{
Console.WriteLine(i);
Thread.Sleep(i);
});
processor(100);
processor(1);
processor(2);
Output:
100
2

Exit a loop if another thread enter in the

I've a multi-threading issue.
I've a method that is called to make refresh on several items.
In this method, I iterate on a list of items and refresh one of it's property.
The list has a lot of elements and we have to do some math to compute it's property.
The current code of this operation look like this:
public void AddItemsWithLayoutRefresh(IEnumerable<MyItem> items){
_control.Invoke(()=>{
AddItems(items);
for(int i =0;i<_guiItems.Count;i++){
//The goal is to have a condition here to "break" the loop and let the next call to RefreshLayout proceed
_guiItems[i].Propriety = ComputePropriety(_guiItems[i]);
}
});
}
The problem is that I may have 4 call, which are currently just blocking on the Invoke.
I've to finish the "AddItems" methods, but concerning everything that is in the "for" loop, I can abort this without any issue if I know that it will be executed just after.
But how to do this in a thread-safe way?
If I put a private bool _isNewRefreshHere;, set to true before entering the Invoke, then checking in the Invoke, I've no warranty that there is not already two call that have reach the Invoke BEFORE I check it in the for loop.
So how can I break when being in my loop when a new call is made to my method?
Solution
Based on Andrej Mohar's answer, I did the following:
private long m_refreshQueryCount;
public void AddItemsWithLayoutRefresh(IEnumerable<MyItem> items){
Interlocked.Increment(ref m_refreshQueryCount);
_control.Invoke(()=>{
Interlocked.Decrement(ref m_refreshQueryCount);
AddItems(items);
for(int i =0;i<_guiItems.Count;i++){
if (Interlocked.Read(ref m_refreshQueryCount) > 0)
{
break;
}
_guiItems[i].Propriety = ComputePropriety(_guiItems[i]);
}
});
}
Which seems to work very nicely
If I were you, I'd try to make a thread-safe waiting counter. You can use Interlocked methods like Increment and Decrement. What these basically do is they increment the value as an atomic operation, which is considered to be thread-safe. So you increase the variable before the Invoke call. This will allow you to know how many threads are in the waiting queue. You decrement the variable after the for loop finishes and before the ending of the Invoke block. You can then check inside the for statement for the number of waiting threads and break the for if the number is greater than 1. This way you should know exactly how many threads are in the execution chain.
I would do it in the following way:
private readonly object _refresherLock = new object();
private bool _isNewRefreshHere = false;
private AutoResetEvent _refresher = new AutoResetEvent(true);
public void AddItemsWithLayoutRefresh(IEnumerable<MyItem> items)
{
lock (_refresherLock)
{
if (_isNewRefreshHere)
{
return;
}
_isNewRefreshHere = true;
}
_refresher.WaitOne();
_isNewRefreshHere = false;
_control.Invoke(() =>
{
AddItems(items);
for (int i = 0; i < _guiItems.Count && !_isNewRefreshHere; i++)
{
_guiItems[i].Propriety = ComputePropriety(_guiItems[i]);
}
_refresher.Set();
});
}
That is:
You can always cancel the current updation with a new one.
You cannot queue up more than one updation at a time.
You are guaranteed to have no cross-threading conflicts.
You should test that code since I did not. :)

c# event handling: best practice to avoid thread contention and threadpool draining

When events trigger, they use threads from the threadpool. So if you have a bunch of events that trigger faster than they return, you drain your threadpool. So whenever you have an event handler method that doesn't have any other control to limit the rate of threads entering, and doesn't have any guarantee of returning quickly, and you're not painstakingly implementing 100% thread-safe code inside that method, it's probably best to implement some thread control. The obvious simple thing to do would be to lock() inside the event handling method, but if you do that, all the threads after the first one will block in queue, waiting to enter the lock region, hogging all your threads from threadpool. It is probably better to detect another thread is inside this method, and quickly abort instead.
The question is: I have a way of detecting another thread already running, and quickly aborting the subsequent threads. But it doesn't seem very C#-ish due to the use of "const" and manually handling a locking flag at a low level. Is there a better way?
This is basically a direct replication of the lock() functionality, but using a non-blocking Interlocked.Exchange, instead of using the blocking Monitor.Enter()
public class FooGoo
{
private const int LOCKED = 0; // could use any arbitrary value; I choose 0
private const int UNLOCKED = LOCKED + 1; // any arbitrary value, != LOCKED
private static int _myLock = UNLOCKED;
void myEventHandler()
{
int previousValue = Interlocked.Exchange(ref _myLock, LOCKED);
if ( previousValue == UNLOCKED )
{
try
{
// some handling code, which may or may not return quickly
// maybe not threadsafe
}
finally
{
_myLock = UNLOCKED;
}
}
else
{
// another thread is executing right now. So I will abort.
//
// optional and environment-specific, maybe you want to
// queue some event information or set a flag or something,
// so you remember later that this thread aborted
}
}
}
So far, this is the best answer I have found. Does there exist any shorthand equivalent of a non-blocking lock() to shorten this up?
static object _myLock;
void myMethod ()
{
if ( Monitor.TryEnter(_myLock) )
{
try
{
// Do stuff
}
finally
{
Monitor.Exit(_myLock);
}
}
else
{
// then I failed to get the lock. Optionally do stuff.
}
}

Pause and Resume a Thread

I have this code to pause and resume a thread:
public partial class frmMain : Form
{
(...)
ManualResetEvent wait_handle = new ManualResetEvent(true);
(...)
}
private void frmMain_Shown(object sender, EventArgs e)
{
ThreadPool.QueueUserWorkItem(new WaitCallback(TheLoop));
}
private void TheLoop(object stateinfo)
{
bool hasInfo = true;
while (doLoop)
{
wait_handle.WaitOne();
bool hasLines = GetInfo();
if (hasLines)
{
//Consuming time Operation 1
System.Threading.Thread.Sleep(7000);
if (CurrentLine < line.Count - 1)
CurrentLine++;
else
{
bool hasInfo2 = GetInfo2();
if (hasInfo2)
{
//Consuming time Operation 2
System.Threading.Thread.Sleep(7000);
}
CurrentLine = 0;
}
}
else
System.Threading.Thread.Sleep(40000); //Wait to query again
}
}
private void btnPauseResume_Click(object sender, EventArgs e)
{
if (btnPauseResume.Text == "Pause")
{
btnPauseResume.Text = "Resume";
wait_handle.Reset();
}
else
{
btnPauseResume.Text = "Pause";
wait_handle.Set();
}
}
The code above shows a cycle information, it works find to pause and resume the "first consuming time operation" but doesn't work for the second one, if I press the button to pause the thread in the second consuming time operation, this one continues and when the first one appears again, then it pauses there.
What am I missing here?
Thx
Have you considered using a Background Worker instead since you are using WinForms? It would probably be easier than trying to 'Pause' a thread. You can check the CancellationPending property to see if a user has elected to cancel the operation. The link has a good sample to look at.
I have never seen someone pausing a thread. Create a delegate and event inside the class or method that you are executing on a separate threat. Execute that event whenever you wish to pause your thred.
There is not any reason that I can see that would prevent a second call to WaitOne from working if placed before the 2nd time consuming operation. Since you are using a ManualResetEvent the wait handle's state will persist until either Set or Reset is called. That means if you resume the thread by calling Set then both calls to WaitOne will pass through. Likewise, if you pause the thread by calling Reset then both calls to WaitOne will block. Of course, it will not be possible to predict where the worker thread will pause if there is more than one call to WaitOne.
Got it guys! the thing is where you put the WaitOne(). For instance, if I have a While Loop (like my example) if I put the wait before it, no matter how many times I hit the pause button, it won't stop the thread, it's logic since the loop already began, but if I put it at the end, then it will work.
Appreciated your help.

Categories