Suppose you are permanently invoking a method asynchronously onto the UI thread/dispatcher with
while (true) {
uiDispatcher.BeginInvoke(new Action<int, T>(insert_), DispatcherPriority.Normal, new object[] { });
}
On every run of the program you observe that the GUI of the application begins to freeze after about 90 seconds due to the flood of invocations (time varies but lies roughly between 1 and 2 minutes).
How could one exactly determine (measure ?) the point when this overloading occurs in order to stop it early enough ?
Appendix I:
In my actual program I don't have an infinite loop. I have an algorithm that iterates several hundred times before terminating. In every iteration I am adding a string to a List control in my WPF application. I used the while (true) { ... } construct because it matches best what happens. In fact the algorithm terminates correctly and all (hundreds) strings are added correctly to my List but after some time I am loosing the ability to use my GUI until the algorithm terminates - then the GUI is responsive again.
Appendix II:
The purpose of my program is to observe a particular algorithm while it's running. The strings I am adding are log entries: one log string per iteration. The reason why I am invoking these add-operations is that the algorithm is running in another thread than the UI thread. To catch up with the fact that I can't do UI manipulation from any thread other than the UI thread I built some kind of ThreadSafeObservableCollection (But I am pretty sure that this code is not worth posting because it would detract from the actual problem what I think is that the UI can't handle the repeatedly and fast invocation of methods.
It's pretty straight forward: you are doing it wrong by the time you overload the user's eyeballs. Which happens pretty quickly as far as modern cpu cores are concerned, beyond 20 updates per second the displayed information just starts to look like a blur. Something the cinema takes advantage of, movies play back at 24 frames per second.
Updating any faster than that is just a waste of resources. You still have an enormous amount of breathing room left before the UI thread starts to buckle. It depends on the amount of work you ask it to do, but typical is a x50 safety margin. A simple timer based on Environment.TickCount will get the job done, fire an update when the difference is >= 45 msec.
Posting that often to the UI is a red flag. Here is an alternative: Put new strings into a ConcurrentQueue and have a timer pull them out every 100ms.
Very simple and easy to implement, and the result is perfect.
I've not used WPF--just Windows Forms, but I would suggest that if there is a view-only control which will need to be updated asynchronously, the proper way to do it is to write the control so that its properties can be accessed freely from any thread, and updating a control will BeginInvoke the refresh routine only if there isn't already an update pending; the latter determination can be made with an Int32 "flag" and Interlock.Exchange (the property setter calls Interlocked.Exchange on the flag after changing the underlying field; if the flag had been clear, it does a BeginInvoke on the refresh routine; the refresh routine then clears the flag and performs the refresh). In some cases, the pattern may be further enhanced by having the control's refresh routine check how much time has elapsed since the last time it ran and, if the answer is less than 20ms or so, use a timer to trigger a refresh 20ms after the previous one.
Even though .net can handle having many BeginInvoke actions posted on the UI thread, it's often pointless to have more than update for a single control pending at a time. Limit the pending actions to one (or at most a small number) per control, and there will be no danger of the queue overflowing.
ok, sorry for the bad link before in the comments, but I kept reading and maybe this will be of help:
The DispatcherOperation object returned by BeginInvoke can be used in several ways to interact with the specified delegate, such as:
Changing the DispatcherPriority of the delegate as it is pending execution in the event queue.
Removing the delegate from the event queue.
Waiting for the delegate to return.
Obtaining the value that the delegate returns after it is executed.
If multiple BeginInvoke calls are made at the same DispatcherPriority, they will be executed in the order the calls were made.
If BeginInvoke is called on a Dispatcher which has shut down, the status property of the returned DispatcherOperation is set to Aborted.
Maybe you can do something with the number of delegates that you are waiting on...
To put supercat's solution in a more WPF like way, try for an MVVM pattern and then you can have a separate view model class which you can share between threads, perhaps take locks out at apropriate points or use the concurrent collections class. You implement an interface (I think it's INotifyPropertyChanged and fire an event to say the collection has changed. This event must be fired from the UI thread, but only needs
After going through the answers provided by others and your comments on them, your actual intent seems to be ensuring that UI remains responsive. For this I think you have already received good proposals.
But still, to answer your question (how to detect and flag overloading of UI thread) verbatim, I can suggest the following:
First determine what should be the definition of 'overloading' (for e.g. I can assume it to be 'UI thread stops rendering the controls and stops processing user input' for a big enough duration)
Define this duration (for e.g. if UI thread continues to process render and input messages in at-most 40ms I will say it is not overloaded).
Now Initiate a DispactherTimer with DispatcherPriority set according to your definition for overloading (for my e.g. it can be DispatcherPriority.Input or lower) and Interval sufficiently less than your 'duration' for overloading
Maintain a shared variable of type DateTime and on each tick of the timer change its value to DateTime.Now.
In the delegate you pass to BeginInvoke, you can compute a difference between current time and the last time Tick was fired. If it exceeds your 'measure' of overloading then well the UI thread is 'Overloaded' according to your definition. You can then set a shared flag which can be checked from inside your loop to take appropriate action.
Though I admit, it is not fool proof, but by empirically adjusting your 'measure' you should be able to detect overloading before it impacts you.
Use a StopWatch to measure minimum, maximum, average, first and last update durations. (You can ouput this to your UI.)
Your update frequency must be < than 1/(the average update duration).
Change your algorithm's implementation so that it iterations are invoked by a multimedia timer e.g. this .NET wrapper or this .NET wrapper. When the timer is activated, use Interlocked to prevent running a new iteration before current iteration is complete. If you need to iterations on the main, use a dispatcher. You can run more than 1 iteration per timer event, use a parameter for this and together with time measurements determine how many interations to run per timer event and how often you want the timer events.
I do not recommend using less than 5mSec for the timer, as the timer events will suffocate the CPU.
As I wrote ealier in my comment, use DispatcherPriority.Input when dispatching to the main thread, that way the UI's CPU time isn't suffocated by the dispatches. This is the same priority the UI messages have, so that way they are not ignored.
Related
When making UI updates with GTK# (GDK2), such as changing label text, setting button visibility etc, should I as a general rule be using Gdk.Threads.AddTimeout or Gdk.Threads.AddIdle?
The GDK2 documentation states that AddIdle
Adds a function to be called whenever there are no higher priority events pending.
So it would seem that AddTimeout would provide a more responsive update. However the documentation states:
Note that timeout functions may be delayed, due to the processing of
other event sources. Thus they should not be relied on for precise
timing.
Which leads me to believe Idle and Timeout are both about as responsive as each other (which they seem to be, when running code).
A comment in some code I am working on suggests that Timeout results in a faster update but hits the UI harder, but I cannot find any sources to back this up.
So as a general rule, which of these methods should I be using to perform GTK updates in the main thread?
Threads.AddTimeout(0, 0, () =>
{
// do something;
return false;
});
vs
Threads.AddIdle(0, () =>
{
// do something;
return false;
});
The timeout callback is appropriate if you need something to happen after a specific period of time. An example of this would be blinking a cursor in a text field (not that you should implement that yourself).
Idle timeout gets called the once the main loop finishes executing everything else that is ready. The difference if you are just calling it once is priority. If you have some other event being handled by the main loop doing it in the idle handler guarantees that the other thing happens first.
The difference becomes more readily apparent when you return true, and get a repeated callback. If you do this in an idle callback you end up using as much CPU as the OS will let you, but your UI remains responsive assuming each callback is fast. If you do this with a timeout you get much more predictable behavior.
The only reason I would think this hits the UI harder is that it defaults to a higher priority, and could potentially delay draw events. Priority DEFAULT_IDLE < HIGH_IDLE (draw) < DEFAULT.
I have a task with a huge amount of input data (video). I need to process its frames in the background without freezing the UI and I don't need to process every frame.
So I want to create a background thread and skip frames while background is busy. Than I get another frames from video input and again.
I have this simple code now. I worked. But can it cause troubles and may be there is a better approach?
public class VideoProcessor{
bool busy=false;
void VideoStreamingEvent(Frame data){
if(!busy){
busy=true;
InvokeInBackground(()=>{
DataProcessing(data);
busy=false;
});
}
}
}
But can it cause troubles and may be there is a better approach?
If the VideoStreamingEvent method never executes concurrently on multiple threads, then this will work fine if you simply add volatile to the busy field declaration. It may, in practice, appear to work well enough without it, but that behavior is not guaranteed.
If it is possible for VideoStreamingEvent to be invoked on multiple threads, then you will need some synchronization around where you read and write the busy field.
I'm writing a program that runs through my method possibly 50 times a second or more (necessary)
The method needs to follow this model:
Create boolean value.
Wait until the value changes.
Continue on in the method.
Simple, I know, but I don't want to use a while loop because it takes up 3% or so CPU more than it should, and I imagine, should I need it to wait any longer for the value to change, that could take up all of my CPU cycles, which I don't want. Also, creating a new thread for every time I execute the method at 50 times per second is a horrible idea.
So what could I do? If I need to provide any other kind of information feel free to ask.
Could a ManualResetEvent be of any use? Not sure how it would work with your system, but it might be something to look into.
Depending on the nature of the method, you could just make the rest of the method into an event handler, and the place that changes its value then first a ValueChanged type event.
For multiple threads wait, can anyone compare the pros and cons of using WaitHandle.WaitAll and Thread.Join?
WaitHandle.WaitAll has a 64 handle limit so that is obviously a huge limitation. On the other hand, it is a convenient way to wait for many signals in only a single call. Thread.Join does not require creating any additional WaitHandle instances. And since it could be called individually on each thread the 64 handle limit does not apply.
Personally, I have never used WaitHandle.WaitAll. I prefer a more scalable pattern when I want to wait on multiple signals. You can create a counting mechanism that counts up or down and once a specific value is reach you signal a single shared event. The CountdownEvent class conveniently packages all of this into a single class.
var finished = new CountdownEvent(1);
for (int i = 0; i < NUM_WORK_ITEMS; i++)
{
finished.AddCount();
SpawnAsynchronousOperation(
() =>
{
try
{
// Place logic to run in parallel here.
}
finally
{
finished.Signal();
}
}
}
finished.Signal();
finished.Wait();
Update:
The reason why you want to signal the event from the main thread is subtle. Basically, you want to treat the main thread as if it were just another work item. Afterall, it, along with the other real work items, is running concurrently as well.
Consider for a moment what might happen if we did not treat the main thread as a work item. It will go through one iteration of the for loop and add a count to our event (via AddCount) indicating that we have one pending work item right? Lets say the SpawnAsynchronousOperation completes and gets the work item queued on another thread. Now, imagine if the main thread gets preempted before swinging around to the next iteration of the loop. The thread executing the work item gets its fair share of the CPU and starts humming along and actually completes the work item. The Signal call in the work item runs and decrements our pending work item count to zero which will change the state of the CountdownEvent to signalled. In the meantime the main thread wakes up and goes through all iterations of the loop and hits the Wait call, but since the event got prematurely signalled it pass on by even though there are still pending work items.
Again, avoiding this subtle race condition is easy when you treat the main thread as a work item. That is why the CountdownEvent is intialized with one count and the Signal method is called before the Wait.
I like #Brian's answer as a comparison of the two mechanisms.
If you are on .Net 4, it would be worthwhile exploring Task Parallel Library to achieve Task Parellelism via System.Threading.Tasks which allows you to manage tasks across multiple threads at a higher level of abstraction. The signalling you asked about in this question to manage thread interactions is hidden or much simplified, and you can concentrate on properly defining what each Task consists of and how to coordinate them.
This may seem offtopic but as Microsoft themselves say in the MSDN docs:
in the .NET Framework 4, tasks are the
preferred API for writing
multi-threaded, asynchronous, and
parallel code.
The waitall mechanism involves kernal-mode objects. I don't think the same is true for the join mechanism. I would prefer join, given the opportunity.
Technically though, the two are not equivalent. IIRC Join can only operate on one thread. Waitall can hold for the signalling of multiple kernel objects.
I have a WinForm drawing a chart from available data.
I programmed it so that every 1 secong the Winform.Timer.Tick event calls a function that:
will dequeue all data available
will add new points on the chart
Right now data to be plotted is really huge and it takes a lot of time to be executed so to update my form. Also Winform.Timer.Tick relies on WM_TIMER , so it executes in the same thread of the Form.
Theses 2 things are making my form very UNresponsive.
What can I do to solve this issue?
I thought the following:
moving away from usage of Winform.Timer and start using a System.Threading.Timer
use the IsInvokeRequired pattern so I will rely on the .NET ThreadPool.
Since I have lots of data, is this a good idea?
I have fear that at some point also the ThreadPool will be too long or too big.
Can you give me your suggestion about my issue?
Thank you very much!
AFG
It is a good idea to move the fetching of the data to a Thread. You can use a BackgroundWorker that gets the data in an endless loop and
use the UpdateProgress event to update the chart. This takes care of the InvokeRequired business
Use a Sleep(remainingTime) inside the loop to get a desired frequency.
It is quite unlikely you'll be ahead by using a background timer. Your chart control almost certainly requires it to be updated from the same thread is was created on. Any kind of control that has a visible appearance does. Which requires you to use Control.BeginInvoke in the Elapsed event handler so that the update code runs on the UI thread. Dequeueing data isn't likely to be expensive, you will have actually have made it slower by invoking. And still not have taken the pressure off the UI thread.
You'll also have a potentially serious throttling problem, the timer will keep on ticking and pump data, even if the UI thread can't keep up. That will eventually crash your program with OOM.
Consider instead to make the code that updates the chart smarter. A chart can only display details of the data if such details are at least a pixel wide. Realistically, it can only display 2000 pixels with useful information. That's not much, updating 2000 data points shouldn't cause any trouble.
I would go with a System.Timers.Timer over a BackgroudWorker in an endless loop.
The BackgroundWorker is executed from a ThreadPool and is not meant to run for the lifetime of your application.
Motivation for System.Timers.Timer:
Each elapsed event is executed from a ThreadPool, won't hang your UI thread.
Using a combination of locks and enabling/disabling the timer we can get the same frequency as if we did a Thread.Sleep(xxx) in an endless loop.
Cleaner and more obvious as to what you are trying to achieve
Here's my suggestion:
Disabling the timer at the beginning of the method, then re-enabling it again at the end, will cater for the case where the amount of work done in the elapsed event takes longer than the timer interval. This also ensures the timer between updates is consistent. I've added a lock for extra precaution.
I used an anonymous method to update the UI thread, but you can abviously do that however you want, as long as you remember to Invoke, it's also a good idea to check the InvokeRequired property
private readonly object chartUpdatingLock = new object();
private void UpdateChartTimerElapsed(object sender, ElapsedEventArgs e)
{
// Try and get a lock, this will cater for the case where two or more events fire
// in quick succession.
if (Monitor.TryEnter(chartUpdatingLock)
{
this.updateChartTimer.Enabled = false;
try
{
// Dequeuing and whatever other work here..
// Invoke the UI thread to update the control
this.myChartControl.Invoke(new MethodInvoker(delegate
{
// Do you UI work here
}));
}
finally
{
this.updateChartTimer.Enabled = true;
Monitor.Exit(chartUpdatingLock);
}
}
}