I have a task with a huge amount of input data (video). I need to process its frames in the background without freezing the UI and I don't need to process every frame.
So I want to create a background thread and skip frames while background is busy. Than I get another frames from video input and again.
I have this simple code now. I worked. But can it cause troubles and may be there is a better approach?
public class VideoProcessor{
bool busy=false;
void VideoStreamingEvent(Frame data){
if(!busy){
busy=true;
InvokeInBackground(()=>{
DataProcessing(data);
busy=false;
});
}
}
}
But can it cause troubles and may be there is a better approach?
If the VideoStreamingEvent method never executes concurrently on multiple threads, then this will work fine if you simply add volatile to the busy field declaration. It may, in practice, appear to work well enough without it, but that behavior is not guaranteed.
If it is possible for VideoStreamingEvent to be invoked on multiple threads, then you will need some synchronization around where you read and write the busy field.
Related
Let's say I have a method that I run in a separate thread via Task.Factory.StartNew().
This method reports so many progress (IProgress) that it freezes my GUI.
I know that simply reducing the number of reports would be a solution, like reporting only 1 out of 10 but in my case, I really want to get all reports and display them in my GUI.
My first idea was to queue all reports and treat them one by one, pausing a little bit between each of them.
Firstly: Is it a good option?
Secondly: How to implement that? Using a timer or using some kind of Task.Delay()?
UPDATE:
I'll try to explain better. The progress sent to the GUI consists of geocoordinates that I display on a map. Displaying each progress one after another provide some kind of animation on the map. That's why I don't want to skip any of them.
In fact, I don't mind if the method that I execute in another thread finishes way before the animation. All I want, is to be sure that all points have been displayed for at least a certain amount of time (let's say 200 ms).
Sounds like the whole point of having the process run in a separate thread is wasted if this is the result. As such, my first recommendation would be to reduce the number of updates if possible.
If that is out of the question, perhaps you could revise the data you are sending as part of each update. How large, and how complex is the object or data-structure used for reporting? Can performance be improved by reducing it's complexity?
Finally, you might try another approach: What if you create a third thread that just handles the reporting, and delivers it to your GUI in larger chunks? If you let your worker-thread report it's status to this reporter-thread, then let the reporter thread report back to your main GUI-thread only occasionally (e.g. every 1 in 10, as you suggest yourself above, bur then reporting 10 chunks of data at once), then you won't call on your GUI that often, yet you'll still be able to keep all the status data from the processing, and make it available in the GUI.
I don't know how viable this will be for your particular situation, but it might be worth an experiment or two?
I have many concerns regarding your solution, but I can't say for sure which one can be a problem without code samples.
First of all, Stephen Cleary in his StartNew is Dangerous article points out the real problem with this method with using it with default parameters:
Easy enough for the simple case, but let’s consider a more realistic example:
private void Form1_Load(object sender, EventArgs e)
{
Compute(3);
}
private void Compute(int counter)
{
// If we're done computing, just return.
if (counter == 0)
return;
var ui = TaskScheduler.FromCurrentSynchronizationContext();
Task.Factory.StartNew(() => A(counter))
.ContinueWith(t =>
{
Text = t.Result.ToString(); // Update UI with results.
// Continue working.
Compute(counter - 1);
}, ui);
}
private int A(int value)
{
return value; // CPU-intensive work.
}
...
Now, the question returns: what thread does A run on? Go ahead and walk through it; you should have enough knowledge at this point to figure out the answer.
Ready? The method A runs on a thread pool thread the first time, and then it runs on the UI thread the last two times.
I strongly recommend you to read whole article for better understanding the StartNew method usage, but want to point out the last advice from there:
Unfortunately, the only overloads for StartNew that take a
TaskScheduler also require you to specify the CancellationToken and
TaskCreationOptions. This means that in order to use
Task.Factory.StartNew to reliably, predictably queue work to the
thread pool, you have to use an overload like this:
Task.Factory.StartNew(A, CancellationToken.None,
TaskCreationOptions.DenyChildAttach, TaskScheduler.Default);
And really, that’s kind of ridiculous. Just use Task.Run(() => A());.
So, may be your code can be easily improved simply by switching the method you are creating news tasks. But there is some other suggestions regarding your question:
Use BlockingCollection for the storing the reports, and write a simple consumer from this queue to UI, so you'll always have a limited number of reports to represent, but at the end all of them will be handled.
Use a ConcurrentExclusiveSchedulerPair class for your logic: for generating the reports use the ConcurrentScheduler Property and for displaying them use ExclusiveScheduler Property.
I am trying to make a WinForms multi-threading app, which endlessly generates exceptions in two different threads.
One thread uses GenerateDllNotFoundExc() method and the other one another method, which is basically the same but simply generates another exception.
It then writes the exception message to a queue and then from the queue to text box.
However the GUI always freezes after 1 second, it writes messages to text box a bit and freezes. I tried debugging it, the code itself works however GUI is freezes.
Could someone please give me a hint as to what I'm doing wrong?
private delegate void GetQueueElem();
private event GetQueueElem getqueuelem;
private void GenerateDllNotFoundExc()
{
Action<String> addelem = new Action<String>(AddToQueue);
string exdll = string.Empty;
while (shouldgeneratemore)
{
try
{
throw new DllNotFoundException();
}
catch (Exception ex)
{
exdll = ex.Message;
}
this.Invoke(addelem, exdll);
}
}
private void AddToQueue(string exmess)
{
lock (lockobject)
queue.Enqueue(exmess);
getqueuelem.Invoke();
}
private void AddToTextBox()
{
while (queue.Count > 0)
{
string s = queue.Dequeue() +"\t" + Thread.CurrentThread.Name
+ "\t" + Thread.CurrentThread.ManagedThreadId + "\t";
lock (lockobject)
textBox1.Text += s;
}
}
This question is educational, it shows evidence of having all three major threading bugs. Putting them roughly in order in how common they are:
A threading race bug. Tripped when one thread reads a variable that is modified by another. Locking is required to avoid that from causing problems. This code uses the lock keyword but does not use it properly. The Queue class is not thread-safe, in this code both the unsafe Count property and the Dequeue() method are used without a lock. Not the actual problem here however, none of the code that uses the Queue actually runs on more than one thread. In other words, the lock isn't actually needed.
Deadlock. Occurs when code acquires locks in an unpredictable order. Particularly nasty for code that runs on the UI thread of a program, it often acquires locks that are not visible, built into the .NET Framework, the operating system or various 3rd party hooks. Screen readers for example. The Invoke() method is particularly prone to deadlock and should strongly be avoided, BeginInvoke() is always preferred. You don't actually need Invoke(), you don't care about the return value. Not the actual bug in this program however, even though it looks a lot like deadlock, you can use the debugger and see that the UI thread is executing code and not stopped on a lock.
A fire-hose bug. Fire-hosing occurs when the thread that produces results does so faster than the thread that processes them can consume. This kind of bug produces various kinds of misery, it can look a lot like a deadlock. Ultimately such a program will always fall over when it runs out of memory, consumed by a queue that contains too many results that have not been processed yet. Takes a while btw, .NET programs have a lot of memory available.
It is number 3 in this program. The UI thread needs to perform multiple duties and treats invoke requests with a high priority. Dispatching the invoked method, AddToQueue() in this case. It reads the invoke request from an internal queue and it tries to get the queue emptied first before doing other lower priority tasks. This goes wrong when the queue can't be emptied because a worker thread adds entries to the queue at a rate higher than the UI thread can empty it. In other words, the UI thread can never keep up, it only dispatches invoke requests and does not get around to doing anything else.
Pretty visible in Task Manager for example, you'll see your program burning 100% core. So you know it isn't actually deadlock. And very noticeable in your UI, you can bang on the Stop button but it does not have any effect. And painting no longer occurs, treated as a low priority task that's only executed when nothing more important needs to happen. It looks completely frozen, even though the UI thread is running like gangbusters.
A fire-hose bug is pretty easy to trip, it only takes a bit more than a thousand invoke requests per second. Depends on how much work the UI thread needs to do. Usually a lot, updating UI is typically pretty expensive. Nothing very subtle about setting the Text property of a TextBox, a lot of work happens under the covers. That innocent looking += operator burns a lot of cycles. Beyond the static overhead of SendMessage() to talk to the native TextBox, a lot of cycles are burned on constantly having to re-allocate the internal text buffer. Compare String vs StringBuilder. Or in other words, even if you don't trip the fire-hose bug at first, you are guaranteed you will sooner or later because the TextBox contains too much text that needs to be moved from one buffer to another. Sooner in your case.
Ultimately a fire-hose bug like this is a balancing bug. You are updating UI at a rate that is far, far higher than a human can ever observe. That is not a useful user interface. There is no practical advice for this program, it is too synthetic, intentionally slowing down the worker thread would be a workaround.
I have a simple app that read a database and then aftersome manipulation write the results on another one.
The first lines of code update the ui with a message for the user and an onscreen log, then is all wrapped inside a try/catch construct with usings and other try/catch annidated.
message.AppendText("** Message for the user that appear only after the try block's execution **\n");
message.ScrollToEnd();
try
{
using(SqlConnection...)
{
business code
}
}
catch
{
bbbb...
}
In the end it works, but the ui is only updated when it finishes all.
I can understand why what's inside the try must wait the end, but why the first lines don't affect the ui till the end of the successive block?
And how can I create more responsive ui?
I first tried creating a thread for any connection (one has a timout of 5 seconds), and one for the businness code.
Ok, it was overkill, but was experimenting.
I had so much problems sharing the connections between threads and interacting with the main window's ui that abandoned the idea and rewrited all as described above.
People here have suggested creating a responsive UI. This is one way to do that. At the top of your code file, add:
using System.Threading;
Move all the stuff that takes a long time to a new method:
public void LoadStuff()
{
// Do some stuff that takes a while here
}
Replace the original stuff with this code:
Thread callThread = new Thread(new ThreadStart(LoadStuff));
callThread.Start();
Now, anytime you need to update your UI from LoadStuff you have to encapsulate it (surround it) with this code. The reason for this is only the thread that creates the UI can modify it. So, we have to tell our new thread to refer back to the old thread to execute the code. Therefore, inside LoadStuff, after you compute a bunch of data, to update your UI use this:
this.Dispatcher.Invoke(new Action(() =>
{
// Code to update UI here
}));
Like others have suggested, there are others ways to increase UI speed, and I was not the first to suggest using a different thread to compute. But I just wanted to show you a way to do it.
In addition to moving long-running processes off of the UI thread, there are some UI tricks that you can do to help make user interaction feel a little better. For example, if an action takes more than about 0.1 seconds, try fading in a message (e.g. "Loading...") to let the user know that there is something happening. Once you get the data back, fade this message back out.
You may also want to try animating the UI update to avoid the "stuttering" sensation.
Suppose you are permanently invoking a method asynchronously onto the UI thread/dispatcher with
while (true) {
uiDispatcher.BeginInvoke(new Action<int, T>(insert_), DispatcherPriority.Normal, new object[] { });
}
On every run of the program you observe that the GUI of the application begins to freeze after about 90 seconds due to the flood of invocations (time varies but lies roughly between 1 and 2 minutes).
How could one exactly determine (measure ?) the point when this overloading occurs in order to stop it early enough ?
Appendix I:
In my actual program I don't have an infinite loop. I have an algorithm that iterates several hundred times before terminating. In every iteration I am adding a string to a List control in my WPF application. I used the while (true) { ... } construct because it matches best what happens. In fact the algorithm terminates correctly and all (hundreds) strings are added correctly to my List but after some time I am loosing the ability to use my GUI until the algorithm terminates - then the GUI is responsive again.
Appendix II:
The purpose of my program is to observe a particular algorithm while it's running. The strings I am adding are log entries: one log string per iteration. The reason why I am invoking these add-operations is that the algorithm is running in another thread than the UI thread. To catch up with the fact that I can't do UI manipulation from any thread other than the UI thread I built some kind of ThreadSafeObservableCollection (But I am pretty sure that this code is not worth posting because it would detract from the actual problem what I think is that the UI can't handle the repeatedly and fast invocation of methods.
It's pretty straight forward: you are doing it wrong by the time you overload the user's eyeballs. Which happens pretty quickly as far as modern cpu cores are concerned, beyond 20 updates per second the displayed information just starts to look like a blur. Something the cinema takes advantage of, movies play back at 24 frames per second.
Updating any faster than that is just a waste of resources. You still have an enormous amount of breathing room left before the UI thread starts to buckle. It depends on the amount of work you ask it to do, but typical is a x50 safety margin. A simple timer based on Environment.TickCount will get the job done, fire an update when the difference is >= 45 msec.
Posting that often to the UI is a red flag. Here is an alternative: Put new strings into a ConcurrentQueue and have a timer pull them out every 100ms.
Very simple and easy to implement, and the result is perfect.
I've not used WPF--just Windows Forms, but I would suggest that if there is a view-only control which will need to be updated asynchronously, the proper way to do it is to write the control so that its properties can be accessed freely from any thread, and updating a control will BeginInvoke the refresh routine only if there isn't already an update pending; the latter determination can be made with an Int32 "flag" and Interlock.Exchange (the property setter calls Interlocked.Exchange on the flag after changing the underlying field; if the flag had been clear, it does a BeginInvoke on the refresh routine; the refresh routine then clears the flag and performs the refresh). In some cases, the pattern may be further enhanced by having the control's refresh routine check how much time has elapsed since the last time it ran and, if the answer is less than 20ms or so, use a timer to trigger a refresh 20ms after the previous one.
Even though .net can handle having many BeginInvoke actions posted on the UI thread, it's often pointless to have more than update for a single control pending at a time. Limit the pending actions to one (or at most a small number) per control, and there will be no danger of the queue overflowing.
ok, sorry for the bad link before in the comments, but I kept reading and maybe this will be of help:
The DispatcherOperation object returned by BeginInvoke can be used in several ways to interact with the specified delegate, such as:
Changing the DispatcherPriority of the delegate as it is pending execution in the event queue.
Removing the delegate from the event queue.
Waiting for the delegate to return.
Obtaining the value that the delegate returns after it is executed.
If multiple BeginInvoke calls are made at the same DispatcherPriority, they will be executed in the order the calls were made.
If BeginInvoke is called on a Dispatcher which has shut down, the status property of the returned DispatcherOperation is set to Aborted.
Maybe you can do something with the number of delegates that you are waiting on...
To put supercat's solution in a more WPF like way, try for an MVVM pattern and then you can have a separate view model class which you can share between threads, perhaps take locks out at apropriate points or use the concurrent collections class. You implement an interface (I think it's INotifyPropertyChanged and fire an event to say the collection has changed. This event must be fired from the UI thread, but only needs
After going through the answers provided by others and your comments on them, your actual intent seems to be ensuring that UI remains responsive. For this I think you have already received good proposals.
But still, to answer your question (how to detect and flag overloading of UI thread) verbatim, I can suggest the following:
First determine what should be the definition of 'overloading' (for e.g. I can assume it to be 'UI thread stops rendering the controls and stops processing user input' for a big enough duration)
Define this duration (for e.g. if UI thread continues to process render and input messages in at-most 40ms I will say it is not overloaded).
Now Initiate a DispactherTimer with DispatcherPriority set according to your definition for overloading (for my e.g. it can be DispatcherPriority.Input or lower) and Interval sufficiently less than your 'duration' for overloading
Maintain a shared variable of type DateTime and on each tick of the timer change its value to DateTime.Now.
In the delegate you pass to BeginInvoke, you can compute a difference between current time and the last time Tick was fired. If it exceeds your 'measure' of overloading then well the UI thread is 'Overloaded' according to your definition. You can then set a shared flag which can be checked from inside your loop to take appropriate action.
Though I admit, it is not fool proof, but by empirically adjusting your 'measure' you should be able to detect overloading before it impacts you.
Use a StopWatch to measure minimum, maximum, average, first and last update durations. (You can ouput this to your UI.)
Your update frequency must be < than 1/(the average update duration).
Change your algorithm's implementation so that it iterations are invoked by a multimedia timer e.g. this .NET wrapper or this .NET wrapper. When the timer is activated, use Interlocked to prevent running a new iteration before current iteration is complete. If you need to iterations on the main, use a dispatcher. You can run more than 1 iteration per timer event, use a parameter for this and together with time measurements determine how many interations to run per timer event and how often you want the timer events.
I do not recommend using less than 5mSec for the timer, as the timer events will suffocate the CPU.
As I wrote ealier in my comment, use DispatcherPriority.Input when dispatching to the main thread, that way the UI's CPU time isn't suffocated by the dispatches. This is the same priority the UI messages have, so that way they are not ignored.
I have a WinForm drawing a chart from available data.
I programmed it so that every 1 secong the Winform.Timer.Tick event calls a function that:
will dequeue all data available
will add new points on the chart
Right now data to be plotted is really huge and it takes a lot of time to be executed so to update my form. Also Winform.Timer.Tick relies on WM_TIMER , so it executes in the same thread of the Form.
Theses 2 things are making my form very UNresponsive.
What can I do to solve this issue?
I thought the following:
moving away from usage of Winform.Timer and start using a System.Threading.Timer
use the IsInvokeRequired pattern so I will rely on the .NET ThreadPool.
Since I have lots of data, is this a good idea?
I have fear that at some point also the ThreadPool will be too long or too big.
Can you give me your suggestion about my issue?
Thank you very much!
AFG
It is a good idea to move the fetching of the data to a Thread. You can use a BackgroundWorker that gets the data in an endless loop and
use the UpdateProgress event to update the chart. This takes care of the InvokeRequired business
Use a Sleep(remainingTime) inside the loop to get a desired frequency.
It is quite unlikely you'll be ahead by using a background timer. Your chart control almost certainly requires it to be updated from the same thread is was created on. Any kind of control that has a visible appearance does. Which requires you to use Control.BeginInvoke in the Elapsed event handler so that the update code runs on the UI thread. Dequeueing data isn't likely to be expensive, you will have actually have made it slower by invoking. And still not have taken the pressure off the UI thread.
You'll also have a potentially serious throttling problem, the timer will keep on ticking and pump data, even if the UI thread can't keep up. That will eventually crash your program with OOM.
Consider instead to make the code that updates the chart smarter. A chart can only display details of the data if such details are at least a pixel wide. Realistically, it can only display 2000 pixels with useful information. That's not much, updating 2000 data points shouldn't cause any trouble.
I would go with a System.Timers.Timer over a BackgroudWorker in an endless loop.
The BackgroundWorker is executed from a ThreadPool and is not meant to run for the lifetime of your application.
Motivation for System.Timers.Timer:
Each elapsed event is executed from a ThreadPool, won't hang your UI thread.
Using a combination of locks and enabling/disabling the timer we can get the same frequency as if we did a Thread.Sleep(xxx) in an endless loop.
Cleaner and more obvious as to what you are trying to achieve
Here's my suggestion:
Disabling the timer at the beginning of the method, then re-enabling it again at the end, will cater for the case where the amount of work done in the elapsed event takes longer than the timer interval. This also ensures the timer between updates is consistent. I've added a lock for extra precaution.
I used an anonymous method to update the UI thread, but you can abviously do that however you want, as long as you remember to Invoke, it's also a good idea to check the InvokeRequired property
private readonly object chartUpdatingLock = new object();
private void UpdateChartTimerElapsed(object sender, ElapsedEventArgs e)
{
// Try and get a lock, this will cater for the case where two or more events fire
// in quick succession.
if (Monitor.TryEnter(chartUpdatingLock)
{
this.updateChartTimer.Enabled = false;
try
{
// Dequeuing and whatever other work here..
// Invoke the UI thread to update the control
this.myChartControl.Invoke(new MethodInvoker(delegate
{
// Do you UI work here
}));
}
finally
{
this.updateChartTimer.Enabled = true;
Monitor.Exit(chartUpdatingLock);
}
}
}