Autocomplete textbox freezes while executing query. Must be a better way! - c#

everyone! I searched the best I could and did not find exactly the help I was looking for.
Problem
AutoCompleteTextbox FREEZES and "eats" characters while query is performed
Asking for
Mimic Google Instant functionality
Background
First things first: C#, WPF, .NET 4.0
Ok, now that's out of the way, I'm trying to find the best way to implement a dynamic AutoComplete Textbox, which queries a database for results after each letter typed.
The following code gets executed when the AutoCompleteTextBox's TextChanged event is fired:
public void Execute(object sender, object parameter)
{
//removed some unnecessary code for the sake of being concise
var autoCompleteBox = sender as AutoCompleteTextBox;
var e = parameter as SearchTextEventArgs;
var result = SearchUnderlyings(e.SearchText);
autoCompleteBox.ItemsSource = result;
}
Now, let's say that SearchUnderlyings(e.SearchText) takes an average of 600-1100ms - during that time, the textbox is frozen and it "eats" any keys pressed. This is an annoying problem I've been having. For some reason, the LINQ in SearchUnderlyings(e.SearchText) is running in the GUI thread. I tried delegating this to a background thread, but still same result.
Ideally, I would like the textbox to work the way Google Instant does - but I don't want to be "killing" threads before the server/query can return a result.
Anyone have experience or can offer some guidance which will allow me to query as I type without freezing the GUI or killing the server?
Thank you guys!

This line:
var result = SearchUnderlyings(e.SearchText);
Runs synchronously, locking the UI thread. The way to cure this would be to switch to an asynchronous pattern, where you start the query, and then do something when it finishes.
This article demonstrates it pretty nicely, and shows some solutions - http://www.codeproject.com/KB/cs/AsyncMethodInvocation.aspx

What is probably killing you is setting the binding source over and over again (which is why running the query on a background thread doesn't make a difference).
You might consider the algorithm as a whole. Depending on your data, you could wait until the user enters the first three characters and then do one large query against the database. Bind the item source once. Each character typed afterwards just performs a filter against your data that is already cached on the client. That way you are not hitting the database over and over (which is going to be terribly expensive).
Or consider just bringing back three or so results from the DB to keep your service serialization time down.

So, we kind of hacked something quick. By making the calls to SearchUnderlyings(e.SearchText) asynchronous, my GUI thread is no longer blocked and the textbox is no longer "eating" key presses. By adding the lastQueryTag == _lastQuery check, we are trying to ensure some thread-safety, allowing only the most recent query to set the ItemsSource.
Perhaps not the most ideal or elegant solution. I am still open to further critiques and suggestions. Thank you!
private long _lastQuery = DateTime.Now.Ticks;
public void Execute(object sender, object parameter)
{
var autoCompleteBox = sender as AutoCompleteTextBox;
var e = parameter as SearchTextEventArgs;
// removed unecessary code for clarity
long lastQueryTag = _lastQuery = DateTime.Now.Ticks;
Task.Factory.StartNew(() =>
{
var result = SearchUnderlyings(e.SearchText);
System.Windows.Application.Current.Dispatch(() =>
{
if (lastQueryTag == _lastQuery)
autoCompleteBox.ItemsSource = result;
});
});
}

Related

How do you synchronously wait for the results of a event in C#?

There seems to be a pattern in Windows Applications in C# whereby you assign a 'handler' to an Event, and this Event is fired as a side-effect of a seperate method call. To give an example
ocrEngine = new OcrEngine();
ocrEngine.OcrResults += new OcrResultsEventHandler(ocrEngine_MethodThatProcessesTheResultsOfOcr);
So, for example, you have an OCR (Optical character recognition) engine and you want to pass it an image, and get back some text. However, in this API I am using the method to pass in the image return an int. i.e.
int result = ocEngine.ReadImage(image);
This means I discover if the OCR process was successful with an int, i.e 0 = success.
However the actual results are returned in the 'ocrEngine_MethodThatProcessesTheResultsOfOcr' method.
If I am running this from a console app, I am trying to understand the pattern I should be using to return the data, as effectively there is no obvious synchronous way of returning the actual result.
In fact there are clearly at least two threads running per method call to ocEngine.ReadImage(image);
I have a work around involving Thread.Sleep and checking for a boolean, but this seems plain wrong.
Any guidance would be greatly appreciated.
It seems as though there is no perfect answer here, other than using an API that instead returns the results in a syncronous manner.
The best approach that has been suggested was as follows:
static AutoResetEvent autoResetEvent = new AutoResetEvent(false);
public static string Process(Bitmap image)
{
...
int result = ocEngine.ReadImage(image);
autoResetEvent.WaitOne(2500);
...
}
then in the callback event call:
autoResetEvent.Set();
once the work has been done.
This approach produced more consistent results, however, and this may have been the result of the API we were using, still caused timeout/deadlock issues occasionally.

Setup an event loop for a UWP app

How do I setup an event loop (main loop) in a UWP app?
My goal is to have an app that has a main page with a continuously updating calculation, that then continuously updates an image on the page. While that is constantly happening, the user can click a button to change the calculation behavior, or to go to a second page and edit related data.
The user can then click back to the main page and see the image and calculation restart and continuously update. (The calculation is complex, so it should go as fast as possible and use up as much app time as possible).
If there is a way to accomplish this without an event loop I would like to know that also, but so far I have not found a way.
With an event loop, I can simply update the calculation and UI every time through the loop. But I have not found any way to do so. This question asked nearly the same thing, but was never directly answered and had a different use case anyway.
The closest I have found to a solution is to grab the CoreDispatcher from the CoreWindow and use the RunIdleAsync() method to create a loop:
public MainPage()
{
this.InitializeComponent();
Windows.UI.Core.CoreWindow appwindow = Windows.UI.Core.CoreWindow.GetForCurrentThread();
Windows.UI.Core.CoreDispatcher appdispatcher = appwindow.Dispatcher;
//create a continuously running idle task (the app loop)
appdispatcher.RunIdleAsync( (dummyt) =>
{
//do the event loop here
.
.
.
if (appdispatcher.ShouldYield()) //necessary to prevent blocking UI
{
appdispatcher.ProcessEvents(Windows.UI.Core.CoreProcessEventsOption.ProcessAllIfPresent);
}
});
}
The main problem with this is that you can't switch between pages (you get a system exception from dispatching events within an already dispatched event).
Second, this is very messy and requires maintaining extra state in the event loop. Besides, why should I have to go through these contortions just to have some calculations happening while the app is waiting for user input?
Is there a way to do this (besides switching to a C++ DirectX app)?
I don't know about setting up your own event loop, but there is no reason to do so.
What you are talking about sounds like a great case for Tasks. You would start a calculation Task whenever your user did something, having it report its progress via standard C# events if you need mid-operation updates. Those updates would modify properties in your view model which the binding system would then pick up.
You could also make your calculation code cancellable so changes can abort a previous calculation.
All of this involves pretty standard UWP concepts; no need for a special event loop. That you are even considering that makes me think you need to study MVVM and multi-threading/tasks; you are still thinking in a very "Win-Forms" kind of way.
If we're talking about some event loop, or stream, .Net has a great library named Rx, or Reactive Extensions, which may be helpful for you. You can set up a simple flow, something like this:
var source = Observable
// fire event every second
.Interval(TimeSpan.FromSeconds(1), Scheduler.DispatcherScheduler)
// add timestamp for event
.Timestamp()
// gather system information to calculate
.Select(GetSystemInfo);
Note that the events right now are on UI thread, as you need to access the controls. Now you have two options: use Rx for background processing too or use TPL Dataflow' TransformBlock for processing your system information into new image (it can be Observer and Observable at a time). After that you need to get back to UI thread.
First option:
var source = Observable
// fire event every second
.Interval(TimeSpan.FromSeconds(1), DispatcherScheduler.Current)
// add timestamp for event
.Timestamp()
// gather system information to calculate
.Select(GetSystemInfo)
// expensive calculations are done in background
.Select(x => x.ObserveOn(DefaultScheduler.Instance))
.Select(x => Expensive(x))
.Select(x => x.ObserveOn(DispatcherScheduler.Current))
.Select(x => UpdateUI(x));
You probably should split this chain into several observers and observables, still the idea is the same, more information here: Rx Design Guidelines.
Second option:
var action = new TransformBlock<SystemInfo, ImageDelta>(CalculateDelta,
new ExecutionDataflowBlockOptions
{
// we can start as many item processing as processor count
MaxDegreeOfParallelism = Environment.ProcessorCount,
});
IDisposable subscription = source.Subscribe(action.AsObserver());
var uiObserver = action.AsObservable()
.Select(x => x.ObserveOn(DispatcherScheduler.Current))
.Select(x => UpdateUI(x));
I want to note that UWP and MVVM pattern do provide a possibility to work with binding between UI and ObservableCollection, which will help you to notify user in most natural way.

Multi Threading with LINQ to SQL

I am writing a WinForms application. I am pulling data from my database, performing some actions on that data set and then plan to save it back to the database. I am using LINQ to SQL to perform the query to the database because I am only concerned with 1 table in our database so I didn't want to implement an entire ORM for this.
I have it pulling the dataset from the DB. However, the dataset is rather large. So currently what I am trying to do is separate the dataset into 4 relatively equal sized lists (List<object>).
Then I have a separate background worker to run through each of those lists, perform the action and report its progress while doing so. I have it planned to consolidate those sections into one big list once all 4 background workers have finished processing their section.
But I keep getting an error while the background workers are processing their unique list. Do the objects maintain their tie to the DataContext for the LINQ to SQL even though they have been converted to List objects? Any ideas how to fix this? I have minimal experience with multi-threading so if I am going at this completely wrong, please tell me.
Thanks guys. If you need any code snippets or any other information just ask.
Edit: Oops. I completely forgot to give the error message. In the DataContext designer.cs it gives the error An item with the same key has already been added. on the SendPropertyChanging function.
private void Setup(){
List<MyObject> quarter1 = _listFromDB.Take(5000).ToList();
bgw1.RunWorkerAsync();
}
private void bgw1_DoWork(object sender, DoWorkEventArgs e){
e.Result = functionToExecute(bgw1, quarter1);
}
private List<MyObject> functionToExecute(BackgroundWorker caller, List<MyObject> myList)
{
int progress = 0;
foreach (MyObject obj in myList)
{
string newString1 = createString();
obj.strText = newString;
//report progress here
caller.ReportProgress(progress++);
}
return myList;
}
This same function is called by all four workers and is given a different list for myList based on which worker is called the function.
Because a real answer has yet to be posted, I'll give it a shot.
Given that you haven't shown any LINQ-to-SQL code (no usage of DataContext) - I'll take an educated guess that the DataContext is shared between the threads, for example:
using (MyDataContext context = new MyDataContext())
{
// this is just some random query, that has not been listed - ToList()
// thus query execution is defered. listFromDB = IQueryable<>
var listFromDB = context.SomeTable.Where(st => st.Something == true);
System.Threading.Tasks.Task.Factory.StartNew(() =>
{
var list1 = listFromDB.Take(5000).ToList(); // runs the SQL query
// call some function on list1
});
System.Threading.Tasks.Task.Factory.StartNew(() =>
{
var list2 = listFromDB.Take(5000).ToList(); // runs the SQL query
// call some function on list2
});
}
Now the error you got - An item with the same key has already been added. - was because the DataContext object is not thread safe! A lot of stuff happens in the background - DataContext has to load objects from SQL, track their states, etc. This background work is what throws the error (because each thread is running the query, the DataContext gets accessed).
At least this is my own personal experience. Having come across the same error while sharing the DataContext between multiple threads. You only have two options in this scenario:
1) Before starting the threads, call .ToList() on the query, making listFromDB not an IQueryable<>, but an actual List<>. This means that the query has already ran and the threads operate on an actual List, not on the DataContext.
2) Move the DataContext definition into each thread. Because the DataContext is no longer shared, no more errors.
The third option would be to re-write the scenario into something else, like you did (for example, make everything sequential on a single background thread)...
First of all, I don't really see why you'd need multiple worker threads at all. (are theses lists in seperate databases / tables / servers? Do you really want to show 4 progress bars if you have 4 lists or are you somehow merging these progress reportings into one weird progress bar:D
Also, you're trying to speed up processing updates to your databases, but you don't send linq to sql any SAVES, so you're not really batching transactions, you'll just save everything at the end in one big transaction, is that really what you're aiming for? the progress bar will just stop at 100% and then spend a lot of time on the SQL side.
Just create one background thread and process everything synchronously, but batch a save transaction every couple of rows (i'd suggest something like every 1000 rows, but you should experiment with this) , it'll be fast, even with millions of rows,
If you really need this multithreaded solution:
The "another blabla with the same key has been added" error suggests that you are adding the same item to multiple "mylists", or adding the same item to the same list twice, otherwise how would there be any errors at all?
Using Parallel LINQ (PLINQ), you can take benefit of multiple CPU cores for processing your data. But if your application is going to run on single-core CPU, then splitting data into peaces wouldn't give you performance benefits instead it will incur some context-change overhead.
Hope it Helps

BackgroundWorker with SqlConnection

I know I'm having a massive derp moment here and this is probably quite easy to actually do - I have had a search around and read a few articles but i'm still struggling a little, so any feedback or pointers to useful resources would be greatly appreciated!
Anyway I have a class called PopulateDatagridViews which I have various functions in, one of which is called ExecuteSqlStatement, this function is simple enough, it initializes an SQL connection and returns a DataTable populated with the results of the SQL query. Within the same class I also have various functions that use string builders to build up SQL statements. (Not ideal, I know.)
I create a PopulateDatagridViews object in my GUI thread and use it to set various datagrid views with with the returned DataTables. For example:
dataGridViewVar.DataSource = populateDgv.GetCustomers();
Naturally a problem I'm having is that the more data to be read from the database, the longer the U.I is unresponsive. I would like to shift the process of retrieving data via the PopulateDatagridViews to a separate thread or BackgroundWorker so as prevent the main GUI thread from locking up whilst this is processed.
I realise I can create a BackgroundWorker to do this and place in the DoWork handler a call to the appropriate function within my PopulateDatagridViews.
I figure I could create a BackgroundWorker for each individual function inside my PopulateDatagridViews class, but surely there is a more efficient way to do this? I'd very much appreciate a point in the right direction on this as it's driving me around the bend!
Additional Info: I use version 4.0 of the .Net framework.
I strongly suggest that you use TPL (Task Parallel Library) http://msdn.microsoft.com/en-us/library/dd537609.aspx
In your case you will create first task to pull some data and than start second task after first is completed to update UI.
I`ll try to find code that i write for similar problem.
Edit: Adding code
Task<return_type> t1 = new Task<return_type>(() =>
{
//do something to take some result
return some_result; //return it
});
t1.Start();
Task t2 = t1.ContinueWith((some_arg_that_represent_previous_task_obj) =>{//ContinueWith guarantees that t2 is started AFTER t1 is executed!
//Update your GUI here
//if you need result from previos task: some_arg_that_represent_previous_task_obj.Result //Your dataset or whatever
}, TaskScheduler.FromCurrentSynchronizationContext()); //VERY important - you must update gui from same thread that created it! (you will have cross thread exeption if you dont add TaskScheduler.FromCurrentSynchronizationContext()
Hope it helps.
Well in that case I recommend reading this msdn article to get some ideas. Afterwards you should look for some tutorials, because the msdn is not the best source to learn things. ;o)

How to do the processing and keep GUI refreshed using databinding?

History of the problem
This is continuation of my previous question
How to start a thread to keep GUI refreshed?
but since Jon shed new light on the problem, I would have to completely rewrite original question, which would make that topic unreadable. So, new, very specific question.
The problem
Two pieces:
CPU hungry heavy-weight processing as a library (back-end)
WPF GUI with databinding which serves as monitor for the processing (front-end)
Current situation -- library sends so many notifications about data changes that despite it works within its own thread it completely jams WPF data binding mechanism, and in result not only monitoring the data does not work (it is not refreshed) but entire GUI is frozen while processing the data.
The aim -- well-designed, polished way to keep GUI up to date -- I am not saying it should display the data immediately (it can skip some changes even), but it cannot freeze while doing computation.
Example
This is simplified example, but it shows the problem.
XAML part:
<StackPanel Orientation="Vertical">
<Button Click="Button_Click">Start</Button>
<TextBlock Text="{Binding Path=Counter}"/>
</StackPanel>
C# part (please NOTE this is one piece code, but there are two sections of it):
public partial class MainWindow : Window,INotifyPropertyChanged
{
// GUI part
public MainWindow()
{
InitializeComponent();
DataContext = this;
}
private void Button_Click(object sender, RoutedEventArgs e)
{
var thread = new Thread(doProcessing);
thread.IsBackground = true;
thread.Start();
}
// this is non-GUI part -- do not mess with GUI here
public event PropertyChangedEventHandler PropertyChanged;
public void OnPropertyChanged(string property_name)
{
if (PropertyChanged != null)
PropertyChanged(this, new PropertyChangedEventArgs(property_name));
}
long counter;
public long Counter
{
get { return counter; }
set
{
if (counter != value)
{
counter = value;
OnPropertyChanged("Counter");
}
}
}
void doProcessing()
{
var tmp = 10000.0;
for (Counter = 0; Counter < 10000000; ++Counter)
{
if (Counter % 2 == 0)
tmp = Math.Sqrt(tmp);
else
tmp = Math.Pow(tmp, 2.0);
}
}
}
Known workarounds
(Please do not repost them as answers)
I sorted the list according how much I like the workaround, i.e. how much work it requires, limitations of it, etc.
this is mine, it is ugly, but simplicity of it kills -- before sending notification freeze a thread -- Thread.Sleep(1) -- to let the potential receiver "breathe" -- it works, it is minimalistic, it is ugly though, and it ALWAYS slows down computation even if no GUI is there
based on Jon idea -- give up with data binding COMPLETELY (one widget with databinding is enough for jamming), and instead check from time to time data and update the GUI manually -- well, I didn't learn WPF just to give up with it now ;-)
Thomas idea -- insert proxy between library and frontend which would receiver all notifications from the library, and pass some of them to WPF, like for example every second -- the downside is you have to duplicate all objects that send notifications
based on Jon idea - pass GUI dispatcher to library and use it for sending notifications -- why it is ugly? because it could be no GUI at all
My current "solution" is adding Sleep in the main loop. The slowdown is negligible, but it is enough for WPF to be refreshed (so it is even better than sleeping before each notification).
I am all ears for real solutions, not some tricks.
Remarks
Remark on giving up with databinding -- for me the design of it is broken, in WPF you have single channel of communication, you cannot bind directly to the source of the change. The databinding filters the source based on name (string!). This requires some computation even if you use some clever structure to keep all the strings.
Edit: Remark on abstractions -- call me old timer, but I started learning computer convinced, that computers should help humans. Repetitive tasks are domain of computers, not humans. No matter how you call it -- MVVM, abstractions, interface, single inheritance, if you write the same code, over and over, and you don't have way to automatize the things you do, you use broken tool. So for example lambdas are great (less work for me) but single inheritance is not (more work for me), data binding (as an idea) is great (less work) but the need of proxy layer for EVERY library I bind to is broken idea because it requires a lot of work.
In my WPF applications I don't send the property change directly from the model to the GUI. It always goes via a proxy (ViewModel).
The property change events are put in a queue which is read from the GUI thread on a timer.
Don't understand how that can be so much more work. You just need another listener for your model's propertychange event.
Create a ViewModel class with a "Model" property which is your current datacontext. Change the databindings to "Model.Property" and add some code to hook up the events.
It looks something like this:
public MyModel Model { get; private set; }
public MyViewModel() {
Model = new MyModel();
Model.PropertyChanged += (s,e) => SomethingChangedInModel(e.PropertyName);
}
private HashSet<string> _propertyChanges = new HashSet<string>();
public void SomethingChangedInModel(string propertyName) {
lock (_propertyChanges) {
if (_propertyChanges.Count == 0)
_timer.Start();
_propertyChanges.Add(propertyName ?? "");
}
}
// this is connected to the DispatherTimer
private void TimerCallback(object sender, EventArgs e) {
List<string> changes = null;
lock (_propertyChanges) {
_Timer.Stop(); // doing this in callback is safe and disables timer
if (!_propertyChanges.Contain(""))
changes = new List<string>(_propertyChanges);
_propertyChanges.Clear();
}
if (changes == null)
OnPropertyChange(null);
else
foreach (string property in changes)
OnPropertyChanged(property);
}
This isn't really a WPF issue per se. When you have a long-running operation that updates a set of data rapidly, keeping the UI updated - any UI, whether it's WPF or WinForms or just VT100 emulation - is going to present the same problem. UI updates are comparatively slow and complex, and integrating them with a fast-changing complex process without hurting that process requires a clean separation between the two.
That clean separation is even more important in WPF because the UI and the long-running operation need to run on separate threads so that the UI doesn't freeze while the operation is running.
How do you achieve that clean separation? By implementing them independently, providing a mechanism for periodically updating the UI from within the long-running process, and then testing everything to figure out how frequently that mechanism should be invoked.
In WPF, you'll have three components: 1) a view, which is the physical model of your UI, 2) a view model, which is the logical model of the data that is displayed in the UI, and that pushes changes in the data out to the UI through change notification, and 3) your long-running process.
The long-running process can be almost completely unaware of the UI, so long as it does two things. It needs to expose public properties and/or methods so that the view model can examine its state, and it needs to raise an event whenever the UI should be updated.
The view model listens to that event. When the event is raised, it copies state information from the process to its data model, and its built-in change notification pushes those out to the UI.
Multithreading complicates this, but only a bit. The process needs to run on a different thread than the UI, and when its progress-reporting event is handled, its data will be copied across threads.
Once you've built these three pieces, the multithreading is very straightforward to accomplish using WPF's BackgroundWorker. You create the object that's going to run the process, wire its progress-reporting event up with the BackgroundWorker's ReportProgress event, and marshal data from the object's properties to the view model in that event handler. Then fire off the object's long-running method in the BackgroundWorker's DoWork event handler and you're good to go.
A user interface that changes faster than the human eye can observe (~25 updates/sec) is not a usable user interface. A typical user will observe the spectacle for at most a minute before giving up completely. You are well past this if you made the UI thread freeze.
You have to design for a human, not a machine.
Since there are too many notifications for the UI to handle, why not just throttle the notifications a bit? This seems to work fine:
if (value % 500 == 0)
OnPropertyChanged("Counter");
You could also limit the frequency of the notifications, using a timer:
public SO4522583()
{
InitializeComponent();
_timer = new DispatcherTimer();
_timer.Interval = TimeSpan.FromMilliseconds(50);
_timer.Tick += new EventHandler(_timer_Tick);
_timer.Start();
DataContext = this;
}
private bool _notified = false;
private DispatcherTimer _timer;
void _timer_Tick(object sender, EventArgs e)
{
_notified = false;
}
...
long counter;
public long Counter
{
get { return counter; }
set
{
if (counter != value)
{
counter = value;
if (!_notified)
{
_notified = true;
OnPropertyChanged("Counter");
}
}
}
}
EDIT: if you cannot afford to skip notifications because they're used by other parts of your code, here's a solution that doesn't require big changes in your code:
create a new property UICounter, which throttles the notifications as shown above
in the Counter setter, update UICounter
in your UI, bind to UICounter rather than Counter
A layer between UI and the library is necessary. This will ensure that you will be able to do interaction testing and also allow you to swap out the library with another implementation in future without much change. This isn't a duplication, but a way of providing an interface for UI layer to communicate. This layer will accept objects from library, convert them to specific data transfer objects and pass them onto another layer which will have the responsibility to throttle the updates and convert them to your specific VM objects.
My opinion is that VMs should be as dumb as possible and their only responsibility should be to provide data to views.
Your qestion sounds similar to slow-down-refresh-rate-of-bound-datagrid.
At least the answers are similar
Have a shadow copy of your data bound to the gui element instead of binding the original data.
Add an eventhandler that update the shadow-copy with a certain delay from the original data.
You need to disconnect the source of the notifications from the target for the notifications. The way you have it set up now, every time the value changes, you go through an entire refresh cycle (which I believe is blocking your processing function from continuing as well). This is not what you want.
Provide an Output stream to your processing function which it would use to write its notifications.
On the monitoring side, attach an input stream to that outputstream and use it as the data source for your UI component. This way there isn't any notification event handling going on at all - the processing is running flat out as fast as it can, outputting monitor data to the output stream you provide. Your monitor UI is simply rendering whatever it receives in the input stream.
You will need a thread to continuously read from the input stream. If no data is available, then it should block. If it reads some data, it should dump it into the UI.
Regards,
Rodney

Categories