Best practice for queueing events in WPF - c#

I have a WPF (MVVM) project where I have multiple view-models, each with a button that launches different analyses on the same data source, which in this case is a file. The file cannot be shared, so if the buttons are pressed near the same time the second call will fail.
I need a way to queue the button clicks so that each analysis can be run sequentially, but I can't seem to get it to work. I tried using a static Semaphore, SemaphoreSlim and Mutex, but they appear to stop everything (the Wait() function appears to block the currently running analysis). I tried a lock() command with a static object but it didn't seem to block either event (I get the file share error). I also tried a thread pool (with a max concurrent thread count of 1), but it gives threading errors updating the UI (this may be solvable with Invoke() calls).
My question is what might be considered best practice in this situation with WPF?
EDIT: I created a mockup which exhibits the problem I'm having. It is at http://1drv.ms/1s4oQ1T.

What you need here is an asynchronous queue, so that you can enqueue these tasks without actually having anything blocking your threads. SemaphoreSlim actually has a WaitAsync method that makes creating such a queue rather simple:
public class TaskQueue
{
private SemaphoreSlim semaphore;
public TaskQueue()
{
semaphore = new SemaphoreSlim(1);
}
public async Task<T> Enqueue<T>(Func<Task<T>> taskGenerator)
{
await semaphore.WaitAsync();
try
{
return await taskGenerator();
}
finally
{
semaphore.Release();
}
}
public async Task Enqueue(Func<Task> taskGenerator)
{
await semaphore.WaitAsync();
try
{
await taskGenerator();
}
finally
{
semaphore.Release();
}
}
}
This allows you to enqueue operations that will be all executed sequentially, rather than in parallel, and without blocking any threads at any time. The operations can also be any type of asynchronous operation, whether that is CPU bound work in another thread, IO bound work, etc.

I would do two things to solve this problem:
First, encapsulate the analysis operations in a command pattern. If you aren't familiar with it, the simplest implementation is an interface with a single function Execute. When you want to perform an analysis operation, just create one of these. You could also use the built-in ICommand interface to help, but be aware that this interface has more to it than the generic command pattern.
Of course, creation is only half the battle, so after doing so I would add it to a BlockingCollection. This collection is .NET's solution to the Producer-Consumer problem. Have a background thread that consumes this collection (executing the command objects contained within) using a foreach on the collection's GetConsumingEnumerable method and your buttons will "feed" it.
foreach (var item in bc.GetConsumingEnumerable())
{
item.Execute();
}
MSDN for Blocking Collection: http://msdn.microsoft.com/en-us/library/dd267312(v=vs.110).aspx
Now, all the semaphores, waits, etc. are done for you, and you can just add an operation to the queue (if it needs to be a queue, consider using ConcurrentQueue as the backing collection for BlockingCollection) and return on the UI thread. The background thread will pick the task up and run it.
You will need to Invoke any UI updates from the background thread of course, no getting around that issue :).

I'd recommend a queue, in a scheduling object shared by the view-models, with a consumer task that waits on the queue to have an item added to it. When a button is pressed, the view-model adds a work item to the queue. The consumer task takes one item from the queue each time, does the analysis contained in the work item, and then checks the queue for another item, waiting for more work items to be added if there are no work items to be processed.

Related

C# await on a List<T> Count

I am upgrading some legacy WinForms code and I am trying to figure out what the "right way" as of .NET 4.6.1 to refactor the following.
The current code is doing a tight while(true) loop while checking a bool property. This property puts a lock() on a generic List<T> and then returns true if it has no items (list.Count == 0).
The loop has the dreaded Application.DoEvents() in it to make sure the message pump continues processing, otherwise it would lock up the application.
Clearly, this needs to go.
My confusion is how to start a basic refactoring where it still can check the queue length, while executing on a thread and not blowing out the CPU for no reason. A delay between checks here is fine, even a "long" one like 100ms+.
I was going to go with an approach that makes the method async and lets a Task run to do the check:
await Task.Run(() => KeepCheckingTheQueue());
Of course, this keeps me in the situation of the method needing to ... loop to check the state of the queue.
Between the waiting, awaiting, and various other methods that can be used to move this stuff to the thread pool ... any suggestion on how best to handle this?
What I need is how to best "poll' a boolean member (or property) while freeing the UI, without the DoEvents().
The answer you're asking for:
private async Task WaitUntilAsync(Func<bool> func)
{
while (!func())
await Task.Delay(100);
}
await WaitUntilAsync(() => list.Count == 0);
However, polling like this is a really poor approach. If you can describe the actual problem your code is solving, then you can get better solutions.
For example, if the list represents some queue of work, and your code is wanting to asynchronously wait until it's done, then this can be better coded using an explicit signal (e.g., TaskCompletionSource<T>) or a true producer/consumer queue (e.g., TPL Dataflow).
It's generally never a good idea for client code to worry about locking a collection (or sprinkling your code with lock() blocks everywhere) before querying it. Best to encapsulate that complexity out.
Instead I recommend using one of the .NET concurrent collections such as ConcurrentBag. No need for creating a Task which is somewhat expensive.
If your collection does not change much you might want to consider one of the immutable thread-safe collections such as ImmutableList<>.
EDIT: Upon reading your comments I suggest you use a WinForms Timer; OnApplicationIdle or BackgroundWorker. The problem with async is that you still need to periodically call it. Using a timer or app idle callback offers the benefit of using the GUI thread.
Depending on the use case, you could start a background thread or a background worker. Or maybe even a timer.
Those are executed in a different thread, and are therefore not locking the execution of your other form related code. Invoke the original thread if you have to perform actions on the UI thread.
I would also recommend to prevent locking as much as possible, for example by doing a check before actually locking:
if (list.Count == 0)
{
lock (lockObject)
{
if (list.Count == 0)
{
// execute your code here
}
}
}
That way you are only locking if you really need to and you avoid unnecessary blocking of your application.
I think what you're after here is the ability to await Task.Yield().
class TheThing {
private readonly List<int> _myList = new List<int>();
public async Task WaitForItToNotBeEmpty() {
bool hadItems;
do {
await Task.Yield();
lock (_myList) // Other answers have touched upon this locking concern
hadItems = _myList.Count != 0;
} while (!hadItems);
}
// ...
}

How to make thread safe event handler

In my application I have a queue which fires notifications whenever there are any changes to the queue, but sometimes it happens that when there are simultaneous operations on the queue event handler that it fires multiple times and that's okay, but what I don't want is,...
Below is the code for the event handler:
private async void NotificationQueue_Changed(object sender, EventArgs e)
{
if (!IsQueueInProcess)
await ProcessQeueue();
}
In ProcessQueue method I am setting IsQueueInProcess to true and whenever it gets completed it is set to false. Now, the problem is that whenever multiple event notifications fire simultaneously multiple ProcessQeueue methods start executing, which I don't want. I want to make sure that there will be only one execution of ProcessQeueue at any given time.
Given your statement that this event is raised whenever there are any changes to the queue, and that the queue can be used concurrently (i.e. there are multiple producers adding things to the queue), it seems likely to me that the best way to address this would be to abandon the event-based behavior altogether. Instead, using BlockingCollection<T>, with a thread dedicated to processing the queue via GetConsumingEnumerable(). That method will block the thread as long as the queue is empty, and will allow the thread to remove and process items in the queue any time any other thread adds something to it. The collection itself is thread-safe, so using that you would not require any additional thread synchronization (for the handling of the queue, that is…it's possible processing an item involves thread interactions, but there's nothing in your question that describes that aspect, so I can't say one way or the other anything about that).
That said, taking the question literally, the simplest approach would be to include a semaphore:
private readonly SemaphoreSlim _semaphore = new SemaphoreSlim(1);
private async void NotificationQueue_Changed(object sender, EventArgs e)
{
if (_semaphore.Wait(0))
{
await ProcessQueue();
_semaphore.Release();
}
}
The above attempts to acquire the semaphore's lock. With a timeout of 0 milliseconds, it will return immediately even if the semaphore could not be acquired. The return value indicates whether the semaphore was successfully acquired or not.
In this way, as long as there is no outstanding queue-processing operation, the current event handler invocation can acquire the semaphore and will call the ProcessQueue() method. When that operation completes, the continuation will release the semaphore. Until that happens, no other invocation of the event handler will be able to acquire the semaphore, and thus will not initiate processing of the queue.
I'll note that nothing here guarantees a solution to threads racing with each other that would ensure the queue is always either empty, or always has some processing operation acting on it. That's up to you, to ensure that the ProcessQueue() method has the synchronization needed to guarantee that if any thread has modified the queue and caused this event to be raised, that that thread will not fail to initiate another round of processing should the first round not be able to observe the change.
Or put another way, you need to make sure that for any thread that is going to raise that event, either its change to the queue will be observed by the current processing operation, or that thread will initiate a new one.
There's not enough context in your question for anyone to be able to address that concern specifically. I will just point out that it's a common enough thing for someone to overlook when trying to implement this sort of system. IMHO, all the more reason to just have a dedicated thread using BlockingCollection<T> to consume elements added to the queue. :)
See also the related question How to avoid reentrancy with async void event handlers?. This is a slightly different question, in that the accepted answer causes each invocation of the event handler to result in the operation initiated by the event handler. Your scenario is simpler, since you simply want to skip initiation of a new operation, but you may still find some useful insight there.
I agree with Peter that abandoning event-based notifications is the best solution, and that you should move to a producer/consumer queue. However, I recommend one of the TPL Dataflow blocks instead of BlockingCollection<T>.
In particular, ActionBlock<T> should work quite nicely:
private readonly ActionBlock<T> notificationQueue = new ActionBlock<T>(async t =>
{
await ProcessQueueItem(t);
});
By default, TPL Dataflow blocks have a concurrency limit of 1.

Queue function calls to write char for char

What should it do
I'm trying to write some text char for char into a TextBlock. I'm using this code for this:
void WriteTextCharForChar(String text)
{
new Thread(() =>
{
foreach(Char c in text)
{
TxtDisplayAppendText(c.ToString());
Thread.Sleep(rnd.Next(20, 100));
}
Thread.CurrentThread.Abort();
}).Start();
}
The problem
The Problem is that the text, of course, gets mixed when calling this method more then once. What I would need is a kind of queue or wait any how until the currently text is written to the TextBlock.
Of curse I'm open to any other solution to get this working. Thank you!
So there are several options here.
You could add a lock around the work that the thread does, so that there is never more than one running at a time.
This has several problems though:
The items aren't necessarily processed in order
You're creating lots of threads, all of which are spending almost all of their time waiting; this is very wasteful of resources.
You could create a thread safe queue (BlockingCollection would be best) and then have a single thread reading from it and writing out the results while the UI thread just adds to the queue
This also has problems. Most notably you're creating a new thread that's going to be spending basically all of its time waiting around. This is probably better than #1, but not by a lot.
You could avoid using multiple threads entirely and do everything asynchronously. The Task Parallel Library gives you a lot of tools to help with this. This is the best option as it results in the creation of 0 extra threads.
So first we'll create a helper method that will handling the writing of the text itself, so that another thread can be the one to handle "scheduling" these calls. This method will be much easier to write using await. A key point to note is that, rather than blocking the current thread for an unknown period of time, it will use Task.Delay to continue execution at a point in the future without blocking the thread:
private async Task WriteText(string text)
{
foreach (char c in text)
{
TxtDisplayAppendText(c.ToString());
await Task.Delay(rnd.Next(20, 100));
}
}
Now for your method. We can manage our queue though what will be in effect a linked list of tasks. If we have a single field of type Task representing the "previous task" we can have each method call add a continuation to that task, and then set itself as the previous task. The next task will set itself as the continuation of that, and so on. Each continuation will fire when the previous task runs, or will run immediately if the previous task has already finished, so this gives us our "queue", effectively. Note that since WriteTextCharForChar is being called from the UI thread these calls are all already synchronized, so there's no need to lock around the manipulation of this task.
private Task previousWrite = Task.FromResult(false); //already completed task
private void WriteTextCharForChar(String text)
{
previousWrite = previousWrite.ContinueWith(t => WriteText(text))
.Unwrap();
}

C# - Queue Management that that always run and Dequeing

I need to build a process that listen in WCF for new tasks. (Async)
Every Task get Enqueue'ed (somehow).
What is the Best (Logical and Performance) way to loop the queue and Dequeue it.
I thought about:
while(true){
queue.Dequeue();
}
I assume that there are better ways to do that.
Thanks
Have a look at System.Collections.Concurrent namespace - there is thread-safe queue implementation viz. ConcurrentQueue - although, I suspect that your needs would be better served by BlockingCollection.
Blocking collection is essentially a thread-safe collection useful for producer-consumer scenario. In your case, WCF calls will act as producers that will add to the collection while the worker thread will act as consumer who would essentially take queued tasks from the collection. By using single consumer (and collection), you can ensure order of execution. If that's not important then you may able to use multiple consumer threads. (There are also AddAny and TakeAny static overloads that will allow you to use multiple collections (multiple queues) if that is the need.)
The advantage over while(true) approach would be avoidance of tight loop that will just consume CPU cycles. Apart from having thread-safe, this would also solve issue of synchronization between queuing and de-queuing threads.
EDIT:
Blocking Collection is really very simple to use. See below simple example - add task will invoked from say your WCF methods to queue up tasks while StartConsumer will be called during service start-up.
public class MyTask { ... }
private BlockingCollection<MyTask> _tasks = new BlockingCollection<MyTask>();
private void AddTask(MyTask task)
{
_tasks.Add(task);
}
private void StartConsumer()
{
// I have used a task API but you can very well launch a new thread instead of task
Task.Factory.StartNew(() =>
{
while (!_tasks.IsCompleted)
{
var task = _tasks.Take();
ProcessTask(task);
}
});
}
While stopping service, one need to invoke _tasks.CompleteAdding so that consumer thread will break.
Find more examples on MSDN:
http://msdn.microsoft.com/en-us/library/dd997306.aspx
http://msdn.microsoft.com/en-us/library/dd460690.aspx
http://msdn.microsoft.com/en-us/library/dd460684.aspx
Instead of the infinite loop, I would use events to synchronize the queue. Whenever the WCF call is made, add an element to the queue and send out a "AnElementHasBeenAddedEvent".
The Thread executing the queued tasks listens for that event and whenever it receives it, the queue will be emptied.
Make sure there is only one thread that does this job!
Advantages over the while(true) concept: You do not have a thread that constantly loops through the endless loop and thus eats resources. You only do as much work as needed.

Awaiting the generation of multiple items in a single thread

(OBS: English is not my native language and I understand that the title of this question is far from good, but I tried my best to make the question itself clear)
Let's suppose I have a IEnumerable<T> ts that has MANY items and that each MoveNext() is VERY expensive - let's say ts was generated using a yield returnmethod that makes expensive computations.
Look at this piece of code:
var enumerator = ts.GetEnumerator();
while (true) {
T t = await TaskEx.Run<T>(() => enumerator.MoveNext() ?
enumerator.Current :
null);
if (t == null) break;
t.DoSomeLightweigthOperation();
}
It consumes the collection ts asynchronously, without blocking the main thread (which, in my case is a UI thread). The problem is: it spawns one thread for each item in ts. Is there a (simple) way to do the same thing using only ONE thread that does the entire job?
So, just to make myself clear, I want to generate these items using only one thread, but I need to get some kind of collections of Tasks (or any other class with a GetAwaiter) so that I can await after the generation of each of these items.
Your existing solution does not spawn one thread for each item. Rather, it creates a thread pool work item that is queued to the thread pool, once for each item. It is possible (even likely) that each MoveNext will actually be done by the same threadpool thread.
So, I think your existing solution will work, given your constraints.
If you can change the enumerable at all, I'd consider IAsyncEnumerator<T>, which has a Task<bool> MoveNext() member. IAsyncEnumerator<T>/IAsyncEnumerable<T> are part of Ix_Experimental-Async. I also wrote a nearly-identical IAsyncEnumerator<T>, which is part of Nito.AsyncEx. Asynchronous enumeration is a closer match to what you're trying to do; there's a Channel9 video that describes asynchronous enumeration (though the API has changed slightly since then).
If all you want to do is run through ts on another thread you could do something like this:
ThreadPool.QueueUserWorkItem(_ =>
{
foreach(T t in ts)
{
t.DoSomeLightweigthOperation();
}
});
Which just runs through it on a thread from the threadpool.
UPDATE: So the operation does ui work which means it needs to be queued on the ui thread. You don't say which ui framework you use but all of them have ways of doing that.
For instance if you are using wpf you can use the dispatcher

Categories