Exit a loop if another thread enter in the - c#

I've a multi-threading issue.
I've a method that is called to make refresh on several items.
In this method, I iterate on a list of items and refresh one of it's property.
The list has a lot of elements and we have to do some math to compute it's property.
The current code of this operation look like this:
public void AddItemsWithLayoutRefresh(IEnumerable<MyItem> items){
_control.Invoke(()=>{
AddItems(items);
for(int i =0;i<_guiItems.Count;i++){
//The goal is to have a condition here to "break" the loop and let the next call to RefreshLayout proceed
_guiItems[i].Propriety = ComputePropriety(_guiItems[i]);
}
});
}
The problem is that I may have 4 call, which are currently just blocking on the Invoke.
I've to finish the "AddItems" methods, but concerning everything that is in the "for" loop, I can abort this without any issue if I know that it will be executed just after.
But how to do this in a thread-safe way?
If I put a private bool _isNewRefreshHere;, set to true before entering the Invoke, then checking in the Invoke, I've no warranty that there is not already two call that have reach the Invoke BEFORE I check it in the for loop.
So how can I break when being in my loop when a new call is made to my method?
Solution
Based on Andrej Mohar's answer, I did the following:
private long m_refreshQueryCount;
public void AddItemsWithLayoutRefresh(IEnumerable<MyItem> items){
Interlocked.Increment(ref m_refreshQueryCount);
_control.Invoke(()=>{
Interlocked.Decrement(ref m_refreshQueryCount);
AddItems(items);
for(int i =0;i<_guiItems.Count;i++){
if (Interlocked.Read(ref m_refreshQueryCount) > 0)
{
break;
}
_guiItems[i].Propriety = ComputePropriety(_guiItems[i]);
}
});
}
Which seems to work very nicely

If I were you, I'd try to make a thread-safe waiting counter. You can use Interlocked methods like Increment and Decrement. What these basically do is they increment the value as an atomic operation, which is considered to be thread-safe. So you increase the variable before the Invoke call. This will allow you to know how many threads are in the waiting queue. You decrement the variable after the for loop finishes and before the ending of the Invoke block. You can then check inside the for statement for the number of waiting threads and break the for if the number is greater than 1. This way you should know exactly how many threads are in the execution chain.

I would do it in the following way:
private readonly object _refresherLock = new object();
private bool _isNewRefreshHere = false;
private AutoResetEvent _refresher = new AutoResetEvent(true);
public void AddItemsWithLayoutRefresh(IEnumerable<MyItem> items)
{
lock (_refresherLock)
{
if (_isNewRefreshHere)
{
return;
}
_isNewRefreshHere = true;
}
_refresher.WaitOne();
_isNewRefreshHere = false;
_control.Invoke(() =>
{
AddItems(items);
for (int i = 0; i < _guiItems.Count && !_isNewRefreshHere; i++)
{
_guiItems[i].Propriety = ComputePropriety(_guiItems[i]);
}
_refresher.Set();
});
}
That is:
You can always cancel the current updation with a new one.
You cannot queue up more than one updation at a time.
You are guaranteed to have no cross-threading conflicts.
You should test that code since I did not. :)

Related

Sequentially call asynchronous methods in a loop

I'm working with some code, that has an approach I've never dealt with before, and am hoping someone might be able to provide a spark of knowledge:
Effectively the class is setup like this:
void Loop()
{
for (int i = 0; i < 100; i++)
{
//need to block calls to this until UpdateComplete is hit
Update(i);
}
}
void Update(int i)
{
//asynchronous - returns immediately
}
void UpdateComplete()//called from another thread
{
//unblock further calls to Update();
}
The big caveat is that it's not possible to call Update() before UpdateComplete() has 'returned' or been calledback.
Ignoring any UI side effects, is there a neat solution to this problem?
I've currently got two strategies - one hacky and one I feel is over-complicated:
1 - Hacky: Set a global class boolean IsBlocked, that Update() sets to true and UpdateComplete() sets to false, and inside my for loop (in Loop()), just put in a while (IsBlocked){}.
2 - Over-complicated: Get rid of the loop altogether. Call Update(0); from inside Loop() instead, and have UpdateComplete() call Update(1);, Update(2); and so on.
I guess I'm hoping for something like a Thread.Pause that can then be remotely 'resumed' by another thread (the separate one that calls UpdateComplete()
Any ideas appreciated!
Async/await can also be used here
TaskCompletionSource<object> tcs = null;
async void Loop()
{
for (int i = 0; i < 100; i++)
{
tcs = new TaskCompletionSource<object>();
Update(i);
await tcs.Task;
}
}
void Update(int i)
{
//asynchronous - returns immediately
}
void UpdateComplete()//called from another thread
{
tcs.TrySetResult(null);
//unblock further calls to Update();
}
Post the updates into the thread pool using QueueUserWorkItem
Have a counter of the number of posted items.
Have main thread wait on AutoResetEvent
When Each update finishes decrement the count under lock.
When the count hits 0 Set the AutoResetEvent to awake the main thread
I didn't post any code cos I assumed you can google how to do each bit, if not comment and I can add code

Is there such a synchronization tool as "single-item-sized async task buffer"?

Many times in UI development I handle events in such a way that when an event first comes - I immediately start processing, but if there is one processing operation in progress - I wait for it to complete before I process another event. If more than one event occurs before the operation completes - I only process the most recent one.
The way I typically do that my process method has a loop and in my event handler I check a field that indicates if I am currently processing something and if I am - I put my current event arguments in another field that is basically a one item sized buffer and when current processing pass completes - I check if there is some other event to process and I loop until I am done.
Now this seems a bit too repetitive and possibly not the most elegant way to do it, though it seems to otherwise work fine for me. I have two questions then:
Does what I need to do have a name?
Is there some reusable synchronization type out there that could do that for me?
I'm thinking of adding something to the set of async coordination primitives by Stephen Toub that I included in my toolkit.
So first, we'll handle the case that you described in which the method is always used from the UI thread, or some other synchronization context. The Run method can itself be async to handle all of the marshaling through the synchronization context for us.
If we're running we just set the next stored action. If we're not, then we indicate that we're now running, await the action, and then continue to await the next action until there is no next action. We ensure that whenever we're done we indicate that we're done running:
public class EventThrottler
{
private Func<Task> next = null;
private bool isRunning = false;
public async void Run(Func<Task> action)
{
if (isRunning)
next = action;
else
{
isRunning = true;
try
{
await action();
while (next != null)
{
var nextCopy = next;
next = null;
await nextCopy();
}
}
finally
{
isRunning = false;
}
}
}
private static Lazy<EventThrottler> defaultInstance =
new Lazy<EventThrottler>(() => new EventThrottler());
public static EventThrottler Default
{
get { return defaultInstance.Value; }
}
}
Because the class is, at least generally, going to be used exclusively from the UI thread there will generally need to be only one, so I added a convenience property of a default instance, but since it may still make sense for there to be more than one in a program, I didn't make it a singleton.
Run accepts a Func<Task> with the idea that it would generally be an async lambda. It might look like:
public class Foo
{
public void SomeEventHandler(object sender, EventArgs args)
{
EventThrottler.Default.Run(async () =>
{
await Task.Delay(1000);
//do other stuff
});
}
}
Okay, so, just to be verbose, here is a version that handles the case where the event handlers are called from different threads. I know you said that you assume they're all called from the UI thread, but I generalized it a bit. This means locking over all access to instance fields of the type in a lock block, but not actually executing the function inside of a lock block. That last part is important not just for performance, to ensure we're not blocking items from just setting the next field, but also to avoid issues with that action also calling run, so that it doesn't need to deal with re-entrancy issues or potential deadlocks. This pattern, of doing stuff in a lock block and then responding based on conditions determined in the lock means setting local variables to indicate what should be done after the lock ends.
public class EventThrottlerMultiThreaded
{
private object key = new object();
private Func<Task> next = null;
private bool isRunning = false;
public void Run(Func<Task> action)
{
bool shouldStartRunning = false;
lock (key)
{
if (isRunning)
next = action;
else
{
isRunning = true;
shouldStartRunning = true;
}
}
Action<Task> continuation = null;
continuation = task =>
{
Func<Task> nextCopy = null;
lock (key)
{
if (next != null)
{
nextCopy = next;
next = null;
}
else
{
isRunning = false;
}
}
if (nextCopy != null)
nextCopy().ContinueWith(continuation);
};
if (shouldStartRunning)
action().ContinueWith(continuation);
}
}
Does what I need to do have a name?
What you're describing sounds a bit like a trampoline combined with a collapsing queue. A trampoline is basically a loop that iteratively invokes thunk-returning functions. An example is the CurrentThreadScheduler in the Reactive Extensions. When an item is scheduled on a CurrentThreadScheduler, the work item is added to the scheduler's thread-local queue, after which one of the following things will happen:
If the trampoline is already running (i.e., the current thread is already processing the thread-local queue), then the Schedule() call returns immediately.
If the trampoline is not running (i.e., no work items are queued/running on the current thread), then the current thread begins processing the items in the thread-local queue until it is empty, at which point the call to Schedule() returns.
A collapsing queue accumulates items to be processed, with the added twist that if an equivalent item is already in the queue, then that item is simply replaced with the newer item (resulting in only the most recent of the equivalent items remaining in the queue, as opposed to both). The idea is to avoid processing stale/obsolete events. Consider a consumer of market data (e.g., stock ticks). If you receive several updates for a frequently traded security, then each update renders the earlier updates obsolete. There is likely no point in processing earlier ticks for the same security if a more recent tick has already arrived. Thus, a collapsing queue is appropriate.
In your scenario, you essentially have a trampoline processing a collapsing queue with for which all incoming events are considered equivalent. This results in an effective maximum queue size of 1, as every item added to a non-empty queue will result in the existing item being evicted.
Is there some reusable synchronization type out there that could do that for me?
I do not know of an existing solution that would serve your needs, but you could certainly create a generalized trampoline or event loop capable of supporting pluggable scheduling strategies. The default strategy could use a standard queue, while other strategies might use a priority queue or a collapsing queue.
What you're describing sounds very similar to how TPL Dataflow's BrodcastBlock behaves: it always remembers only the last item that you sent to it. If you combine it with ActionBlock that executes your action and has capacity only for the item currently being processed, you get what you want (the method needs a better name):
// returns send delegate
private static Action<T> CreateProcessor<T>(Action<T> executedAction)
{
var broadcastBlock = new BroadcastBlock<T>(null);
var actionBlock = new ActionBlock<T>(
executedAction, new ExecutionDataflowBlockOptions { BoundedCapacity = 1 });
broadcastBlock.LinkTo(actionBlock);
return item => broadcastBlock.Post(item);
}
Usage could be something like this:
var processor = CreateProcessor<int>(
i =>
{
Console.WriteLine(i);
Thread.Sleep(i);
});
processor(100);
processor(1);
processor(2);
Output:
100
2

Dispatcher Invoke(...) vs BeginInvoke(...) confusion

I'm confused why I can't make this test counter application work with 2 (or more) simultaneous running countertextboxes with the use of "BeginInvoke" on my Dispatcher in the Count() method.
You can solve the issue by replacing the BeginInvoke by an Invoke. But this doesn't solve my confusion.
Here's the sample code I'm talking about:
public class CounterTextBox : TextBox
{
private int _number;
public void Start()
{
(new Action(Count)).BeginInvoke(null, null);
}
private void Count()
{
while (true)
{
if (_number++ > 10000) _number = 0;
this.Dispatcher.BeginInvoke(new Action(UpdateText), System.Windows.Threading.DispatcherPriority.Background, null);
}
}
private void UpdateText()
{
this.Text = "" + _number;
}
}
When you use Dispatcher.BeginInvoke it means that it schedules the given action for execution in the UI thread at a later point in time, and then returns control to allow the current thread to continue executing. Invoke blocks the caller until the scheduled action finishes.
When you use BeginInvoke your loop is going to run super fast since BeginInvoke returns right away. This means that you're adding lot and lots of actions to the message queue. You're adding them much faster than they can actually be processed. This means that there's a long time between when you schedule a message and when it actually gets a chance to be run.
The actual action that you're running uses the field _number. But _number is being modified by the other thread very quickly and while the action is in the queue. This means that it won't display the value of _number at the time you scheduled the action, but rather what it is after it has been continuing on in it's very tight loop.
If you use Dispatcher.Invoke instead then it prevents the loop from "getting ahead of itself" and having multiple scheduled events, which ensures that the value that it's writing is always the "current" value. Additionally, by forcing each iteration of the loop to wait for the message to be run it makes the loop a lot less "tight", so it can't run as quickly in general.
If you want to use BeginInvoke the first thing you really need to do is slow down your loop. If you want it to update the text every second, or ever 10ms, or whatever, then you can use Thread.Sleep to wait the appropriate amount of time.
Next, you need to take a copy of _number before passing it to the Dispatcher so that it displays the value at the time you scheduled it, not at the time it is executed:
while (true)
{
if (_number++ > 10000)
_number = 0;
int copy = _number;
this.Dispatcher.BeginInvoke(new Action(() => UpdateText(copy))
, System.Windows.Threading.DispatcherPriority.Background, null);
Thread.Sleep(200);
}
private void UpdateText(int number)
{
this.Text = number.ToString();
}

C# BackgroundWorker

I have a button that on click event I get some information from the network.
When I get information I parse it and add items to ListBox. All is fine, but when I do a fast double-click on button, it seems that two background workers are running and after finishing all work, items in the list are dublicated.
I want to do so that if you click button and the proccess of getting information is in work, this thread is stopping and only after first work is completed the second one is beginning.
Yes, I know about AutoResetEvent, but when I used it it helped me only one time and never more. I can't implement this situation and hope that you will help me!
Now I even try to make easier but no success :( : I added a flag field(RefreshDialogs)(default false), when the user clicks on button, if flag is true(it means that work is doing), nothing is doing, but when flag field is set to false, all is fine and we start a new proccess.
When Backgroundwork completes, I change field flag to false(it means that user can run a new proccess).
private void Message_Refresh_Click(object sender, EventArgs e)
{
if (!RefreshDialogs)
{
RefreshDialogs = true;
if (threadBackgroundDialogs.WorkerSupportsCancellation)
{
threadBackgroundDialogs.CancelAsync();
}
if (!threadBackgroundDialogs.IsBusy)
{
downloadedDialogs = 0;
threadBackgroundDialogs = new BackgroundWorker();
threadBackgroundDialogs.WorkerSupportsCancellation = true;
threadBackgroundDialogs.DoWork += LoadDialogs;
threadBackgroundDialogs.RunWorkerCompleted += ProcessCompleted;
threadBackgroundDialogs.RunWorkerAsync();
}
}
}
void ProcessCompleted(object sender, RunWorkerCompletedEventArgs e)
{
RefreshDialogs = false;
}
So you want to keep the second process running while the first works, but they shouldn't disturb each other? And after the first one finishes the second one continues?
Crude way: While loop:
if (!RefreshDialogs)
{
RefreshDialogs = true;
this becomes:
while(RefreshDialogs)
{
}
RefreshDialogs = true;
After you set it false the second process wwill jump out of the while. (Note this is extremly inefficent since both processes will be running all the time, i'm pretty sure the second one will block the first one, but with multitasking now it shouldn't, if it block use a Dispatcher.Thread)
Elegant way: Use A Semaphore
http://msdn.microsoft.com/de-de/library/system.threading.semaphore%28v=vs.80%29.aspx
If you find it impossible to have both processes running at the same time, or want another way:
Add an Array/List/int and when the second process notices there is the first process running, like with your bool, increase your Added variable, and at the end of the process, restart the new process and decrese the variable:
int number;
if (!RefreshDialogs)
{
RefreshDialogs = true;
your code;
if(number > 0)
{
number--;
restart process
}
}
else
{
number++;
}
I have to admit, i like my last proposal the most, since its highly efficent.
Make your thread blocking. That is easy;
lock(someSharedGlobalObject)
{
Do Work, Exit early if cancelled
}
This way other threads will wait until the first thread releases the lock. They will never execute simultaneously and silently wait until they can continue.
As for other options; why not disable the button when clicked and re-enable it when the backgroundworker completes. Only problem is this does not allow for cancelling the current thread. The user has to wait for it to finish. It does make any concurrency go away very easily.
How about this approach?
Create a request queue or counter which will be incremented on every button click. Every time that count is > 0. Start the background worker. When the information comes, decrement the count and check for 0. If its still > 0 restart the worker. In that your request handler becomes sequential.
In this approach you may face the problem of continuous reference of the count by two threads, for that you may use a lock unlock condition.
I hav followed this approach for my app and it works well, hope it does the same for you.
I'm not an Windows Phone expert, but as I see it has support for TPL, so following code would read nicely:
private object syncRoot =new object();
private Task latestTask;
public void EnqueueAction(System.Action action)
{
lock (syncRoot)
{
if (latestTask == null)
latestTask = Task.Factory.StartNew(action);
else
latestTask = latestTask.ContinueWith(tsk => action());
}
}
Use can use semaphores
class TheClass
{
static SemaphoreSlim _sem = new SemaphoreSlim (3);
static void Main()
{
for (int i = 1; i <= 5; i++)
new Thread (Enter).Start (i);
}
static void Enter (object name)
{
Console.WriteLine (name + " wants to enter");
_sem.Wait();
Console.WriteLine (name + " has entered!");
Thread.Sleep (1000 * (int) name );
Console.WriteLine (name + " is leaving");
_sem.Release(); }
}
}
I found the solution and thanks to #Giedrius. Flag RefreshingDialogs is set to true only when proccess is at the end, when I added items to Listbox. The reason why I'am using this flag is that state of process changes to complete when the asynchronous operation of getting content from network(HttpWebRequest, method BeginGetRequestStream) begins, but after network operaion is complete I need to make UI operations and not only them(parse content and add it to Listbox)My solution is:
private object syncRoot = new object();
private Task latestTask;
public void EnqueueAction(System.Action action)
{
lock (syncRoot)
{
if (latestTask == null)
{
downloadedDialogs = 0;
latestTask = Task.Factory.StartNew(action);
}
else if(latestTask.IsCompleted && !RefreshingDialogs)
{
RefreshingDialogs = true;
downloadedDialogs = 0;
latestTask = Task.Factory.StartNew(action);
}
}
}
private void Message_Refresh_Click(object sender, EventArgs e)
{
Action ac = new Action(LoadDialogs2);
EnqueueAction(ac);
}

Multi-threading problem when checking the list Count property

I have List newJobs. Some threads add items to that list and other thread removes items from it, if it's not empty. I have ManualResetEvent newJobEvent which is set when items are added to the list, and reset when items are removed from it:
Adding items to the list is performed in the following way:
lock(syncLock){
newJobs.Add(job);
}
newJobEvent.Set();
Jobs removal is performed in the following way:
if (newJobs.Count==0)
newJobEvent.WaitOne();
lock(syncLock){
job = newJobs.First();
newJobs.Remove(job);
/*do some processing*/
}
newJobEvent.Reset();
When the line
job=newJobs.First()
is executed I sometimes get an exception that the list is empty. I guess that the check:
if (newJobs.Count==0)
newJobEvent.WaitOne();
should also be in the lock statement but I'm afraid of deadlocks on the line newJobEvent.WaitOne();
How can I solve it?
Many thanks and sorry for the long post!
You are right. Calling WaitOne inside a lock could lead to a deadlock. And the check to see if the list is empty needs to be done inside the lock otherwise there could be a race with another thread trying to remove an item. Now, your code looks suspiciously like the producer-consumer pattern which is usually implemented with a blocking queue. If you are using .NET 4.0 then you can take advantage of the BlockingCollection class.
However, let me go over a couple of ways you can do it youself. The first uses a List and a ManualResetEvent to demonstrate how this could be done using the data structures in your question. Notice the use of a while loop in the Take method.
public class BlockingJobsCollection
{
private List<Job> m_List = new List<Job>();
private ManualResetEvent m_Signal = new ManualResetEvent(false);
public void Add(Job item)
{
lock (m_List)
{
m_List.Add(item);
m_Signal.Set();
}
}
public Job Take()
{
while (true)
{
lock (m_List)
{
if (m_List.Count > 0)
{
Job item = m_List.First();
m_List.Remove(item);
if (m_List.Count == 0)
{
m_Signal.Reset();
}
return item;
}
}
m_Signal.WaitOne();
}
}
}
But this not how I would do it. I would go with the simplier solution below with uses Monitor.Wait and Monitor.Pulse. Monitor.Wait is useful because it can be called inside a lock. In fact, it is suppose to be done that way.
public class BlockingJobsCollection
{
private Queue<Job> m_Queue = new Queue<Job>();
public void Add(Job item)
{
lock (m_Queue)
{
m_Queue.Enqueue(item);
Monitor.Pulse(m_Queue);
}
}
public Job Take()
{
lock (m_Queue)
{
while (m_Queue.Count == 0)
{
Monitor.Wait(m_Queue);
}
return m_Queue.Dequeue();
}
}
}
Not answering your question, but if you are using .NET framework 4, you can use the new ConcurrentQueue which does all the locking for you.
Regarding your question:
One scenario that I can think of causing such a problem is the following:
The insertion thread enters the lock, calls newJob.Add, leaves the lock.
Context switch to the removal thread. It checks for emptyness, sees an item, enters the locked area, removes the item, resets the event - which hasn't even been set yet.
Context switch back to the insertion thread, the event is set.
Context switch back to the removal thread. It checks for emptyness, sees no items, waits for the event - which is already set, trys to get the first item... Bang!
Set and reset the event inside the lock and you should be fine.
I don't see why object removal in case of zero objects should wait for one to be added and then remove it. It looks to be being against logic.

Categories