Busy Indicator for multiple async calls - c#

I have a Busy property that is set to true before an async call is made, then set to false when finished. Now I have 2 async calls, how do I handle this logic? Do I need to lock variables or some other parallel problems I need to look out for?
private bool _busy;
public bool Busy
{
get { return _busy; }
set
{
bool changed = value != _busy;
_busy = value;
if (changed) RaisePropertyChanged("Busy");
}
}
private void loadUser(int userId)
{
Busy = true;
api.GetUser(userId, CancellationToken.None).ContinueWith(t =>
Deployment.Current.Dispatcher.BeginInvoke(() =>
{
Busy = false;
}));
}
private void loadOtherData(int dataId)
{
Busy = true;
api.GetData(dataId, CancellationToken.None).ContinueWith(t =>
Deployment.Current.Dispatcher.BeginInvoke(() =>
{
Busy = false;
}));
}
I know this logic is flawed due to the Busy property is set to false on the first method that finishes execution. An idea I have is to use 2 more fields; isUserLoading and isOtherDataLoading and ensure both are false before setting Busy to false.
I'd like to know if there is a better way of accomplishing this.

If you have two booleans, _isUserLoading and _isOtherDataLoading, that you update in your load methods, then you can just change Busy to this:
public bool busy
{
get
{
return _isUserLoading || _isOtherDataLoading;
}
}
Another version including the call to RaisePropertyChanged could work like this:
public bool busy
{
get
{
return _isUserLoading || _isOtherDataLoading;
}
}
public bool IsUserLoading
{
get
{
return _isUserLoading;
}
set
{
bool busy = Busy;
_isUserLoading = value;
if (busy != Busy) RaisePropertyChanged("Busy");
}
}
And of course a similar property for IsOtherDataLoading.

Ideally you want to expose an observable property/event from the API stating when it's busy. Asuming you can't make any modifications, I'll suggest you go with a more generic approach. Something like
class CountedContext {
int workersCount = 0;
Action<bool> notifier;
public CountedContext(Action<bool> notifier) { this.notifier = notifier; }
public Task<TResult> Execte<TResult>(Func<Task<TResult>> func)
{
lock(worksersCount)
{
workersCount++;
if (workdersCount == 1)
notifier(true);
}
var result = func();
result.ContinueWith(_ =>
{
lock (worksersCount)
{
workersCount--;
if (worksersCount == 0){
notifier(false);
}
}
});
return result;
}
}
In your main class you can use:
// constructor:
countedContext = new CountedContext(state =>
{
...BeginInvoke(() =>
{
Busy = state;
});
});
...
// loadData
countedContext.Execute(() => api.GetData());

Related

Upgrading Action-based transaction manager to support async/await

I have an Action-based UndoRedoTransactionManager that looks something like this:
public interface IUndoRedoTransactionManager
{
bool CanDo { get; }
bool CanUndo { get; }
void ClearHistory();
bool Do(Action doAction, Action undoAction, string description);
bool Redo();
void RedoAll();
bool Undo();
void UndoAll();
}
which is used throughout my codebase like this:
var done = false;
_manager.Do(() =>
{
Thread.Sleep(100);
SomeUiWork();
done = true;
}, () =>
{
Thread.Sleep(100);
UndoSomeUiWork();
done = false;
}, "Description");
I am working on some new code in which I want to use my UndoRedoTransactionManager like this:
var done = false;
_manager.Do(async () =>
{
await Task.Delay(100);
SomeUiWork();
done = true;
}, async () =>
{
await Task.Delay(100);
UndoSomeUiWork();
done = false;
}, "Description");
From what I understand, this new code creates async voids which should be avoided when possible. It also wouldn't guarantee that it runs without blocking.
I've read a bunch of blogs but I couldn't understand them well-enough to convert their information to my situation (I am new to async programming, and am probably doing a lot of things wrong).
My question is: How can I expand my Action-based UndoRedoTransactionManager to also support async/await? or How can I best tackle my issue?
I noticed that I could add a new method to my interface:
Task<bool> Do(Func<Task> action, Func<Task> undoAction, string description);
And the compiler will automatically pick this one when using async lambdas (and tell me to await it), however I then run into interchangeability issues.
My full UndoRedoTransactionManager implementation:
public class UndoRedoTransactionManager : IUndoRedoTransactionManager
{
private readonly Stack<UndoRedoTransaction> _done = new Stack<UndoRedoTransaction>();
private readonly Stack<UndoRedoTransaction> _unDone = new Stack<UndoRedoTransaction>();
public bool CanDo => _unDone.Count > 0;
public bool CanUndo => _done.Count > 0;
public void ClearHistory()
{
_done.Clear();
_unDone.Clear();
}
public bool Do(Action action, Action undoAction, string description)
{
var trans = new UndoRedoTransaction(action, undoAction, description);
_unDone.Clear();
trans.Do();
_done.Push(trans);
return true;
}
public bool Redo()
{
if (_unDone.Count == 0) return false;
var act = _unDone.Pop();
act.Do();
_done.Push(act);
return true;
}
public void RedoAll()
{
while (Redo())
{
}
}
public bool Undo()
{
if (_done.Count == 0) return false;
var act = _done.Pop();
act.Undo();
_unDone.Push(act);
return true;
}
public void UndoAll()
{
while (Undo())
{
}
}
private class UndoRedoTransaction
{
private readonly Action _doAction;
private readonly Action _undoAction;
public string Description { get; }
public UndoRedoTransaction(Action doAction, Action undoAction, string description)
{
_doAction = doAction;
_undoAction = undoAction;
Description = description;
}
public void Do()
{
_doAction();
}
public void Undo()
{
_undoAction();
}
}
}

Multithreading: How to check if static class is busy

I have a static class and it has a static function IsDataCorrect() which does a http request.
The function can be called from multiple threads at the same time, and I want to let the first thread doing the request, and the others should be rejected (meaning they should get false as return value, they should not just be blocked!) until half a second after the first thread finished the request.
After that, the next winning thread should be able to do the next request, others should be rejected, and so on.
This is my approach, could someone please confirm if that is reasonable:
static class MyClass
{
private static bool IsBusy = false;
private static object lockObject = new object();
public static bool IsDataCorrect(string testString)
{
lock (lockObject)
{
if (IsBusy) return false;
IsBusy = true;
}
var uri = $"https://something.com";
bool htmlCheck = GetDocFromUri(uri, 2);
var t = new Thread(WaitBeforeFree);
t.Start();
//Fast Evaluations
//...
return htmlCheck;
}
private static void WaitBeforeFree()
{
Thread.Sleep(500);
IsBusy = false;
}
}
Your threads accessing the function would still be serialized in access for checking IsBusy flag, since only one thread at a time would be able to check it due to synchronization on lockObject. Instead, you can simply attempt to get a lock, and consequently, you don't need a flag since the lock itself will serve as the lock. Second, I would replace launching of new thread every time just to sleep and reset the flag, and replace it with a check on DateTime field.
static class MyClass
{
private static DateTime NextEntry = DateTime.Now;
private static ReaderWriterLockSlim timeLock = new ReaderWriterLockSlim();
private static object lockObject = new object();
public static bool IsDataCorrect(string testString)
{
bool tryEnterSuccess = false;
try
{
try
{
timeLock.EnterReadLock()
if (DateTime.Now < NextEntry) return false;
}
finally
{
timeLock.ExitReadLock()
}
Monitor.TryEnter(lockObject, ref tryEnterSuccess);
if (!tryEnterSuccess) return false;
var uri = $"https://something.com";
bool htmlCheck = GetDocFromUri(uri, 2);
//Fast Evaluations
//...
try
{
timeLock.EnterWriteLock()
NextEntry = DateTime.Now.AddMilliseconds(500);
} finally {
timeLock.ExitWriteLock()
}
return htmlCheck;
} finally {
if (tryEnterSuccess) Monitor.Exit(lockObject);
}
}
}
More efficient this way for not launching new threads, DateTime access is safe and yet concurrent so threads only stop when absolutely have to. Otherwise, everything keeps moving along with minimal resource usage.
I see you guys solved the problem correctly, but I think that there is still room to make it correct, efficient and simple in same time:).
How about this way?
EDIT: Edit to make calming easier and part of the example.
public static class ConcurrentCoordinationExtension
{
private static int _executing = 0;
public static bool TryExecuteSequentially(this Action actionToExecute)
{
// compate _executing with zero, if zero, set 1,
// return original value as result,
// successfull entry then result is zero, non zero returned, then somebody is executing
if (Interlocked.CompareExchange(ref _executing, 1, 0) != 0) return false;
try
{
actionToExecute.Invoke();
return true;
}
finally
{
Interlocked.Exchange(ref _executing, 0);//
}
}
public static bool TryExecuteSequentially(this Func<bool> actionToExecute)
{
// compate _executing with zero, if zero, set 1,
// return original value as result,
// successfull entry then result is zero, non zero returned, then somebody is executing
if (Interlocked.CompareExchange(ref _executing, 1, 0) != 0) return false;
try
{
return actionToExecute.Invoke();
}
finally
{
Interlocked.Exchange(ref _executing, 0);//
}
}
}
class Program
{
static void Main(string[] args)
{
DateTime last = DateTime.MinValue;
Func<bool> operation= () =>
{
//calming condition was not meant
if (DateTime.UtcNow - last < TimeSpan.FromMilliseconds(500)) return false;
last = DateTime.UtcNow;
//some stuff you want to process sequentially
return true;
};
operation.TryExecuteSequentially();
}
}

How to buffer a burst of events into fewer resulting actions

I want to reduce multiple events into a single delayed action. After some trigger occurs I expect some more similar triggers to come, but I prefer not to repeat the resulting delayed action. The action waits, to give a chance of completion to the burst.
The question: How can I do it in an elegant reusable way?
Till now I used a property to flag the event and trigger a delayed action like below:
public void SomeMethod()
{
SomeFlag = true; //this will intentionally return to the caller before completing the resulting buffered actions.
}
private bool someFlag;
public bool SomeFlag
{
get { return someFlag; }
set
{
if (someFlag != value)
{
someFlag = value;
if (value)
SomeDelayedMethod(5000);
}
}
}
public async void SomeDelayedMethod(int delay)
{
//some bufferred work.
await Task.Delay(delay);
SomeFlag = false;
}
below is a shorter way, but still not generic or reusable... I want something concise that packages the actions and the flag, and keeps the functionality (returning to the caller before execution is complete (like today)). I also need to be able to pass an object reference to this action)
public void SerializeAccountsToConfig()
{
if (!alreadyFlagged)
{
alreadyFlagged = true;
SerializeDelayed(5000, Serialize);
}
}
public async void SerializeDelayed(int delay, Action whatToDo)
{
await Task.Delay(delay);
whatToDo();
}
private bool alreadyFlagged;
private void Serialize()
{
//some buferred work.
//string json = JsonConvert.SerializeObject(Accounts, Formatting.Indented);
//Settings1.Default.Accounts = json;
//Settings1.Default.Save();
alreadyFlagged = false;
}
Here's a thread-safe and reusable solution.
You can create an instance of DelayedSingleAction, and in the constructor you pass the action that you want to have performed. I believe this is thread safe, though there is a tiny risk that it will restart the timer just before commencing the action, but I think that risk would exist no matter what the solution is.
public class DelayedSingleAction
{
private readonly Action _action;
private readonly long _millisecondsDelay;
private long _syncValue = 1;
public DelayedSingleAction(Action action, long millisecondsDelay)
{
_action = action;
_millisecondsDelay = millisecondsDelay;
}
private Task _waitingTask = null;
private void DoActionAndClearTask(Task _)
{
Interlocked.Exchange(ref _syncValue, 1);
_action();
}
public void PerformAction()
{
if (Interlocked.Exchange(ref _syncValue, 0) == 1)
{
_waitingTask = Task.Delay(TimeSpan.FromMilliseconds(_millisecondsDelay))
.ContinueWith(DoActionAndClearTask);
}
}
public Task Complete()
{
return _waitingTask ?? Task.FromResult(0);
}
}
See this dotnetfiddle for an example which invokes one action continuously from multiple threads.
https://dotnetfiddle.net/el14wZ
Since you're interested in RX here simple console app sample:
static void Main(string[] args)
{
// event source
var burstEvents = Observable.Interval(TimeSpan.FromMilliseconds(50));
var subscription = burstEvents
.Buffer(TimeSpan.FromSeconds(3)) // collect events 3 seconds
//.Buffer(50) // or collect 50 events
.Subscribe(events =>
{
//Console.WriteLine(events.First()); // take only first event
// or process event collection
foreach (var e in events)
Console.Write(e + " ");
Console.WriteLine();
});
Console.ReadLine();
return;
}
Based on the solution proposed by Andrew, here is a more generic solution.
Declaration and instance creation of the delayed action:
public DelayedSingleAction<Account> SendMailD;
Create the instance inside a function or in the constructor (this can be a collection of such actions each working on a different object):
SendMailD = new DelayedSingleAction<Account>(SendMail, AccountRef, 5000);
repeatedly call this action
SendMailD.PerformAction();
Send mail is the action you will "burst control". Its signature matches :
public int SendMail(Account A)
{}
Here is the updated class
public class DelayedSingleAction<T>
{
private readonly Func<T, int> actionOnObj;
private T tInstance;
private readonly long millisecondsDelay;
private long _syncValue = 1;
public DelayedSingleAction(Func<T, int> ActionOnObj, T TInstance, long MillisecondsDelay)
{
actionOnObj = ActionOnObj;
tInstance = TInstance;
millisecondsDelay = MillisecondsDelay;
}
private Task _waitingTask = null;
private void DoActionAndClearTask(Task _)
{
Console.WriteLine(string.Format("{0:h:mm:ss.fff} DelayedSingleAction Resetting SyncObject: Thread {1} for {2}", DateTime.Now, System.Threading.Thread.CurrentThread.ManagedThreadId, tInstance));
Interlocked.Exchange(ref _syncValue, 1);
actionOnObj(tInstance);
}
public void PerformAction()
{
if (Interlocked.Exchange(ref _syncValue, 0) == 1)
{
Console.WriteLine(string.Format("{0:h:mm:ss.fff} DelayedSingleAction Starting the timer: Thread {1} for {2}", DateTime.Now, System.Threading.Thread.CurrentThread.ManagedThreadId, tInstance));
_waitingTask = Task.Delay(TimeSpan.FromMilliseconds(millisecondsDelay)).ContinueWith(DoActionAndClearTask);
}
}
public Task Complete()
{
return _waitingTask ?? Task.FromResult(0);
}
}

Threading Barrier Issue

I have a situation where I need one thread to wait until another thread provides it with data. I've created this class :Propery Highlighted,Syntax Highlighted Version
public class AsyncObjectWait<T1, TriggerType>
{
private void _timeout(TimeSpan timeout, TaskCompletionSource<TriggerType> tcs)
{
System.Threading.Thread.Sleep(timeout);
lock (tcs)
{
if (!tcs.TrySetException(new TimeoutException("Wait Timed Out")))
{
int a = 0;
}
else
{
Console.WriteLine("Timed Out");
}
}
}
public Task<TriggerType> WaitOneAsync(TimeSpan timeout)
{
TaskCompletionSource<TriggerType> wait = new TaskCompletionSource<TriggerType>();
lock (_waits)
{
_waits.Enqueue(wait);
Task.Run(() =>
{
_timeout(timeout, wait);
});
}
return wait.Task;
}
public bool TrySetOne(TriggerType trigger, Converter<TriggerType, T1> converter)
{
TaskCompletionSource<TriggerType> wait;
bool set = false;
lock (_waits)
{
while ((!set) && (!(_waits.Count == 0))) //while we havent completed a task, and as long as we have some more to check.
{
; //get the next wait.
lock (_waits.Peek()) //wait for any other threads to stop using the next wait it, then get exclusive access.
{
wait = _waits.Peek();
if (EqualityComparer<T1>.Default.Equals(converter(trigger), _value))
{
set = wait.TrySetResult(trigger); //try and set the result to complete the task. if we cant, try the next task
_waits.Dequeue();
if (!set)
{
continue;
}
else
{
return true;
}
}
else
{
return false;
}
}
}
return false;
}
}
}
To do this, as well as provide a Task-based interface that doesn't freeze my UI while it's waiting.
It seems to have an issue, though. When the Recieve() method is called, it can take up to a second for the task generated by WaitFor to recieve the completion info. I haven't been able to find a reason for this.
Could this be due to async method overhead? Is there another, less complicated way to do this?

Multithreading BlockingCollection Alternatives to GetConsumingEnumerable() Producer-Consumer

I have a situation where I have multiple producers and multiple consumers. The producers enters a job into a queue. I chose the BlockingCollection and it works great since I need the consumers to wait for a job to be found. However, if I use the GetConsumingEnumerable() feature the order of the items in the collection change... this is not what I need.
It even says in MSDN http://msdn.microsoft.com/en-us/library/dd287186.aspx
that it does not preserve the order of the items.
Does anyone know an alternative for this situation?
I see that the Take method is available but does it also provide a 'wait' condition for the consumer threads?
It says http://msdn.microsoft.com/en-us/library/dd287085.aspx
'A call to Take may block until an item is available to be removed.' Is it better to use TryTake? I really need the thread to wait and keep checking for a job.
Take blocks the thread till something comes available.
TryTake as the name implies tries to do so but returns a bool if it fails or succeeds.
Allowing for more flex using it:
while(goingOn){
if( q.TryTake(out var){
Process(var)
}
else{
DoSomething_Usefull_OrNotUseFull_OrEvenSleep();
}
}
instead of
while(goingOn){
if( var x = q.Take(){
//w'll wait till this ever will happen and then we:
Process(var)
}
}
My votes are for TryTake :-)
EXAMPLE:
public class ProducerConsumer<T> {
public struct Message {
public T Data;
}
private readonly ThreadRunner _producer;
private readonly ThreadRunner _consumer;
public ProducerConsumer(Func<T> produce, Action<T> consume) {
var q = new BlockingCollection<Message>();
_producer = new Producer(produce,q);
_consumer = new Consumer(consume,q);
}
public void Start() {
_producer.Run();
_consumer.Run();
}
public void Stop() {
_producer.Stop();
_consumer.Stop();
}
private class Producer : ThreadRunner {
public Producer(Func<T> produce, BlockingCollection<Message> q) : base(q) {
_produce = produce;
}
private readonly Func<T> _produce;
public override void Worker() {
try {
while (KeepRunning) {
var item = _produce();
MessageQ.TryAdd(new Message{Data = item});
}
}
catch (ThreadInterruptedException) {
WasInterrupted = true;
}
}
}
public abstract class ThreadRunner {
protected readonly BlockingCollection<Message> MessageQ;
protected ThreadRunner(BlockingCollection<Message> q) {
MessageQ = q;
}
protected Thread Runner;
protected bool KeepRunning = true;
public bool WasInterrupted;
public abstract void Worker();
public void Run() {
Runner = new Thread(Worker);
Runner.Start();
}
public void Stop() {
KeepRunning = false;
Runner.Interrupt();
Runner.Join();
}
}
class Consumer : ThreadRunner {
private readonly Action<T> _consume;
public Consumer(Action<T> consume,BlockingCollection<Message> q) : base(q) {
_consume = consume;
}
public override void Worker() {
try {
while (KeepRunning) {
Message message;
if (MessageQ.TryTake(out message, TimeSpan.FromMilliseconds(100))) {
_consume(message.Data);
}
else {
//There's nothing in the Q so I have some spare time...
//Excellent moment to update my statisics or update some history to logfiles
//for now we sleep:
Thread.Sleep(TimeSpan.FromMilliseconds(100));
}
}
}
catch (ThreadInterruptedException) {
WasInterrupted = true;
}
}
}
}
}
USAGE:
[Fact]
public void ConsumerShouldConsume() {
var produced = 0;
var consumed = 0;
Func<int> produce = () => {
Thread.Sleep(TimeSpan.FromMilliseconds(100));
produced++;
return new Random(2).Next(1000);
};
Action<int> consume = c => { consumed++; };
var t = new ProducerConsumer<int>(produce, consume);
t.Start();
Thread.Sleep(TimeSpan.FromSeconds(5));
t.Stop();
Assert.InRange(produced,40,60);
Assert.InRange(consumed, 40, 60);
}

Categories