Handle exceptions with TPL Dataflow blocks - c#

I have a simple tpl data flow which basically does some tasks.
I noticed when there is an exception in any of the datablocks, it wasn't getting caught in the initial parent block caller.
I have added some manual code to check for exception but doesn't seem the right approach.
if (readBlock.Completion.Exception != null
|| saveBlockJoinedProcess.Completion.Exception != null
|| processBlock1.Completion.Exception != null
|| processBlock2.Completion.Exception != null)
{
throw readBlock.Completion.Exception;
}
I had a look online to see what's a suggested approach but didn't see anything obvious.
So I created some sample code below and was hoping to get some guidance on a better solution:
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading;
using System.Threading.Tasks;
using System.Threading.Tasks.Dataflow;
namespace TPLDataflow
{
class Program
{
static void Main(string[] args)
{
try
{
//ProcessB();
ProcessA();
}
catch (Exception e)
{
Console.WriteLine("Exception in Process!");
throw new Exception($"exception:{e}");
}
Console.WriteLine("Processing complete!");
Console.ReadLine();
}
private static void ProcessB()
{
Task.WhenAll(Task.Run(() => DoSomething(1, "ProcessB"))).Wait();
}
private static void ProcessA()
{
var random = new Random();
var readBlock = new TransformBlock<int, int>(x =>
{
try { return DoSomething(x, "readBlock"); }
catch (Exception e) { throw e; }
}); //1
var braodcastBlock = new BroadcastBlock<int>(i => i); // ⬅ Here
var processBlock1 = new TransformBlock<int, int>(x =>
DoSomethingAsync(5, "processBlock1")); //2
var processBlock2 = new TransformBlock<int, int>(x =>
DoSomethingAsync(2, "processBlock2")); //3
//var saveBlock =
// new ActionBlock<int>(
// x => Save(x)); //4
var saveBlockJoinedProcess =
new ActionBlock<Tuple<int, int>>(
x => SaveJoined(x.Item1, x.Item2)); //4
var saveBlockJoin = new JoinBlock<int, int>();
readBlock.LinkTo(braodcastBlock, new DataflowLinkOptions
{ PropagateCompletion = true });
braodcastBlock.LinkTo(processBlock1,
new DataflowLinkOptions { PropagateCompletion = true }); //5
braodcastBlock.LinkTo(processBlock2,
new DataflowLinkOptions { PropagateCompletion = true }); //6
processBlock1.LinkTo(
saveBlockJoin.Target1); //7
processBlock2.LinkTo(
saveBlockJoin.Target2); //8
saveBlockJoin.LinkTo(saveBlockJoinedProcess,
new DataflowLinkOptions { PropagateCompletion = true });
readBlock.Post(1); //10
//readBlock.Post(2); //10
Task.WhenAll(processBlock1.Completion,processBlock2.Completion)
.ContinueWith(_ => saveBlockJoin.Complete());
readBlock.Complete(); //12
saveBlockJoinedProcess.Completion.Wait(); //13
if (readBlock.Completion.Exception != null
|| saveBlockJoinedProcess.Completion.Exception != null
|| processBlock1.Completion.Exception != null
|| processBlock2.Completion.Exception != null)
{
throw readBlock.Completion.Exception;
}
}
private static int DoSomething(int i, string method)
{
Console.WriteLine($"Do Something, callng method : { method}");
throw new Exception("Fake Exception!");
return i;
}
private static async Task<int> DoSomethingAsync(int i, string method)
{
Console.WriteLine($"Do SomethingAsync");
throw new Exception("Fake Exception!");
await Task.Delay(new TimeSpan(0, 0, i));
Console.WriteLine($"Do Something : {i}, callng method : { method}");
return i;
}
private static void Save(int x)
{
Console.WriteLine("Save!");
}
private static void SaveJoined(int x, int y)
{
Thread.Sleep(new TimeSpan(0, 0, 10));
Console.WriteLine("Save Joined!");
}
}
}

I had a look online to see what's a suggested approach but didn't see anything obvious.
If you have a pipeline (more or less), then the common approach is to use PropagateCompletion to shut down the pipe. If you have more complex topologies, then you would need to complete blocks by hand.
In your case, you have an attempted propagation here:
Task.WhenAll(
processBlock1.Completion,
processBlock2.Completion)
.ContinueWith(_ => saveBlockJoin.Complete());
But this code will not propagate exceptions. When both processBlock1.Completion and processBlock2.Completion complete, saveBlockJoin is completed successfully.
A better solution would be to use await instead of ContinueWith:
async Task PropagateToSaveBlockJoin()
{
try
{
await Task.WhenAll(processBlock1.Completion, processBlock2.Completion);
saveBlockJoin.Complete();
}
catch (Exception ex)
{
((IDataflowBlock)saveBlockJoin).Fault(ex);
}
}
_ = PropagateToSaveBlockJoin();
Using await encourages you to handle exceptions, which you can do by passing them to Fault to propagate the exception.

Propagating errors backward in the pipeline is not supported in the TPL Dataflow out of the box, which is especially annoying when the blocks have a bounded capacity. In this case an error in a block downstream may cause the blocks in front of it to block indefinitely. The only solution I know is to use the cancellation feature, and cancel all blocks in case anyone fails. Here is how it can be done. First create a CancellationTokenSource:
var cts = new CancellationTokenSource();
Then create the blocks one by one, embedding the same CancellationToken in the options of all of them:
var options = new ExecutionDataflowBlockOptions()
{ BoundedCapacity = 10, CancellationToken = cts.Token };
var block1 = new TransformBlock<double, double>(Math.Sqrt, options);
var block2 = new ActionBlock<double>(Console.WriteLine, options);
Then link the blocks together, including the PropagateCompletion setting:
block1.LinkTo(block2, new DataflowLinkOptions { PropagateCompletion = true });
Finally use an extension method to trigger the cancellation of the CancellationTokenSource in case of an exception:
block1.OnFaultedCancel(cts);
block2.OnFaultedCancel(cts);
The OnFaultedCancel extension method is shown below:
public static class DataflowExtensions
{
public static void OnFaultedCancel(this IDataflowBlock dataflowBlock,
CancellationTokenSource cts)
{
dataflowBlock.Completion.ContinueWith(_ => cts.Cancel(), default,
TaskContinuationOptions.OnlyOnFaulted |
TaskContinuationOptions.ExecuteSynchronously, TaskScheduler.Default);
}
}

at the first look, if have only some minor points (not looking at your architecture). it seems to me that you have mixed some newer and some older constructs. and there are some code parts which are unnecessary.
for example:
private static void ProcessB()
{
Task.WhenAll(Task.Run(() => DoSomething(1, "ProcessB"))).Wait();
}
using the Wait()-method, if any exceptions happen, they will be wrapped in a System.AggregateException. in my opinion, this is better:
private static async Task ProcessBAsync()
{
await Task.Run(() => DoSomething(1, "ProcessB"));
}
using async-await, if an exception occurs, the await statement rethrows the first exception which is wrapped in the System.AggregateException. This allows you to try-catch for concrete exception types and handle only cases you really can handle.
another thing is this part of your code:
private static void ProcessA()
{
var random = new Random();
var readBlock = new TransformBlock<int, int>(
x =>
{
try { return DoSomething(x, "readBlock"); }
catch (Exception e)
{
throw e;
}
},
new ExecutionDataflowBlockOptions { MaxDegreeOfParallelism = 1 }); //1
Why catch an exception only to rethrow it? in this case, the try-catch is redundant.
And this here:
private static void SaveJoined(int x, int y)
{
Thread.Sleep(new TimeSpan(0, 0, 10));
Console.WriteLine("Save Joined!");
}
It is much better to use await Task.Delay(....). Using Task.Delay(...), your application will not freeze.

Related

C# Parallel Foreach + Async

I'm processing a list of items (200k - 300k), each item processing time is between 2 to 8 seconds. To gain time, I can process this list in parallel. As I'm in an async context, I use something like this :
public async Task<List<Keyword>> DoWord(List<string> keyword)
{
ConcurrentBag<Keyword> keywordResults = new ConcurrentBag<Keyword>();
if (keyword.Count > 0)
{
try
{
var tasks = keyword.Select(async kw =>
{
return await Work(kw).ConfigureAwait(false);
});
keywordResults = new ConcurrentBag<Keyword>(await Task.WhenAll(tasks).ConfigureAwait(false));
}
catch (AggregateException ae)
{
foreach (Exception innerEx in ae.InnerExceptions)
{
log.ErrorFormat("Core threads exception: {0}", innerEx);
}
}
}
return keywordResults.ToList();
}
The keyword list contains always 8 elements (comming from above) thus I process my list 8 by 8 but, in this case, I guess that if 7 keywords are processed in 3 secs and the 8th is processed in 10 secs, the total time for the 8 keywords will be 10 (correct me if i'm wrong).
How Can I approach from the Parallel.Foreach then? I mean : launch 8 keywords if 1 of them is done, launch 1 more. In this case I'll have 8 working processes permanently. Any idea ?
Another more easier way to do this is to use the AsyncEnumerator NuGet Package:
using System.Collections.Async;
public async Task<List<Keyword>> DoWord(List<string> keywords)
{
var keywordResults = new ConcurrentBag<Keyword>();
await keywords.ParallelForEachAsync(async keyword =>
{
try
{
var result = await Work(keyword);
keywordResults.Add(result);
}
catch (AggregateException ae)
{
foreach (Exception innerEx in ae.InnerExceptions)
{
log.ErrorFormat("Core threads exception: {0}", innerEx);
}
}
}, maxDegreeOfParallelism: 8);
return keywordResults.ToList();
}
Here's some sample code showing how you could approach this using TPL Dataflow.
Note that in order to compile this, you will need to add TPL Dataflow to your project via NuGet.
using System;
using System.Collections.Generic;
using System.Linq;
using System.Threading.Tasks;
using System.Threading.Tasks.Dataflow;
namespace Demo
{
class Keyword // Dummy test class.
{
public string Name;
}
class Program
{
static void Main()
{
// Dummy test data.
var keywords = Enumerable.Range(1, 100).Select(n => n.ToString()).ToList();
var result = DoWork(keywords).Result;
Console.WriteLine("---------------------------------");
foreach (var item in result)
Console.WriteLine(item.Name);
}
public static async Task<List<Keyword>> DoWork(List<string> keywords)
{
var input = new TransformBlock<string, Keyword>
(
async s => await Work(s),
// This is where you specify the max number of threads to use.
new ExecutionDataflowBlockOptions { MaxDegreeOfParallelism = 8 }
);
var result = new List<Keyword>();
var output = new ActionBlock<Keyword>
(
item => result.Add(item), // Output only 1 item at a time, because 'result.Add()' is not threadsafe.
new ExecutionDataflowBlockOptions { MaxDegreeOfParallelism = 1 }
);
input.LinkTo(output, new DataflowLinkOptions { PropagateCompletion = true });
foreach (string s in keywords)
await input.SendAsync(s);
input.Complete();
await output.Completion;
return result;
}
public static async Task<Keyword> Work(string s) // Stubbed test method.
{
Console.WriteLine("Processing " + s);
int delay;
lock (rng) { delay = rng.Next(10, 1000); }
await Task.Delay(delay); // Simulate load.
Console.WriteLine("Completed " + s);
return await Task.Run( () => new Keyword { Name = s });
}
static Random rng = new Random();
}
}

cancel async task if running

I have the following method called on several occasions (e.g onkeyup of textbox) which asynchronously filters items in listbox.
private async void filterCats(string category,bool deselect)
{
List<Category> tempList = new List<Category>();
//Wait for categories
var tokenSource = new CancellationTokenSource();
var token = tokenSource.Token;
//HERE,CANCEL TASK IF ALREADY RUNNING
tempList= await _filterCats(category,token);
//Show results
CAT_lb_Cats.DataSource = tempList;
CAT_lb_Cats.DisplayMember = "strCategory";
CAT_lb_Cats.ValueMember = "idCategory";
}
and the following task
private async Task<List<Category>> _filterCats(string category,CancellationToken token)
{
List<Category> result = await Task.Run(() =>
{
return getCatsByStr(category);
},token);
return result;
}
and I would like to test whether the task is already runing and if so cancel it and start it with the new value. I know how to cancel task, but how can I check whether it is already running?
This is the code that I use to do this :
if (_tokenSource != null)
{
_tokenSource.Cancel();
}
_tokenSource = new CancellationTokenSource();
try
{
await loadPrestatieAsync(_bedrijfid, _projectid, _medewerkerid, _prestatieid, _startDate, _endDate, _tokenSource.Token);
}
catch (OperationCanceledException ex)
{
}
and for the procedure call it is like this (simplified of course) :
private async Task loadPrestatieAsync(int bedrijfId, int projectid, int medewerkerid, int prestatieid,
DateTime? startDate, DateTime? endDate, CancellationToken token)
{
await Task.Delay(100, token).ConfigureAwait(true);
try{
//do stuff
token.ThrowIfCancellationRequested();
}
catch (OperationCanceledException ex)
{
throw;
}
catch (Exception Ex)
{
throw;
}
}
I am doing a delay of 100 ms because the same action is triggered rather quickly and repeatedly, a small postpone of 100 ms makes it look like the GUI is more responsive actually.
It appears you are looking for a way to get an "autocomplete list" from text entered in a text box, where an ongoing async search is canceled when the text has changed since the search was started.
As was mentioned in the comments, Rx (Reactive Extensions), provides very nice patterns for this, allowing you to easily connect your UI elements to cancellable asynchronous tasks, building in retry logic, etc.
The less than 90 line program below, shows a "full UI" sample (unfortunately excluding any cats ;-). It includes some reporting on the search status.
I have created this using a number of static methods in the RxAutoComplete class, to show how to this is achieved in small documented steps, and how they can be combined, to achieve a more complex task.
namespace TryOuts
{
using System;
using System.Linq;
using System.Threading.Tasks;
using System.Windows.Forms;
using System.Reactive.Linq;
using System.Threading;
// Simulated async search service, that can fail.
public class FakeWordSearchService
{
private static Random _rnd = new Random();
private static string[] _allWords = new[] {
"gideon", "gabby", "joan", "jessica", "bob", "bill", "sam", "johann"
};
public async Task<string[]> Search(string searchTerm, CancellationToken cancelToken)
{
await Task.Delay(_rnd.Next(600), cancelToken); // simulate async call.
if ((_rnd.Next() % 5) == 0) // every 5 times, we will cause a search failure
throw new Exception(string.Format("Search for '{0}' failed on purpose", searchTerm));
return _allWords.Where(w => w.StartsWith(searchTerm)).ToArray();
}
}
public static class RxAutoComplete
{
// Returns an observable that pushes the 'txt' TextBox text when it has changed.
static IObservable<string> TextChanged(TextBox txt)
{
return from evt in Observable.FromEventPattern<EventHandler, EventArgs>(
h => txt.TextChanged += h,
h => txt.TextChanged -= h)
select ((TextBox)evt.Sender).Text.Trim();
}
// Throttles the source.
static IObservable<string> ThrottleInput(IObservable<string> source, int minTextLength, TimeSpan throttle)
{
return source
.Where(t => t.Length >= minTextLength) // Wait until we have at least 'minTextLength' characters
.Throttle(throttle) // We don't start when the user is still typing
.DistinctUntilChanged(); // We only fire, if after throttling the text is different from before.
}
// Provides search results and performs asynchronous,
// cancellable search with automatic retries on errors
static IObservable<string[]> PerformSearch(IObservable<string> source, FakeWordSearchService searchService)
{
return from term in source // term from throttled input
from result in Observable.FromAsync(async token => await searchService.Search(term, token))
.Retry(3) // Perform up to 3 tries on failure
.TakeUntil(source) // Cancel pending request if new search request was made.
select result;
}
// Putting it all together.
public static void RunUI()
{
// Our simple search GUI.
var inputTextBox = new TextBox() { Width = 300 };
var searchResultLB = new ListBox { Top = inputTextBox.Height + 10, Width = inputTextBox.Width };
var searchStatus = new Label { Top = searchResultLB.Height + 30, Width = inputTextBox.Width };
var mainForm = new Form { Controls = { inputTextBox, searchResultLB, searchStatus }, Width = inputTextBox.Width + 20 };
// Our UI update handlers.
var syncContext = SynchronizationContext.Current;
Action<Action> onUITread = (x) => syncContext.Post(_ => x(), null);
Action<string> onSearchStarted = t => onUITread(() => searchStatus.Text = (string.Format("searching for '{0}'.", t)));
Action<string[]> onSearchResult = w => {
searchResultLB.Items.Clear();
searchResultLB.Items.AddRange(w);
searchStatus.Text += string.Format(" {0} maches found.", w.Length > 0 ? w.Length.ToString() : "No");
};
// Connecting input to search
var input = ThrottleInput(TextChanged(inputTextBox), 1, TimeSpan.FromSeconds(0.5)).Do(onSearchStarted);
var result = PerformSearch(input, new FakeWordSearchService());
// Running it
using (result.ObserveOn(syncContext).Subscribe(onSearchResult, ex => Console.WriteLine(ex)))
Application.Run(mainForm);
}
}
}

How to stop multiple Parallel.For loops with CancellationToken

I'm developing an app with C# and WPF. I'm using 3 nested Parallel.For loops as shown below. When I Cancel() the token, loops are starting to throw ImportAbortedException BUT, I can't catch ImportAbortedException. What I cought is AggregateException
What I want is, stop all the Parallel.Fors and catch the ImportAbortedException and do some other stuff.
Here is the code.
private int _loopCount1 = 100;
private int _loopCount2 = 200;
private int _loopCount3 = 10000;
private CancellationToken _cToken;
private CancellationTokenSource _cSource;
private void Init()
{
_cSource = new CancellationTokenSource();
_cToken = new CancellationToken();
_cToken = _cSource.Token;
try
{
DoTheWork();
}
catch (ImportAbortedException)
{
///
}
catch (Exception)
{
}
}
private void StopAllLoops()
{
_cSource.Cancel();
}
private void DoTheWork()
{
Parallel.For(0, _loopCount1, i =>
{
if (CheckIfCanceled())
throw new ImportAbortedException("process aborted!");
// do a few calculations here.
Parallel.For(0, _loopCount2, j =>
{
if (CheckIfCanceled())
throw new ImportAbortedException("process aborted!");
// do a few calculations here.
Parallel.For(0, _loopCount3, k =>
{
if (CheckIfCanceled())
throw new ImportAbortedException("process aborted!");
// do some other process here.
});
});
});
}
private bool CheckIfCanceled()
{
return _cToken.IsCancellationRequested;
}
The ParallelOptions attribute of the Parallel.For has a CancellationToken property you can pass it, so when the cancellation token is canceled the parallel for is stopped and a OperationCanceledException is produced.
See MSDN ParallelOptions
I would avoid using Parallel.For entirely and use Microsoft's Reactive Framework (NuGet "Rx-Main" & "Rx-WPF"). You can use it to neatly handle all of your parallel processing and you can marshall results back tot he UI thread.
Your code would look like this:
private IDisposable DoTheWork()
{
var query =
from i in Observable.Range(0, _loopCount1)
from x in Observable.Start(() => SomeCalculation1(i))
from j in Observable.Range(0, _loopCount2)
from y in Observable.Start(() => SomeCalculation2(i, j))
from k in Observable.Range(0, _loopCount3)
from z in Observable.Start(() => SomeCalculation3(i, j, k))
select new { x, y, z };
return
query
.ObserveOnDispatcher()
.Subscribe(w =>
{
/* Do something with w.x, w.y, w.z */
});
}
You would call it like this:
var subscription = DoTheWork();
And to cancel, you simply do this:
subscription.Dispose();
It's all multi-threaded, UI safe, and can be easily cancelled.

Why TPL Dataflow block.LinkTo does not give any output?

I am quite new to the topic TPL Dataflow. In the book Concurrency in C# I tested the following example. I can't figure out why there's no output which should be 2*2-2=2;
static void Main(string[] args)
{
//Task tt = test();
Task tt = test1();
Console.ReadLine();
}
static async Task test1()
{
try
{
var multiplyBlock = new TransformBlock<int, int>(item =>
{
if (item == 1)
throw new InvalidOperationException("Blech.");
return item * 2;
});
var subtractBlock = new TransformBlock<int, int>(item => item - 2);
multiplyBlock.LinkTo(subtractBlock,
new DataflowLinkOptions { PropagateCompletion = true });
multiplyBlock.Post(2);
await subtractBlock.Completion;
int temp = subtractBlock.Receive();
Console.WriteLine(temp);
}
catch (AggregateException e)
{
// The exception is caught here.
foreach (var v in e.InnerExceptions)
{
Console.WriteLine(v.Message);
}
}
}
Update1: I tried another example. Still I did not use Block.Complete() but I thought when the first block's completed, the result is passed into the second block automatically.
private static async Task test3()
{
TransformManyBlock<int, int> tmb = new TransformManyBlock<int, int>((i) => { return new int[] {i, i + 1}; });
ActionBlock<int> ab = new ActionBlock<int>((i) => Console.WriteLine(i));
tmb.LinkTo(ab);
for (int i = 0; i < 4; i++)
{
tmb.Post(i);
}
//tmb.Complete();
await ab.Completion;
Console.WriteLine("Finished post");
}
This part of the code:
await subtractBlock.Completion;
int temp = subtractBlock.Receive();
is first (asynchronously) waiting for the subtraction block to complete, and then attempting to retrieve an output from the block.
There are two problems: the source block is never completed, and the code is attempting to retrieve output from a completed block. Once a block has completed, it will not produce any more data.
(I assume you're referring to the example in recipe 4.2, which will post 1, causing the exception, which completes the block in a faulted state).
So, you can fix this test by completing the source block (and the completion will propagate along the link to the subtractBlock automatically), and by reading the output before (asynchronously) waiting for subtractBlock to complete:
multiplyBlock.Complete();
int temp = subtractBlock.Receive();
await subtractBlock.Completion;

Pause and Resume Subscription on cold IObservable

Using Rx, I desire pause and resume functionality in the following code:
How to implement Pause() and Resume() ?
static IDisposable _subscription;
static void Main(string[] args)
{
Subscribe();
Thread.Sleep(500);
// Second value should not be shown after two seconds:
Pause();
Thread.Sleep(5000);
// Continue and show second value and beyond now:
Resume();
}
static void Subscribe()
{
var list = new List<int> { 1, 2, 3, 4, 5 };
var obs = list.ToObservable();
_subscription = obs.SubscribeOn(Scheduler.NewThread).Subscribe(p =>
{
Console.WriteLine(p.ToString());
Thread.Sleep(2000);
},
err => Console.WriteLine("Error"),
() => Console.WriteLine("Sequence Completed")
);
}
static void Pause()
{
// Pseudocode:
//_subscription.Pause();
}
static void Resume()
{
// Pseudocode:
//_subscription.Resume();
}
Rx Solution?
I believe I could make it work with some kind of Boolean field gating combined with thread locking (Monitor.Wait and Monitor.Pulse)
But is there an Rx operator or some other reactive shorthand to achieve the same aim?
Here's a reasonably simple Rx way to do what you want. I've created an extension method called Pausable that takes a source observable and a second observable of boolean that pauses or resumes the observable.
public static IObservable<T> Pausable<T>(
this IObservable<T> source,
IObservable<bool> pauser)
{
return Observable.Create<T>(o =>
{
var paused = new SerialDisposable();
var subscription = Observable.Publish(source, ps =>
{
var values = new ReplaySubject<T>();
Func<bool, IObservable<T>> switcher = b =>
{
if (b)
{
values.Dispose();
values = new ReplaySubject<T>();
paused.Disposable = ps.Subscribe(values);
return Observable.Empty<T>();
}
else
{
return values.Concat(ps);
}
};
return pauser.StartWith(false).DistinctUntilChanged()
.Select(p => switcher(p))
.Switch();
}).Subscribe(o);
return new CompositeDisposable(subscription, paused);
});
}
It can be used like this:
var xs = Observable.Generate(
0,
x => x < 100,
x => x + 1,
x => x,
x => TimeSpan.FromSeconds(0.1));
var bs = new Subject<bool>();
var pxs = xs.Pausable(bs);
pxs.Subscribe(x => { /* Do stuff */ });
Thread.Sleep(500);
bs.OnNext(true);
Thread.Sleep(5000);
bs.OnNext(false);
Thread.Sleep(500);
bs.OnNext(true);
Thread.Sleep(5000);
bs.OnNext(false);
It should be fairly easy for you to put this in your code with the Pause & Resume methods.
Here it is as an application of IConnectableObservable that I corrected slightly for the newer api (original here):
public static class ObservableHelper {
public static IConnectableObservable<TSource> WhileResumable<TSource>(Func<bool> condition, IObservable<TSource> source) {
var buffer = new Queue<TSource>();
var subscriptionsCount = 0;
var isRunning = System.Reactive.Disposables.Disposable.Create(() => {
lock (buffer)
{
subscriptionsCount--;
}
});
var raw = Observable.Create<TSource>(subscriber => {
lock (buffer)
{
subscriptionsCount++;
if (subscriptionsCount == 1)
{
while (buffer.Count > 0) {
subscriber.OnNext(buffer.Dequeue());
}
Observable.While(() => subscriptionsCount > 0 && condition(), source)
.Subscribe(
v => { if (subscriptionsCount == 0) buffer.Enqueue(v); else subscriber.OnNext(v); },
e => subscriber.OnError(e),
() => { if (subscriptionsCount > 0) subscriber.OnCompleted(); }
);
}
}
return isRunning;
});
return raw.Publish();
}
}
Here is my answer. I believe there may be a race condition around pause resume, however this can be mitigated by serializing all activity onto a scheduler. (favor Serializing over synchronizing).
using System;
using System.Reactive.Concurrency;
using System.Reactive.Disposables;
using System.Reactive.Linq;
using System.Reactive.Subjects;
using Microsoft.Reactive.Testing;
using NUnit.Framework;
namespace StackOverflow.Tests.Q7620182_PauseResume
{
[TestFixture]
public class PauseAndResumeTests
{
[Test]
public void Should_pause_and_resume()
{
//Arrange
var scheduler = new TestScheduler();
var isRunningTrigger = new BehaviorSubject<bool>(true);
Action pause = () => isRunningTrigger.OnNext(false);
Action resume = () => isRunningTrigger.OnNext(true);
var source = scheduler.CreateHotObservable(
ReactiveTest.OnNext(0.1.Seconds(), 1),
ReactiveTest.OnNext(2.0.Seconds(), 2),
ReactiveTest.OnNext(4.0.Seconds(), 3),
ReactiveTest.OnNext(6.0.Seconds(), 4),
ReactiveTest.OnNext(8.0.Seconds(), 5));
scheduler.Schedule(TimeSpan.FromSeconds(0.5), () => { pause(); });
scheduler.Schedule(TimeSpan.FromSeconds(5.0), () => { resume(); });
//Act
var sut = Observable.Create<IObservable<int>>(o =>
{
var current = source.Replay();
var connection = new SerialDisposable();
connection.Disposable = current.Connect();
return isRunningTrigger
.DistinctUntilChanged()
.Select(isRunning =>
{
if (isRunning)
{
//Return the current replayed values.
return current;
}
else
{
//Disconnect and replace current.
current = source.Replay();
connection.Disposable = current.Connect();
//yield silence until the next time we resume.
return Observable.Never<int>();
}
})
.Subscribe(o);
}).Switch();
var observer = scheduler.CreateObserver<int>();
using (sut.Subscribe(observer))
{
scheduler.Start();
}
//Assert
var expected = new[]
{
ReactiveTest.OnNext(0.1.Seconds(), 1),
ReactiveTest.OnNext(5.0.Seconds(), 2),
ReactiveTest.OnNext(5.0.Seconds(), 3),
ReactiveTest.OnNext(6.0.Seconds(), 4),
ReactiveTest.OnNext(8.0.Seconds(), 5)
};
CollectionAssert.AreEqual(expected, observer.Messages);
}
}
}
It just works:
class SimpleWaitPulse
{
static readonly object _locker = new object();
static bool _go;
static void Main()
{ // The new thread will block
new Thread (Work).Start(); // because _go==false.
Console.ReadLine(); // Wait for user to hit Enter
lock (_locker) // Let's now wake up the thread by
{ // setting _go=true and pulsing.
_go = true;
Monitor.Pulse (_locker);
}
}
static void Work()
{
lock (_locker)
while (!_go)
Monitor.Wait (_locker); // Lock is released while we’re waiting
Console.WriteLine ("Woken!!!");
}
}
Please, see How to Use Wait and Pulse for more details

Categories