My goal here is to spool all items/notifications going from IObservable<T> for future subscribers.
E.g. if someone subscribes on a message stream, first he receives all messages which came prior to the subscription. Then he starts receiving new messages, whenever there are any. This should occur seamlessly, without repetitions and losses on the "boundary" between old and new messages.
I came up with following extension method:
public static IObservable<T> WithHistory<T>(this IObservable<T> source)
{
var accumulator = new BlockingCollection<T>();
source.Subscribe(accumulator.Add);
return accumulator
.GetConsumingEnumerable()
.ToObservable()
.SubscribeOn(ThreadPoolScheduler.Instance);
}
As far as I tested it, it works:
class Generator<T>
{
event Action<T> onPush;
public IObservable<T> Items =>
Observable.FromEvent<T>(d => onPush += d, d => onPush -= d);
public void Push(T item) => onPush?.Invoke(item);
}
...
private static void Main()
{
var g = new Generator<int>();
var ongoingItems = g.Items;
var allItems = g.Items.WithHistory();
g.Push(1);
g.Push(2);
ongoingItems.Subscribe(x => Console.WriteLine($"Ongoing: got {x}"));
allItems.Subscribe(x => Console.WriteLine($"WithHistory: got {x}"));
g.Push(3);
g.Push(4);
g.Push(5);
Console.ReadLine();
}
The result:
Ongoing: got 3
Ongoing: got 4
Ongoing: got 5
WithHistory: got 1
WithHistory: got 2
WithHistory: got 3
WithHistory: got 4
WithHistory: got 5
However, using BlockingCollection<T> seems to be an overkill. Also the method above does not support completion, error handling and would cause deadlocks without .SubscribeOn(ThreadPoolScheduler.Instance).
Is there any better way to achieve it, without the described flaws?
Best way to do it is with .Replay()
void Main()
{
var g = new Generator<int>();
var ongoingItems = g.Items;
var allItems = g.Items.Replay().RefCount();
using(var tempSubscriber = allItems.Subscribe())
{
g.Push(1);
g.Push(2);
ongoingItems.Subscribe(x => Console.WriteLine($"Ongoing: got {x}"));
allItems.Subscribe(x => Console.WriteLine($"WithHistory: got {x}"));
g.Push(3);
g.Push(4);
g.Push(5);
Console.ReadLine();
}
}
.Replay().RefCount() produces an observable that will keep an internal queue for replaying, as long as there's a subscriber. If you have a persistent subscriber though (like your solution does in the WithHistory method), you have a memory leak. The best way to get around this is to have a temporary subscriber which automatically disconnects after you're no longer interested in the history.
Related
I have tried to write console observable as in the example below, but it doesn't work. There are some issues with subscriptions. How to solve these issues?
static class Program
{
static async Task Main(string[] args)
{
// var observable = Observable.Interval(TimeSpan.FromMilliseconds(1000)).Publish().RefCount(); // works
// var observable = FromConsole().Publish().RefCount(); // doesn't work
var observable = FromConsole(); // doesn't work
observable.Subscribe(Console.WriteLine);
await Task.Delay(1500);
observable.Subscribe(Console.WriteLine);
await new TaskCompletionSource().Task;
}
static IObservable<string> FromConsole()
{
return Observable.Create<string>(async observer =>
{
while (true)
{
observer.OnNext(Console.ReadLine());
}
});
}
}
If I used Observable.Interval, it subscribes two times and I have two outputs for one input. If I used any version of FromConsole, I have one subscription and a blocked thread.
To start with, it is usually best to avoid using Observable.Create to create observables - it's certainly there for that purpose, but it can create observables that don't behave like you think they should because of their blocking nature. As you've discovered!
Instead, when possible, use the built-in operators to create observables. And that can be done in this case.
My version of FromConsole is this:
static IObservable<string> FromConsole() =>
Observable
.Defer(() =>
Observable
.Start(() => Console.ReadLine()))
.Repeat();
Observable.Start effectively is like Task.Run for observables. It calls Console.ReadLine() for us without blocking.
The Observable.Defer/Repeat pair repeatedly calls Observable.Start(() => Console.ReadLine()). Without the Defer it would just call Observable.Start and repeatedly return the one string forever.
That solves that.
Now, the second issue is that you want to see the value from the Console.ReadLine() output by both subscriptions to the FromConsole() observable.
Due to the way Console.ReadLine works, you are getting values from each subscription, but only one at a time. Try this code:
static async Task Main(string[] args)
{
var observable = FromConsole();
observable.Select(x => $"1:{x}").Subscribe(Console.WriteLine);
observable.Select(x => $"2:{x}").Subscribe(Console.WriteLine);
await new TaskCompletionSource<int>().Task;
}
static IObservable<string> FromConsole() =>
Observable
.Defer(() =>
Observable
.Start(() => Console.ReadLine()))
.Repeat();
When I run that I get this kind of output:
1:ddfd
2:dfff
1:dfsdfs
2:sdffdfd
1:sdfsdfsdf
The reason for this is that each subscription starts up a fresh subscription to FromConsole. So you have two calls to Console.ReadLine() they effectively queue and each one only gets each alternate input. Hence the alternation between 1 & 2.
So, to solve this you simply need the .Publish().RefCount() operator pair.
Try this:
static async Task Main(string[] args)
{
var observable = FromConsole().Publish().RefCount();
observable.Select(x => $"1:{x}").Subscribe(Console.WriteLine);
observable.Select(x => $"2:{x}").Subscribe(Console.WriteLine);
await new TaskCompletionSource<int>().Task;
}
static IObservable<string> FromConsole() =>
Observable
.Defer(() =>
Observable
.Start(() => Console.ReadLine()))
.Repeat();
I now get:
1:Hello
2:Hello
1:World
2:World
In a nutshell, it's the combination of the non-blocking FromConsole observable and the use of .Publish().RefCount() that makes this work the way you expect.
The problem is that the Console.ReadLine is a blocking method, so the subscription to the FromConsole sequence blocks indefinitely, so the await Task.Delay(1500); line is never reached. You can solve this problem by reading from the console asynchronously, offloading the blocking call to a ThreadPool thread:
static IObservable<string> FromConsole()
{
return Observable.Create<string>(async observer =>
{
while (true)
{
observer.OnNext(await Task.Run(() => Console.ReadLine()));
}
});
}
You can take a look at this question about why there is no better solution than offloading.
As a side note, subscribing to a sequence without providing an onError handler is not a good idea, unless having the process crash with an unhandled exception is an acceptable behavior for your app. It is especially problematic with sequences produced with Observable.Create<T>(async, because it can lead to weird/buggy behavior like this one: Async Create hanging while publishing observable.
You need to return a observable without the publish. You can then subscribe to it and do your thing further. Here is an example. When I run it i can readline multiple times.
public class Program
{
static void Main(string[] args)
{
FromConsole().Subscribe(x =>
{
Console.WriteLine(x);
});
}
static IObservable<string> FromConsole()
{
return Observable.Create<string>(async observer =>
{
while (true)
{
observer.OnNext(Console.ReadLine());
}
});
}
}
The purpose is to do some async work on a scarce resource in a RX operator, Select for example. Issues arise when observable notifications came at a rate that is faster than the time it takes for the async operation to complete.
Now I actually solved the problem. My question would be what is the correct terminology for this particular kind of issue? Does it have a name? Is it backpressure? Research I did until now indicate that this is some kind of a pressure problem, but not necessarily backpressure from my understanding. The most relevant resources I found are these:
https://github.com/ReactiveX/RxJava/wiki/Backpressure-(2.0)
http://reactivex.io/documentation/operators/backpressure.html
Now to the actual code. Suppose there is a scarce resource and it's consumer. In this case exception is thrown when resource is in use. Please note that this code should not be changed.
public class ScarceResource
{
private static bool inUse = false;
public async Task<int> AccessResource()
{
if (inUse) throw new Exception("Resource is alredy in use");
var result = await Task.Run(() =>
{
inUse = true;
Random random = new Random();
Thread.Sleep(random.Next(1, 2) * 1000);
inUse = false;
return random.Next(1, 10);
});
return result;
}
}
public class ResourceConsumer
{
public IObservable<int> DoWork()
{
var resource = new ScarceResource();
return resource.AccessResource().ToObservable();
}
}
Now here is the problem with a naive implementation to consume the resource. Error is thrown because notifications came at a faster rate than the consumer takes to run.
private static void RunIntoIssue()
{
var numbers = Enumerable.Range(1, 10);
var observableSequence = numbers
.ToObservable()
.SelectMany(n =>
{
Console.WriteLine("In observable: {0}", n);
var resourceConsumer = new ResourceConsumer();
return resourceConsumer.DoWork();
});
observableSequence.Subscribe(n => Console.WriteLine("In observer: {0}", n));
}
With the following code the problem is solved. I slow down processing by using a completed BehaviorSubject in conjunction with the Zip operator. Essentially what this code does is to take a sequential approach instead of a parallel one.
private static void RunWithZip()
{
var completed = new BehaviorSubject<bool>(true);
var numbers = Enumerable.Range(1, 10);
var observableSequence = numbers
.ToObservable()
.Zip(completed, (n, c) =>
{
Console.WriteLine("In observable: {0}, completed: {1}", n, c);
var resourceConsumer = new ResourceConsumer();
return resourceConsumer.DoWork();
})
.Switch()
.Select(n =>
{
completed.OnNext(true);
return n;
});
observableSequence.Subscribe(n => Console.WriteLine("In observer: {0}", n));
Console.Read();
}
Question
Is this backpressure, and if not does it have another terminology associated?
You're basically implementing a form of locking, or a mutex. Your code an cause backpressure, it's not really handling it.
Imagine if your source wasn't a generator function, but rather a series of data pushes. The data pushes arrive at a constant rate of every millisecond. It takes you 10 Millis to process each one, and your code forces serial processing. This causes backpressure: Zip will queue up the unprocessed datapushes infinitely until you run out of memory.
I have a remote program that sends an updated measurement every 10 milliseconds over a socket connection. In my client program I have wrapped this socket in an observable that generated these measurements. For my usecase it's important that the measurement arrive at 10 millisecond intervals. Of course this is not happening since network delays make it arrive a little earlier or later each message.
So basically what I have on my remote pc is program that sends this on a socket connection.
-- is 10 milliseconds
o--o--o--o--o--o--o--o--o--o--...
Which becomes this on my client due to network delays.
o-o---o-o--o---o--o-o--o-o-...
Now in my observable I want to "normalise" this so it will again emit a value each 10 millisecond.
--o--o--o--o--o--o--o--o--o--o...
Of course this will mean I will have to introduce a buffer time that it will store values and emit them on 10 millisecond interval. Is there a way I can accomplish this?
Here is some test code that will emit the event according to the way I described above.
using System;
using System.Collections.Generic;
using System.Reactive.Disposables;
using System.Reactive.Linq;
using System.Threading.Tasks;
using Microsoft.Reactive.Testing;
public class Program
{
protected static event EventHandler<EventArgs> CancelEvent;
private static Random random = new Random();
private static double GetRandomNumber(double minimum, double maximum)
{
return random.NextDouble() * (maximum - minimum) + minimum;
}
public static void Main()
{
var completed = false;
var scheduler = new TestScheduler();
var observable = Observable
.Interval(TimeSpan.FromMilliseconds(7.0), scheduler)
.SelectMany(e => Observable
.Return(e, scheduler)
.Delay(TimeSpan.FromMilliseconds(GetRandomNumber(0.0, 6.0)), scheduler)
)
.TimeInterval(scheduler)
.Select(t => t.Interval.Milliseconds);
var fromEvent = Observable.FromEventPattern<EventArgs>(
p => CancelEvent += p,
p => CancelEvent -= p,
scheduler
);
var cancellable = observable.TakeUntil(fromEvent);
var results = new List<int>();
using (cancellable.Subscribe(
results.Add,
e => { throw new Exception("No exception is planned! {0}", e); },
() => { completed = true; })
)
{
scheduler.AdvanceBy(TimeSpan.FromSeconds(3.5).Ticks);
CancelEvent(null, new EventArgs());
scheduler.AdvanceBy(TimeSpan.FromSeconds(3).Ticks);
}
Console.WriteLine("Have I completed indeed? {0}", completed);
Console.WriteLine("What emit time deltas been registered before cancellation?\n\t{0}", string.Join("ms\n\t", results));
}
}
This is theoretically similar to A way to push buffered events in even intervals.
That solution would look like this:
var source = new Subject<double>();
var bufferTime = TimeSpan.FromMilliseconds(100);
var normalizedSource = source
.Delay(bufferTime)
.Drain(x => Observable.Empty<int>().Delay(TimeSpan.FromMilliseconds(10)));
...with Drain defined as follows:
public static class ObservableDrainExtensions
{
public static IObservable<TOut> Drain<TSource, TOut>(this IObservable<TSource> source,
Func<TSource, IObservable<TOut>> selector)
{
return Observable.Defer(() =>
{
BehaviorSubject<Unit> queue = new BehaviorSubject<Unit>(new Unit());
return source
.Zip(queue, (v, q) => v)
.SelectMany(v => selector(v)
.Do(_ => { }, () => queue.OnNext(new Unit()))
);
});
}
}
However, I think you're going to run into problems with the 10 millisecond qualifier. That's too small a time to schedule. If I remember correctly, any delay less than 15ms is ignored by the schedulers and fired immediately. Given that, even if you used a larger interval (I tried with 100 ms), you're going to get some variance thanks to OS context switching, etc..
I'm trying to create an Rx operator that seems pretty useful, but I've suprisingly not found any questions on Stackoverflow that match precisely. I'd like to create a variation on Throttle that lets values through immediately if there's been a period of inactivity. My imagined use case is something like this:
I have a dropdown that kicks off a web request when the value is changed. If the user holds down the arrow key and cycles rapidly through the values, I don't want to kick off a request for each value. But if I throttle the stream then the user has to wait out the throttle duration every time they just select a value from the dropdown in the normal manner.
So whereas a normal Throttle looks like this:
I want to create ThrottleSubsequent that look like this:
Note that marbles 1, 2, and 6 are passed through without delay because they each follow a period of inactivity.
My attempt at this looks like the following:
public static IObservable<TSource> ThrottleSubsequent<TSource>(this IObservable<TSource> source, TimeSpan dueTime, IScheduler scheduler)
{
// Create a timer that resets with each new source value
var cooldownTimer = source
.Select(x => Observable.Interval(dueTime, scheduler)) // Each source value becomes a new timer
.Switch(); // Switch to the most recent timer
var cooldownWindow = source.Window(() => cooldownTimer);
// Pass along the first value of each cooldown window immediately
var firstAfterCooldown = cooldownWindow.SelectMany(o => o.Take(1));
// Throttle the rest of the values
var throttledRest = cooldownWindow
.SelectMany(o => o.Skip(1))
.Throttle(dueTime, scheduler);
return Observable.Merge(firstAfterCooldown, throttledRest);
}
This seems to work, but I'm having a difficult time reasoning about this, and I get the feeling there are some edge cases here where things might get screwy with duplicate values or something. I'd like to get some feedback from more experienced Rx-ers as to whether or not this code is correct, and/or whether there is a more idiomatic way of doing this.
Well, here's a test suite (using nuget Microsoft.Reactive.Testing):
var ts = new TestScheduler();
var source = ts.CreateHotObservable<char>(
new Recorded<Notification<char>>(200.MsTicks(), Notification.CreateOnNext('A')),
new Recorded<Notification<char>>(300.MsTicks(), Notification.CreateOnNext('B')),
new Recorded<Notification<char>>(500.MsTicks(), Notification.CreateOnNext('C')),
new Recorded<Notification<char>>(510.MsTicks(), Notification.CreateOnNext('D')),
new Recorded<Notification<char>>(550.MsTicks(), Notification.CreateOnNext('E')),
new Recorded<Notification<char>>(610.MsTicks(), Notification.CreateOnNext('F')),
new Recorded<Notification<char>>(760.MsTicks(), Notification.CreateOnNext('G'))
);
var target = source.ThrottleSubsequent(TimeSpan.FromMilliseconds(150), ts);
var expectedResults = ts.CreateHotObservable<char>(
new Recorded<Notification<char>>(200.MsTicks(), Notification.CreateOnNext('A')),
new Recorded<Notification<char>>(450.MsTicks(), Notification.CreateOnNext('B')),
new Recorded<Notification<char>>(500.MsTicks(), Notification.CreateOnNext('C')),
new Recorded<Notification<char>>(910.MsTicks(), Notification.CreateOnNext('G'))
);
var observer = ts.CreateObserver<char>();
target.Subscribe(observer);
ts.Start();
ReactiveAssert.AreElementsEqual(expectedResults.Messages, observer.Messages);
and using
public static class TestingHelpers
{
public static long MsTicks(this int i)
{
return TimeSpan.FromMilliseconds(i).Ticks;
}
}
Seems to pass. If you wanted to reduce it, you could turn it into this:
public static IObservable<TSource> ThrottleSubsequent2<TSource>(this IObservable<TSource> source, TimeSpan dueTime, IScheduler scheduler)
{
return source.Publish(_source => _source
.Window(() => _source
.Select(x => Observable.Interval(dueTime, scheduler))
.Switch()
))
.Publish(cooldownWindow =>
Observable.Merge(
cooldownWindow
.SelectMany(o => o.Take(1)),
cooldownWindow
.SelectMany(o => o.Skip(1))
.Throttle(dueTime, scheduler)
)
);
}
EDIT:
Publish forces sharing of a subscription. If you have a bad (or expensive) source observable with subscription side-effects, Publish makes sure you only subscribe once. Here's an example where Publish helps:
void Main()
{
var source = UglyRange(10);
var target = source
.SelectMany(i => Observable.Return(i).Delay(TimeSpan.FromMilliseconds(10 * i)))
.ThrottleSubsequent2(TimeSpan.FromMilliseconds(70), Scheduler.Default) //Works with ThrottleSubsequent2, fails with ThrottleSubsequent
.Subscribe(i => Console.WriteLine(i));
}
static int counter = 0;
public IObservable<int> UglyRange(int limit)
{
var uglySource = Observable.Create<int>(o =>
{
if (counter++ == 0)
{
Console.WriteLine("Ugly observable should only be created once.");
Enumerable.Range(1, limit).ToList().ForEach(i => o.OnNext(i));
}
else
{
Console.WriteLine($"Ugly observable should only be created once. This is the {counter}th time created.");
o.OnError(new Exception($"observable invoked {counter} times."));
}
return Disposable.Empty;
});
return uglySource;
}
I am new to Rx. I want to know if it is possible to dispatch a message to different subscribers such that they run on different thread? How can an IObserable control it? The plain Subject implementation, as I understand it calls the subscribers one after the other on a single thread.
public class Subsciber : IObserver<int>
{
public void OnNext(int a)
{
// Do something
}
public void OnError(Exception e)
{
// Do something
}
public void OnCompeleted()
{
}
}
public static class Program
{
public void static Main()
{
var observable = new <....SomeClass....>();
var sub1 = new Subscriber();
var sub2 = new Subscriber();
observable.Subscribe(sub1);
observable.Subscribe(sub2);
// some waiting function
}
}
If I use Subject as 'SomeClass', then sub2's OnNext() will not be called until sub1's OnNext() is completed. If sub1 is takes a lot of time, I don't want it to delay sub2's reception. Can someone tell me how Rx allows this kind of implementation for SomeClass.
The code you have written is almost there to run the observable in parallel. If you write your observer as this:
public class Subscriber : IObserver<int>
{
public void OnNext(int a)
{
Console.WriteLine("{0} on {1} at {2}",
a,
Thread.CurrentThread.ManagedThreadId,
DateTime.Now.ToString());
}
public void OnError(Exception e)
{ }
public void OnCompleted()
{ }
}
Then running this code:
var observable =
Observable
.Interval(TimeSpan.FromSeconds(1.0))
.Select(x => (int)x)
.Take(5)
.ObserveOn(Scheduler.ThreadPool);
var sub1 = new Subscriber();
var sub2 = new Subscriber();
observable.Subscribe(sub1);
observable.Subscribe(sub2);
Thread.Sleep(10000);
Will produce the following:
0 on 28 at 2011/10/20 00:13:49
0 on 16 at 2011/10/20 00:13:49
1 on 29 at 2011/10/20 00:13:50
1 on 22 at 2011/10/20 00:13:50
2 on 27 at 2011/10/20 00:13:51
2 on 29 at 2011/10/20 00:13:51
3 on 27 at 2011/10/20 00:13:52
3 on 19 at 2011/10/20 00:13:52
4 on 27 at 2011/10/20 00:13:53
4 on 27 at 2011/10/20 00:13:53
It's already running the subscriptions in parallel on different threads.
The important thing that I used was the .ObserveOn extension method - that's what made this work.
You should keep in mind that observers don't generally share the same instance of observables. Subscribing to an observable effectively wires up a unique "chain" of observable operators from the source of the observable to the observer. This is much the same as calling GetEnumerator twice on an enumerable, you will not share the same enumerator instance, you will get two unique instances.
Now, I want to describe what I mean by a chain. I'm going to give the Reflector.NET extracted code from Observable.Generate & Observable.Where to illustrate the point.
Take this code for example:
var xs = Observable.Generate(0, x => x < 10, x => x + 1, x => x);
var ys = xs.Where(x => x % 2 == 0);
ys.Subscribe(y => { /* produces 0, 2, 4, 6, 8 */ });
Under the hood both Generate & Where each create a new instance of the internal Rx class AnonymousObservable<T>. The constructor for AnonymousObservable<T> takes a Func<IObserver<T>, IDisposable> delegate which it uses whenever it receives a call to Subscribe.
The slightly cleaned up code for Observable.Generate<T>(...) from Reflector.NET is:
public static IObservable<TResult> Generate<TState, TResult>(
TState initialState,
Func<TState, bool> condition,
Func<TState, TState> iterate,
Func<TState, TResult> resultSelector,
IScheduler scheduler)
{
return new AnonymousObservable<TResult>((IObserver<TResult> observer) =>
{
TState state = initialState;
bool first = true;
return scheduler.Schedule((Action self) =>
{
bool flag = false;
TResult local = default(TResult);
try
{
if (first)
{
first = false;
}
else
{
state = iterate(state);
}
flag = condition(state);
if (flag)
{
local = resultSelector(state);
}
}
catch (Exception exception)
{
observer.OnError(exception);
return;
}
if (flag)
{
observer.OnNext(local);
self();
}
else
{
observer.OnCompleted();
}
});
});
}
The Action self parameter is a recursive call that iterates output values. You'll notice that nowhere in this code does the observer get stored or that the values get pasted to more than one observer. This code runs once for each new observer.
The slightly cleaned up code for Observable.Where<T>(...) from Reflector.NET is:
public static IObservable<TSource> Where<TSource>(
this IObservable<TSource> source,
Func<TSource, bool> predicate)
{
return new AnonymousObservable<TSource>(observer =>
source.Subscribe(x =>
{
bool flag;
try
{
flag = predicate(x);
}
catch (Exception exception)
{
observer.OnError(exception);
return;
}
if (flag)
{
observer.OnNext(x);
}
}, ex => observer.OnError(ex), () => observer.OnCompleted));
}
Again this code doesn't track multiple observers. It calls Subscribe effectively passing its own code as the observer to the underlying source observable.
You should see that, in my example code above, subscribing to Where creates a subscription to Generate and hence this is a chain of observables. In fact it's chaining subscribe calls on a series of AnonymousObservable objects.
If you have two subscriptions you have two chains. If you have 1,000 subscriptions you have 1,000 chains.
Now, just as a side note - even though there are IObservable<T> and IObserver<T> interfaces - you should very very rarely actually implement these in your own classes. The built-in classes and operators handle 99.99% of all cases. It's a bit like IEnumerable<T> - how often do you need to implement this interface yourself?
Let me know if this helps and if you need any further explanation.
If you have a IObservable and you need to force the subscription to run on a different thread, then you can use the ObserveOn function.
If you run the below code, it will force the number generator to run in a different thread contexts. You can also use the EventLoopScheduler and specify the System.Thread you want to use, set priority, set name, etc...
void Main()
{
var numbers = Observable.Interval(TimeSpan.FromMilliseconds(100));
var disposable = new CompositeDisposable()
{
numbers.ObserveOn(Scheduler.TaskPool).Subscribe(x=> Console.WriteLine("TaskPool: "+ Thread.CurrentThread.ManagedThreadId)),
numbers.ObserveOn(Scheduler.ThreadPool).Subscribe(x=> Console.WriteLine("ThreadPool: "+ Thread.CurrentThread.ManagedThreadId)),
numbers.ObserveOn(Scheduler.Immediate).Subscribe(x=> Console.WriteLine("Immediate: "+ Thread.CurrentThread.ManagedThreadId))
};
Thread.Sleep(1000);
disposable.Dispose();
}
Output
Immediate: 10
ThreadPool: 4
TaskPool: 20
TaskPool: 4
ThreadPool: 24
Immediate: 27
Immediate: 10
TaskPool: 24
ThreadPool: 27
Immediate: 24
TaskPool: 26
ThreadPool: 20
Immediate: 26
ThreadPool: 24
TaskPool: 27
Immediate: 28
ThreadPool: 27
TaskPool: 26
Immediate: 10
Note how I used the CompositeDisposable to dispose all subscriptions at the end. If you don't do this in LinqPad for example. The Observable.Interval will continue to run in memory untill you kill the process.