Code works but seems to run synchronously not asynchronously - c#

static void Main(string[] args)
{
// do async method for each stock in the list
Task<IEnumerable<string>> symbols = Helper.getStockSymbols("amex", 0);
List<Task> tasks = new List<Task>();
try
{
for (int i = 0; i < symbols.Result.Count(); i++)
{
if (i < symbols.Result.Count())
{
string symbol = symbols.Result.ElementAtOrDefault(i);
Task t = Task.Run(() => getCalculationsDataAsync(symbol, "amex"));
tasks.Add(t);
Task e = t.ContinueWith((d) => getThreadStatus(tasks));
}
}
// don't exit until they choose to
while (args.FirstOrDefault() != "exit")
{
// do nothing
}
}
catch (Exception ex)
{
Console.WriteLine(ex.Message);
}
}
public static void getThreadStatus(List<Task> taskList)
{
int count = 0;
foreach (Task t in taskList)
{
if (t.Status == TaskStatus.Running)
{
count += 1;
}
}
Console.WriteLine(count + " threads are running.");
}
public static async Task getCalculationsDataAsync(string symbol, string market)
{
// do calculation here
}
What I'm trying to do in my code is run a new task for each stock in my list and run them all at the same time. I have a 4 core processor and I believe that means I can only run 4 tasks at once. I tried testing how many tasks were running by inserting the continuewith method that you see in my code that will tell me how many tasks are running. When I run this code, it tells me that 0 tasks are running so my questions are:
How can I complete my objective by running these tasks at the same exact time?
Why is it telling me that 0 tasks are running? I can only assume this is because the current task is finished and it hasn't started a new one yet if the tasks are running one after the other.

I am not sure why you are seeing no task running. Are you sure your Task.Run() is being hit? That is to say is i < symbols.Result.Count() satisfied?
Regardless of the above, let us try and achieve what you want. Firstly, no, it is not correct to say that because your machine has four physical cores/'threads', that you can use a maximum of four Threads. These are not the same things. A Google on this subject will bring a plethora of information your way about this.
Basically starting a thread the way you have will start background threads on a thread pool, and this thread pool can hold/govern numerous threads see Threading in C# by J. Albahari for more information. Whenever you start a thread, a few hundred microseconds are spent organizing such things as a fresh private local variable stack. Each thread also consumes (by default) around 1 MB of memory. The thread pool cuts these overheads by sharing and recycling threads, allowing multithreading to be applied at a very granular level without a performance penalty. This is useful when leveraging multicore processors to execute computationally intensive code in parallel in “divide-and-conquer” style.
The thread pool also keeps a lid on the total number of worker threads it will run simultaneously. Too many active threads throttle the operating system with administrative burden and render CPU caches ineffective. Once a limit is reached, jobs queue up and start only when another finishes.
Okay, now for your code. Let us say we have a list of stock types
List<string> types = new List<string>() { "AMEX", "AMEC", "BP" };
To dispatch multiple threads to do for for each of these (not using async/await), you could do something like
foreach (string t in types)
{
Task.Factory.StartNew(() => DoSomeCalculationForType(t));
}
This will fire of three background thread pool threads and is non-blocking, so this code will return to caller almost immediately.
If you want to set up post processing you can do this via continuations and continuation chaining. This is all described in the Albahari link I provided above.
I hope this helps.
--
Edit. to address comments:
Beginning with the .NET Framework version 4, the default size of the thread pool for a process depends on several factors, such as the size of the virtual address space. A process can call the GetMaxThreads method to determine the number of threads.
However, there's something else at play: the thread pool doesn't immediately create new threads in all situations. In order to cope with bursts of small tasks, it limits how quickly it creates new threads. IIRC, it will create one thread every 0.5 seconds if there are outstanding tasks, up to the maximum number of threads. I can't immediately see that figure documented though, so it may well change. I strongly suspect that's what you're seeing though. Try queuing a lot of items and then monitor the number of threads over time.
Basically let the thread pool dispatch what it wants, its optimizer will do the best job for your circumstance.

I think the problem is that the tasks have a status of Running either very briefly or can skip that status all together and go straight from WaitingForActivation to RanToCompletion.
I modified your program slightly and i can see the tasks starting and completing but they don't appear to be in the Running state whenever I check.
static void Main(string[] args)
{
// do async method for each stock in the list
Task<IEnumerable<string>> symbols = Task.FromResult(Enumerable.Range(1, 5).Select (e => e.ToString()));
List<Task> tasks = new List<Task>();
try
{
for (int i = 0; i < symbols.Result.Count(); i++)
{
string symbol = symbols.Result.ElementAtOrDefault(i);
Task t = getCalculationsDataAsync(symbol, "amex", tasks);
tasks.Add(t);
}
Console.WriteLine("Tasks Count:"+ tasks.Count());
Task.WaitAll(tasks.ToArray());
}
catch (Exception ex)
{
Console.WriteLine(ex.Message);
}
}
public static void getThreadStatus(List<Task> taskList)
{
int count = 0;
foreach (Task t in taskList)
{
Console.WriteLine("Status " + t.Status);
if (t.Status == TaskStatus.Running)
{
count += 1;
Console.WriteLine("A task is running");
}
}
//Console.WriteLine(count + " threads are running.");
}
public static async Task getCalculationsDataAsync(string symbol, string market, List<Task> tasks)
{
Console.WriteLine("Starting task");
var delay = new Random((int)DateTime.Now.Ticks).Next(5000);
Console.WriteLine("Delay:" + delay);
await Task.Delay(delay);
Console.WriteLine("Finished task");
getThreadStatus(tasks);
}
Output
Starting task
Delay:1784
Starting task
Delay:2906
Starting task
Delay:2906
Starting task
Delay:2906
Starting task
Delay:2906
Tasks Count:5
Finished task
Status WaitingForActivation
Status WaitingForActivation
Status WaitingForActivation
Status WaitingForActivation
Status WaitingForActivation
Finished task
Finished task
Finished task
Status RanToCompletion
Status RanToCompletion
Status WaitingForActivation
Status WaitingForActivation
Status RanToCompletion
Status WaitingForActivation
Status WaitingForActivation
Status WaitingForActivation
Status WaitingForActivation
Status WaitingForActivation
Status WaitingForActivation
Status WaitingForActivation
Status WaitingForActivation
Status WaitingForActivation
Status WaitingForActivation
Finished task
Status RanToCompletion
Status RanToCompletion
Status RanToCompletion
Status RanToCompletion
Status WaitingForActivation

As I describe on my blog, there are two kinds of tasks: Delegate Tasks and Promise Tasks. Only Delegate Tasks ever actually run. Promise Tasks only complete. The task returned by Task.Run is a Promise Task, not a Delegate Task; this is necessary so that Task.Run can understand async code and only complete when the async code is completed.
So, checking for TaskStatus.Running is not going to work the way you want. Instead, you can create a counter:
private static int _count;
static void Main(string[] args)
{
...
for (int i = 0; i < symbols.Result.Count(); i++)
{
if (i < symbols.Result.Count())
{
string symbol = symbols.Result.ElementAtOrDefault(i);
Task t = Task.Run(() => getCalculationsDataWithCountAsync(symbol, "amex"));
tasks.Add(t);
}
}
...
}
public static async Task getCalculationsDataWithCountAsync(string symbol, string market)
{
Console.WriteLine(Interlocked.Increment(ref _count) + " threads are running.");
try
{
await getCalculationsDataAsync(symbol, market);
}
finally
{
Console.WriteLine(Interlocked.Decrement(ref _count) + " threads are running.");
}
}
Note that I used a separate async "wrapper" method instead of messing with ContinueWith. Code using await instead of ContinueWith more correctly handles a number of edge cases, and IMO is clearer to read.
Also, remember that async and await free up threads while they're "awaiting." So, you can potentially have hundreds of (asynchronous) tasks going simultaneously; this does not imply that there are that many threads.

Related

Why do I seem to have so few threads

I am trying to understand some code (for performance reasons) that is processing tasks from a queue. The code is C# .NET Framework 4.8 (And I didn't write this stuff)
I have this code creating a timer that from what I can tell should use a new thread every 10 seconds
_myTimer = new Timer(new TimerCallback(OnTimerGo), null, 0, 10000 );
Inside the onTimerGo it calls DoTask() inside of DoTask() it grabs a task off a queue and then does this
System.Threading.Tasks.Task.Factory.StartNew(ProcessTask, task).ContinueWith(c => DoTask());
My reading of this is that a new thread should start running OnTimerGo every 10 seconds, and that thread should in parralel run ProcessTask on tasks as fast as it can get them from the queue.
I inserted some code to call ThreadPool.GetMaxThreads and ThreadPool.GetAvailableThreads to figure out how many threads were in use. Then I queued up 10,000 things for it to do and let it loose.
I never see more then 4 threads in use at a time. This is running on a c4.4xlarge ec2 instance... so 16 vCPU 30 gb mem. The get max and available return over 2k. So I would expect more threads. By looking at the logging I can see that a total of 50ish different threads (by thread id) end up doing the work over the course of 20 minutes. Since the timer is set to every 10 seconds, I would expect 100 threads to be doing the work (or for it to finish sooner).
Looking at the code, the only time a running thread should stop is if it asks for a task from the queue and doesn't get one. Some other logging shows that there are never more than 2 tasks running in a thread. This is probably because they work is pretty fast. So the threads shouldn't be exiting, and I can even see from the logs that many of them end up doing as many as 500 tasks over the 20 minutes.
so... what am I missing here. Are the ThreadPool.GetMaxThreads and ThreadPool.GetAvailableThreads not accurate if run from inside a thread? Is something shutting down some of the threads while letting others keep going?
EDIT: adding more code
public static void StartScheduler()
{
lock (TimerLock)
{
if (_timerShutdown == false)
{
_myTimer = new Timer(new TimerCallback(OnTimerGo), null, 0, 10 );
const int numberOfSecondsPerMinute = 60;
const int margin = 1;
var pollEventsPerMinute = (numberOfSecondsPerMinute/SystemPreferences.TaskPollingIntervalSeconds);
_numberOfTimerCallsForHeartbeat = pollEventsPerMinute - margin;
}
}
}
private static void OnTimerGo(object state)
{
try
{
_lastTimer = DateTime.UtcNow;
var currentTickCount = Interlocked.Increment(ref _timerCallCount);
if (currentTickCount == _numberOfTimerCallsForHeartbeat)
{
Interlocked.Exchange(ref _timerCallCount, 0);
MonitoringTools.SendHeartbeatMetric(Heartbeat);
}
CheckForTasks();
}
catch (Exception e)
{
Log.Warn("Scheduler: OnTimerGo exception", e);
}
}
public static void CheckForTasks()
{
try
{
if (DoTask())
_lastStart = DateTime.UtcNow;
_lastStartOrCheck = DateTime.UtcNow;
}
catch (Exception e)
{
Log.Error("Unexpected exception checking for tasks", e);
}
}
private static bool DoTask()
{
Func<DataContext, bool> a = db =>
{
var mtid = Thread.CurrentThread.ManagedThreadId;
int totalThreads = Process.GetCurrentProcess().Threads.Count;
int maxWorkerThreads;
int maxPortThreads;
ThreadPool.GetMaxThreads(out maxWorkerThreads, out maxPortThreads);
int AvailableWorkerThreads;
int AvailablePortThreads;
ThreadPool.GetAvailableThreads(out AvailableWorkerThreads, out AvailablePortThreads);
int usedWorkerThreads = maxWorkerThreads - AvailableWorkerThreads;
string usedThreadMessage = $"Thread {mtid}: Threads in Use count: {usedWorkerThreads}";
Log.Info(usedThreadMessage);
var taskTypeAndTasks = GetTaskListTypeAndTasks();
var task = GetNextTask(db, taskTypeAndTasks.Key, taskTypeAndTasks.Value);
if (_timerShutdown)
{
Log.Debug("Task processing stopped.");
return false;
}
if (task == null)
{
Log.DebugFormat("DoTask: Idle in thread {0} ({1} tasks running)", mtid, _processingTaskLock);
return false;
}
Log.DebugFormat("DoTask: starting task {2}:{0} on thread {1}", task.Id, mtid, task.Class);
System.Threading.Tasks.Task.Factory.StartNew(ProcessTask, task).ContinueWith(c => DoTask());
Log.DebugFormat("DoTask: done ({0})", mtid);
return true;
};
return DbExtensions.WithDbWrite(ctx => a(ctx));
}
The Task.Factory.StartNew by default doesn't create a new thread. It borrows a thread from the ThreadPool instead.
The ThreadPool is intended as a small pool of reusable threads, to help amortize the cost of running frequent and lightweight operations like callbacks, continuations, event handers etc. Depleting the ThreadPool from available workers by scheduling too much work on it, results in a situation that is called saturation or starvation. And as you've already figured out, it's not a happy situation to be.
You can prevent the saturation of the ThreadPool by running your long-running work on dedicated threads instead of ThreadPool threads. This can be done by passing the TaskCreationOptions.LongRunning as argument to the Task.Factory.StartNew:
_ = Task.Factory.StartNew(ProcessTask, task, CancellationToken.None,
TaskCreationOptions.LongRunning,
TaskScheduler.Default).ContinueWith(t => DoTask(), CancellationToken.None,
TaskContinuationOptions.ExecuteSynchronously,
TaskScheduler.Default);
The above code schedules the ProcessTask(task) on a new thread, and after the invocation is completed either successfully or unsuccessfully, the DoTask will be invoked on the same thread. Finally the thread will be terminated. The discard _ signifies that the continuation Task (the task returned by the ContinueWith) is fire-and-forget. Which, to put it mildly, is architecturally suspicious. 😃
In case you are wondering why I pass the TaskScheduler.Default explicitly as argument to StartNew and ContinueWith, check out this link.
My reading of this is that a new thread should start running OnTimerGo every 10 seconds, and that thread should in parralel run ProcessTask on tasks as fast as it can get them from the queue.
Well, that is definitely not what's happening. It's a lot of uncertainty about your code, but it's clear that another DoTask is starting AFTER ProcessTask completes. And that is not parallel execution. Your line of code is this
System.Threading.Tasks.Task.Factory.StartNew(ProcessTask, task).ContinueWith(c => DoTask());
I suggest you to start another DoTask right there like this:
System.Threading.Tasks.Task.Factory.StartNew(ProcessTask, task);
DoTask();
Make sure your code is ready for parallel execution, though.

Is parallel asynchronous execution where a thread sleeps using multiple threads?

This is the code that I wrote to better understand asynchronous methods. I knew that an asynchronous method is not the same as multithreading, but it does not seem so in this particular scenario:
class Program
{
static void Main(string[] args)
{
Thread.CurrentThread.CurrentCulture = new System.Globalization.CultureInfo("en-US");
//the line above just makes sure that the console output uses . to represent doubles instead of ,
ExecuteAsync();
Console.ReadLine();
}
private static async Task ParallelAsyncMethod() //this is the method where async parallel execution is taking place
{
List<Task<string>> tasks = new List<Task<string>>();
for (int i = 0; i < 5; i++)
{
tasks.Add(Task.Run(() => DownloadWebsite()));
}
var strings = await Task.WhenAll(tasks);
foreach (var str in strings)
{
Console.WriteLine(str);
}
}
private static string DownloadWebsite() //Imitating a website download
{
Thread.Sleep(1500); //making the thread sleep for 1500 miliseconds before returning
return "Download finished";
}
private static async void ExecuteAsync()
{
var watch = Stopwatch.StartNew();
await ParallelAsyncMethod();
watch.Stop();
Console.WriteLine($"It took the machine {watch.ElapsedMilliseconds} milliseconds" +
$" or {Convert.ToDouble(watch.ElapsedMilliseconds) / 1000} seconds to complete this task");
Console.ReadLine();
}
}
//OUTPUT:
/*
Download finished
Download finished
Download finished
Download finished
Download finished
It took the machine 1537 milliseconds or 1.537 seconds to complete this task
*/
As you can see, the DownloadWebsite method waits for 1.5 seconds and then returns "a". The method called ParallelAsyncMethod adds five of these methods into the "tasks" list and then starts the parallel asynchronous execution. As you can see, I also tracked the amount of time that it takes for the ExecuteAsync method to be executed. The result is always somewhere around 1540 milliseconds. Here is my question: if the DownloadWebsite method required a thread to sleep 5 times for 1500 milliseconds, does it mean that the parallel execution of these methods required 5 different threads? If not, then how come it only took the program 1540 milliseconds to be executed and not ~7500 ms?
I knew that an asynchronous method is not the same as multi-threading
That is correct, an asynchronous method releases the current thread whilst I/O occurs, and schedules a continuation after it's completion.
Async and threads are completely unrelated concepts.
but it does not seem so in this particular scenario
That is because you explicitly run DownloadWebsite on the ThreadPool using Task.Run, which imitates asynchronous code by returning a Task after instructing the provided delegate to run.
Because you are not waiting for each Task to complete before starting the next, multiple threads can be used simultaneously.
Currently each thread is being blocked, as you have used Thread.Sleep in the implementation of DownloadWebsite, meaning you are actually running 5 synchronous methods on the ThreadPool.
In production code your DownloadWebsite method should be written asynchronously, maybe using HttpClient.GetAsync:
private static async Task<string> DownloadWebsiteAsync()
{
//...
await httpClinet.GetAsync(//...
//...
}
In that case, GetAsync returns a Task, and releases the current thread whilst waiting for the HTTP response.
You can still run multiple async methods concurrently, but as the thread is released each time, this may well use less than 5 separate threads and may even use a single thread.
Ensure that you dont use Task.Run with an asynchronous method; this simply adds unnecessary overhead:
var tasks = new List<Task<string>>();
for (int i = 0; i < 5; i++)
{
tasks.Add(DownloadWebsiteAsync()); // No need for Task.Run
}
var strings = await Task.WhenAll(tasks);
As an aside, if you want to imitate an async operation, use Task.Delay instead of Thread.Sleep as the former is non-blocking:
private static async Task<string> DownloadWebsite() //Imitating a website download
{
await Task.Delay(1500); // Release the thread for ~1500ms before continuing
return "Download finished";
}

How to force an ActionBlock to complete fast

According to the documentation:
A dataflow block is considered completed when it is not currently processing a message and when it has guaranteed that it will not process any more messages.
This behavior is not ideal in my case. I want to be able to cancel the job at any time, but the processing of each individual action takes a long time. So when I cancel the token, the effect is not immediate. I must wait for the currently processed item to complete. I have no way to cancel the actions directly, because the API I use is not cancelable. Can I do anything to make the block ignore the currently running action, and complete instantly?
Here is an example that demonstrates my problem. The token is canceled after 500 msec, and the duration of each action is 1000 msec:
static async Task Main()
{
var cts = new CancellationTokenSource(500);
var block = new ActionBlock<int>(async x =>
{
await Task.Delay(1000);
}, new ExecutionDataflowBlockOptions() { CancellationToken = cts.Token });
block.Post(1); // I must wait for this one to complete
block.Post(2); // This one is ignored
block.Complete();
var stopwatch = Stopwatch.StartNew();
try
{
await block.Completion;
}
catch (OperationCanceledException)
{
Console.WriteLine($"Canceled after {stopwatch.ElapsedMilliseconds} msec");
}
}
Output:
Canceled after 1035 msec
The desired output would be a cancellation after ~500 msec.
Based on this excerpt from your comment...:
What I want to happen in case of a cancellation request is to ignore the currently running workitem. I don't care about it any more, so why I have to wait for it?
...and assuming you are truly OK with leaving the Task running, you can simply wrap the job you wish to call inside another Task which will constantly poll for cancellation or completion, and cancel that Task instead. Take a look at the following "proof-of-concept" code that wraps a "long-running" task inside another Task "tasked" with constantly polling the wrapped task for completion, and a CancellationToken for cancellation (completely "spur-of-the-moment" status, you will want to re-adapt it a bit of course):
public class LongRunningTaskSource
{
public Task LongRunning(int milliseconds)
{
return Task.Run(() =>
{
Console.WriteLine("Starting long running task");
Thread.Sleep(3000);
Console.WriteLine("Finished long running task");
});
}
public Task LongRunningTaskWrapper(int milliseconds, CancellationToken token)
{
Task task = LongRunning(milliseconds);
Task wrapperTask = Task.Run(() =>
{
while (true)
{
//Check for completion (you could, of course, do different things
//depending on whether it is faulted or completed).
if (!(task.Status == TaskStatus.Running))
break;
//Check for cancellation.
if (token.IsCancellationRequested)
{
Console.WriteLine("Aborting Task.");
token.ThrowIfCancellationRequested();
}
}
}, token);
return wrapperTask;
}
}
Using the following code:
static void Main()
{
LongRunningTaskSource longRunning = new LongRunningTaskSource();
CancellationTokenSource cts = new CancellationTokenSource(1500);
Task task = longRunning.LongRunningTaskWrapper(3000, cts.Token);
//Sleep long enough to let things roll on their own.
Thread.Sleep(5000);
Console.WriteLine("Ended Main");
}
...produces the following output:
Starting long running task
Aborting Task.
Exception thrown: 'System.OperationCanceledException' in mscorlib.dll
Finished long running task
Ended Main
The wrapped Task obviously completes in its own good time. If you don't have a problem with that, which is often not the case, hopefully, this should fit your needs.
As a supplementary example, running the following code (letting the wrapped Task finish before time-out):
static void Main()
{
LongRunningTaskSource longRunning = new LongRunningTaskSource();
CancellationTokenSource cts = new CancellationTokenSource(3000);
Task task = longRunning.LongRunningTaskWrapper(1500, cts.Token);
//Sleep long enough to let things roll on their own.
Thread.Sleep(5000);
Console.WriteLine("Ended Main");
}
...produces the following output:
Starting long running task
Finished long running task
Ended Main
So the task started and finished before timeout and nothing had to be cancelled. Of course nothing is blocked while waiting. As you probably already know, of course, if you know what is being used behind the scenes in the long-running code, it would be good to clean up if necessary.
Hopefully, you can adapt this example to pass something like this to your ActionBlock.
Disclaimer & Notes
I am not familiar with the TPL Dataflow library, so this is just a workaround, of course. Also, if all you have is, for example, a synchronous method call that you do not have any influence on at all, then you will obviously need two tasks. One wrapper task to wrap the synchronous call and another one to wrap the wrapper task to include continuous status polling and cancellation checks.

Load Test using C# Async Await

I am creating a console program, which can test read / write to a Cache by simulating multiple clients, and have written following code. Please help me understand:
Is it correct way to achieve the multi client simulation
What can I do more to make it a genuine load test
void Main()
{
List<Task<long>> taskList = new List<Task<long>>();
for (int i = 0; i < 500; i++)
{
taskList.Add(TestAsync());
}
Task.WaitAll(taskList.ToArray());
long averageTime = taskList.Average(t => t.Result);
}
public static async Task<long> TestAsync()
{
// Returns the total time taken using Stop Watch in the same module
return await Task.Factory.StartNew(() => // Call Cache Read / Write);
}
Adjusted your code slightly to see how many threads we have at a particular time.
static volatile int currentExecutionCount = 0;
static void Main(string[] args)
{
List<Task<long>> taskList = new List<Task<long>>();
var timer = new Timer(Print, null, TimeSpan.FromSeconds(1), TimeSpan.FromSeconds(1));
for (int i = 0; i < 1000; i++)
{
taskList.Add(DoMagic());
}
Task.WaitAll(taskList.ToArray());
timer.Change(Timeout.Infinite, Timeout.Infinite);
timer = null;
//to check that we have all the threads executed
Console.WriteLine("Done " + taskList.Sum(t => t.Result));
Console.ReadLine();
}
static void Print(object state)
{
Console.WriteLine(currentExecutionCount);
}
static async Task<long> DoMagic()
{
return await Task.Factory.StartNew(() =>
{
Interlocked.Increment(ref currentExecutionCount);
//place your code here
Thread.Sleep(TimeSpan.FromMilliseconds(1000));
Interlocked.Decrement(ref currentExecutionCount);
return 4;
}
//this thing should give a hint to scheduller to use new threads and not scheduled
, TaskCreationOptions.LongRunning
);
}
The result is: inside a virtual machine I have from 2 to 10 threads running simultaneously if I don't use the hint. With the hint — up to 100. And on real machine I can see 1000 threads at once. Process explorer confirms this. Some details on the hint that would be helpful.
If it is very busy, then apparently your clients have to wait a while before their requests are serviced. Your program does not measure this, because your stopwatch starts running when the service request starts.
If you also want to measure what happen with the average time before a request is finished, you should start your stopwatch when the request is made, not when the request is serviced.
Your program takes only threads from the thread pool. If you start more tasks then there are threads, some tasks will have to wait before TestAsync starts running. This wait time would be measured if you remember the time Task.Run is called.
Besides the flaw in time measurements, how many service requests do you expect simultaneously? Are there enough free threads in your thread pool to simulate this? If you expect about 50 service requests at the same time, and the size of your thread pool is only 20 threads, then you'll never run 50 service requests at the same time. Vice versa: if your thread pool is way bigger than your number of expected simultaneous service requests, then you'll measure longer times than are actual the case.
Consider changing the number of threads in your thread pool, and make sure no one else uses any threads of the pool.

Checking if a thread returned to thread pool

How can I check if a thread returned to the thread pool, using VS C# 2015 debugger?
What's problematic in my case is the fact that it cannot be detected by debugging line by line.
async Task foo()
{
int y = 0;
await Task.Delay(5);
// (1) thread 2000 returns to thread pool here...
while (y<5) y++;
}
async Task testAsync()
{
Task task = foo();
// (2) ... and here thread 2000 is back from the thread pool, to run the code below. I want
// to confirm that it was in the thread pool in the meantime, using debugger.
int i = 0;
while (i < 100)
{
Console.WriteLine("Async 1 before: " + i++);
}
await task;
}
In the first line of testAsync running on thread 2000, foo is called. Once it encounters await Task.Delay(5), thread 2000 returns to thread pool (allegedly, I'm trying to confirm this), and the method waits for Task.Delay(5) to complete. In the meantime, the control returns to the caller and the first loop of testAsync is executed on thread 2000 as well.
So between two consecutive lines of code, the thread returned to thread pool and came back from there. How can I confirm this with debugger? Possibly with Threads debugger window?
To clarify a bit more what I'm asking: foo is running on thread 2000. There are two possible scenarios:
When it hits await Task.Delay(5), thread 2000 returns to the thread pool for a very short time, and the control returns to the caller, at line (2), which will execute on thread 2000 taken from the thread pool. If this is true, you can't detect it easily, because Thread 2000 was in the thread pool during time between two consecutive lines of code.
When it hits await Task.Delay(5), thread 2000 doesn't return to thread pool, but immediately executes code in testAsync starting from line (2)
I'd like to verify which one is really happening.
There is a major mistake in your assumption:
When it hits await Task.Delay(5), thread 2000 returns to the thread pool
Since you don't await foo() yet, when thread 2000 hits Task.Delay(5) it just creates a new Task and returns to testAsync() (to int i = 0;). It moves on to the while block, and only then you await task. At this point, if task is not completed yet, and assuming the rest of the code is awaited, thread 2000 will return to the thread pool. Otherwise, if task is already completed, it will synchronously continue from foo() (at while (y<5) y++;).
EDIT:
what if the main method called testAsync?
When synchronous method calls and waits async method, it must block the thread if the async method returns uncompleted Task:
void Main()
{
var task = foo();
task.Wait(); //Will block the thread if foo() is not completed.
}
Note that in the above case the thread is not returning to the thread pool - it is completely suspended by the OS.
Maybe you can give an example of how to call testAsync so that thread 2000 returns to the thread pool?
Assuming thread 2k is the main thread, it cannot return to the thread pool. But you can use Task.Run(()=> foo()) to run foo() on the thread pool, and since the calling thread is the main thread, another thread pool thread will pick up that Task. So the following code:
static void Main(string[] args)
{
Console.WriteLine("main started on thread {0}", Thread.CurrentThread.ManagedThreadId);
var testAsyncTask = Task.Run(() => testAsync());
testAsyncTask.Wait();
}
static async Task testAsync()
{
Console.WriteLine("testAsync started on thread {0}", Thread.CurrentThread.ManagedThreadId);
await Task.Delay(1000);
Console.WriteLine("testAsync continued on thread {0}", Thread.CurrentThread.ManagedThreadId);
}
Produced (on my PC) the following output:
main started on thread 1
testAsync started on thread 3
testAsync continued on thread 4
Press any key to continue . . .
Threads 3 and 4 came from and returned to the thread pool.
You can print out the Thread.CurrentThread.ManagedThreadId to the console. Note that the thread-pool is free to re-use that same thread to run continuations on it, so there's no guarantee that it'll be different:
void Main()
{
TestAsync().Wait();
}
public async Task FooAsync()
{
int y = 0;
await Task.Delay(5);
Console.WriteLine($"After awaiting in FooAsync:
{Thread.CurrentThread.ManagedThreadId }");
while (y < 5) y++;
}
public async Task TestAsync()
{
Console.WriteLine($"Before awaiting in TestAsync:
{Thread.CurrentThread.ManagedThreadId }");
Task task = foo();
int i = 0;
while (i < 100)
{
var x = i++;
}
await task;
Console.WriteLine($"After awaiting in TestAsync:
{Thread.CurrentThread.ManagedThreadId }");
}
Another thing you can check is ThreadPool.GetAvailableThreads to determine if another worker has been handed out for use:
async Task FooAsync()
{
int y = 0;
await Task.Delay(5);
Console.WriteLine("Thread-Pool threads after first await:");
int avaliableWorkers;
int avaliableIo;
ThreadPool.GetAvailableThreads(out avaliableWorkers, out avaliableIo);
Console.WriteLine($"Available Workers: { avaliableWorkers},
Available IO: { avaliableIo }");
while (y < 1000000000) y++;
}
async Task TestAsync()
{
int avaliableWorkers;
int avaliableIo;
ThreadPool.GetAvailableThreads(out avaliableWorkers, out avaliableIo);
Console.WriteLine("Thread-Pool threads before first await:");
Console.WriteLine($"Available Workers: { avaliableWorkers},
Available IO: { avaliableIo }");
Console.WriteLine("-------------------------------------------------------------");
Task task = FooAsync();
int i = 0;
while (i < 100)
{
var x = i++;
}
await task;
}
On my machine, this yields:
Thread-Pool threads before first await:
Available Workers: 1023, Available IO: 1000
----------------------------------------------
Thread-Pool threads after first await:
Available Workers: 1022, Available IO: 1000
I'd like to verify which one is really happening.
There is no way to "verify" that with debugger, because the debugger is made to simulate the logical (synchronous) flow - see Walkthrough: Using the Debugger with Async Methods.
In order to understand what is happening (FYI it's your case (2)), you need to learn how await works starting from Asynchronous Programming with Async and Await - What Happens in an Async Method section, Control Flow in Async Programs and many other sources.
Look at this snippet:
static void Main(string[] args)
{
Task.Run(() =>
{
// Initial thread pool thread
var t = testAsync();
t.Wait();
});
Console.ReadLine();
}
If we make the lambda to be async and use await t; instead of t.Wait();, this is the point where the initial thread will be returned to the thread pool. As I mentioned above, you cannot verify that with debugger. But look at the above code and think logically - we are blocking the initial thread, so if it' wasn't free, your testAsync and foo methods will not be able to resume. But they do, and this can easily be verified by putting breakpoint after await lines.

Categories