I've made a clean project of .net Core 3.0 Web application and I am trying to understand, how ThreadPool works in C#.
namespace TestASPSelf.Controllers
{
public class HomeController : Controller
{
private readonly ILogger<HomeController> _logger;
public static int countThread = 0;
public HomeController(ILogger<HomeController> logger)
{
_logger = logger;
}
public IActionResult Index()
{
int workerThreads;
int portThreads;
ThreadPool.GetMaxThreads(out workerThreads, out portThreads);
Console.WriteLine("\nMaximum worker threads: \t{0}" +
"\nMaximum completion port threads: {1}",
workerThreads, portThreads);
ThreadPool.GetAvailableThreads(out workerThreads,
out portThreads);
Console.WriteLine("\nAvailable worker threads: \t{0}" +
"\nAvailable completion port threads: {1}\n",
workerThreads, portThreads);
Console.WriteLine("countThread = " + countThread);
return View();
}
class Z
{
public static void WaitTest(object o)
{
countThread++;
while (true)
{
Thread.Sleep(1000);
}
}
}
public IActionResult Privacy()
{
for (int i = 0; i < 100; i++)
{
Console.WriteLine("starting thread "+i);
ThreadPool.QueueUserWorkItem(new WaitCallback(Z.WaitTest));
}
return View();
}
}
}
When http://localhost:5000/Home/Privacy opened, it hangs for some time (for about 40-80 seconds), but I see, that logic of for cycles in it completes almost instantly.
When http://localhost:5000/ opened after that, it hangs for 40-80 seconds too and result is in console countThread = 100.
CPU usage of app is about 5-10%, when threads were started.
I am trying to understand:
1) The first one is why ASP controller hangs for 40-80 seconds per page, when 100 Threads are running by CPU usage 5-10 percents. CPU has a lot of resources, RAM is free too, but why ASP Controller methods of pages are hangs?
2) How to create ThreadPool in C# with limited count of running threads? If I understand method public static bool SetMinThreads (int workerThreads, int completionPortThreads); correctly, it affects globally all threads of the app. How to create object of ThreadPool with limited count of active threads, like ExecutorService in Java? For example, Java code of thread pool could look like
ExecutorService executor = Executors.newFixedThreadPool(5);
for (int i = 0; i < 10; i++) {
Runnable worker = new WorkerThread("" + i);
executor.execute(worker);
}
executor.shutdown();
while (!executor.isTerminated()) {
}
3) How to prevent hangs of all methods of ASP Controller and to make "truly real" threads, like in Java?
ThreadPool.QueueUserWorkItem(new WaitCallback(Z.WaitTest));
With this you do something very wrong. You cause threads in the threadpool to block and thus the pool is unable to finish processing your request and process new requests.
At some point a thread from the pool manages to return and process your next request, but than it hangs again due to the overloaded pool.
As for your other questions, please explain what you want to achieve. Your question seems to try to solve a problem that is not understood well.
Update: After Arthur's comment.
If you will be downloading files, you should use Tasks and async-await. IO operations do not consume threads (more here).
Create N tasks, each downloading a file, and then await Task.WhenAll. Pseudo code:
List<Task> tasks = new List<Task>();
for (int i = 0; i < filesToDownloadCount; i++)
{
var t = new Task ( () => { /* ... code to download your file here ... */});
tasks.Add (t);
}
await t.WhenAll (tasks);
This approach will give you the best throughput and your bottleneck will be your bandwidth, not the CPU.
The ThreadPool class has several static methods including the QueueUserWorkItem that is responsible for calling a thread pool worker thread when it is available. If no worker thread is available in the thread pool, it waits until the thread becomes available.
using System;
using System.Threading;
class ThreadPoolSample
{
// Background task
static void BackgroundTask(Object stateInfo)
{
Console.WriteLine("Hello! I'm a worker from ThreadPool");
Thread.Sleep(1000);
}
static void BackgroundTaskWithObject(Object stateInfo)
{
Person data = (Person)stateInfo;
Console.WriteLine($"Hi {data.Name} from ThreadPool.");
Thread.Sleep(1000);
}
static void Main(string[] args)
{
// Use ThreadPool for a worker thread
ThreadPool.QueueUserWorkItem(BackgroundTask);
Console.WriteLine("Main thread does some work, then sleeps.");
Thread.Sleep(500);
// Create an object and pass it to ThreadPool worker thread
Person p = new Person("Mahesh Chand", 40, "Male");
ThreadPool.QueueUserWorkItem(BackgroundTaskWithObject, p);
int workers, ports;
// Get maximum number of threads
ThreadPool.GetMaxThreads(out workers, out ports);
Console.WriteLine($"Maximum worker threads: {workers} ");
Console.WriteLine($"Maximum completion port threads: {ports}");
// Get available threads
ThreadPool.GetAvailableThreads(out workers, out ports);
Console.WriteLine($"Availalbe worker threads: {workers} ");
Console.WriteLine($"Available completion port threads: {ports}");
// Set minimum threads
int minWorker, minIOC;
ThreadPool.GetMinThreads(out minWorker, out minIOC);
ThreadPool.SetMinThreads(4, minIOC);
// Get total number of processes availalbe on the machine
int processCount = Environment.ProcessorCount;
Console.WriteLine($"No. of processes available on the system: {processCount}");
// Get minimum number of threads
ThreadPool.GetMinThreads(out workers, out ports);
Console.WriteLine($"Minimum worker threads: {workers} ");
Console.WriteLine($"Minimum completion port threads: {ports}");
Console.ReadKey();
}
// Create a Person class
public class Person
{
public string Name { get; set; }
public int Age { get; set; }
public string Sex { get; set; }
public Person(string name, int age, string sex)
{
this.Name = name;
this.Age = age;
this.Sex = sex;
}
}
}
Related
I've coded a void to handle multiple threads for selenium web browsing. The issue is that right now for example, if i input 4 tasks, and 2 threads. The program says it finished when it has finished 2 tasks.
Edit: Basically I want the program to wait for the tasks to complete And also I want that if one thread finishes but the other is running and there are tasks to do, it goes directly to start another task, and not waiting for the 2nd thread to finish.
Thanks and sorry for the code, made it fast to show it as a example of how it is.
{
static void Main(string[] args)
{
Threads(4, 4);
Console.WriteLine("Program has finished");
Console.ReadLine();
}
static Random ran = new Random();
static int loop;
public static void Threads(int number, int threads)
{
for (int i = 0; i < number; i++)
{
if (threads == 1)
{
generateDriver();
}
else if (threads > 1)
{
start:
if (loop < threads)
{
loop++;
Thread thread = new Thread(() => generateDriver());
thread.Start();
}
else
{
Task.Delay(2000).Wait();
goto start;
}
}
}
}
public static void test(IWebDriver driver)
{
driver.Navigate().GoToUrl("https://google.com/");
int timer = ran.Next(100, 2000);
Task.Delay(timer).Wait();
Console.WriteLine("[" + DateTime.Now.ToString("hh:mm:ss") + "] - " + "Task done.");
loop--;
driver.Close();
}
public static void generateDriver()
{
ChromeOptions options = new ChromeOptions();
options.AddArguments("--disable-dev-shm-usage");
options.AddArguments("--disable-extensions");
options.AddArguments("--disable-gpu");
options.AddArguments("window-size=1024,768");
options.AddArguments("--test-type");
ChromeDriverService service = ChromeDriverService.CreateDefaultService(Path.GetDirectoryName(Assembly.GetExecutingAssembly().Location));
service.HideCommandPromptWindow = true;
service.SuppressInitialDiagnosticInformation = true;
IWebDriver driver = new ChromeDriver(service, options);
test(driver);
}
Manually keeping track of running threads, waiting for them to finish and reusing ones that are already finished is not trivial.
However the .NET runtime provides ready made solutions that you should prefer to handling it yourself.
The simplest way to achieve your desired result is to use a Parallel.For loop and set the MaxDegreeOfParallelism, e.g.:
public static void Threads(int number, int threads)
{
Parallel.For(0, number,
new ParallelOptions { MaxDegreeOfParallelism = threads },
_ => generateDriver());
}
If you really want to do it manually you will need to use arrays of Thread (or Task) and keep iterating over them, checking whether they have finished and if they did replace them with a new thread. This requires quite a bit more code than the Parallel.For solution (and is unlikely to perform better)
Inspired by The Little Book of Semaphores, I decided to implement the Producer-Consumer problem using Semaphores.
I specifically want to be able to stop all Worker threads at will.
I've tested my methodolodgy extensively and can't find anything faulty.
Following code is a prototype for testing and can be ran as a Console application:
using System;
using System.Collections.Concurrent;
using System.Threading;
using NUnit.Framework;
public class ProducerConsumer
{
private static readonly int _numThreads = 5;
private static readonly int _numItemsEnqueued = 10;
private static readonly Semaphore _workItems = new Semaphore(0, int.MaxValue);
private static readonly ManualResetEvent _stop = new ManualResetEvent(false);
private static ConcurrentQueue<int> _queue;
public static void Main()
{
_queue = new ConcurrentQueue<int>();
// Create and start threads.
for (int i = 1; i <= _numThreads; i++)
{
Thread t = new Thread(new ParameterizedThreadStart(Worker));
// Start the thread, passing the number.
t.Start(i);
}
// Wait for half a second, to allow all the
// threads to start and to block on the semaphore.
Thread.Sleep(500);
Console.WriteLine(string.Format("Main thread adds {0} items to the queue and calls Release() {0} times.", _numItemsEnqueued));
for (int i = 1; i <= _numItemsEnqueued; i++)
{
Console.WriteLine("Waking up a worker thread.");
_queue.Enqueue(i);
_workItems.Release(); //wake up 1 worker
Thread.Sleep(2000); //sleep 2 sec so it's clear the threads get unblocked 1 by 1
}
// sleep for 5 seconds to allow threads to exit
Thread.Sleep(5000);
Assert.True(_queue.Count == 0);
Console.WriteLine("Main thread stops all threads.");
_stop.Set();
// wait a while to exit
Thread.Sleep(5000);
Console.WriteLine("Main thread exits.");
Console.WriteLine(string.Format("Last value of Semaphore was {0}.", _workItems.Release()));
Assert.True(_queue.Count == 0);
Console.WriteLine("Press Enter to exit.");
Console.ReadLine();
}
private static void Worker(object num)
{
// Each worker thread begins by requesting the semaphore.
Console.WriteLine("Thread {0} begins and waits for the semaphore.", num);
WaitHandle[] wait = { _workItems, _stop };
int signal;
while (0 == (signal = WaitHandle.WaitAny(wait)))
{
Console.WriteLine("Thread {0} becomes unblocked by Release() and has work to do.", num);
int res;
if (_queue.TryDequeue(out res))
{
Console.WriteLine("Thread {0} dequeues {1}.", num, res);
}
else
{
throw new Exception("this should not happen.");
}
}
if (signal == 1)
Console.WriteLine("Thread {0} was stopped.", num);
Console.WriteLine("Thread {0} exits.", num);
}
}
Now for my question, I'm using WaitHandle.WaitAny(semaphore) under the assumption that when I call Release() on the semaphore, only 1 Worker will be woken up. However, I can't find reassurance in the documentation that this is actually true. Can anyone confirm this is true?
It is indeed interesting that it seems that the documentation doesn't state explicitly that in the case of WaitOne only 1 thread will receive the signal. When you get familiar with multithreading theory this becomes somewhat self-evident.
Yes, WaitOne that is called on Semaphore (and WaitAny that is called on a list of WaitHandles that include Semaphore) is received by a single thread. If you want reference from MSDN so here it is, Semaphore is child class of WaitHandle, which is:
Encapsulates operating system–specific objects that wait for exclusive access to shared resources.
So yes, unless explicitly stated methods provide exclusive access.
For example method WaitOne of ManualResetEvent will unblock for all waiting threads, but documentation is explicit about it:
Notifies one or more waiting threads that an event has occurred.
Even if I am using TPL from long time but since it sounds new to me. I want to understand the TPL with thread pool and I created a POC in .NET framework 4.0 for that which is as below.
public class CustomData
{
public long CreationTime;
public int Name;
public int ThreadNum;
}
public class TPLSample
{
public int MaxThread = 0;
public void Start()
{
Task[] taskArray = new Task[10000];
for (int i = 0; i < taskArray.Length; i++)
{
taskArray[i] = Task.Factory.StartNew((Object obj) =>
{
var data = new CustomData() { Name = i, CreationTime = DateTime.Now.Ticks };
Thread.SpinWait(10000);
data.ThreadNum = Thread.CurrentThread.ManagedThreadId;
if (Thread.CurrentThread.ManagedThreadId > MaxThread)
{
MaxThread = Thread.CurrentThread.ManagedThreadId;
}
Console.WriteLine("Task #{0} created at {1} on thread #{2}.",
data.Name, data.CreationTime, data.ThreadNum);
},
i);
}
Task.WaitAll(taskArray);
Console.WriteLine("Max no of threads {0}", MaxThread);
}
}
I found that only 14 threads are created to do this task!!
But why the 14? what is the criteria ? can I increase or decrease this number?
How can I change this number. Is it really possible or totally abstracted from a developer.
From MSDN:
The number of operations that can be queued to the thread pool is limited only by available memory; however, the thread pool limits the number of threads that can be active in the process simultaneously. Beginning with the .NET Framework 4, the default size of the thread pool for a process depends on several factors, such as the size of the virtual address space. A process can call the GetMaxThreads method to determine the number of threads.
Another MSDN:
The TPL may employ various optimizations, especially with large numbers of delegates.
Another SO question about this. Hopefully this will quench your thirst.
I have an app that takes on unknown amount of task. The task are blocking (they wait on network) i'll need multiple threads to keep busy.
Is there an easy way for me to have a giant list of task and worker threads which will pull the task when they are idle? ATM i just start a new thread for each task, which is fine but i'd like some control so if there are 100task i dont have 100threads.
Assuming that the network I/O classes that you are dealing with expose Begin/End style async methods, then what you want to do is use the TPL TaskFactory.FromAsync method. As laid out in TPL TaskFactory.FromAsync vs Tasks with blocking methods, the FromAsync method will use async I/O under the covers, rather than keeping a thread busy just waiting for the I/O to complete (which is actually not what you want).
The way that Async I/O works is that you have a pool of threads that can handle the result of I/O when the result is ready, so that if you have 100 outstanding I/Os you don't have 100 threads blocked waiting for those I/Os. When the whole pool is busy handling I/O results, subsequent results get queued up automatically until a thread frees up to handle them. Keeping a huge pool of threads waiting like that is a scalability disaster- threads are hugely expensive objects to keep around idling.
here a msdn sample to manage through a threadpool many threads:
using System;
using System.Threading;
public class Fibonacci
{
public Fibonacci(int n, ManualResetEvent doneEvent)
{
_n = n;
_doneEvent = doneEvent;
}
// Wrapper method for use with thread pool.
public void ThreadPoolCallback(Object threadContext)
{
int threadIndex = (int)threadContext;
Console.WriteLine("thread {0} started...", threadIndex);
_fibOfN = Calculate(_n);
Console.WriteLine("thread {0} result calculated...", threadIndex);
_doneEvent.Set();
}
// Recursive method that calculates the Nth Fibonacci number.
public int Calculate(int n)
{
if (n <= 1)
{
return n;
}
return Calculate(n - 1) + Calculate(n - 2);
}
public int N { get { return _n; } }
private int _n;
public int FibOfN { get { return _fibOfN; } }
private int _fibOfN;
private ManualResetEvent _doneEvent;
}
public class ThreadPoolExample
{
static void Main()
{
const int FibonacciCalculations = 10;
// One event is used for each Fibonacci object
ManualResetEvent[] doneEvents = new ManualResetEvent[FibonacciCalculations];
Fibonacci[] fibArray = new Fibonacci[FibonacciCalculations];
Random r = new Random();
// Configure and launch threads using ThreadPool:
Console.WriteLine("launching {0} tasks...", FibonacciCalculations);
for (int i = 0; i < FibonacciCalculations; i++)
{
doneEvents[i] = new ManualResetEvent(false);
Fibonacci f = new Fibonacci(r.Next(20,40), doneEvents[i]);
fibArray[i] = f;
ThreadPool.QueueUserWorkItem(f.ThreadPoolCallback, i);
}
// Wait for all threads in pool to calculation...
WaitHandle.WaitAll(doneEvents);
Console.WriteLine("All calculations are complete.");
// Display the results...
for (int i= 0; i<FibonacciCalculations; i++)
{
Fibonacci f = fibArray[i];
Console.WriteLine("Fibonacci({0}) = {1}", f.N, f.FibOfN);
}
}
}
Here is some code that perpetually generate GUIDs. I've written it to learn about threading. In it you'll notice that I've got a lock around where I generate GUIDs and enqueue them even though the ConcurrentQueue is thread safe. It's because my actual code will need to use NHibernate and so I must make sure that only one thread gets to fill the queue.
While I monitor this code in Task Manager, I notice the process drops the number of threads from 18 (on my machine) to 14 but no less. Is this because my code isn't good?
Also can someone refactor this if they see fit? I love shorter code.
class Program
{
ConcurrentNewsBreaker Breaker;
static void Main(string[] args)
{
new Program().Execute();
Console.Read();
}
public void Execute()
{
Breaker = new ConcurrentNewsBreaker();
QueueSome();
}
public void QueueSome()
{
ThreadPool.QueueUserWorkItem(DoExecute);
}
public void DoExecute(Object State)
{
String Id = Breaker.Pop();
Console.WriteLine(String.Format("- {0} {1}", Thread.CurrentThread.ManagedThreadId, Breaker.Pop()));
if (Breaker.Any())
QueueSome();
else
Console.WriteLine(String.Format("- {0} XXXX ", Thread.CurrentThread.ManagedThreadId));
}
}
public class ConcurrentNewsBreaker
{
static readonly Object LockObject = new Object();
ConcurrentQueue<String> Store = new ConcurrentQueue<String>();
public String Pop()
{
String Result = null;
if (Any())
Store.TryDequeue(out Result);
return Result;
}
public Boolean Any()
{
if (!Store.Any())
{
Task FillTask = new Task(FillupTheQueue, Store);
FillTask.Start();
FillTask.Wait();
}
return Store.Any();
}
private void FillupTheQueue(Object StoreObject)
{
ConcurrentQueue<String> Store = StoreObject as ConcurrentQueue<String>;
lock(LockObject)
{
for(Int32 i = 0; i < 100; i++)
Store.Enqueue(Guid.NewGuid().ToString());
}
}
}
You are using .NET's ThreadPool so .NET/Windows manages the number of threads based on the amount of work waiting to be processed.
While I monitor this code in Task
Manager, I notice the process drops
the number of threads from 18 (on my
machine) to 14 but no less. Is this
because my code isn't good?
This does not indicate a problem. 14 is still high, unless you've got a 16-core cpu.
The threadpool will try to adjust and do the work with as few threads as possible.
You should start to worry when the number of threads goes up significantly.