Delegate.BeginInvoke Delay - c#

Sometimes when Delegate.BeginInvoke is invoked, it takes more than one second to execute the delegate method.
What could be the reasons for the delay? I get this issue 1 or 2 times a day in an application which runs continuosly.
Please help me.
Thanks!

The thread pool manager makes sure that only as many threads are allowed to execute as you have CPU cores. As soon as one completes, another one that's waiting in the queue is allowed to execute.
Twice a second, it re-evaluates what's going on with the running threads. If they don't complete, it assumes they are blocked and allows another waiting thread to run. On the typical two-core CPU, you'll get two threads running right away, the 3rd thread starts after one second, the 4th thread after 1.5 second, etcetera.
Well, there's your second. The Q&D fix is to use ThreadPool.SetMinThreads(), but that's the sledgehammer solution. The real issue is that your program is using thread pool threads for long-running tasks. Either because they execute a lot of code or because they block on some kind of I/O request. The latter being the more common case.
The way to solve it is to not use a thread pool thread for such a blocking thread but use the Thread class instead. Don't do this if the threads are actually burning CPU cycles, you'll slow everything down. Easy to tell, you'll see 100% cpu load in Taskmgr.exe

Since you're using Delegate.BeginInvoke then you're, indirectly, using the ThreadPool. The ThreadPool recycles completed threads and allows them to be reused without going through the expense of constructing new threads and tearing completed threads down.
So... when you use Delegate.BeginInvoke you're adding the method to be invoked to a queue, as soon as the ThreadPool thinks it has an available thread for your task it will execute. However, if the ThreadPool is out of available threads then you'll be left waiting.
System.Threading.ThreadPool has several properties and methods to show how many threads are available, maximums, etc. I would try monitoring those counts to see if it looks like the ThreadPool is being spread thin.
If that's the case then the best resolution is to ensure that the ThreadPool is only being used for short-lived (small) tasks. If it's being used for long-running tasks then those tasks should be modified to use their own dedicated thread rather than occupying the ThreadPool.

Can you set the priority of the BeginInvoke?
http://msdn.microsoft.com/en-us/library/system.windows.threading.dispatcherpriority.aspx
Do you have other BeginInvoke calls waiting?
"If multiple BeginInvoke calls are made at the same DispatcherPriority, they will be executed in the order the calls were made."
http://msdn.microsoft.com/en-us/library/ms591206.aspx

Related

How can I make a thread wait until another thread is wating (C#)

I have a consumer thread that creates some worker threads. These threads must switch between active and waiting states. When all worker threads are in the waiting states, it means that the current job is done. How can I make the consumer thread wait for all the worker threads to be in the waiting state? I want a behavior very similar to Thread.Join() on all worker threads, however, I want the threads to keep running for the next job. I cannot create new threads because the jobs are in a tight loop and creating new threads is costly.
As far as I am aware there is no mechanism to do what you wish. (Thread.Join but since you can't block that is not an option)
From the info you provided it sounds like your really building a state machine, just across multiple threads.
I would create a Singleton and have that act as a state machine. Threads could signal to the Singleton there status.
It sounds like you have an indeterminate number of threads, so you would need to put the status of each in a collection. I would look here Thread Safe Collections to find the right fit for how you wish to store your state information.
Hope this helps.
Apologies for the brief answer (may expand later), but you probably the WaitHandle.WaitAll method, combined with a ManualResetEvent. You would pass your ManualResetEvent objects into each worker thread when they're created, signal them when they become idle, and pass the entire set of handles into the WaitHandle.WaitAll method to wake the observing thread when they're complete. You can also use the timeout feature of this method if you want to periodically run some kind of task while waiting, or perform some kind of operation if the task is taking too long.
Note that if your worker threads are intended to terminate when the operation is complete (wasn't totally clear if this is the case), it might be more appropriate to spawn them as tasks and use Task.WaitAll instead.
Edit: On a quick re-read, it sounds like you do want to be using tasks rather than trying to re-use full worker threads. Tasks use threads which have been allocated from the thread pool, eliminating that thread creation overhead you were worried about, because the threads will (generally) be ready and waiting for work. You can simply spawn each task and wait for them all to be finished.

Force ThreadPool to start the thread sooner

I use ThreadPool.QueueUserWorkItem for creating a thread on Windows CE (I use .NET Framework 3.5). Sometimes the thread waits for something and starts too late. In the QueueUserWorkItem documentation it says that the delegate will be executed "when a thread pool thread becomes available".
Is there a way to force the ThreadPool to execute my delegate immediately? Would Thread.Start() be a solution for this?
Thank you!
First off, QueueUserWorkItem doesn't create a thread, it merely places a "task" in the ThreadPool's queue for the workers to pick up and execute. In case of saturation (more tasks than available threads), there is no guarantee of when a worker will become available to execute the task. If you want immediate execution use an instance of Thread instead. The only way to improve your odds with the ThreadPool is to increase the number of workers.
Edit: Just to be clear, if thread pool threads are indeed free, they will pick up work and execute it usually faster than starting a fresh thread.
A ThreadPool have a limited size. So you can't lunch as many thread as you want in the same time. If all the threads are busy then you have to wait for one to become available.
Check the number of thread you want to lunch and compare it to the Threadpool size -> GetMaxThreads()
Then if you want more thread just resize the pool with SetMaxThreads(int)
If you start a lot of threads from a pool you can get situation when there is no a free thread and your request is queued, that's why sometimes it starts to late. Try to increase a max number of worker threads in the pool. Use ThreadPool.SetMaxThreads and ThreadPool.SetMinThreads to configure the pool.

Thread: How to re-start thread once completed?

I have a method void DoWork(object input) that takes roughly 5 seconds to complete. I have read that Thread is better suited than ThreadPool for these longer operations but I have encountered a problem.
I click a button which calls threadRun.Start(input) which runs and completes fine. I click the button again and receive the following exception:
Thread is running or terminated; it cannot restart.
Can you not "reuse" a Thread? Should I use ThreadPool? Why is Thread "better suited for longer operations" compared to ThreadPool? If you can't reuse a thread, why use it at all (i.e. what advantages does it offer)?
Can you not "reuse" a Thread?
You can. But you have to code the thread not to terminate but to instead wait for more work. That's what a thread pool does.
Should I use ThreadPool?
If you want to re-use a thread, yes.
Why is Thread "better suited for longer operations" compared to ThreadPool?
Imagine a thread pool that is serving a large number of quick operations. You don't want to have too many threads, because the computer can only do so many things at a time. Each long operation you make the thread pool do ties up a thread from the pool. So the pool either has to have lots of extra threads or may run short of threads. Neither leads to an efficient thread pool design.
For longer operations, the overhead of creating and destroying a thread is very small in comparison to the cost of the operation. So the normal downside of using a thread just for the operation doesn't apply.
If you can't reuse a thread, why use it at all (i.e. what advantages does it offer)?
I'm assuming you mean using a thread dedicated to a job that then terminates over using a thread pool. The advantage is that the number of threads will always equal the number of jobs this way. This means you have to create a thread every time you start a job and destroy a thread every time you finish one, but you never have extra threads nor do you ever run short on threads. (This can be a good thing with I/O bound threads but can be a bad thing if most threads are CPU bound most of the time.)
Thread.Start documentation says:
Once the thread terminates, it cannot be restarted with another call
to Start.
Threads are not reusable. I have already faced this problem a while ago, the solution was to create a new Thread instance whenever needed.
It looks like this by by design.
I encountered the same problem and the only solution I could find was to recreate the thread. In my case I wasn't restarting the thread very often so I didn't look any further.
A search now has turned up this thread on social.msdn where the accepted answer states:
a stopped or aborted thread cannot be stated again.
The MSDN repeat this as well:
trying to restart an aborted thread by calling Start on a thread that has terminated throws a ThreadStateException.
As the message states, you cannot restart the thread. You can simply create a new thread for your next operation. Or, you might consider a design where the background thread keeps working until it completes all of your tasks, rather than launch a new thread for each one.
for(;;){} or while(true){} are useful constructs to 'reuse' a thread. Typically, the thread waits on some synchronization object at the top of these loops. In your example, you could wait on an event or semaphore and signal it from your button OnClick() handler.
It's just in background mode. It sounds like you need to use the ThreadPool because re-starting and re-creating Thread objects are very expensive operations. If you have a long running job that may last longer than your main process, then consider the use of a Windows Service.

Is ThreadPool worth it in this scenario?

I have a thread that I fire off every time the user scans a barcode.
Most of the time it is a fairly short running thread. But sometimes it can take a very long time (waiting on a invoke to the GUI thread).
I have read that it may be a good idea to use the ThreadPool for this rather than just creating my own thread for each scan.
But I have also read that if the ThreadPool runs out of threads then it will just wait until some other thread exits (not OK for what I am doing).
So, how likely is it that I am going to run out of threads? And is the benefit of the ThreadPool really worth it? (When I scan it does not seem to take too long for the scan to "run" the thread logic.)
It depends on what you mean by "a very long time" and how common that scenario is.
The MSDN topic "The Managed Thread Pool" offers good guidelines for when not to use thread pool threads:
There are several scenarios in which it is appropriate to create and manage your own threads instead of using thread pool threads:
You require a foreground thread.
You require a thread to have a particular priority.
You have tasks that cause the thread to block for long periods of time. The
thread pool has a maximum number of
threads, so a large number of blocked
thread pool threads might prevent
tasks from starting.
You need to place threads into a single-threaded apartment. All
ThreadPool threads are in the
multithreaded apartment.
You need to have a stable identity associated with the thread, or to
dedicate a thread to a task.
Since the user will never scan more than one barcode at a time, the memory costs of the threadpool might not be worth it - I'd stick with a single thread just waiting in the background.
The point of the thread pool is to amortize the cost of creating threads, which are not inexpensive to spin up and tear down. If you have a short-running task, the cost of creating/destroying the thread can be a significant portion of the overall run-time. The maximum number of threads in the thread pool depends on the version of the .NET Framework, typically dozens to hundreds per processor. The number of threads is scaled depending on available work.
Will you run out of threads and have to wait for a thread to become available? It depends on your workload. You can get the maximum number of threads available via ThreadPool.GetMaxThreads(). Chances are (based on the description of your problem) that this number is sufficiently high.
http://msdn.microsoft.com/en-us/library/system.threading.threadpool.getmaxthreads.aspx
Another option would be to manage your own pool of scan threads and assign them work rather than creating a new thread for every scan. Personally I would try the threadpool first and only manage your own threads if it proved necessary. Even better, I would look into async programming techniques in .NET. The methods will be run on the thread pool, but give you a much nicer programming experience than manual thread management.
If most of the time it is short running threads you could use the thread pool or a BackgroundWorker which draws threads from the pool.
An advantage I can see in your case is that threadpool class puts an upper limit on the amount of threads that may be active. It depends on the context of your application whether you will exhaust system resources. Exhausting a modern desktop system is VERY hard to do really.
If the software is used in a supermarket till it is highly unlikely that you will have more then 5 barcodes being analysed at the same time. If its run in a back-end server for a whole row of supermarket tills. Then perhaps 30-100 concurrent requests might be active.
With this sort of theory crafting it is highly unlikely that you will run out of threads, even on embedded hardware. If you have a dozen or so requests active at a time, and your code works, it's ok to just leave it as it is.
A thread pool is just an abstraction though, and you could have queue in the middle that queues request onto a thread-pool, in this scenario for the row-of-till example above, I'd feel comfortable queueing 100-1000 requests against a threadpool with 10 threads.
In .net (and on windows in general), the question should always be reversed: "Is creating a new thread worth it in this scenario?"
Creating a new thread is expensive, and doing it over and over again is almost certainly not worth it. The thread pool is cheap, and really should be the first thing you turn to when you need a new thread.
If you decide to spin up a new thread, soon you will start worrying about re-using the thread if it's already running. Then you will start worrying that sometimes the thread is running but it seems to be taking too long, and so you should make a new one. Then you're going to decide to have a thread not exit immediately upon finishing work, but to wait a little while in case new work comes in. And then... bam! You've created your own thread pool. At which point you should just back up and use the system-provided one.
The folks who mentioned that the thread pool might "run out of threads" were well-intentioned, but they did you a disservice. The limit on the number of threads in the thread pool is quite large. If you run into it, you have other problems.
(And, of course, since .net 2.0, you can set the maximum number of threads, so you can tweak the number if you absolutely have to.)
Others have directed you to MSDN: "The Managed Thread Pool". I will repeat that direction, as the article is good, but in my mind does not sell the thread pool hard enough. :)

Why .net Threadpool is used only for short time span tasks?

I've read at many places that .net Threadpool is meant for short time span tasks (may be not more than 3secs). In all these mentioning I've not found a concrete reason why it should be not be used.
Even some people said that it leads to nasty results if we use for long time tasks and also leads to deadlocks.
Can somebody explain it in plain english with technical reason why we should not use thread pool for long time span tasks?
To be specific, I would even like to give a scenario and want to to know why ThreadPool should not be used in this scenario with proper reasons behind it.
Scenario: I need to process some thousands of user's data. User's processing data is retrieved from a local database and using that information I need to connect to an API hosted on some other location and the response from API will be stored in the local database after processing it.
If someone can explain me pitfalls in this scenario if I use ThreadPool with thread limit of 20? Processing time of each user may range from 3 sec to 1 min (or more).
The point of the threadpool is to avoid the situation where the time spent creating the thread is longer than the time spent using it. By reusing existing threads, we get to avoid that overhead.
The downside is that the threadpool is a shared resource: if you're using a thread, something else can't. So if you have lots of long-running tasks, you could end up with thread-pool starvation, possibly even leading to deadlock.
Don't forget that your application's code may not be the only code using the thread pool... the system code uses it a lot too.
It sounds like you might want to have your own producer/consumer queue, with a small number of threads processing it. Alternatively, if you could talk to your other service using an asynchronous API, you may find that each bit of processing on your computer would be short-lived.
It is related to the way the threadpool scheduler works. It tries hard to ensure that it won't release more waiting threads than you have CPU cores. Which is a good idea, running more threads than cores is wasteful as Windows spends time switching context between threads. Making the overall time needed to complete the jobs longer.
As soon as a TP thread completes, another one is allowed to run. Two times per second, the TP scheduler steps in when the running threads do not complete. It cannot tell why these threads are taking so much time to get their job done. Half a second is a lot of CPU cycles, a cool billion or so. It therefore assumes that the threads are blocking, waiting for some kind of I/O to complete. Like a dbase query, a disk read, a socket connection attempt, stuff like that.
And it allows another thread to run. You've now got more threads then you have cores. Which isn't really a problem if those original threads are indeed blocking, they're not consuming any CPU cycles.
You can see where this leads: if your thread runs for 3 seconds then its creating a bit of a logjam. It delays, but won't block, other TP threads that are waiting to run. If your thread needs to spend so much time because it is constantly blocking then you are better off creating a regular Thread. And if you really care that the thread does not get delayed by the TP scheduler then you should use a Thread as well.
The TP scheduler was tinkered with in .NET 4.0 btw, what I wrote is really only true for earlier releases. The basics are still there, it just uses a smarter scheduling algorithm. Based on a feedback, dynamically scheduling by measuring throughput. This really only matters if you have a lot of TP threads going.
Two reasons not really touched upon:
The threadpool is used as the normal means of handling I/O callback functions, which are usually supposed to happen very soon after associated I/O operation completes. In general, timeliness is more important with short tasks than long ones, but long-running tasks in the threadpool will delay the execution of notification tasks which could have (and should have) started up, run, and completed quickly.
If a threadpool task becomes blocked until such time as some other threadpool task runs, it may hog a threadpool thread, thus delaying or in some cases blocking altogether the start of that other task (or any others).
Generally, having a threadpool thread acquire a lock (waiting if necessary) isn't a problem. If it's necessary for one threadpool thread to wait for another threadpool thread to release a lock, the fact that latter thread acquired the lock in the first place implies that it got started. On the other hand, waiting for e.g. some data to arrive from a connection may cause deadlock if an I/O callback routine is used to flag the arrival of data. If many too many threadpool threads are waiting for the I/O callback to signal that data has arrived, the system may decide to defer the callback until one of the threadpool threads completes.

Categories