So far during my experience in Windows Phone 7 application development I notices there are different ways to runs an action in an asynchronous thread.
System.Threading.Thread
System.ComponentModel.BackgroundWorker
System.Threading.ThreadPool.QueueUserWorkItem()
I couldn't see any tangible difference between these methods (other than that the first two are more traceable).
Is there any thing you guys consider before using any of these methods? Which one would you prefer and why?
The question is kinda answered but the answers are a little short on detail (IMO).
Lets take each in turn.
System.Threading.Thread
All the threads (in the CLR anyway) are ultimately represented by this class. However you probably included this to query when we might want to create an instance ourselves.
The answer is rarely. Ordinarily the day-to-day workhorse for dispatching background tasks is the Threadpool. However there are some circumstances where we would want to create our own thread. Typically such a thread would live for most of the app runtime. It would spend most of its life in blocked on some wait handle. Occasionally we signal this handle and it comes alive to do something important but then it goes back to sleep. We don't use a Threadpool work item for this because we do not countenance the idea that it may queue up behind a large set of outstanding tasks some of which may themselves (perhaps inadverently) be blocked on some other wait.
System.ComponentModel.BackgroundWorker
This is friendly class wrapper around the a ThreadPool work item. This class only to the UI oriented developer who occasionally needs to use a background thread. Its events being dispatched on the UI thread makes it easy to consume.
System.Threading.ThreadPool.QueueUserWorkItem
This the day-to-day workhorse when you have some work you want doing on a background thread. This eliminates the expense of allocating and deallocating individual threads to perform some task. It limits the number of thread instances to prevent too much of the available resources being gobbled up by too many operations try to run in parallel.
The QueueUserWorkItem is my prefered option for invoking background operations.
It arguably depends on what you are trying to do, you have listed 3 very different threading models.
Basic threading
Designed for applications with a seperate UI thread.
Managed thread pool
Have you read MSDN etc...
http://www.albahari.com/threadin
Http://msdn.microsoft.com/en-us/library/aa645740(v=vs.71).aspx
You don't state "what for", but
Basic Thread - quite expensive, not for small jobs
Backgroundworker - mostly for UI + Progressbar work
ThreadPool - for small independent jobs
I think the TPL is not supported in SL, which is a pity.
The background worker tends to be better to use when your UI needs to be update as your thread progresses because it handles invoking the call back functions (such as the OnProgress callback) on the UI thread rather than the background thread. The other two don't do this work. It is up to you to do it.
Related
I am coding an application that runs many threads in the background which have to report back to the main thread so it can update a table in the interface. In the past, the worker threads were ordinary separate classes (named Citizen) which I have ran from the main thread using something like
new Thread(new ThreadStart(citizen.ProcessActions)).Start();
where ProcessActions function was the main function which did all the background work. Before actually starting the thread, I would register event handlers so the Citizen threads could log/report some stuff to the interface. Usually, there are tens of these Citizen threads (around 50) and they're pretty big classes - each has it's own HTTP client and it browses the web.
Is this a good way to do manage threads? Probably not, to be frank; I'm pretty sure the threads aren't gracefully exiting - once the ProcessActions function gets done, I remove the event handlers and that's it - the memory usage keeps rising with each new Citizen started.
What would be the best way to manage many (50+) threads, with which you have to communicate often? I believe I wouldn't have to worry much about thread safety for Citizen variables as I wouldn't be accessing them from other threads but it's own thread.
I think what you're looking for is a thread pool. Here's an MSDN article on them and that should be available in C# 4.0.
The idea would be to create a thread pool, set its count to some high number(say 50), and then start assigning threads to tasks. If the pool needs to expand, it can, but by declaring a high number up front, you get all the expensive creation of threads out of the way.
It might be beneficial to 'queue' tasks that you want to get done, and assign those tasks as threads become available.
Also, memory leaks can be hard to find, but I would start by testing the simple case: Take out all threads(just run one Citizen after another from the main thread) and let it run for a long time. If it's still leaking memory, your thread management isn't the issue.
I'm working on a network-bound application, which is supposed to have a lot (hundreds, may be thousands) of parallel processes.
I'm looking for the best way to implement it.
When I tried setting
ThreadPool.SetMaxThreads(int.MaxValue, int.MaxValue);
and than creating 1000 threads and making those do stuff in parallel, application's execution became really jumpy.
I've heard somewhere that delegate.BeginInvoke is somehow better that new Thread(...), so I've tried it, and than opened the app in debugger, and what I've seen are parallel threads.
If I have to create lots and lots of threads, what is the best way to ensure that the application is going to run smoothly?
Have you tried the new await / async pattern in C# 5 / .NET 4.5?
I haven't got sources to hand about how this operates under the hood, but one of the most common use-cases of this new feature is waiting for IO bound stuff.
Threads are not lightweight objects. They are expensive to create and context switch to/from; hence the reason for the Thread Pool (pre-created and recycled). Most common solutions that involve networking or other IO ports utilise lower-level IO Completion Ports (there is a managed library here) to "wait" on a port, but where the thread can continue executing as normal.
BeginInvoke will utilise a Thread Pool thread, so it will be better than creating your own only if a thread is available. This approach, if used too heavily, can immediately result in thread starvation.
Setting such a high thread pool count is not going to work in the long run as threads are too heavy for what it appears you want to do.
Axum, a former Microsoft Research language, used to achieve massive parallelism that would have been suitable for this task. It operated similarly to Stackless Python or Erlang. Lots of concepts from Axum made their way into the parallelism drive into C# 5 and .NET 4.5.
Setting the ThreadPool.SetMaxThreads will only affect how many threads the thread pool has, and it won't make a difference regarding threads you create yourself with new Thread().
Go async (model, not keyword) as suggested by many.
You should follow the advice mentioned in the other answers and comments. As fsimonazzi says, creating new threads directly has nothing to do with the ThreadPool. For a quick test lower the max worker and completionPort threads and use the ThreadPool.QueueUserWorkItem method. The ThreadPool will decide what your system can handle, queue your tasks and resuse threads whenever it can.
If your tasks are not compute-bound then you should also utilize asynchronous I/O. You do not your worker threads to wait for I/O completion. You need those worker threads to return to the pool as quickly as possible and not block on I/O requests.
The Microsoft .NET Base Class Library provides several ways to create a thread and start it. Basically the invocation is very similar to every other one providing the same kind of service: create an object representing an execution flow (or more), assign it a delegate representing the execution flow to execute and, eventually, depending on delegate signature, an object as a parameter.
Well, there are two approaches (essentially):
1) Using the System.Threading.Thread class.
Thread curr = new Thread(myfunction); /* In a class, myfunction is a void taking an object */
curr.Start(new Object()); /* Or something else to be downcast */
2) Using the System.Threading.ThreadPool class.
ThreadPool.QueueUserWorkItem(myfunction, new Object()); /* Same philosophy here */
Are there any special reasons why I should use 1) or 2)?
Performance reasons?
Patterns?
What is the best approach?
I have a feeling that the answer is: "Depend by the situation". Could you please list some situations where one approach is better than another?
Starting a new thread can be a very expensive operation. The thread pool reuses threads and thus amortizes the cost. Unless you need a dedicated thread, the thread pool is the recommended way to go. By using a dedicated thread you have more control over thread specific attributes such as priority, culture and so forth. Also, you should not do long running tasks on the thread pool as it will force the pool to spawn additional threads.
In addition to the options you mention .NET 4 offers some great abstractions for concurrency. Check out the Task and Parallel classes as well as all the new PLINQ methods.
The Managed Thread Pool has some very good guidelines on when NOT to use the thread pool.
In my experience, you want to create your own thread when you need a persistent, dedicated, long-running thread. For everything else, use asynchronous delegates or something like QueueUserWorkItem, BackgroundWorker, or the Task-related features of .NET 4.0.
Threads in ThreadPool are background threads;
All threads created and started by a new Thread object are foreground threads.
A background thread does not keep the managed execution environment running.
refer to http://msdn.microsoft.com/en-us/library/h339syd0.aspx for more.
In .NET 4.5.2 they added a new method: HostingEnvironment.QueueBackgroundWorkItem.
This appears to be an alternative to ThreadPool.QueueUserWorkItem. Both behave similarly, but there are some nice benefits to using the new method when working in ASP.NET:
The HostingEnvironment.QueueBackgroundWorkItem method lets you
schedule small background work items. ASP.NET tracks these items and
prevents IIS from abruptly terminating the worker process until all
background work items have completed. This method can't be called
outside an ASP.NET managed app domain.
Using the ThreadPool, you have less control of the threading system. This is a trade off to simplify the process for you. If you have all that you need from the ThreadPool, you should feel free to utilize it. If you need more control of the threads, then you need to of course use the Thread classes.
ThreadPool.QueueUserWorkItem() is basically for fire-and-forget scenarios, when application doesn't depend on whether operations will finish or not.
Use classic threads for fine-grained control.
You should use ThreadPool.QueueUserWorkItem except in cases of:
You require a foreground thread.
You require a thread to have a particular priority.
You have tasks that cause the thread to block for long periods of
time. The thread pool has a maximum number of threads, so a large
number of blocked thread pool threads might prevent tasks from
starting.
You need to place threads into a single-threaded apartment. All
ThreadPool threads are in the multithreaded apartment.
You need to have a stable identity associated with the thread, or to
dedicate a thread to a task.
Reference link.
This question already has answers here:
Does Func<T>.BeginInvoke use the ThreadPool?
(2 answers)
Closed 1 year ago.
In my WPF application, I want to do some work in a non-UI thread so as to avoid the UI from become not-responding. For that I did this:
var caller = new AsyncMethodCaller<Pattern>(this.SetPatternType);
caller.BeginInvoke(_patterns, null, null);
And the delegate is defined as,
public delegate void AsyncMethodCaller<in T>(IEnumerable<T> data);
My question is:
Does BeginInvoke() create a new thread and the callback SetPatternType runs in it? If so, how long this thread last?
Is this approach good in general? If not, what is wrong with it? And what potential problem(s) might I face?
I'm using C# 4.0 and Visual Studio 2010.
EDIT:
Also I need few guidelines regarding these:
When I should create a new thread myself and when should I make use of BeginInvoke()? And when should I use DispatcherObject.Dispatcher.BeginInvoke() object?
It is technically not a new thread its a Threadpool thread, and it migth last longer than your process/program but might run some other threads asynch calls immediately it finishes yours. Check MSDN articles on Asynch Programming and Threadpool to get the complete details.
And depending on your interest check I/O CompletionPort for additional details.
Asynch programming is generally considered better than atleast synchronous code, but f you are on .NET 4.0 take a look at Task Parallel Library.
Based on Question Edit, when should I create my own thread?
It is always better to use BeginInvoke or Async programming compared to creating your own thread. Strictly create your own thread when you are sure that you need a dedicated thread doing some task/work continuously and you are clear about the synchronization mechanics needed by multiple threads in your application. Avoid creating new threads as long as you can unless you have a really compelling reason. You add a thread today and probably move on and after two years three developers see that an additional thread was added for some continuous stuff, they'll add few more and so on. Trust me I've seen this happening, therefore set the right practices (ie using Asynch methods) and people will try to follow that. I've seen applications with 150 threads, does that make sense on a dual core or quad core machine, I dont think so.
Just checked all the running processes on my Toshiba Laptop for such badly designed apps, Toshiba Bluetooth Manager wins the crown of worst designed program on my box using 53 threads. :)
It uses the thread pool - so it won't necessarily create a new thread, but it runs in a different thread to the calling thread (unless that's a thread pool thread itself which happens to finish its task before the delegate invocation is scheduled; it would be pretty unlikely to use the same thread).
Dispatcher.CurrentDispatcher is new stuff in WPF (kind of replaces InvokeRequired stuff in WinForms).
You can use Dispatcher to queue any updates you want to the GUI and it has different priorities you can choose from
see this MSDN link
Senerio
We have a C# .Net Web Application that records incidents. An external database needs to be queried when an incident is approved by a supervisor. The queries to this external database are sometimes taking a while to run. This lag is experienced through the browser.
Possible Solution
I want to use threading to eliminate the simulated hang to the browser. I have used the Thread class before and heard about ThreadPool. But, I just found BackgroundWorker in this post.
MSDN states:
The BackgroundWorker class allows you to run an operation on a separate, dedicated thread. Time-consuming operations like downloads and database transactions can cause your user interface (UI) to seem as though it has stopped responding while they are running. When you want a responsive UI and you are faced with long delays associated with such operations, the BackgroundWorker class provides a convenient solution.
Is BackgroundWorker the way to go when handling long running queries?
What happens when 2 or more BackgroundWorker processes are ran simultaneously? Is it handled like a pool?
Yes, BackgroundWorker can significantly simplify your threading code for long-running operations. The key is registering for the DoWork, ProgressChanged, and RunWorkerCompleted events. These help you avoid having to have a bunch of synchronization objects passed back and forth with the thread to try to determine the progress of the operation.
Also, I believe the progress events are called on the UI thread, avoiding the need for calls to Control.Invoke to update your UI.
To answer your last question, yes, threads are allocated from the .NET thread pool, so you while you may instantiate as many BackgroundWorker objects as you'd like, you can only run as many concurrent operations as the thread pool will allow.
If you're using .NET 4 (or can use the TPL backport from the Rx Framework), then one nice option is to use a Task created with the LongRunning hint.
This provides many options difficult to accomplish via the ThreadPool or BackgroundWorker, including allowing for continuations to be specified at creation time, as well as allowing for clean cancellation and exception/error handling.
I had ran in similar situation with long running queries. I used the asynchronous invoke provided by delegates. You can use the BeginInvoke method of the delegate.
BackgroundWrokerks are just like any other threads, accept they can be killed or quit, w/out exiting the main thread and your application.
ThreadPool uses a pool of BackgroundWorkers. It is the preferred way of most multi threading scenarios because .net manages threads for you, and it re-uses them instead of creating new ones as needed which is a expensive process.
Such threading scenarios are great for processor intensive code.
For something like a query which happens externally, you also have the option of asynchronous data access. You can hand off the query request, and give it the name of your callback method, which will be called when query is finished and than do something with the result (i.e. update UI status or display returned data)..
.Net has inbuilt support for asynchronous data querying
http://www.devx.com/dotnet/Article/26747