I've got the following code,
As you can see the background worker searches for files and in the progress changed event files are added to a listview, however since there is lots of files being added to the listview, the UI becomes unresponsive, I could Sleep the thread in the loops but I don't think that's a good practice, what's the best way to prevent the UI freeze?
to elaborate more, listview is a form control on a form.
void bg_DoWork(object sender, DoWorkEventArgs e)
{
Stack<string> dirs = new Stack<string>(20);
dirs.Push(e.Argument.ToString());
while (dirs.Count > 0)
{
string currentDir = dirs.Pop();
string[] subDirs;
try { subDirs = System.IO.Directory.GetDirectories(currentDir); }
catch (UnauthorizedAccessException) { continue; }
catch (System.IO.DirectoryNotFoundException) { continue; }
string[] files = null;
try { files = System.IO.Directory.GetFiles(currentDir); }
catch (UnauthorizedAccessException) { continue; }
catch (System.IO.DirectoryNotFoundException) { continue; }
foreach (var file in files) { bg.ReportProgress(0, file); }
foreach (string str in subDirs) { dirs.Push(str); }
}
}
void bg_ProgressChanged(object sender, ProgressChangedEventArgs e)
{
listView1.Items.Add(e.UserState.ToString());
}
So the issue here is that ReportProgress is actually asynchronous. It doesn't wait for the corresponding UI updates to actually be made before it continue moving on doing work. Normally this is great. In most situations there's no compelling reason to slow down your productive work just to go wait for UI updates.
There's one exception though. If you call ReportProgress so often that it doesn't actually have time to complete the previous progress update before the next one is added, what ends up happening is that you fill up the message queue with requests to go update progress. You have so many files, and getting those lists of files takes very little time. It actually takes quite a bit less time than it takes to marshal to the UI thread and update the UI.
Because this queue ends up being backed up, any other UI updates need to sit through that long queue before they can do anything.
Batching up the updates and indicating progress less often is one possible solution. It may or may not be acceptable, given your situation. It will almost certainly help, but depending on just how long it takes the UI to be updated based on whatever it is that you're doing, and how quickly you can generate data, it's possible that even this will cause problems. If it works for your specific case though, great.
The other option is to change how you update progress such that your worker waits for the UI update before continuing. Obviously this is something that you should avoid unless you need to do it, because it means that while you aren't freezing the UI while you do your work, your work will take a fair bit longer. While there are any number of ways of doing this, the simplest of which is likely to just use Invoke (not BeginInvoke):
foreach (var file in files)
listView1.Invoke(new Action(()=>listView1.Items.Add(file));
While calling Invoke from a BackgroundWorker is generally code smell, and should be avoided, this is a somewhat exceptional case.
Note that even if you do end up resorting to using Invoke here I would still suggest batching the invokes such that you add more than just one item per invoke. If the number of files in a single directory is sufficiently low, put the whole foreach inside the Invoke, and if your subdirectories tend to have very few files (i.e, they're very deep, not broad) consider even putting all of the files into a temp list until it's large enough to be worth batching into an Invoke. Play around with different approaches based on your data, to see what works best.
bg.ReportProgress() is meant to report the overall progress of the BackgroundWorker back to the UI thread, so you can inform your users as to the progress. However, you're using it to actually add strings to a ListView. You're better off compiling the list of files into an in-memory list and then populating listView1 once when the background worker completes:
public void bg_RunWorkerCompleted(object sender, RunWorkerCompletedEventArgs e)
{
foreach (var file in MyFileListVar){
listView1.Items.Add(file);
}
}
Try to load multiple files (lets say between 10~50), then send them back to the UI thread (ie: bg.ReportProgress) instead of sending every file separately.
Related
Windows service: Generating a set of FileWatcher objects from a list of directories to watch in a config file, have the following requirements:
File processing can be time consuming - events must be handled on their own task threads
Keep handles to the event handler tasks to wait for completion in an OnStop() event.
Track the hashes of uploaded files; don't reprocess if not different
Persist the file hashes to allow OnStart() to process files uploaded while the service was down.
Never process a file more than once.
(Regarding #3, we do get events when there are no changes... most notably because of the duplicate-event issue with FileWatchers)
To do these things, I have two dictionaries - one for the files uploaded, and one for the tasks themselves. Both objects are static, and I need to lock them when adding/removing/updating files and tasks. Simplified code:
public sealed class TrackingFileSystemWatcher : FileSystemWatcher {
private static readonly object fileWatcherDictionaryLock = new object();
private static readonly object runningTaskDictionaryLock = new object();
private readonly Dictionary<int, Task> runningTaskDictionary = new Dictionary<int, Task>(15);
private readonly Dictionary<string, FileSystemWatcherProperties> fileWatcherDictionary = new Dictionary<string, FileSystemWatcherProperties>();
// Wired up elsewhere
private void OnChanged(object sender, FileSystemEventArgs eventArgs) {
this.ProcessModifiedDatafeed(eventArgs);
}
private void ProcessModifiedDatafeed(FileSystemEventArgs eventArgs) {
lock (TrackingFileSystemWatcher.fileWatcherDictionaryLock) {
// Read the file and generate hash here
// Properties if the file has been processed before
// ContainsNonNullKey is an extension method
if (this.fileWatcherDictionary.ContainsNonNullKey(eventArgs.FullPath)) {
try {
fileProperties = this.fileWatcherDictionary[eventArgs.FullPath];
}
catch (KeyNotFoundException keyNotFoundException) {}
catch (ArgumentNullException argumentNullException) {}
}
else {
// Create a new properties object
}
fileProperties.ChangeType = eventArgs.ChangeType;
fileProperties.FileContentsHash = md5Hash;
fileProperties.LastEventTimestamp = DateTime.Now;
Task task;
try {
task = new Task(() => new DatafeedUploadHandler().UploadDatafeed(this.legalOrg, datafeedFileData), TaskCreationOptions.LongRunning);
}
catch {
..
}
// Only lock long enough to add the task to the dictionary
lock (TrackingFileSystemWatcher.runningTaskDictionaryLock) {
try {
this.runningTaskDictionary.Add(task.Id, task);
}
catch {
..
}
}
try {
task.ContinueWith(t => {
try {
lock (TrackingFileSystemWatcher.runningTaskDictionaryLock) {
this.runningTaskDictionary.Remove(t.Id);
}
// Will this lock burn me?
lock (TrackingFileSystemWatcher.fileWatcherDictionaryLock) {
// Persist the file watcher properties to
// disk for recovery at OnStart()
}
}
catch {
..
}
});
task.Start();
}
catch {
..
}
}
}
}
What's the effect of requesting a lock on the FileSystemWatcher collection in the ContinueWith() delegate when the delegate is defined within a lock on the same object? I would expect it to be fine, that even if the task starts, completes, and enters the ContinueWith() before ProcessModifiedDatafeed() releases the lock, the task thread would simply be suspended until the creating thread has released the lock. But I want to make sure I'm not stepping on any delayed execution landmines.
Looking at the code, I may be able to release the lock sooner, avoiding the issue, but I'm not certain yet... need to review the full code to be sure.
UPDATE
To stem the rising "this code is terrible" comments, there are very good reasons why I catch the exceptions I do, and am catching so many of them. This is a Windows service with multi-threaded handlers, and it may not crash. Ever. Which it will do if any of those threads have an unhandled exception.
Also, those exceptions are written to future bulletproofing. The example I've given in comments below would be adding a factory for the handlers... as the code is written today, there will never be a null task, but if the factory is not implemented correctly, the code could throw an exception. Yes, that should be caught in testing. However, I have junior developers on my team... "May. Not. Crash." (also, it must shut down gracefully if there is an unhandled exception, allowing currently-running threads to complete - which we do with an unhandled exception handler set in main()). We have enterprise-level monitors configured to send alerts when application errors appear on the event log – those exceptions will log and flag us. The approach was a deliberate and discussed decision.
Each possible exception has each been carefully considered and chosen to fall into one of two categories - those that apply to a single datafeed and will not shut down the service (the majority), and those that indicate clear programming or other errors that fundamentally render the code useless for all datafeeds. For example, we've chosen to shut down the service down if we can't write to the event log, as that's our primary mechanism for indicating datafeeds are not getting processed. The exceptions are caught locally, because the local context is the only place where the decision to continue can be made. Furthermore, allowing exceptions to bubble up to higher levels (1) violates the concept of abstraction, and (2) makes no sense in a worker thread.
I'm surprised at the number of people who argue against handling exceptions. If I had a dime for every try..catch(Exception){do nothing} I see, you'd get your change in nickels for the rest of eternity. I would argue to the death1 that if a call into the .NET framework or your own code throws an exception, you need to consider the scenario that would cause that exception to occur and explicitly decide how it should be handled. My code catches UnauthorizedExceptions in IO operations, because when I considered how that could happen, I realized that adding a new datafeed directory requires permissions to be granted to the service account (it won't have them by default).
I appreciate the constructive input... just please don't criticize simplified example code with a broad "this sucks" brush. The code does not suck - it is bulletproof, and necessarily so.
1 I would only argue a really long time if Jon Skeet disagrees
First, your question: it's not a problem in itself to request lock inside ContinueWith. If you bother you do that inside another lock block - just don't. Your continuation will execute asynchronously, in different time, different thread.
Now, code itself is questionable. Why do you use many try-catch blocks around statements that almost cannot throw exceptions? For example here:
try {
task = new Task(() => new DatafeedUploadHandler().UploadDatafeed(this.legalOrg, datafeedFileData), TaskCreationOptions.LongRunning);
}
catch {}
You just create task - I cannot imagine when this can throw. Same story with ContinueWith. Here:
this.runningTaskDictionary.Add(task.Id, task);
you can just check if such key already exists. But even that is not necessary because task.Id is unique id for given task instance which you just created. This:
try {
fileProperties = this.fileWatcherDictionary[eventArgs.FullPath];
}
catch (KeyNotFoundException keyNotFoundException) {}
catch (ArgumentNullException argumentNullException) {}
is even worse. You should not use exceptions lile this - don't catch KeyNotFoundException but use appropriate methods on Dictionary (like TryGetValue).
So to start with, remove all try catch blocks and either use one for the whole method, or use them on statements that can really throw exceptions and you cannot handle that situation otherwise (and you know what to do with exception thrown).
Then, your approach to handle filesystem events is not quite scaleable and reliable. Many programs will generate multiple change events in short intervals when they are saving changes to a file (there are also other cases of multiple events for the same file going in sequence). If you just start processing file on every event, this might lead to different kind of troubles. So you might need to throttle events coming for a given file and only start processing after certain delay after last detected change. That might be a bit advanced stuff, though.
Don't forget to grab a read lock on the file as soon as possible, so that other processes cannot change file while you are working with it (for example, you might calculate md5 of a file, then someone changes file, then you start uploading - now your md5 is invalid). Other approach is to record last write time and when it comes to uploading - grab read lock and check if file was not changed in between.
What is more important is that there can be a lot of changes at once. Say I copied 1000 files very fast - you do not want to start uploading them all at once with 1000 threads. You need a queue of files to process, and take items from that queue with several threads. This way thousands of events might happen at once and your upload will still work reliably. Right now you create new thread for each change event, where you immediatly start upload (according to method names) - this will fail under serious load of events (and in cases described above).
No it will not burn you. Even if the ContinueWith is inlined into to the current thread that was running the new Task(() => new DatafeedUploadHandler().. it will get the lock e.g. no dead lock.
The lock statement is using the Monitor class internally, and it is reentrant. e.g. a thread can aquire a lock multiple times if it already got/owns the lock. Multithreading and Locking (Thread-Safe operations)
And the other case where the task.ContinueWith starts before the ProcessModifiedDatafeed finished is like you said. The thread that is running the ContinueWith simply would have to wait to get the lock.
I would really consider to do the task.ContinueWith and the task.Start() outside of the lock if you reviewed it. And it is possible based on your posted code.
You should also take a look at the ConcurrentDictionary in the System.Collections.Concurrent namespace. It would make the code easier and you dont have to manage the locking yourself. You are doing some kind of compare exchange/update here if (this.fileWatcherDictionary.ContainsNonNullKey(eventArgs.FullPath)). e.g. only add if not already in the dictionary. This is one atomic operation. There is no function to do this with a ConcurrentDictionary but there is an AddOrUpdate method. Maybe you can rewrite it by using this method. And based on your code you could safely use the ConcurrentDictionary at least for the runningTaskDictionary
Oh and TaskCreationOptions.LongRunning is literally creating a new thread for every task which is kind of an expensive operation. The windows internal thread pool is intelligent in new windows versions and is adapting dynamically. It will "see" that you are doing lots of IO stuff and will spawn new threads as needed and practical.
Greetings
I have not fully followed the logic of this code but are you aware that task continuations and calls to Wait/Result can be inlined onto the current thread? This can cause reentrancy.
This is very dangerous and has burned many.
Also I don't quite see why you are starting task delayed. This is a code smell. Also why are you wrapping the task creation with try? This can never throw.
This clearly is a partial answer. But the code looks very tangled to me. If it's this hard to audit it you probably should write it differently in the first place.
I have a problem here. Is there a way to programmatically find out the moment a certain thread exits? I mean, even the VS debugger gives you that info in its Output window. I know that I simply could make my thread to raise an event but this of course leads to a problem when you spawn more than one thread, so each of them will raise that event, unless you use some toggle variable to make sure that only the first thread to reach that point in code will raise the event.
Here is the beginning of a method that is executed in multiple threads, with cancellation, pause and exit logic shown here. The problem spots of code are the calls to the event handlers (DownloadComplete, DownloadPaused and DownloadCanceled). As you can see, some of them will be executed only once by the first thread that reaches that point and then toggles related bool variables, so other threads won't raise this event again. Another big problem here is the thread exit logic, when the thread doesn't find an item to download and returns with rasing that download complete event which of course will be fired by each thread. Both of these approaches are incorrect and I currently have no any idea how to implement my intended behavior. All I need is to have a clue how to find out a way to catch the momement when the last of the all of spawned thread ends so I could raise all of the mentioned above events just once.
Please note that I have no acess to the types from the System.Threading.Tasks namespace as my project is targeted against .net 2.0. This is my very first experiece with programming something more complex than some class assignment, so I realize that this code is most likely all sorts of terrible.
Sorry for my English.
private void PerformDownload()
{
while (true)
{
if (askedToCancel)
{
lock (lockObj)
{
if (!cancellationPerformed)
{
cancellationPerformed = true;
foreach (DownloadItem di in itemsToProcess)
if (di.Status == DownloadItemStatus.Prepared)
itemsToProcess.UpdateItemStatus(di.URL, DownloadItemStatus.Canceled);
DownloadCanceled();
}
}
return;
}
if (askedToPause)
{
lock (lockObj)
{
if (!pausingPerformed)
{
foreach (DownloadItem di in itemsToProcess)
if (di.Status == DownloadItemStatus.Prepared)
itemsToProcess.UpdateItemStatus(di.URL, DownloadItemStatus.Paused);
DownloadPaused();
}
}
waitHandle.WaitOne();
}
DownloadItem currentItem = null;
lock (lockObj)
{
foreach (DownloadItem di in itemsToProcess)
if (di.Status == DownloadItemStatus.Prepared)
{
currentItem = di;
itemsToProcess.UpdateItemStatus(currentItem.URL, DownloadItemStatus.Downloading);
break;
}
}
if (currentItem == null)
{
DownloadComplete();
return;
}
You can use Interlocked.Increment() and Interlocked.Decrement() on a counter, incrementing at the thread's entry point, decrementing when the thread exits. Then if the counter is non-zero, at least one instance of that thread is running.
For example:
private int _threadCounter;
private void ThreadEntryPoint()
{
try
{
Interlocked.Increment(ref _threadCounter);
// Do thread stuff here
}
finally
{
Interlocked.Decrement(ref _threadCounter);
}
}
That said, it is IMHO not likely to be the best design for it to matter whether the thread is running or not. That is, whenever I think of how that information might be used, it seems to me there's a better way to address the scenario.
Threads exist for a reason, beyond simply being present. So what your code really ought to care about is whether that reason has been addressed, not whether any thread is running. Without a complete code example, I can't really comment on a specific scenario. But I suspect that tracking the actual thread existence is less useful and will be harder to maintain than a more goal-oriented approach.
Code below runs OK. I wonder if it is really correct?
if (openFileDialog.ShowDialog() == DialogResult.OK)
{
Parallel.ForEach(openFileDialog.FileNames, currentFile =>
{
try
{
StreamReader FileReader = new StreamReader(currentFile);
do
{
URLtextBox.Invoke(new MethodInvoker(delegate
{
URLtextBox.Text += SelectURLfromString(FileReader.ReadLine());
}));
}
while (FileReader.Peek() != -1);
FileReader.Close();
}
catch (System.Security.SecurityException ex)
{
...
}
catch (Exception ex)
{
...
}
});
}
Otherwise I get either "Cross-thread operation not valid. Control 'URLtextBox' accessed from another thread" or stuck application.
The code is correct - you need to use Invoke to refresh controls from outside the GUI thread. However, you are excuting the SelectURLfromString(FileReader.ReadLine()); method in the GUI thread as well, you should replace that by
string url = SelectURLfromString(FileReader.ReadLine());
URLtextBox.Invoke(new MethodInvoker(delegate
{
URLtextBox.Text += url;
}));
to minimize the work in the GUI thread to a minimum.
You cannot update UI controls from worker threads safely, unless you marshall onto the UI thread.
Take a look at TaskScheduler.FromCurrentSynchronizationContext
How to: Schedule Work on a Specified Synchronization Context
The code is correct, you need the Invoke call so that the control is updated in the GUI thread.
However, there are some other things that doesn't really make sense in the code:
You are doing parallel operations that is using a resource that is not parallel. Your threads will be fighting for attention from the disk, which is clearly the bottle neck due to it's relatively low speed.
You will read lines from several files, and dump them intermixed in a textbox. That might be all right in this specific situation, but generally it gives an unpredictable result.
You are using the += operation to concatenate strings, a method that is notorious for it's bad scalability. It might not be a big problem in this case though, as the disk bottle neck is probably a lot worse.
The Invoke is necessary because controls are bound to the thread that created their associated User32 window (often called an HWND). That said, you could probably optimize a little by reading and processing the contents of the file outside of the Invoke's delegate.
I'm trying to work with Threadding and it seems to me like it's suspiciously difficult (I'm probably doing it wrong).
I want to load a file inside a BackgroundWorker and while that happens, "send" each new line to a separate Thread (not bgWorker). I'm using BlockingCollection and Add() each line, then I want to Take() them and process them in another thread.
Now, everything is straightforward with the BgWorker; but why is it impossible(isn't it?) to just declare a new thread in Form1.cs and have it perform like the BgWorker? In other words, why must you create a separate WorkerClass ( http://msdn.microsoft.com/en-us/library/7a2f3ay4(VS.80).aspx )?
I'm asking this because, you can access your BlockingCollection fine from within the BackgroundWorker, but you can't do it from a separate WorkerClass (since it's a plain vanilla separate class). (So what's the point of the BlockingCollection then if you can't use it for what it's meant?)
Also, BgWorkers have a ReportProgress(...) event/method. As far as I know, if you use that msdn example, you don't have squat in your Thread.
What am I missing here? Please help.
PS: Before you jump and tell me that It's not in any way more efficient to send lines to another thread, know that I'm doing this as a learning exercise. Trying to figure out how Threads work in C# and how you sync/communicate between/with them (and/or bgWorkers).
Answering specifically why working with threads is more difficult than working with a background worker....
The backgroundworker is actually a method for creating another thread wrapped up in an easier to use package. The reason working with threads directly is harder is because it's closer to the real thing.
For a similar comparison, using System.Net.Mail to send an email is just a simplified way of creating socket connections, etc... Under the hood, the System.Net.Mail classes do the detailed work. Similarly, under the hood, the BackgroundWorker does the detailed work of dealing with the threads.
As a matter of fact, the MSDN documentaiton for the backgroundWorker object starts out like this:
BackgroundWorker Class Updated:
September 2010
Executes an operation on a separate
thread.
http://msdn.microsoft.com/en-us/library/system.componentmodel.backgroundworker.aspx
So if the backgroundworker class is supposed to make threading easier, why would people want to work with threads directly? Because of the issue you're having. Sometimes the "friendly wrapper" leads to a loss of fine control.
Edit - added
What you're asking about in the comments is thread synchronization. This article covers it pretty well.
http://msdn.microsoft.com/en-us/magazine/cc164037.aspx
and this article answers "communicating between threads" explicitly.
http://www.devnewsgroups.net/group/microsoft.public.dotnet.framework/topic63233.aspx
To answer your question in the title, yes "normal" threads can act like BackgroundWorker threads. You just have to create more of the wiring code yourself.
I wrote a simple application for scanning my music collection using a manually created thread. The main body of the thread is a method that loops over all of the folders under a specified root and fires an event each time it encounters a folder that contains some mp3 files.
I subscribe to this event in the main form of my application and update a DataGridView with the new information.
So the thread is kicked off by the following code:
this.libraryThread = new Thread(new ThreadStart(this.library.Build)) { IsBackground = true };
// Disable all the buttons except for Stop which is enabled
this.EnableButtons(false);
// Find all the albums
this.libraryThread.Start();
The method supplied to ThreadStart does some housekeeping and then calls the method that does the work:
private void FindAlbums(string root)
{
// Find all the albums
string[] folders = Directory.GetDirectories(root);
foreach (string folder in folders)
{
if (this.Stop)
{
break;
}
string[] files = Directory.GetFiles(folder, "*.mp3");
if (files.Length > 0)
{
// Add to library - use first file as being representative of the whole album
var info = new AlbumInfo(files[0]);
this.musicLibrary.Add(info);
if (this.Library_AlbumAdded != null)
{
this.Library_AlbumAdded(this, new AlbumInfoEventArgs(info));
}
}
this.FindAlbums(folder);
}
}
When this method finishes a final LibraryFinished event is fired.
I subscribe to these events in the main form:
this.library.Library_AlbumAdded += this.Library_AlbumAdded;
this.library.Library_Finished += this.Library_Finished;
and in these methods add the new album to the grid:
private void Library_AlbumAdded(object sender, AlbumInfoEventArgs e)
{
this.dataGridView.InvokeIfRequired(() => this.AddToGrid(e.AlbumInfo));
}
and finish off (which reenables buttons etc.):
private void Library_Finished(object sender, EventArgs e)
{
this.dataGridView.InvokeIfRequired(() => this.FinalUpdate());
}
As you can see this is a lot of work which would be a whole lot simpler if I used a BackgroundWorker.
There is a sequence for FORM(some UI) should get downloaded using service.
Currently, this download is in a BackgroundWorker Thread.
Now, since the performance is slow... We decided to categories the FORMS into 2 and start downloading parallely using another BackgroundWorker on top of the existing Thread.
Now, the scenario is the either of this BackgroundWorker should wait for other to complete.
So, how to implement it.
I tried with AutoResetEvent. but, i could not achieve this.
Any help is appreciated.
I don't think that the scenario is really that one BackgroundWorker should wait for another. What you really want is to fire some UI event after (and only after) both of them complete. It's a subtle but important difference; the second version is a lot easier to code.
public class Form1 : Form
{
private object download1Result;
private object download2Result;
private void BeginDownload()
{
// Next two lines are only necessary if this is called multiple times
download1Result = null;
download2Result = null;
bwDownload1.RunWorkerAsync();
bwDownload2.RunWorkerAsync();
}
private void bwDownload1_RunWorkerCompleted(object sender,
RunWorkerCompletedEventArgs e)
{
download1Result = e.Result;
if (download2Result != null)
DisplayResults();
}
private void bwDownload2_RunWorkerCompleted(object sender,
RunWorkerCompletedEventArgs e)
{
download2Result = e.Result;
if (download1Result != null)
DisplayResults();
}
private void DisplayResults()
{
// Do something with download1Result and download2Result
}
}
Note that those object references should be strongly-typed, I just used object because I don't know what you're downloading.
This is really all you need; the RunWorkerCompleted event runs in the foreground thread so you actually don't need to worry about synchronization or race conditions in there. No need for lock statements, AutoResetEvent, etc. Just use two member variables to hold the results, or two boolean flags if the result of either can actually be null.
You should be able to use two AutoResetEvent's and the WaitAll function to wait for both to complete. Call the Set function on the AutoResetEvent objects in the respective OnRunWorkerCompleted event.
Jeffrey Richter is THE guru when it comes to multi threading and he's written an amazing library called Power Threading Library which makes doing tasks like downloading n files asynchronously and continuing after they are all completed (or one or some), really simple.
Take a little time out to watch the video, learn about it and you won't regret it. Using the power threading library (which is free and has a Silverlight and Compact Framework version also) also makes your code easier to read, which is a big advantage when doing any async stuff.
Good luck,
Mark
int completedCount = 0;
void threadProc1() { //your thread1 proc
//do something
....
completedCount++;
while (completedCount < 2) Thread.Sleep(10);
//now both threads are done
}
void threadProc2() { //your thread1 proc
//do something
....
completedCount++;
while (completedCount < 2) Thread.Sleep(10);
//now both threads are done
}
Just use 2 BackgroundWorker objects, and have each one alert the UI when it completes. That way you can display a spinner, progress bar, whatever on the UI and update it as download results come back from the threads. You will also avoid any risks of thread deadlocking, etc.
By the way, just so we are all clear, you should NEVER call a blocking function such as WaitAll from the UI thread. It will cause the UI to completely lock up which will make you users wonder WTF is going on :)