I need to handle user request one by one(similar like a queue job)
This is what i have:
Thread checkQueue;
Boolean IsComplete = true;
protected void Start()
{
checkQueue = new Thread(CheckQueueState);
checkQueue.Start();
}
private void CheckQueueState()
{
while (true)
{
if (checkIsComplete)
{
ContinueDoSomething();
checkQueue.Abort();
break;
}
System.Threading.Thread.Sleep(1000);
}
}
protected void ContinueDoSomething()
{
IsComplete = false;
...
...
IsComplete = true; //when done, set it to true
}
Everytime when there is a new request from user, system will call Start() function and check whether the previous job is complete, if yes then will proceed to next job.
But I am not sure whether it is correct to do in this way.
Any improvement or any better way to do this?
I like usr's suggestion regarding using TPL Dataflow. If you have the ability to add external dependencies to your project (TPL Dataflow is not distributed as part of the .NET framework), then it provides a clean solution to your problem.
If, however, you're stuck with what the framework has to offer, you should have a look at BlockingCollection<T>, which works nicely with the producer-consumer pattern that you're trying to implement.
I've thrown together a quick .NET 4.0 example to illustrate how it can be used in your scenario. It is not very lean because it has a lot of calls to Console.WriteLine(). However, if you take out all the clutter it's extremely simple.
At the center of it is a BlockingCollection<Action>, which gets Action delegates added to it from any thread, and a thread specifically dedicated to dequeuing and executing those Actions sequentially in the exact order in which they were added.
using System;
using System.Collections.Concurrent;
using System.Threading;
using System.Threading.Tasks;
namespace SimpleProducerConsumer
{
class Program
{
static void Main(string[] args)
{
Console.WriteLine("Main thread id is {0}.", Thread.CurrentThread.ManagedThreadId);
using (var blockingCollection = new BlockingCollection<Action>())
{
// Start our processing loop.
var actionLoop = new Thread(() =>
{
Console.WriteLine(
"Starting action loop on thread {0} (dedicated action loop thread).",
Thread.CurrentThread.ManagedThreadId,
Thread.CurrentThread.IsThreadPoolThread);
// Dequeue actions as they become available.
foreach (var action in blockingCollection.GetConsumingEnumerable())
{
// Invoke the action synchronously
// on the "actionLoop" thread.
action();
}
Console.WriteLine("Action loop terminating.");
});
actionLoop.Start();
// Enqueue some work.
Console.WriteLine("Enqueueing action 1 from thread {0} (main thread).", Thread.CurrentThread.ManagedThreadId);
blockingCollection.Add(() => SimulateWork(1));
Console.WriteLine("Enqueueing action 2 from thread {0} (main thread).", Thread.CurrentThread.ManagedThreadId);
blockingCollection.Add(() => SimulateWork(2));
// Let's enqueue it from another thread just for fun.
var enqueueTask = Task.Factory.StartNew(() =>
{
Console.WriteLine(
"Enqueueing action 3 from thread {0} (task executing on a thread pool thread).",
Thread.CurrentThread.ManagedThreadId);
blockingCollection.Add(() => SimulateWork(3));
});
// We have to wait for the task to complete
// because otherwise we'll end up calling
// CompleteAdding before our background task
// has had the chance to enqueue action #3.
enqueueTask.Wait();
// Tell our loop (and, consequently, the "actionLoop" thread)
// to terminate when it's done processing pending actions.
blockingCollection.CompleteAdding();
Console.WriteLine("Done enqueueing work. Waiting for the loop to complete.");
// Block until the "actionLoop" thread terminates.
actionLoop.Join();
Console.WriteLine("Done. Press Enter to quit.");
Console.ReadLine();
}
}
private static void SimulateWork(int actionNo)
{
Thread.Sleep(500);
Console.WriteLine("Finished processing action {0} on thread {1} (dedicated action loop thread).", actionNo, Thread.CurrentThread.ManagedThreadId);
}
}
}
And the output is:
0.016s: Main thread id is 10.
0.025s: Enqueueing action 1 from thread 10 (main thread).
0.026s: Enqueueing action 2 from thread 10 (main thread).
0.027s: Starting action loop on thread 11 (dedicated action loop thread).
0.028s: Enqueueing action 3 from thread 6 (task executing on a thread pool thread).
0.028s: Done enqueueing work. Waiting for the loop to complete.
0.527s: Finished processing action 1 on thread 11 (dedicated action loop thread).
1.028s: Finished processing action 2 on thread 11 (dedicated action loop thread).
1.529s: Finished processing action 3 on thread 11 (dedicated action loop thread).
1.530s: Action loop terminating.
1.532s: Done. Press Enter to quit.
Use an ActionBlock<T> from the TPL Dataflow library. Set its MaxDegreeOfParalellism to 1 and you're done.
Note, that ASP.NET worker processes can recycle at any time (e.g. due to scheduled recycle, memory limit, server reboot or deployment), so the queued work might suddenly be lost without notice. I recommend you look into some external queueing solution like MSMQ (or others) for reliable queues.
Take a look at the Reactive Extensions from Microsoft. This library contains a set of schedulers available that follow the semantics you require.
The best to fit your needs is the EventLoopScheduler. It will queue up actions and perform them one after another. If it completes an action and there are more items in the queue it will sequentially process the actions on the same thread until the queue is empty and then it disposes the thread. When a new action is queued it creates a new thread. It is very efficient because of this.
The code is super simple and looks like this:
var scheduler = new System.Reactive.Concurrency.EventLoopScheduler();
scheduler.Schedule(() => { /* action here */ });
If you need to have every queued action performed on a new thread then use it like this:
var scheduler = new System.Reactive.Concurrency.NewThreadScheduler();
scheduler.Schedule(() => { /* action here */ });
Very simple.
Related
I am working on some legacy code which repeatedly calls a long running task in a new thread:
var jobList = spGetSomeJobIds.ToList();
jobList.ForEach((jobId) =>
{
var myTask = Task.Factory.StartNew(() => CallExpensiveStoredProc(jobId),
TaskCreationOptions.LongRunning);
myTask.Wait();
});
As the calling thread immediately calls Wait and blocks until the task completes I can't see any point in the Task.Factory.StartNew code. Am I missing something? is there something about TaskCreationOptions.LongRunning which might add value?
As msdn says:
Waits for the Task to complete execution.
in addition, there is the following statement:
Wait blocks the calling thread until the task completes.
So myTask.Wait(); looks redundant as method CallExpensiveStoredProc returns nothing.
As a good practise, it would be better to use async and await operators when you deal with asyncronous operations such as database operations.
UPDATE:
What we have is:
We run LongRunning, so new Thread is created. It can be seen in source files.
Then we call myTask.Wait();. This method just waits when myTask will finish its work. So all jobList iterations will be executed sequantially, not parallely. So now we need to decide how our job should be executed - sequantially(case A) or parallelly(case B).
Case A: Sequantial execution of our jobs
If your jobs should be executed sequntially, then a few questions might be arisen:
what for do we use multithreading, if our code is executing sequantially? Our code should be clean and simple. So we can avoid using multithreading in this case
when we create a new thread, we are adding additional overheads to the threadpool. Because thread pool tries to determine the optimal number of threads and it creates at least one thread per core. That means when all of the thread pool threads are busy, the task might wait (in extreme cases infinitely long), until it actually starts executing.
To sum up, so there is no gain in this case to create new Thread, especially new thread using LongRunning enum.
Case B: Parallel execution of our jobs
If our goal is to run all jobs parallely, then myTask.Wait(); should be eliminated because it makes code to be executed sequntially.
Code to test:
var jobs = new List<int>(){1, 2, 3 };
jobs.ForEach(j =>
{
var myTask = Task.Factory.StartNew(() =>
{
Console.WriteLine($"This is a current number of executing task: { j }");
Thread.Sleep(5000); // Imitation of long-running operation
Console.WriteLine($"Executed: { j }");
}, TaskCreationOptions.LongRunning);
myTask.Wait();
});
Console.WriteLine($"All jobs are executed");
To conclude in this case B, there is no gain to create new Thread, especially new thread using LongRunning enum. Because this is an expensive operation in the time it takes to be created and in memory consumption.
Can someone please explain why this creates a deadlock, and how to solve it?
txtLog.AppendText("We are starting the thread" + Environment.NewLine);
var th = new Thread(() =>
{
Application.Current.Dispatcher.Invoke(new Action(() => // causes deadlock
{
txtLog.AppendText("We are inside the thread" + Environment.NewLine); // never gets printed
// compute some result...
}));
});
th.Start();
th.Join(); // causes deadlock
// ... retrieve the result computed by the thread
Explanation: I need my secondary thread to compute a result, and to return it to the main thread. But the secondary thread must also write debug informations to the log; and the log is in a wpf window, so the thread needs to be able to use the dispatcher.invoke(). But the moment I do Dispatcher.Invoke, a deadlock occurs, because the main thread is waiting for the secondary thread to finish, because it needs the result.
I need a pattern to solve this. Please help me rewrite this code. (Please write actual code, do not just say "use BeginInvoke"). Thank you.
Also, theoretically, I don't understand one thing: a deadlock can only happen when two threads access two shared resources in different orders. But what are the actual resources in this case? One is the GUI. But what is the other? I can't see it.
And the deadlock is usually solved by imposing the rule that the threads can only lock the resources in a precise order. I've done this already elsewhere. But how can I impose this rule in this case, since I don't understand what the actual resources are?
Short answer: use BeginInvoke() instead of Invoke().
Long answer change your approach: see the altenative.
Currently your Thread.Join() is causing that main thread get blocked waiting for the termination of secondary thread, but secondary thread is waiting to main thread executes your AppendText action, thus your app is deadlocked.
If you change to BeginInvoke() then your seconday thread will not wait until main thread executes your action. Instead of this, it will queue your invocation and continues. Your main thread will not blocked on Join() because your seconday thread this time ends succesfully. Then, when main thread completes this method will be free to process the queued invocation to AppendText
Alternative:
void DoSomehtingCool()
{
var factory = new TaskFactory(TaskScheduler.FromCurrentSynchronizationContext());
factory.StartNew(() =>
{
var result = await IntensiveComputing();
txtLog.AppendText("Result of the computing: " + result);
});
}
async Task<double> IntensiveComputing()
{
Thread.Sleep(5000);
return 20;
}
This deadlock happens because the UI thread is waiting for the background thread to finish, and the background thread is waiting for the UI thread to become free.
The best solution is to use async:
var result = await Task.Run(() => {
...
await Dispatcher.InvokeAsync(() => ...);
...
return ...;
});
The Dispatcher is trying to execute work in the UI message loop, but that same loop is currently stuck on th.Join, hence they are waiting on each other and that causes the deadlock.
If you start a Thread and immediately Join on it, you definitely have a code smell and should re-think what you're doing.
If you want things to be done without blocking the UI you can simply await on InvokeAsync
I had a similar problem which I finally solved in this way:
do{
// Force the dispatcher to run the queued operations
Dispatcher.CurrentDispatcher.Invoke(delegate { }, DispatcherPriority.ContextIdle);
}while(!otherthread.Join(1));
This produces a Join that doesn't block because of GUI-operations on the other thread.
The main trick here is the blocking Invoke with an empty delegate (no-operation), but with a priority setting that is less than all other items in the queue. That forces the dispatcher to work through the entire queue. (The default priority is DispatcherPriority.Normal = 9, so my DispatcherPriority.ContextIdle = 3 is well under.)
The Join() call uses a 1 ms time out, and re-empties the dispatcher queue as long as the join isn't successful.
I really liked #user5770690 answer. I created an extension method that guarantees continued "pumping" or processing in the dispatcher and avoids deadlocks of this kind. I changed it slightly but it works very well. I hope it helps someone else.
public static Task PumpInvokeAsync(this Dispatcher dispatcher, Delegate action, params object[] args)
{
var completer = new TaskCompletionSource<bool>();
// exit if we don't have a valid dispatcher
if (dispatcher == null || dispatcher.HasShutdownStarted || dispatcher.HasShutdownFinished)
{
completer.TrySetResult(true);
return completer.Task;
}
var threadFinished = new ManualResetEvent(false);
ThreadPool.QueueUserWorkItem(async (o) =>
{
await dispatcher?.InvokeAsync(() =>
{
action.DynamicInvoke(o as object[]);
});
threadFinished.Set();
completer.TrySetResult(true);
}, args);
// The pumping of queued operations begins here.
do
{
// Error condition checking
if (dispatcher == null || dispatcher.HasShutdownStarted || dispatcher.HasShutdownFinished)
break;
try
{
// Force the processing of the queue by pumping a new message at lower priority
dispatcher.Invoke(() => { }, DispatcherPriority.ContextIdle);
}
catch
{
break;
}
}
while (threadFinished.WaitOne(1) == false);
threadFinished.Dispose();
threadFinished = null;
return completer.Task;
}
I have an application which should finish within 30 minutes. The components of the application are run using threadpool.
So
//queue first all the components
//when the Collect method for each of the components finishes it will set the event
ManualResetEvent serverEvent = new ManualResetEvent(false);
sectionsCompleted.Add(serverEvent);
ThreadPool.QueueUserWorkItem(serverInfo.Collect,"ServerInfo ");
ManualResetEvent cpuEvent= new ManualResetEvent(false);
sectionsCompleted.Add(cpuEvent);
ThreadPool.QueueUserWorkItem(cpuInfo.Collect,"CPUInfo ");
//then wait for all the components to finish
WaitHandle.WaitAll(sectionsCompleted.ToArray());
So the logic is to call all the components in ThreadPool and use ManualResetEvent class to signal the main thread that the component has finished.
Now i want to use the ElapsedEvent Handler to make sure that the code finishes gracefully in some time frame(say 30 minutes). So after 30 minutes if there are still some threads running i want to abort them.
So my question will ElapsedEventHandler delegate be called at all? or will the main thread wait for WaitHandle.WaitAll(sectionsCompleted.ToArray()) ?
Is there any other way i can achieve this functionality of stopping all threads in a thread pool after some time interval.
If you setup the timer, the event handler for the timer, and start the timer before the above code (or at least before the WaitAll) then
your timer's Elapsed event will fire,
your Main thread will wait at the WaitAll
but you could just as easily do something like:
if (!WaitHandle.WaitAll(sectionsCompleted.ToArray(), TimeSpan.FromMinutes(30)))
{
// did not finish in the 30 minute timespan, so kill the threads
}
If you do the above you won't have to worry about synchronising your event handler for the timer (which may try and kill a thread just as it completes) and the Main method which is waiting on the WaitHandles (and may therefore complete while the event handler thinks the thread is being killed).
If you are able (.NET version depending) then Tasks would be very well suited to this as you could use a CancellationToken to allow you to kill each task gracefully if it has not completed. See MSDN: Task Cancellation for something like the below. If you can't use Task you can just wire this same solution up yourself. One possible technique is to use more WaitHandles (also see below).
This approach will also let you move the Wait+Cancel code into a separate thread. You can therefore release your UI or main code thread immediately the worker threads are created. This has the added advantage that you can also signal from the control thread to the single instance of the Wait+Cancel code to trigger a premature cancellation.
// use the same CancellationTokenSource to create all tasks
var tokenSource2 = new CancellationTokenSource();
// for each task, use the following structure
CancellationToken ct = tokenSource2.Token;
var task = Task.Factory.StartNew(() =>
{
// Were we already canceled?
ct.ThrowIfCancellationRequested();
bool moreToDo = true;
// make sure any loops and other methods check the ct.IsCancellationRequested regularly
while (moreToDo)
{
if (ct.IsCancellationRequested)
{
// Clean up any resources, transactions etc. here, then...
ct.ThrowIfCancellationRequested();
}
}
}, tokenSource2.Token); // Pass same token to StartNew.
// add each task to the tasks list
tasks.Add(task);
// once all tasks created, wait on them and cancel if they overrun
// by passing the token, another thread could even cancel the whole operation ahead of time
if (!Task.WaitAll(tasks.ToArray(), (int)TimeSpan.FromMinutes(30).TotalMilliseconds,
tokenSource2.Token))
{
// did not finish in the 30 minute timespan, so kill the threads
tokenSource2.Cancel();
try
{
// Now wait for the tasks to cancel
Task.WaitAll(tasks.ToArray());
}
catch (AggregateException ae)
{
// handle any unexpected task exceptions here
}
}
Or in .NET 2.0 without Tasks:
// in Main thread ...
ManualResetEvent serverEventCancelled = new ManualResetEvent(false);
cancellationMres.Add(serverEventCancelled);
// Inside the thread, do this regularly - zero timeout returns instantly ...
if (serverEventCancelled.WaitOne(0))
{
// do cancellation and ...
// now set the "completed" waithandle (or do something similar to let Main know we are done)
serverEvent.Set();
return;
}
// In Main thread ...
if (!WaitHandle.WaitAll(sectionsCompleted.ToArray(), TimeSpan.FromMinutes(30)))
{
foreach (var cancellationMre in cancellationMres)
{
cancellationMre.Set();
}
WaitHandle.WaitAll(sectionsCompleted.ToArray());
}
ElapsedEventHandler delegate be called at all?
yes
will the main thread wait for WaitHandle.WaitAll(sectionsCompleted.ToArray()) ?
yes
but you need to signal the eventhandler in your thread(like cpuInfo.Collect) ,
in .net 4.5, you also can use CancellationTokenSource(TimeSpan) to cancel the thread after period time.
btw: you should put WaitHandle.WaitAll(sectionsCompleted.ToArray()) in non-ui thread, or it will block your UI.
I created ten threads who will request the server separately. Now, Each thread is requesting the server successfully but no thread is getting the response.
If the thread is not in active state (due to Operating System Process Scheduling), who will receive the response?
private static void ThreadFunc()
{
var response = CallServer();
Console.writeline(response.Message);
}
Following line of code never hit.
Console.writeline(response.Message);
When waiting for a response, your thread will never be active. It is waiting after all.
The OS marks the underlying IO as completed and marks the thread as ready. It will be scheduled some time in the future to process the results of the completed IO.
What does "active" even mean? For the purposes of this discussion, threads can be blocked, ready and running. During a wait or IO, a thread is blocked. When that wait completes, it is marked ready. It starts to run at some point in time later when the OS decides to schedule it.
A foreground/application thread must stay alive for background threads to completed. If, in your code, you're using ThreadPool threads (which Tasks use by default), and your console application's main thread completes without waiting for these background threads to completed, the process terminates.
Here's an example using Tasks:
static void Main()
{
List<Task> tasks = new List<Task>();
for( int i = 0; i < 10; ++i )
{
tasks.Add( Task.Factory.StartNew( () => ThreadFunc() ) );
}
// need this to keep process alive
// will continue once all tasks complete
Task.WaitAll( tasks.ToArray() );
}
I have an application in which the user will choose to do a number of tasks along with the maximum number of threads. Each task should run on a separate thread. Here is what I am looking for:
If the user specified "n less than t" where n is the maximum number of threads and t is the number of tasks. The program should run "n" threads and after they finish, the program should be notified by some way and repeat the loop untill all tasks are done.
My Question is:
How to know that all running threads has finished their job so that I can repeat the loop.
I recommend using the ThreadPool for your task. Its algorithm will generally be more efficient than something you can roll by hand.
Now the fun part is getting notified when all of your threads complete. Unless you have really specific needs which make this solution unsuitable, it should be easy enough to implement with the CountdownEvent class, which is a special kind of waithandle that waits until its been signaled n times. Here's an example:
using System;
using System.Linq;
using System.Threading;
using System.Diagnostics;
namespace CSharpSandbox
{
class Program
{
static void SomeTask(int sleepInterval, CountdownEvent countDown)
{
try
{
// pretend this did something more profound
Thread.Sleep(sleepInterval);
}
finally
{
// need to signal in a finally block, otherwise an exception may occur and prevent
// this from being signaled
countDown.Signal();
}
}
static CountdownEvent StartTasks(int count)
{
Random rnd = new Random();
CountdownEvent countDown = new CountdownEvent(count);
for (int i = 0; i < count; i++)
{
ThreadPool.QueueUserWorkItem(_ => SomeTask(rnd.Next(100), countDown));
}
return countDown;
}
public static void Main(string[] args)
{
Console.WriteLine("Starting. . .");
var stopWatch = Stopwatch.StartNew();
using(CountdownEvent countdownEvent = StartTasks(100))
{
countdownEvent.Wait();
// waits until the countdownEvent is signalled 100 times
}
stopWatch.Stop();
Console.WriteLine("Done! Elapsed time: {0} milliseconds", stopWatch.Elapsed.TotalMilliseconds);
}
}
}
You probably want to use a Thread Pool for this. You (can) specify the number of threads in the pool, and give it tasks to do. When a thread in the pool is idle, it automatically looks for another task to carry out.
If you want to do this without the thread pool, you can use Thread.Join to wait for the threads to complete. That is:
Thread t1 = new Thread(...);
Thread t2 = new Thread(...);
t1.Start();
t2.Start();
// Wait for threads to finish
t1.Join();
t2.Join();
// At this point, all threads are done.
Of course, if this is an interactive application you'd want that to happen in a thread itself. And if you wanted to get fancy, the waiting thread could do the work of one of the threads (i.e. you'd start thread 1 and then the main thread would do the work of the second thread).
If this is an interactive application, then you probably want to make use of BackgroundWorker (which used the thread pool). If you attach an event handler to the RunWorkCompleted event, then you will be notified when the worker has completed its task. If you have multiple workers, have a single RunWorkCompleted event handler, and keep track of which workers have signaled. When they've all signaled, then your program can go ahead and do whatever else it needs to do.
The example at http://msdn.microsoft.com/en-us/library/system.componentmodel.backgroundworker.aspx should give you a good start.
Could you check the isAlive() value for each thread? if all values equal false then you would know that all your threads have ended. Additionally, there is a way to have your delegate return it's own status.
http://msdn.microsoft.com/en-us/library/system.threading.thread.isalive(v=VS.90).aspx