I am working on a small project where I need to make two asynchronous calls right after another.
My code looks something like this:
AsynchronousCall1();
AsynchronousCall2();
The problem I'm having is that both calls take anywhere from one to two seconds to execute and I never know which one will finish last. What I'm looking for is a way to determine who finishes last. If Call1() finishes last, I do one thing. If Call2() finishes last, I do another thing.
This is a simple example of using a lock to ensure that only one thread can enter a piece of code. But it's a general example, which may or may not be best for your application. Add some details to your question to help us find what you're looking for.
void AsynchronousCall1()
{
// do some work
Done("1");
}
void AsynchronousCall2()
{
// do some work
Done("2");
}
readonly object _exclusiveAccess = new object();
volatile bool _alreadyDone = false;
void Done(string who)
{
lock (_exclusiveAccess)
{
if (_alreadyDone)
return;
_alreadyDone = true;
Console.WriteLine(who + " was here first");
}
}
I believe there is a method that is a member of the Thread class to check on a specific thread and determine its status. The other option would be to use a BackgroundWorker instead, as that would allow you to spell out what happens when that thread is finished, by creating seperate methods.
The "thread-unsafe" option would be to use a class variable, and at the end of each thread if it isn't locked / already have the other thread's value, don't set it. Otherwise set it.
Then when in your main method, after the call to wait for all threads to finish, test the class variable.
That will give you your answer as to which thread finished first.
You can do this with two ManualResetEvent objects. The idea is to have the main thread initialize both to unsignaled and then call the asynchronous methods. The main thread then does a WaitAny on both objects. When AsynchronousCall1 completes, it signals one of the objects. When AsynchronousCall2 completes, it signals the other. Here's code:
ManualResetEvent Event1 = new ManualResetEvent(false);
ManualResetEvent Event2 = new ManualResetEvent(false);
void SomeMethod()
{
WaitHandle[] handles = {Event1, Event2};
AsynchronousCall1();
AsynchronousCall2();
int index = WaitHandle.WaitAny(handles);
// if index == 0, then Event1 was signaled.
// if index == 1, then Event2 was signaled.
}
void AsyncProc1()
{
// does its thing and then
Event1.Signal();
}
void AsyncProc2()
{
// does its thing and then
Event2.Signal();
}
There are a couple of caveats here. If both asynchronous methods finish before the call to WaitAny, it will be impossible to say which completed first. Also, if both methods complete very close to one another (i.e. call 1 completes, then call 2 completes before the main thread's wait is released), it's impossible to say which one finished first.
You may want to check out the Blackboard design pattern: http://chat.carleton.ca/~narthorn/project/patterns/BlackboardPattern-display.html. That pattern sets up a common data store and then lets agents (who know nothing about one another -- in this case, your async calls) report their results in that common location. Your blackboard's 'supervisor' would then be aware of which call finished first and could direct your program accordingly.
Related
Background
I am trying to write an application that does the following:
I make a method call to SomeBlockingMethod.
This method calls blocks until I call SomeUnblockingMethod from another thread.
When SomeUnblockingMethod is called, the routine inside of SomeBlockingMethod will continue.
Note, the first thing I do will be to call the SomeBlockingMethod, and then later on I will call the SomeUnblockingMethod. I am thinking about using a Monitor.Wait/Monitor.Pulse mechanism to achieve this. The only thing is, when one calls Monitor.Wait, you cannot block initally unless the object involved has already been locked by something else (or at least not that I know of)... But, I want blocking to be the first thing I do... So this leads me into my question...
Question
Is there some way I can implement Monitor.Wait to initially block until a call to Monitor.Pulse is made?
You can use AutoResetEvent instead.
AutoResetEvent ar = new AutoResetEvent(false); // false set initial state as not signaled
Then you can use ar.WaitOne() to wait and ar.Set() to signal waiting processes.
You should use Monitor when you want to protect a resource or you have a critical section. If you want to have a signaling mechanism then AutoResetEvent or ManualResetEvent sounds like a better option.
I don't know what is the problem, but what you want is already how it works:
object _lock = new object();
void SomeBlockingMethod()
{
lock(_lock)
Monitor.Wait(_lock);
... // here only after pulse
}
void SomeUnblockingMethod()
{
lock(_lock)
Monitor.Pulse(_lock);
}
Perhaps you are calling SomeBlockingMethod from multiple places, then you want to use PulseAll. Or perhaps SomeUnblockingMethod is called before SomeBlockingMethod?
I have the following code in my console application:
if (Count % 11 >= 5)
{
GrabberTask = GrabberTask.ContinueWit(Grabber.ExtractSources);
}
The if clause is inside a loop with a counter. I want to know what happens if the if evaluates to true for a second time before the GrabberTask starts for the first time. Is there any better way to keep doing the same Task when the if condition evaluates to true. I am extracting the image sources from a webpage and storing them in a LinkedList. Sometimes I am getting duplicate links being added to the LinkedList from two different Tasks. How to prevent that?
Each time you provide a continuation, the whole continuation will be executed.
What you need is to implement some kind of thread synchronization to avoid more than a thread execute a given operation against some resource.
The simplest one is the lock statement:
public class Test
{
private static readonly object _syncLock = new object();
public void MyMethod()
{
lock(_syncLock)
{
// No more than a thread will be able to work within this
// protected code block. Others will be awaiting/blocked until
// the thread that acquired the lock leaves this code block
}
}
}
Now, if you perform many continuations in some time interval that might need to execute the same thing, no more than a continuation will be able to be executed at once.
say we declare and run a thread in a method which has a while(1) loop. How can one avoid to create and run a second thread when the method is called again?
I only want one thread to start in this method and every time the method is called again, there should be no thread creation all over again. Should I check for the thread name or should I declare the thread as a field of the class?
Any ideas how to do this?
Thanks,
Juergen
It sounds like the thread should indeed be a field of the class - although you'll have to be careful to access it in a thread-safe way, if there could be several threads calling the method to start with.
What do you want to happen the second time - should the method block, or just finish immediately, or perhaps throw an exception? If you don't need to wait for the thread to finish, you might be able to get away with just a flag instead of keeping a reference to the thread itself.
(Note that I've been assuming this is an instance method and you want one extra thread per instance.) If that's not the case, you'll have to adjust accordingly.
Have the method return a singleton, and start the thread in the singleton constructor.
Could you save the synchronization context the first time, check on subsequent times to see if it matches, and post back to it if necessary?
SynchronizationContext syncContext = null;
...
// "object state" is required to be a callback for Post
public void HiThar(object state) {
if (syncContext == null) {
syncContext = SynchronizationContext.Current;
} else {
syncContext.Post(HiThar, state);
}
}
I have a rather large class which contains plenty of fields (10+), a huge array (100kb) and some unmanaged resources. Let me explain by example
class ResourceIntensiveClass
{
private object unmaganedResource; //let it be the expensive resource
private byte[] buffer = new byte[1024 * 100]; //let it be the huge managed memory
private Action<ResourceIntensiveClass> OnComplete;
private void DoWork(object state)
{
//do long running task
OnComplete(this); //notify callee that task completed so it can reuse same object for another task
}
public void Start(object dataRequiredForCurrentTask)
{
ThreadPool.QueueUserWorkItem(DoWork); //initiate long running work
}
}
The problem is that the start method never returns after the 10000th iteration causing a stack overflow. I could execute the OnComplete delegate in another thread giving a chance for the Start method to return, but it requires using extra cpu time and resources as you know. So what is the best option for me?
Is there a good reason for doing your calculations recursively? This seems like a simple loop would do the trick, thus obviating the need for incredibly deep stacks. This design seems especially problematic as you are relying on main() to setup your recursion.
recursive methods can get out of hand quite fast. Have you looked into using Parallel Linq?
you could do something like
(your Array).AsParallel().ForAll(item => item.CallMethod());
you could also look into the Task Parallel Library (TPL)
with tasks, you can define an action and a continue with task.
The Reactive Framework (RX) on the other hand could handle these on complete events in an async manner.
Where are you changing the value of taskData so that its length can ever equal currentTaskIndex? Since the tasks you are assigning to the data are never changing, they are being carried out forever...
I would guess that the problem arises from using the pre-increment operator here:
if(c.CurrentCount < 10000)
c.Start(++c.CurrentCount);
I am not sure of the semantics of pre-increment in C#, perhaps the value passed to a method call is not what you expect.
But since your Start(int) method assigns the value of the input to this.CurrentCount as it's first step anyway, you should be safe replacing this with:
if(c.CurrentCount < 10000)
c.Start(c.CurrentCount + 1);
There is no point in assigning to c.CurrentCount twice.
If using the threadpool, I assume you are protecting the counters (c.CurrentCount), otherwise concurrent increments will cause more activity, not just 10000 executions.
There's a neat tool called a ManualResetEvent that could simplify life for you.
Place a ManualResetEvent in your class and add a public OnComplete event.
When you declare your class, you can wire up the OnComplete event to some spot in your code or not wire it up and ignore it.
This would help your custom class to have more correct form.
When your long process is complete (I'm guessing this is in a thread), simply call the Set method of the ManualResetEvent.
As for running your long method, it should be in a thread that uses the ManualResetEvent in a way similar to below:
private void DoWork(object state)
{
ManualResetEvent mre = new ManualResetEvent(false);
Thread thread1 = new Thread(
() => {
//do long running task
mre.Set();
);
thread1.IsBackground = true;
thread1.Name = "Screen Capture";
thread1.Start();
mre.WaitOne();
OnComplete(this); //notify callee that task completed so it can reuse same object for another task
}
What I have is a loop reading some data and when a set of circumstances are met I need to instantiate a thread. However, the created thread might not complete before the loop criteria is met and I need to create another thread doing the same thing. This is part of an ocr app and I haven't done much thread work before.
while loop
if(criteria)
{
observer = new BackgroundWorker();
observer.DoWork += new DoWorkEventHandler(observer_DoObserving);
observer.RunWorkerAsync();
}
the observer_DoObserving function calls the ocr app, waits for a response and then processes it appropriately and sets observer = null at the end. So how would I create multiple instances of the 'observer' thread. Of course instantly I thought of a class structure, is this an appropriate way to do it or is there another way that is appropriate for threading.
I hope this makes sense.
Thanks, R.
You could use the thread pool, specifically ThreadPool.
while (something)
{
if (criteria)
{
// QueueUserWorkItem also has an overload that allows you to pass data
// that data will then be passed into WorkerMethod when it is called
ThreadPool.QueueUserWorkItem(new WaitCallback(WorkerMethod));
}
}
// ...
private void WorkerMethod(object state)
{
// do work here
}
How you handle this depends in large part on whether the background thread needs to communicate anything to the main thread when it's done. If the background thread really is "fire and forget", then there's no particular reason why you need to maintain a reference to the observer. So you could write:
while loop
{
if(criteria)
{
BackgroundWorker observer = new BackgroundWorker();
observer.DoWork += new DoWorkEventHandler(observer_DoObserving);
observer.RunWorkerAsync();
}
}
The thread does its work and goes away. observer is a local variable that goes out of scope when execution leaves the if block. There's no way that the variable will be overwritten if you have to start another observer thread before the first one is finished.
If you need to keep track of information for individual observers, then you'd create an object of some type (a class that you define) that contains information about the worker's state, and pass that to the RunWorkerAsync method. The worker can then modify that object and send you progress notifications (see the ProgressChanged event and the ReportProgress method), and also report the status when the worker has finished (see RunWorkerCompleted, the state object you passed to RunWorkerAsync will be in the RunWorkerCompletedEventArgs.UserState property).
I am not entirely able to grasp what you are doing exactly, so I may or may not be helpful here;
You seem to be, in part, asking if it's appropriate to create a class to hold some data indicating the state of a thread or what it's working on. That is entirely appropriate to do, provided the object is not an 'expensive' one to create. (no creating Exception objects and throwing them around all the time, for instance).