Wait For Asynchronous Events to Finish in Foreach - c#

How can I send a request to server using foreach and when i get response then for foreach wait for the returned information to process before continuing on to the next request.
My problem: in foreach I send many requests and foreach continues without process with information which i get in response.
For example:
foreach (DataLoader.GetInfo items in listBoxFetInfo.Items)
{
DownloadInfo(items.CompanyName);
}
and
void DownloadInfo(string name)
{
int requestId = feed.SendCompanyNameRequest(symbol.ToUpper(), sendType, sendTimePeriod, sendDateFrom, DateTime.Now);
}
and
feed.RequestCompanyName += new IFeedEvents_RequestCompanyNameEventHandler(feed_RequestName);
and
void feed_RequestName(int originalRequestId, short ResponseCode, string symbol, short symbolStatusCode, object records)
{
//save to file
}
I can't use Start Multiple Async Tasks and Process Them As They Complete because this solution need in add CancellationToken where i can not add (can not in void feed_RequestName
What another solution can be in this problem?
Also, i can not change signature of:
feed_RequestName(int originalRequestId, short ResponseCode, string symbol, short symbolStatusCode, object records)
and
feed.SendCompanyNameRequest(symbol.ToUpper(), sendType, sendTimePeriod, sendDateFrom, DateTime.Now);

I'm taking a stab in the dark here as to how your thing works, but this is what I would do. Since you can't modify the signatures, you can't use Task, like your last question suggested.
What I see here is that you have a method that begins a request and then an event that is called when the request is completed. I'm assuming your request is running on a separate thread or something inside the library where feed is from. What you need to do is wait for that event to complete once before proceeding to the next item. If this isn't what your question is asking or how your library works, let me know.
If you are not executing the requests in parallel (meaning only one is executing at a time and nobody else is using your feed object), you can use the System.Threading.ManualResetEvent class to make your one method depend on the other really easily. What you can do is the following:
In your DownloadInfo method, Reset the ManualResetEvent so that whenever somebody calls WaitOne, it will "block" the thread and prevent it from going on. It will wait until somebody else calls Set on the ManualResetEvent.
After calling the method that gets your requestId, call WaitOne on the ManualResetEvent. This will wait until "somebody else" calls Set on the ManualResetEvent.
In your feed_RequestName method, Set the manual reset event once you are done processing. This will make the WaitOne that you called in step 2 stop blocking and continue. If you store the results of the request in some variables on the object that these things live in before calling Set, you can access the results from your DownloadInfo method after the WaitOne.
A short, incomplete example:
class YourClassWhereYouPutTheseThings
{
private ManualResetEvent _mre = new ManualResetEvent(true);
//...your other stuff here...
public YourClassWhereYouPutTheseThings()
{
//...your existing stuff
feed.RequestCompanyName += new IFeedEvents_RequestCompanyNameEventHandler(feed_RequestName);
//...your existing stuff
}
void YourMethod()
{
foreach (DataLoader.GetInfo items in listBoxFetInfo.Items)
{
DownloadInfo(items.CompanyName);
}
}
void DownloadInfo(string name)
{
this._mre.Reset(); //make everyone listening to _mre wait until Set is called
int requestId = feed.SendCompanyNameRequest(symbol.ToUpper(), sendType, sendTimePeriod, sendDateFrom, DateTime.Now);
this._mre.WaitOne(); //wait for the feed_RequestName method to be called once
}
void feed_RequestName(int originalRequestId, short ResponseCode, string symbol, short symbolStatusCode, object records)
{
//save to file
//store your information on the object somewhere if you would like
//after saving to your file, set the manualresetevent so that the WaitOne() above can proceed
this._mre.Set();
}
}
Important note: If your feed is shared among multiple objects which could execute at the same time, you need to make sure that you don't interpet requests by other objects as the request that your DownloadInfo method made. Any request will cause your feed_RequestName method to be called, so you need to figure out which call of that method corresponds to your request. Here is one suggestion on how to make that work:
Store the requestId that you get in your DownloadInfo somewhere that feed_RequestName can access. This could be problematic if feed_RequestName gets called before your calling method has a chance to execute. You will need to figure out a way around this problem.
In feed_RequestName, compare against the stored request id and only call Set on the ManualResetEvent if it matches.
Another note: Your library is geared towards executing requests in parallel. Waiting for the request to finish before going to the next one is really inefficient. What you should really do is something like (psuedocode):
DownloadItems:
Lock the object so that only one person uses the method at a time
foreach item in requests
call your start request method and get the id
put the id into a thread-safe set stored on the object
wait for the set to become empty
feed_RequestName:
Do your thing that you've been doing
Remove the id from the thread safe set

Can you use while instead of foreach?
Like:
while(not done processing all requests) {
//grab and process next request
}

Related

C# Suspending Thread until Server responds

I am trying to make a function that when called returns back information to the caller that is on a server. What I want in this function, is that it creates a thread that issues the command to the server, and then suspends itself until the server responds back with the answer.
public AccountState GetAccount(string key)
{
AccountState state = null;
Thread t = new Thread(() =>
{
_connection.SomeCommandSentToServer(key);
accountRequests.TryAdd(key, (Thread.CurrentThread, null));
//Suspend current thread until ServerReponseHere is called
Thread.CurrentThread.Suspend();
//We have been resumed, value should be in accountRequests now
accountRequests.TryRemove(key, out var item);
state = item.AccountState;
});
t.Start();
return state;
}
public ConcurrentDictionary<string, (Thread Thread, AccountState AccountState)> accountRequests = new ConcurrentDictionary<string, (Thread Thread, AccountState AccountState)>();
///Once server is done with processed command, call to this function made
public void ServerReponseHere(string key, AccountState state)
{
accountRequests.TryGetValue(username, out var item);
accountRequests.TryUpdate(username, (item.Thread, new AccountState()), item);
item.Thread.Resume();
}
My Idea then is that in a different function, when server responds back, it calls the ResumeThread function shown above.
C# says that Suspend / Resume are deprecated functions however, -- what is a better way to do this?
UPDATE
Clarification about "SomeCommandSentToServer" -- This just sends a command to the server via TCP sockets.
In that call, all that is really happening is a transmission to the server. I'm using a library that uses WinSock2.h call of "Send()" -- Yes I know it is a deprecated library... but the library I'm using requires it.
I have a separate thread that polls input from the server. So I have no way to "await" on this SomeCommandSentToServer -- I would need to await on some sort of call back function (aka the resume function I was mentioning) -- to make this work.
I am unsure how to do that
With all the information available from the question, here is what you should aim for when using the async / await pattern:
public async Task<AccountState> GetAccountAsync(string key)
{
// The method SomeCommandSentToServerAsync must be changed to support async.
AccountState state = await _connection.SomeCommandSentToServerAsync(key);
return state;
}
It is highly unlikely that you need anything else. By that, I mean you will not have to manipulate threads directly, put them in a concurrent dictionary and manually suspend or resume them because it looks horrible from a maintenance perspective ;)
.NET will take care of the threading part, meaning the magic of the async infrastructure will most likely release the current thread (assuming a call is actually made to the server) until the server returns a response.
Then the infrastructure will either use the existing synchronization context -if you are on a UI thread for instance- or grab a thread from the thread pool -if not- to run the rest of the method.
You could even reduce the size of the method a bit more by simply returning a Task with a result of type AccountState:
public Task<AccountState> GetAccountAsync(string key)
{
// The method SomeCommandSentToServerAsync must be changed to support async.
return _connection.SomeCommandSentToServerAsync(key);
}
In both example, you will haver to make the callers async as well:
public async Task TheCallerAsync()
{
// Grab the key from somewhere.
string key = ...;
var accountState = await <inst>.GetAccountAsync(key);
// Do something with the state.
...
}
Turning a legacy method into an async method
Now, regarding the legacy SomeCommandSentToServer method. There is a way to await that legacy method. Yes, you can turn that method into an asynchronous method that can be used with the async / await.
Of course, I do not have all the details of your implementation but I hope you will get the idea of what needs to be done. The magical class to do that is called TaskCompletionSource.
What it allows you to do is to give you access to a Task. You create the instance of that TaskCompletionSource class, you keep it somewhere, you send the command and immediately return the Task property of that new instance.
Once you get the result from your polling thread, you grab the instance of TaskCompletionSource, get the AccountState and call SetResult with the account state. This will mark the task as completed and do the resume part you were asking for :)
Here is the idea:
public Task<AccountState> SomeCommandSentToServerAsync(string key)
{
var taskCompletionSource = new TaskCompletionSource<AccountState>();
// Find a way to keep the task in some state somewhere
// so that you can get it the polling thread.
// Do the legacy WinSock Send() command.
return taskCompletionSource.Task;
}
// This would be, I guess, your polling thread.
// Again, I am sure it is not 100% accurate but
// it will hopefully give you an idea of where the key pieces must be.
private void PollingThread()
{
while(must_still_poll)
{
// Waits for some data to be available.
// Grabs the data.
if(this_is_THE_response)
{
// Get the response and built the account state somehow...
AccountState accountState = ...
// Key piece #1
// Grab the TaskCompletionSource instance somewhere.
// Key piece #2
// This is the magic line:
taskCompletionSource.SetResult(accountState);
// You can also do the following if something goes wrong:
// taskCompletionSource.SetException(new Exception());
}
}
}

Cancelling async uploading task

I've got an Uploaderclass with one method -Upload
public static int Upload(string endpoint,object objectToBeUploaded)
{
Source.Token.ThrowIfCancellationRequested();
var repos = new UploadRepository(endpoint);
return repos.Upload(objectToBeUploaded);
}
The Source is a static CancellationTokenSource available in the project.
I also have a list of endpoints I need to upload a certain object for.
The code in the Form (it's a very small project using WinForms) looks like this:
private async Task UploadObjectAsync(
string endpoint,
object objectToBeUploaded)
{
try
{
int elementId = await Task.Factory.StartNew(
() => Uploader.Upload(endpoint,objectToBeUploaded));
//do something with the returned value..
}
catch(OperationCanceledEception ex)
{
//handle the exception..
}
}
And then I set the btnUpload.Click handler like this so I can later use it:
this.btnUpload.Click += async (s, e) =>
{
foreach(var endpoint in endpoints)
{
await UploadObjectASsync(endpoint,someObject);
}
}
The problem is that whenever I start uploading to all the endpoints(how they are obtained is irrelevant) and I decide to cancel the uploading process using Source.Cancel(); the first UploadObjectAsyncwill always go through since
the Source.Token.ThrowIfCancellationRequested(); check in the Upload method has already been passed. The rest of tasks will be cancelled normally and handled gracefully.
How am I to restructure this code in order to make sure that the first UploadObjectAsync Task will also be cancelled?
It is worth mentioning that I also don't have access to the source code of the uploading process itself (service reference) - the repos.Upload(objectToBeUploaded) in my Upload method.
You need to make your UploadRepository.Upload take a CancellationToken.
Specially when that's the one doing the I/O operation.. That's when the async/await really pays-off.
That will also help you get rid of that: Task.Factory.StartNew since the Upload method will return Task already. There will be no need to spin off a task.
In your current setup, given enough time for the tasks to start (and go through your ThrowIfCancellationRequested) you won't be able to cancel any upload. Even if it takes 30 seconds.
Also, you might be interested in: Task.Run
There isn't anything practical you can do. The Upload method doesn't take a token. The first task has already passed the cancelation check by the time you hit the cancel button. You can prove to yourself the cancel is a timing issue by adding a 10 second sleep ahead of throw if cancelled call. All tasks would then cancel.
The problem is that you can't stop the process that happens inside the Upload function unless it checks for the state of the CancellationToken any terminates itself.
So what you could do is to abort the thread that is executing by doing something like this:
int elementId = await Task.Factory.StartNew(() =>
{
try
{
using (Source.Token.Register(Thread.CurrentThread.Interrupt))
{
return Uploader.Upload(endpoint, objectToBeUploaded));
}
}
catch (ThreadInterruptedException ex)
{
throw new OperationCanceledEception(ex)
}
}, Source.Token);
Using the Source.Token.Register(delegate) function you cause the token to call that function in case the token is cancelled. This way the thread that is currently executing the the uploaded and should throw a exception right away.
This method only works in case the thread enters the WaitSleepJoin-State from time to time, because the exception is only raised in case the thread is in that state. Have a look at the documentation of the Thread.Interrupt function.
The alternative is to use Thread.Abort and the ThreadAbortedException. This will kill your thread in any case, but it may corrupt the internal state of your service, because locks the thread holds won't be released properly. So be very careful using this method.

What's the best way to perform a task not more than once every second?

I'm writing a Windows Store app that will rely on a JSON api. The provider of the API asks that no more than 1 api request is made per second.
So I created a class that would allow me to queue requests on a blocking queue, and on a background thread it would run a loop that resembles the following:
Loop
{
// this will block until a request is added to the queue
MyRequest = Queue.Take()
// Create task to make the api request here.
Thread.Sleep(1000)
}
This way, it would wait at least one second before trying to Dequeue another request.
I've found that Thread.Sleep is not available for windows store apps. Task.Delay() seems unnecessarily wasteful, since it will create a new task each time it is called.
I feel like there is probably a known way to do this that I'm not aware of?
Thanks,
I know Task.Delay() seems wasteful, but it's recommended by a Microsoft Employee and moderator on MSDN here
Using .Sleep() or an infinite loop like TGH suggests would cause the program to become unresponsive while it waits. If you want it to be responsive while processing your queue, you'd use something like this:
await Task.Delay(1000);
If you want the task to execute in the background, I would suggest a System.Threading.Timer. Set it up as a one-shot rather than a periodic timer, and refresh it after every request. Something like:
System.Threading.Timer myTimer;
// initialize timer
myTimer = new Timer(TimerProc, null, 1000, Timeout.Infinite);
void TimerProc(object state)
{
// do request here
// then reset the timer
myTimer.Change(1000, Timeout.Infinite);
}
That will have the effect of waiting one second between when one request ends and the next one starts.
If you want ticks to execute on one-second intervals, you have to make sure that they can't overlap. That's easy enough with Monitor. You'll need a lock object, and to make your timer an periodic timer:
object timerLock = new object();
// initialize periodic timer
myTimer = new Timer(TimerProc, null, 1000, 1000);
And your timer proc:
void TimerProc(object state)
{
if (!Monitor.TryEnter(timerLock))
{
// a request is currently being processed
return;
}
try
{
// do request here
}
catch (expected exceptions)
{
}
Monitor.Exit(timerLock);
}
Note that I don't do the Monitor.Exit in a finally. See Eric Lippert's Locks and exceptions do not mix for the reasons why.

Threading Web requests handled in Main?

I'm writing an application in C#, and I am creating multiple BackgroundWorker threads to grab information from webpages. Despite them being BackgroundWorkers, my GUI Form is becoming unresponsive.
When I am debugging, I pause when the program goes unresponsive, and I can see that I am in the Main Thread, and I am paused on the webpage fetching method. This method is only called from new threads, though, so I can’t figure out why I would be there in the Main Thread.
Does this make any sense? What can I do to make sure the web requests are only being handled in their respective threads?
EDIT: some code and explanation
I am processing a large list of addresses. Each thread will be processing one or more addresses. I can choose how many threads I want to create (I keep it modest :))
//in “Controller” class
public void process()
{
for (int i = 1; i <= addressList.Count && i<= numthreads; i++)
{
BackgroundWorker bw = new BackgroundWorker();
bw.DoWork += doWork;
bw.RunWorkerAsync((object)i);
}
}
public void doWork(object sender, DoWorkEventArgs e)
{
//create an object that has the web fetching method, call it WorkObject
//WorkObject keeps a reference to Controller.
//When it is done getting information, it will send it to Controller to print
//generate a smaller list of addresses to work on, using e.Argument (should be 'i' from the above 'for' loop)
WorkObject.workingMethod()
}
When WorkObject is created, it uses “i” to know what thread number it is. It will use this to get a list of web addresses to get information from (from a larger list of addresses which is shared by the main Form, the Controller, and each of the WorkObjects – each thread will process a smaller list of addresses). As it iterates over the list, it will call the “getWebInfo” method.
//in “WorkObject” class
public static WebRequest request;
public void workingMethod()
{
//iterate over the small list of addresses. For each one,
getWebInfo(address)
//process the info a bit...then
myController.print()
//note that this isn’t a simple “for” loop, it involves event handlers and threading
//Timers to make sure one is done before going on to the next
}
public string getWebInfo (string address)
{
request = WebRequest.Create(address);
WebResponse response = request.GetResponse();
StreamReader reader = new StreamReader(response.GetResponseStream(), Encoding.UTF8);
string content = reader.ReadToEnd();
return content;
}
You should be performing all the web-work in your BackgroundWorker's DoWork event, are you?
Some clipped code may help us to understand what's going on.
In a similar situation I found I had better control by creating my own threads (no BackgroundWorker and no ThreadPool) and throttling the number of active connections. Write the IP addresses to a Queue and then pick each one off, passing it to a helper class where you call its DoSomething method on a new thread. Throttle by incrementing a counter when you launch a thread and decrement it when the thread completes. You can use callback routines from your helper class to signal your UI thread to update the interface or indicate a thread is finished.
Use your UI to vary the throttle limit and watch the Task Manager to see the effect on memory and CPU usage. You can add a maxconnection setting in your app.config to get more simultaneous connections.
I couldn't figure out why the background workers were causing hanging, but instead of using synchronous HTTP requests I switched to asynchronous using this guide:
http://www.developerfusion.com/code/4654/asynchronous-httpwebrequest/
Cleared up all the hanging. Thanks for the responses.

How can I return a response from a WCF call and then do more work?

I have a synchronous web service call that returns a message. I need to quickly return a message that basically says that order was received. I then need to spend a couple of minutes processing the order, but cannot block the service call for that long. So how can I return from the web service, and then do some more stuff? I'm guessing I need to fork some other thread or something before I return, but I'm not sure of the best approach.
string ProcessOrder(Order order)
{
if(order.IsValid)
{
return "Great!";
//Then I need to process the order
}
}
You can open a new thread and have it do what you need, while you're main thread returns great.
string ProcessOrder(Order order)
{
if(order.IsValid)
{
//Starts a new thread
ThreadPool.QueueUserWorkItem(th =>
{
//Process Order here
});
return "Great!";
}
}
You could start your big amount of work in a seperate thread
public string ProcessOrder(Order order)
{
if(order.IsValid)
{
System.Threading.ParameterizedThreadStart pts = new System.Threading.ParameterizedThreadStart(DoHardWork);
System.Threading.Thread t = new System.Threading.Thread(pts);
t.Start(order);
return "Great!!!";
}
}
public void DoHardWork(object order)
{
//Stuff Goes Here
}
Is the work you're doing "important?" I assume it is. You could use a thread, but you'll have to be ok with the possibility that your work might get interrupted if the machine restarts or possibly if the asp.net worker process recycles. This would likely lead to the work not getting done even though you already told the client you had accepted it. This might be or not depending on your use case.
I would consider taking the work item you receive from the synchronous service request and putting it in a persistent queue. An easy way to do this is to use a transational MSMQ queue. Your synchronous service puts the work request in the queue and you have a few worker threads pulling work requests out of the queue. Wrap your queue read and the work in a transaction and don't commit the transaction until the work is completed. If you machine or process shuts down in the middle of a request, it will be restarted automatically the next time it starts up.
You could also look at utilizing the PIAB (Policy Injection Application Block) to accomplish work after a method call.

Categories