WebService Async method question - c#

in playing around with the idea of using a webservice for my project I noticed that a couple of properties ( method/event ) were automatically created for me. I can see the purpose of the Completed event but I am not sure where the Async method would be used.
webmethod declaration:
[WebMethod]
public string HelloBrad()
{
return "Hello Brad";
}
Consumption of service in codebehind
localhost.Service1 service = new localhost.Service1();;
service.HelloBradAsync
service.HelloBradCompleted
service.HelloBrad
could somebody please explain the usage of the HelloBradAsync method.
Thanks

Async methodology allows your code to continue executing while the server is processing the message. Using your normal service.HelloBrad code, the thread will block until the webservices returns a response.
Instead, if you call HelloBradAsync, it immediately moves on to your next line of code. When the server is done, it will respond inside the "HelloBradCompleted" event.
This is so that the webservice doesn't block your primary thread while it executes, and is definitely the proper way of doing business. It might require a change in your programming paradigm, but you'll find the benefits outweigh the costs.

Related

That async-ing feeling - httpclient and mvc thread blocking

Dilemma, dilemma...
I've been working up a solution to a problem that uses async calls to the HttpClient library (GetAsync=>ConfigureAwait(false) etc). IIn a console app, my dll is very responsive and the mixture of using the async await calls and the Parallel.ForEach(=>) really makes me glow.
Now for the issue. After moving from this test harness to the target app, things have become problematic. I'm using asp.net mvc 4 and have hit a few issues. The main issue really is that calling my process on a controller action actually blocks the main thread until the async actions are complete. I've tried using an async controller pattern, I've tried using Task.Factory, I've tried using new Threads. You name it, I've tried all the flavours - and then some!.
Now, I appreciate that the nature of http is not designed to facilitate long processes like this and there are a number of articles here on SO that say don't do it. However, there are mitigating reasons why i NEED to use this approach. The main reason that I need to run this in mvc is due to the fact that I actually update the live data cache (on the mvc app) in realtime via raising an event in my dll's code. This means that fragments of the 50-60 data feeds can be pushed out live before the entire async action is complete. Therefore, client apps can receive partial updates within seconds of the async action being instigated. If I were to delegate the process out to a console app that ran the entire process in the background, I'd no longer be able to harness those fragment partial updates and this is the raison d'etre behind the entire choice of this architecture.
Can anyone shed light on a solution that would allow me to mitigate the blocking of the thread, whilst at the same time, allow each async fragment to be consumed by my object model and fed out to the client apps (I'm using signalr to make these client updates). A kind of nirvanna would be a scenario where an out-of-process cache object could be shared between numerous processes - the cache update could then be triggered and consumed by my mvc process (aka - http://devproconnections.com/aspnet-mvc/out-process-caching-aspnet). And so back to reality...
I have also considered using a secondary webservice to achieve this, but would welcome other options before once again over engineering my solution (there are already many moving parts and a multitude of async Actions going on).
Sorry not to have added any code, I'm hoping for practical philosophy/insights, rather than code help on this, tho would of course welcome coded examples that illustrate a solution to my problem.
I'll update the question as we move in time, as my thinking process is still maturing on this.
[edit] - for the sake of clarity, the snippet below is my brothers grimm code collision (extracted from a larger body of work):
Parallel.ForEach(scrapeDataBases, new ParallelOptions()
{
MaxDegreeOfParallelism = Environment.ProcessorCount * 15
},
async dataBase =>
{
await dataBase.ScrapeUrlAsync().ConfigureAwait(false);
await UpdateData(dataType, (DataCheckerScrape)dataBase);
});
async and Parallel.ForEach do not mix naturally, so I'm not sure what your console solution looks like. Furthermore, Parallel should almost never be used on ASP.NET at all.
It sounds like what you would want is to just use Task.WhenAll.
On a side note, I think your reasoning around background processing on ASP.NET is incorrect. It is perfectly possible to have a separate process that updates the clients via SignalR.
Being that your question is pretty high level without a lot of code. You could try Reactive Extensions.
Something like
private IEnumerable<Task<Scraper>> ScrappedUrls()
{
// Return the 50 to 60 task for each website here.
// I assume they all return the same type.
// return .ScrapeUrlAsync().ConfigureAwait(false);
throw new NotImplementedException();
}
public async Task<IEnumerable<ScrapeOdds>> GetOdds()
{
var results = new Collection<ScrapeOdds>();
var urlRequest = ScrappedUrls();
var observerableUrls = urlRequest.Select(u => u.ToObservable()).Merge();
var publisher = observerableUrls.Publish();
var hubContext = GlobalHost.ConnectionManager.GetHubContext<OddsHub>();
publisher.Subscribe(scraper =>
{
// Whatever you do do convert to the result set
var scrapedOdds = scraper.GetOdds();
results.Add(scrapedOdds);
// update anything else you want when it arrives.
// Update SingalR here
hubContext.Clients.All.UpdatedOdds(scrapedOdds);
});
// Will fire off subscriptions and not continue until they are done.
await publisher;
return results;
}
The merge option will process the results as they come in. You can then update the signalR hubs plus whatever else you need to update as they come in. The controller action will have to wait for them all to come in. That's why there is an await on the publisher.
I don't really know if httpClient is going to like to have 50 - 60 web calls all at once or not. If it doesn't you can just take the IEnumerable to an array and break it down into a smaller chunks. And also there should be some error checking in there. With Rx you can also tell it to SubscribeOn and ObserverOn different threads but I think with everything being pretty much async that wouldn't be necessary.

C#: will there be a recursion limit issue with this approach borrowed from javascript

Wondering if an approach that's possible in javascript is OK to use in C#.
In a javascript application, when a large number of web requests have to be made one after the other, the UI can be kept responsive by eschewing a for-loop, and instead, using a stack (an array) and events. The next URL is popped from the array in the success eventhandler, which keeps making another request until the stack is empty. Can a similar approach be taken in C# with a Stack? Is the following legit?
urlStack .... // stack of urls
myRequestMaker = new WebRequestMaker(); // my custom object
myRequestMaker.ResponseReceived += (s,e) =>
{
// e contains info whether the request succeeded, failed, or timed out
url = pop another url from urlStack if it's not empty
(s as WebRequestMaker).MakeWebRequest(url);
}
url = pop first url from urlStack
myRequestMaker.MakeWebRequest(url);
(The ReponseReceived event is raised by the WebRequestMaker object also when the request times out, BTW)
In javascript, you can hit maximum recursion limits using this approach, and to get around that you can wrap the method invocation, made inside the success eventhandler, in a SetTimeout. Will invokving myRequestMaker's MakeWebRequest method inside the ResponseReceived eventhandler run into analogous issues?
Yes, your code will eventually hit StackOverflowException. You can simulate setTimeout behaviour using the System.Task class that runs code asynchronously:
myRequestMaker.ResponseReceived += (s,e) =>
{
// e contains info whether the request succeeded, failed, or timed out
url = pop another url from urlStack if it's not empty
Task.Run(() => (s as WebRequestMaker).MakeWebRequest(url));
}
(of course, your urlStack now should be a concurrent to avoid racing conditions)
Optimally, you would not use the code above, but implement a MakeWebRequestAsync method that gets your job done asynchronously. Most built-in long-running/blockin methods (such as reading and writing from/to streams) in .Net already implement such a method, so if your WebRequestMaker is actually using a System.Net.WebClient you can call the async version. More information about tasks and async/await can be found in MSDN.
Yes it will, every function call pushed to the call stack and poped when it finishes.
The error you mention is fired when the call stack is full, not in javascript, in most languages (in fact I don't know any that doesn't).
You should apply the same logic as you do in Javascript here.

Calling a webservice async

Long post.. sorry
I've been reading up on this and tried back and forth with different solutions for a couple of days now but I can't find the most obvious choice for my predicament.
About my situation; I am presenting to the user a page that will contain a couple of different repeaters showing some info based on the result from a couple of webservice calls. I'd like to have the data brought in with an updatepanel (that would be querying the result table once per every two or three seconds until it found results) so I'd actually like to render the page and then when the data is "ready" it gets shown.
The page asks a controller for the info to render and the controller checks in a result table to see if there's anything to be found. If the specific data is not found it calls a method GetData() in WebServiceName.cs. GetData does not return anything but is supposed to start an async operation that gets the data from the webservice. The controller returns null and UpdatePanel waits for the next query.
When that operation is complete it'll store the data in it's relevant place in the db where the controller will find it the next time the page asks for it.
The solution I have in place now is to fire up another thread. I will host the page on a shared webserver and I don't know if this will cause any problems..
So the current code which resides on page.aspx:
Thread t = new Thread(new ThreadStart(CreateService));
t.Start();
}
void CreateService()
{
ServiceName serviceName = new ServiceName(user, "12345", "MOVING", "Apartment", "5100", "0", "72", "Bill", "rate_total", "1", "103", "serviceHost", "password");
}
At first I thought the solution was to use Begin[Method] and End[Method] but these don't seem to have been generated. I thought this seemed like a good solution so I was a little frustrated when they didn't show up.. is there a chance I might have missed a checkbox or something when adding the web references?
I do not want to use the [Method]Async since this stops the page from rendering until [Method]AsyncCompleted gets called from what I've understood.
The call I'm going to do is not CPU-intensive, I'm just waiting on a webService sitting on a slow server, so what I understood from this article: http://msdn.microsoft.com/en-us/magazine/cc164128.aspx making the threadpool bigger is not a choice as this will actually impair the performance instead (since I can't throw in a mountain of hardware).
What do you think is the best solution for my current situation? I don't really like the current one (only by gut feeling but anyway)
Thanks for reading this awfully long post..
Interesting. Until your question, I wasn't aware that VS changed from using Begin/End to Async/Completed when adding web references. I assumed that they would also include Begin/End, but apparently they did not.
You state "GetData does not return anything but is supposed to start an async operation that gets the data from the webservice," so I'm assuming that GetData actually blocks until the "async operation" completes. Otherwise, you could just call it synchronously.
Anyway, there are easy ways to get this working (asynchronous delegates, etc), but they consume a thread for each async operation, which doesn't scale.
You are correct that Async/Completed will block an asynchronous page. (side note: I believe that they will not block a synchronous page - but I've never tried that - so if you're using a non-async page, then you could try that). The method by which they "block" the asynchronous page is wrapped up in SynchronizationContext; in particular, each asynchronous page has a pending operation count which is incremented by Async and decremented after Completed.
You should be able to fake out this count (note: I haven't tried this either ;) ). Just substitute the default SynchronizationContext, which ignores the count:
var oldSyncContext = SynchronizationContext.Current;
try
{
SynchronizationContext.SetSynchronizationContext(new SynchronizationContext());
var serviceName = new ServiceName(..);
// Note: MyMethodCompleted will be invoked in a ThreadPool thread
// but WITHOUT an associated ASP.NET page, so some global state
// might be missing. Be careful with what code goes in there...
serviceName.MethodCompleted += MyMethodCompleted;
serviceName.MethodAsync(..);
}
finally
{
SynchronizationContext.SetSynchronizationContext(oldSyncContext);
}
I wrote a class that handles the temporary replacement of SynchronizationContext.Current as part of the Nito.Async library. Using that class simplifies the code to:
using (new ScopedSynchronizationContext(new SynchronizationContext()))
{
var serviceName = new ServiceName(..);
// Note: MyMethodCompleted will be invoked in a ThreadPool thread
// but WITHOUT an associated ASP.NET page, so some global state
// might be missing. Be careful with what code goes in there...
serviceName.MethodCompleted += MyMethodCompleted;
serviceName.MethodAsync(..);
}
This solution does not consume a thread that just waits for the operation to complete. It just registers a callback and keeps the connection open until the response arrives.
You can do this:
var action = new Action(CreateService);
action.BeginInvoke(action.EndInvoke, action);
or use ThreadPool.QueueUserWorkItem.
If using a Thread, make sure to set IsBackground=true.
There's a great post about fire and forget threads at http://consultingblogs.emc.com/jonathangeorge/archive/2009/09/10/make-methods-fire-and-forget-with-postsharp.aspx
try using below settings
[WebMethod]
[SoapDocumentMethod(OneWay = true)]
void MyAsyncMethod(parameters)
{
}
in your web service
but be careful if you use impersonation, we had problems on our side.
I'd encourage a different approach - one that doesn't use update panels. Update panels require an entire page to be loaded, and transferred over the wire - you only want the contents for a single control.
Consider doing a slightly more customized & optimized approach, using the MVC platform. Your data flow could look like:
Have the original request to your web page spawn a thread that goes out and warms your data.
Have a "skeleton" page returned to your client
In said page, have a javascript thread that calls your server asking for the data.
Using MVC, have a controller action that returns a partial view, which is limited to just the control you're interested in.
This will reduce your server load (can have a backoff algorithm), reduce the amount of info sent over the wire, and still give a great experience to the client.

ASMX webservices with Silverlight Async confusion

I have a silverlight 4 web app that needs to communicate with a server by accessing the ASMX web service on the server.
I have a list(yes, the array), of objects that I need to send(one by one) as a parameter to the service. However looping through the list and running the method(objecttosend); will not work because I need to send then one after another and Silverlight seems to only support Async(presumably to not lockup interface - makes sense).
So I tried this:
public void SendNextPart()
{
if (partsToSend.Count > 0)
{
Part thisPart = partsToSend.Dequeue();
fuWS.createPartCompleted += new EventHandler<System.ComponentModel.AsyncCompletedEventArgs>(fuWS_createPartCompleted);
fuWS.createPartAsync(thisPart);
}
}
Queue<Part> partsToSend = new Queue<Part>();
void fuWS_createPartCompleted(object sender, System.ComponentModel.AsyncCompletedEventArgs e)
{
SendNextPart();
}
Which, as far as I can understand it, will check to see if the List has parts to send, then run the webservice(called fuWS) method and delete that part from the partsToSend List. Once it gets the completed event it should then run the SendNextPart method again and send the next part.
However what is happening(picked this up by watching HTTPwatch) is that it sends the first part, then after that is sends 2 parts at once and then after that more and more, all at once. Almost as if it is receiving the completed event before it has actually sent to the server and run the method successfully.
Please help, this is bugging the hell out of me, and it completely breaks what I need to do :'(
I don't see the SendNextBuffer method that you're calling in the web service callback event handler. But in any case, at best your code has a race condition. If the web service completes and returns before the partsToSend.RemoveAt line is executed (theoretically possible) then you could be making the next request before you've removed the one you just sent.
So first, you should check to make sure you've included all the code in your example unless you meant for SendNextBuffer to say SendNextPart.
Secondly, you should move the partsToSend.RemoveAt line before the web service call.
Finally, you should probably change the partsToSend list into a Queue<Part> (first in, first out) or Stack<Part> (last in, first out) instead since that is what you're using it as.
Ok, so after using Debug.WriteLine, I realized that I was being an idiot.
Check out this line:
fuWS.createPartCompleted += new EventHandler<System.ComponentModel.AsyncCompletedEventArgs>(fuWS_createPartCompleted);
What this was doing was adding a new event handler every time it had to send a new part. So the second part sending now had two callback then the third would have more and so on increasing exponentially.

Multiple asynchronous method calls to method while in a loop

I have spent a whole day trying various ways using 'AddOnPreRenderCompleteAsync' and 'RegisterAsyncTask' but no success so far.
I succeeded making the call to the DB asynchronous using 'BeginExecuteReader' and 'EndExecuteReader' but that is missing the point. The asynch handling should not be the call to the DB which in my case is fast, it should be afterwards, during the 'while' loop, while calling an external web-service.
I think the simplified pseudo code will explain best:
(Note: the connection string is using 'MultipleActiveResultSets')
private void MyFunction()
{
"Select ID, UserName from MyTable"
// Open connection to DB
ExecuteReader();
if (DR.HasRows)
{
while (DR.Read())
{
// Call external web-service
// and get current Temperature of each UserName - DR["UserName"].ToString()
// Update my local DB
Update MyTable set Temperature = ValueFromWebService where UserName =
DR["UserName"];
CmdUpdate.ExecuteNonQuery();
}
// Close connection etc
}
}
Accessing the DB is fast. Getting the returned result from the external web-service is slow and that at least should be handled Asynchnously.
If each call to the web service takes just 1 second, assuming I have only 100 users it will take minimum 100 seconds for the DB update to complete, which obviously is not an option.
There eventually should be thousands of users (currently only 2).
Currently everything works, just very synchronously :)
Thoughts to myself:
Maybe my way of approaching this is wrong?
Maybe the entire process should be called Asynchnously?
Many thanx
Have you considered spinning this whole thing off into it's own thread?
What is really your concern ?
Avoid the long task blocking your application ?
If so, you can use a thread (see BackgroundWorker)
Process several call to the web service in parallel to speed up the whole think ?
If so, maybe the web service can be called asynchronously providing a callback. You could also use a ThreadPool or Tasks. But you'll have to manage to wait for all your calls or tasks to complete before proceeding to the DB update.
You should keep the database connection open for as short of a time as possible. Therefore, don't do stuff while iterating through a DataReader. Most application developers prefer to put their actual database access code on a separate layer, and in a case like this, you would return a DataTable or a typed collection to the calling code. Furthermore, if you are updating the same table you are reading from, this could result in locks.
How many users will be executing this method at once, and how often does it need to be refreshed? Are you sure you need to do this from inside the web app? You may consider using a singleton for this, in which case spinning off a couple worker threads is totally appropriate even if it's in the web app. Another thing to consider is using a Windows Service, which I think would be more appropriate for periodically updating data via from a web service that doesn't even have to do with the current user's session.
Id say, Create a thread for each webrequest, and do something like this:
extra functions:
int privCompleteThreads = 0;
int OpenThreads = 0;
int CompleteThreads
{
get{ return privCompleteThreads; }
set{ privCompleteThreads = value; CheckDoneOperations(); }
}
void CheckDoneOperations
{
if(CompleteThreads == OpenThreads)
{
//done!
}
}
in main program:
foreach(time i need to open a request)
{
OpenThreads = OpenThreads + 1;
//Create thread here
}
inside the threaded function:
//do your other stuff here
//do this when done the operation:
CompleteThreads = CompleteThreads + 1;
now im not sure how reliable this approach would be, its up to you. but a normal web request shouldnt take a second, your browser doesnt take a second loading this page does it? mine loads it as fast as i can hit F5. Its just opening a stream, you could try opening the web request once, and just using the same instance over and over aswell, and see if that speeds it up at all

Categories