Is it possible for HTTP server to receive requests out of order even if they are sent sequentially? - c#

(This discussion might not be specific to C#...)
I have a C# method SendMultipleRequests that sends HTTP POST request 10 times sequentially.
Is it possible for the server to receive requests out of order?
If my understanding is correct if the requests are sent concurrently (without await), the server could receive requests out of order, but in the example below it needs to wait for the response to be received at the client before sending next request, so the server will receive requests in order.
public async Task SendRequest(int i)
{
// definition of endpoint is omitted in this example
var content = new StringContent($"I am {i}-th request");
await HttpClient.PostAsync(endpoint, content);
}
public async Task SendMultipleRequests()
{
for (int i = 0; i < 10; i++)
{
await SendRequest(i);
}
}

with await your app will wait for the task returned by PostAsync to finish before it issues the next request - see the docs for postasync where it says “This operation will not block. The returned Task<TResult> object will complete after the whole response (including content) is read.” - using await will mean that you will only issue the next request after you I’ve read the content of the previous response
If you remove the await then your code will queue up ten tasks and start working on them all in some undefined order. The server will see requests in an unspecified order. This may be further exacerbated by the fact that the requests may take different routes through the internet, some slower. If you remove the await then you should capture the returned task into a list, then you can use something like await Task.WhenAll(list) to wait for them all to complete (unless you really want to fire and forget in which case you can assign them to the discard _ = client.PostAsync... but keeping the task allows you to discover and deal with exceptions that arose)

Related

HttpClient get request cancellation, in case the operation takes longer than x minutes

I am writing an azure function which synchronizes data every 15 minutes.
I am targeting .net 3.1 for azure function and I am using httpclient to send a get request to their API.
var response = await client.GetStringAsync()
It usually takes about 20 seconds to get the results, however, sometimes their server does not process the request correctly or it is just busy, and takes too long to do so or does not return any data. I've tested in postman, and I get status Ok 200, but it keeps loading for many minutes, and nothing happens.
However, if you just cancel the operation, and try again, it is responds fine.
Now My question is, how could I keep track of the time it takes for the result to come from my get client.GetStringAsync() method, and somehow cancel the operation if lets say it is longer than 2 minutes?
If you are using .NET 5, you can use a CancellationTokenSource with a 'cancel after' and pass the token to client.GetStringAsync(), like this:
var cts = new CancellationTokenSource()
cts.CancelAfter(120000); // Cancel after 2 minutes
try
{
var response = await client.GetStringAsync(uri, cts.Token);
}
catch (TaskCanceledException)
{
Console.WriteLine("\nTask cancelled due to timeout.\n");
}
Then you can wrap the operation in a try-catch block to see if the operation was cancelled by the token.
Reference: Cancel async tasks after a period of time (C#)
UPDATE
On .NET Core, you can specify a timeout for your HttpClient. Note however that this timeout applies to all requests you send using that instance of HttpClient. If you wish to do it that way, you can do as follows:
client.Timeout = TimeSpan.FromMinutes(2);
try
{
var response = await client.GetStringAsync(uri);
}
catch (TaskCanceledException)
{
Console.WriteLine("\nRequest cancelled due to timeout.\n");
}

Wrapping both cpu-bound/io-bound long-running code into (async) Task

Consider the simple MVC5 controller:
public class DocumentsController {
// ctor code is omitted
[HttpPost, Route("api/documents/request/stamp={stamp}")]
public ActionResult RequestDocuments(string stamp) {
var documents = this.DocumentsRequestService.RequestByStamp(stamp);
return new JsonResult(documents);
}
}
The DocumentsRequestService does these things internally:
it sends a request to a dedicated MSMQ-queue (let's call it M) AND synchronously waits for an incoming message at the M's response queue:
using(var requestMessage = new Message()) {
requestMessage.Body = documentStamp;
requestMessage.Recoverable = true;
requestMessage.Label = "request";
requestMessage.ResponseQueue = this.requestedDocumentsResponseQueue;
requestMessage.Formatter = new XmlMessageFormatter(new Type[] { typeof(String) });
// send request
this.requestedDocumentsQueue.Send(requestMessage);
// synchronously wait for response
var responseMessage = this.requestedDocumentsResponseQueue.Receive();
if(responseMessage.Label.EndsWith("success")) {
return new DocumentsRequestResult(
success: true,
matches: parseMatchesList(responseMessage)
);
}
return new DocumentsRequestResult(
success: false,
matches: Enumerable.Empty<DocumentsRequestMatch>()
);
}
the consumer (Windows Service) of that message makes a specific api call. By saying 'specific' I mean that we use a third-party means to do that. This call is synchronous and quite long. When the processing ends the consumer sends a response message to the requesting message's response queue.
when response arrives at the M's response queue it's a time to parse and return the results to the controller.
From the end user's perspective this task should be blocking, or at least it should look like blocking.
As far as I understand running a Task makes use of parallelization. Whereas using the async-await pair makes the running task asynchronous. It could be helpful if several tasks would run in parallel.
Is it reasonable/possible to incorporate with Tasking/Asynchrony in my case? If yes, then where do I start?
The "asynchrony" of a network call is transparent to the caller. It doesn't matter to the caller whether the implementation is synchronous or asynchronous. Put another way, from a client's perspective, it's always asynchronous.
For example, the HTTP client couldn't care less if RequestDocuments is synchronous or asynchronous; either way, the HTTP client will send a request and receive a response some time later (i.e., asynchronously).
Similarly, the HTTP web server doesn't care whether the Win32 service is implemented synchronously or asynchronously. It just knows that it puts a message on a queue and some time later (i.e., asynchronously) it gets a response message from the queue.
As far as I understand running a Task makes use of parallelization. Whereas using the async-await pair makes the running task asynchronous.
Sort of. Task can be used for either asynchronous or parallel code, a fact that has caused much confusion. However, Task Parallel Library constructs such as Parallel and PLINQ are firmly in the parallel (non-asynchronous) world.
It could be helpful if several tasks would run in parallel.
I believe "concurrently" is the appropriate term here.
First, note that ASP.NET gives you a considerable amount of concurrency for free. If you want to make each request internally concurrent, then you can do so fairly easily via Task.WhenAll. For example, you can change your DocumentsRequestService call to be asynchronous (assuming your message queue API supports async calls):
using(var requestMessage = new Message()) {
...
// send request
await this.requestedDocumentsQueue.SendAsync(requestMessage);
// asynchronously wait for response
var responseMessage = await this.requestedDocumentsResponseQueue.ReceiveAsync();
...
}
Then you can call it multiple times simultaneously from a single controller action as such:
public async Task<ActionResult> RequestDocuments(string stamp1, string stamp2) {
var task1 = this.DocumentsRequestService.RequestByStampAsync(stamp1);
var task2 = this.DocumentsRequestService.RequestByStampAsync(stamp2);
var documents = await Task.WhenAll(task1, task2);
return new JsonResult(documents);
}

Sending multiple requests in any order but no more that 1 request/second

In my c# wpf application when a user presses a button I need to send to a server 10-20 requests. They can be sent in an arbitrary order but there has to be at least 10 of them because the server returns the results paginated.
Each client (my c# is a client) has an apy key and server can only handle 1 request per second per a certain client, otherwise the server returns an error.
How can send those requests properly? Should I necessarily use async and await? And can I send them in parallel and how? Doesn't async in this case means that they'll be sent in parallel?
And, how can I ensure that only 1 request per second is sent? I gathered it's not good to mix the threads, which is Thread.Sleep(1000) for my case, and async/await.
So, you could create a bunch of tasks that stagger the job by a second each time.
Something like:
List<Uri> uris=new List<Uri>(); //fill with uris
var tasks = uris.Select(async (u, i)=>{
await Task.Delay(TimeSpan.FromSeconds(i));
using(var wc = new WebClient())
{
return await wc.DownloadStringTaskAsync(u);
}
});
var results = await Task.WhenAll(tasks);

How do I remove the delay between HTTP Requests when using Asynchronous actions in ASP.NET?

I am using HttpClient to send a GET request to a server inside of a while loop
while (cycle < maxcycle)
{
var searchParameters = new ASearchParameters
{
Page = cycle++,
id = getid
};
var searchResponse = await Client.SearchAsync(searchParameters);
}
and the SearchAsync contains
public async Task<AuctionResponse> SearchAsync()
{
var uriString = "Contains a https url with parameters"
var searchResponseMessage = await HttpClient.GetAsync(uriString);
return await Deserialize<AuctionResponse>(searchResponseMessage);
}
The thing is after every request there is a delay before the next request is started.
you can see this in fiddler timeline and also in fiddler there is "Tunnel To" example.com:443 before every request
Question : Why is there a delay and how to remove it ?
I see two things that are happening here. First, depending on the deserializer, it may take a while to translate your response back into an object. You might want to time that step and see if that's not the majority of your time spent. Second, the SSL handshake (the origin of your "tunnel to") does require a round trip to establish the SSL channel. I thought HttpClient sent a Keep-Alive header by default, but you may want to see if it is A) not being sent or B) being rejected. If you are re-establishing an SSL channel for each request, that could easily take on the order of a hundred ms all by itself (depending upon the server/network load).
If you're using Fiddler, you can enable the ability to inspect SSL traffic to see what the actual request/response headers are.
I believe you see this delay for a couple of reasons. Based on the code you provided, all other actions besides the request itself take up some fraction of the time between requests. So deserializing the response will add to a delay.
Also, the delay might be tied to the amount of data that is being returned and processed further down the stack. I tried to recreate the scenario you describe in your question with the following code:
const int MaxNumberOfCycles = 10;
static void Main()
{
Start().Wait();
}
async Task Start()
{
var client = new Client();
var cycle = 0;
while (cycle < MaxNumberOfCycles)
{
var response = await client.SearchAsync(cycle++);
}
}
class Client
{
public async Task<HttpResponseMessage> SearchAsync(int n)
{
// parameter 'n' used to vary web service response data
var url = ... // url removed for privacy
using (var client = new HttpClient())
using (var response = await client.GetAsync(url))
{
return response;
}
}
}
With small response sizes I saw no delay between requests. As response sizes increased I began to see slightly longer delays. Here's a screenshot for a series of requests returning 1MB responses:
One thing I noticed about your scenario is that your transfer activity graph shows a solid black line at the end of each request. This line indicates the "time to first byte", meaning that response processing did not even start until the very end of your request.
Another issue you might consider is that Fiddler is that causing these delays. I noticed that your responses aren't being streamed by Fiddler, which probably impacts the results. You can read more about response streaming in Fiddler.
I hope some of this information helps...

await for a response matching a request sequence number

I am trying to implement a network applicative protocol client library, composed of binary request/responses over TCP connection, and I would like this client to be fully asynchronous relying on async/await constructs of c#5.
Each request sent on the NetworkStream contains an associated applicative sequence number. The reponse made to the request, must specify the same sequence number so that response can be matched with original request.
When I issue a request R1, response to R1 can of course come at any time in the future, and other responses for other requests may come on the line before the actual response to R1.
What I would like to do in the client code of the library is something as simple stupid as
var resp = await SendSomeRequestAsync(req);
SendSomeRequestAsync will send the request asynchronously on the line (got that part right) and somehow await the associated response (matching the sequence number that was sent in the request), such as (in SendSomeRequestAsync)
var dummy = await _ns.WriteAsync(rawBytes, 0, rawBytes.Length); // _ns is a NetworkStream
var resp = await GetResponseMatchingSequenceNumberAsync(sequenceNumber);
I have a loop that is launched when client start connection which is reading incoming responses on the connection asynchronously :
while (true)
{
Response rsp = await ReadNextResponseAsync(_ns);
DispatchReceivedResponse(rsp);
}
I can't figure out how to implement the GetResponseMatchingSequenceNumberAsync, or if I am already doing it totally wrong.
Hope my question is clear enough.
Thanks
I've run into exactly this issue, and the following seemed pretty clean:
Create an IDictionary<int,Response> and store TaskCompletionSource<Response> instances in it. When you get a response back, find the TaskCompletionSource and set it completed. I make no claims for the thread safety of this code. The dictionary should probably be of the concurrent kind, or at least accessed in a lock of some sort.
public Task<Response> GetResponseMatchingSequenceNumberAsync(sequenceNumber)
{
var tcs=new TaskCompletionSource<Response>();
pendingTasksDictionary.Add(sequenceNumber,tcs);
return tcs.Task;
}
private void ResponseHandler(int sequenceNumber,Response response)
{
var pendingTcs=pendingTasksDictionary[sequenceNumber];
//remove from dictionary
pendingTcs.SetCompleted(response);
}

Categories