Avoid network throttling when making HTTP requests - c#

I have a very specific problem when making HTTP webrequests. To be more specific, the application is making the requests 24/7 and updating my database table. Since other users are performing requests as well, I've came into a situation where when I make parallel web requests using parallel for loop with a combination of concurrent bag to speed things up, the other users experience a huge slowdown in websites performance. At some point the website becomes very slow and unresponsive when users + application makes the requests...
So now my question is following:
How can I limit the applications amount of webrequests it does at a specific moment?
For example if there are 10000 ports available through which app can make a web request. I wanna be able to tell to application to use lets say 10/15 threads at a time to make the request, and at the same time not to slow down the website to user so that there is no network throttling.
I read a few articles and some people were suggesting to use semaphore slim, but I have no idea how can I pair it up with my web request which looks like following:
private string MakeHTTPWebRequest(string requestXML)
{
var request = (HttpWebRequest)WebRequest.Create("https://api.ebay.com/ws/api.dll");
string GetItemTransactionXMLRequest = null;
byte[] bytes = null;
bytes = System.Text.Encoding.ASCII.GetBytes(requestXML);
ServicePointManager.DefaultConnectionLimit = 9999;
ServicePointManager.Expect100Continue = false;
request.Method = "POST";
request.ContentType = "application/xml";
request.Accept = "application/xml";
request.Proxy = null;
Stream requestStream = request.GetRequestStream();
requestStream.Write(bytes, 0, bytes.Length);
requestStream.Close();
using (var response = (HttpWebResponse)request.GetResponse())
{
if (response.StatusCode == HttpStatusCode.OK)
{
Stream responseStream = response.GetResponseStream();
string responseStr = new StreamReader(responseStream).ReadToEnd();
responseStream.Flush();
responseStream.Close();
return responseStr;
}
return null;
}
}
This is how I do it currently:
Parallel.For(0,somelength,i=> List.Add(MakeHTTPWebRequest("Some xml request here")));
The method above gives me terrible network throttling. How can I do this in a manner where application would know if it's causing network throttling to reduce number of calls or await while user makes the request and then it continues the request?
At the same time this raises another question and issue, how can I set the timeout in this webrequest to unlimted xxx minutes so that app can wait till others are done with their requests so it can continue fetching the results from API...
Can someone help me out with this ?

You're setting some global variables every time you make an HTTP request. I'd recommend only setting them once.
I wanna be able to tell to application to use lets say 10/15 threads at a time to make the request
The fastest fix would be to just pass a ParallelOptions parameter to Parallel.For, setting MaxDegreeOfParallelism to 10/15.
I would also recommend considering making the code asynchronous, since this is an I/O-bound operation. That's where you would use SemaphoreSlim for throttling.
How can I do this in a manner where application would know if it's causing network throttling to reduce number of calls
That's a much harder problem. You'd have to measure your response times and feed them into a routine that establishes a "normal response time", and then starts throttling if the response times start getting too big. This is assuming that your app is throttled similarly to how a user would be.

Related

.Net C# RESTSharp 10 Minute Timeout

I have embedded a browser control into a .Net form and compiled it as a window's executable. The browser control is displaying our HTML5 image viewer. The application opens sockets so it can listen to "push" requests from various servers. This allows images to be pushed to individual user's desktops.
When an incoming image push request comes in, the application calls a REST service using RESTSharp to generate a token for the viewer to use to display the image.
As long as the requests are consistently arriving, everything works great. If there is a lull (10 minutes seems to be the time frame), then the RESTSharp request times out. It is almost as though the creation of a new instance of the RESTSharp artifacts are reusing the old ones in an attempted .Net optimization.
Here is the RESTSharp code I am using:
private async Task<string> postJsonDataToUrl(string lpPostData) {
IRestClient client = new RestClient(string.Format("{0}:{1}", MstrScsUrlBase, MintScsUrlPort));
IRestRequest request = new RestRequest(string.Format("{0}{1}{2}", MstrScsUrlContextRoot, MstrScsUrlPath, SCS_GENERATE_TOKEN_URL_PATH));
request.Timeout = 5000;
request.ReadWriteTimeout = 5000;
request.AddParameter("application/json", lpPostData, ParameterType.RequestBody);
IRestResponse response = await postResultAsync(client, request);
return response.Content;
} // postJsonDataToUrl
private static Task<IRestResponse> postResultAsync(IRestClient client, IRestRequest request) {
return client.ExecutePostTaskAsync(request);
} // PostResultAsync
This is the line where the time out occurs:
IRestResponse response = await postResultAsync(client, request);
I have tried rewriting this using .Net's HttpWebRequest and I get the same problem.
If I lengthen the RESTSharp timeouts, I am able to make calls to the server (using a different client) while the application is "timing out" so I know the server isn't the issue.
The initial version of the code did not have the await async call structure - that was added as an attempt to get more information on the problem.
I am not getting any errors other than the REST timeout.
I have had limited success with forcing a Garbage Collection with this call:
GC.Collect(GC.MaxGeneration, GCCollectionMode.Forced);
Any thoughts?
It is possible you are hitting the connection limit for .Net apps, as in MS docs:
"By default, an application using the HttpWebRequest class uses a maximum of two persistent connections to a given server, but you can set the maximum number of connections on a per-application basis."
(https://learn.microsoft.com/en-us/dotnet/framework/network-programming/managing-connections).
Closing the connections should help, or you might be able to increase that limit, that is also in the doc
I ended up putting
GC.Collect(GC.MaxGeneration, GCCollectionMode.Forced);
in a timer that fired every 2 minutes. This completely solved my issue.
This is very surprising to me since my HttpWebRequest code was wrapped in "using" statements so the resources should have been released properly. I can only conclude that .Net was optimizing the use of the class and was trying to reuse a stale class rather than allow me to create a new one from scratch.
A new way of doing things.
var body = #"{ ""key"": ""value"" }";
// HTTP package
var request = new RestRequest("https://localhost:5001/api/networkdevices", Method.Put);
request.AddHeader("Content-Type", "application/json");
request.AddHeader("Keep-Alive", "");// set "timeout=120" will work as well
request.Timeout = 120;
request.AddBody(body);
// HTTP call
var client = new RestClient();
RestResponse response = await client.ExecuteAsync(request);
Console.WriteLine(response.Content);

Async web request with timeout for web errors [duplicate]

This question already has answers here:
Timeout behaviour in HttpWebRequest.GetResponse() vs GetResponseAsync()
(2 answers)
Closed 7 years ago.
I've been going at this all day. I am creating web requests using
public async static Task<string> FetchString(string Url)
{
HttpWebRequest Request = (HttpWebRequest)WebRequest.Create(Url);
Request.Proxy = null;
WebResponse Response = await Request.GetResponseAsync();
Stream DataStream = Response.GetResponseStream();
if (DataStream == null) return String.Empty;
StreamReader DataReader = new StreamReader(DataStream);
return await DataReader.ReadToEndAsync();
}
which works great. The problem is, though, is that sometimes it hangs on HTTP 504, gateway timeout. Using Request.Timeout (or any of the three variants) does not time the method out for when my method hangs on 504 (edit: timeout doesn't reply to async methods, great). To combat this, I've tried to create a timer that would kill the thread the request was running on, but had no luck doing that, though it felt like a working concept.
How would I be able to asynchronously get the contents of a URL in string form, while still being abe to time the request out after say five seconds?
In my research, I found this is a duplicate question (sorry, i dunno how this works)
Timeout behaviour in HttpWebRequest.GetResponse() vs GetResponseAsync()
Timeout does not apply to asynchronous HttpWebRequest requests. To
quote the docs:
The Timeout property has no effect on asynchronous requests
I recommend you use HttpClient instead, which was designed with
asynchronous requests in mind.
-Stephen Cleary

How do I resend a "failed" WebRequest?

I send POST and GET WebRequest that should support longer periods of internet being down. The idea is to queue the failed (timedout) webrequest and to try to resend them periodically until the internet is up again and all queued WebRequests are sent.
However, I seems that I cannot just reuse the old WebRequest. Do I need to set it up again?
IAsyncResult result = request.BeginGetResponse (o => {
callCallback (o);
}, state);
When request is just setup using:
var request = HttpWebRequest.Create (String.Format (#"{0}/{1}", baseServiceUrl, path));
request.Method = "GET";
request.ContentType = "application/xml; charset=UTF-8";
request.Headers.Add ("Authority", account.Session);
return request;
it works fine. But after a timeout (and request.Abort ();) and calling BeginGetResponse() on the same webrequest just freezes.
You cannot call BeginGetResponse() multiple times on the same HttpWebRequest. I'm not sure whether that's support on .NET, but it's not possible with Mono's implementation (and not something that'd be easy to change).
See also Can I reuse HttpWebRequest without disconnecting from the server? - the underlying network connection will be reused.

Weird Timeout on 4th HttpWebRequest in Multi-Threaded Application

I have a multi-threaded console application acting as a server. The server spawns a new thread every time a new client connects to the TcpListener:
//code copied from http://www.switchonthecode.com/tutorials/csharp-tutorial-simple- threaded-tcp-server
//blocks until a client has connected to the server
TcpClient client = tcpListener.AcceptTcpClient();
//create a thread to handle communication with connected client
Thread clientThread = new Thread(new ParameterizedThreadStart(HandleClientComm));
clientThread.Start(client);
The thread makes a number of HttpWebRequests using the following code:
public static HttpWebResponse HttpGet(string pRequestURI, string pArgs)
{
string requestURI = string.Format("{0}?{1}", pRequestURI, pArgs);
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(requestURI);
request.Method = "GET";
request.ContentType = "application/x-www-form-urlencoded";
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
return response;
}
The problem is that, I get a timeout on the FOURTH REQUEST. It is really weird and I cannot get the hang of it. The code worked just fine when it was in a single-threaded application. I am also making sure to close the response stream using:
response.Close();
The requestURI is correct, 'cause I tried copying and pasting it into my browser. Actually, it doesn't matter what the 4th request is (I've tried with different ones), I always get a timeout.
I guess it may be something related to thread-limits, but I really don't know how to solve it. Any suggestions would be greatly appreciated.
Thank you.
After a lot of blood, sweat and tears, I managed to solve it.
It was a stupid mistake really, turns out there was this one place where I was not closing the request.
For some (unknown) reason, this does not affect the requests when made from a Web Application, but, when issuing the requests from a Console Application, timeout problems are encountered.
Thank you #arx for your help - much appreciated.
request.ServicePoint.CloseConnectionGroup(request.ConnectionGroupName);
This line of code resolved my issue
Thanks

HttpWebRequest.GetRequestStream : What it does?

Code exemple:
HttpWebRequest request =
(HttpWebRequest)HttpWebRequest.Create("http://some.existing.url");
request.Method = "POST";
request.ContentType = "text/xml";
Byte[] documentBytes = GetDocumentBytes ();
using (Stream requestStream = request.GetRequestStream())
{
requestStream.Write(documentBytes, 0, documentBytes.Length);
requestStream.Flush();
requestStream.Close();
}
When I do request.GetRequestStream (), there's nothing to send in the request. From the name of the method, and the intellisense it shows ("Get System.IO.Stream to use to write request data"), nothing indicates that this line of code will connect to the distant server.
But it seems it does...
Can anyone explain to me what HttpWebRequest.GetRequestStream () exactly does ?
Thanks for your enlightenments.
Getting the request stream does not trigger the post, but closing the stream does. Post data is sent to the server in the following way:
A connection is opened to the host
Send request and headers
Write Post data
Wait for a response.
The act of flushing and closing the stream is the final step, and once the input stream is closed (i.e. the client has sent what it needs to the server), then the server can return a response.
You use GetRequestStream() to synchronously obtain a reference to the upload stream. It is only after you have finished writing to the stream that the actual request is send.
However, I would suggest that you use the BeginGetRequestStream method instead of GetRequestStream. BeginGetRequestStream performs asynchronously and don't lock the current thread while the stream is being obtained. You pass a callback and a context to the BeginGetRequestStream. In the callback, you can call EndGetRequestStream() to finally grab a reference and repeat the writing steps listed above (for synchronous behavior). Example:
context.Request.BeginGetRequestStream(new AsyncCallback(Foo), context);
public void Foo(IAsyncResult asyncResult)
{
Context context = (Context)asyncResult.AsyncState;
try
{
HttpWebRequest request = context.Request;
using (var requestStream = request.EndGetRequestStream(asyncResult))
using (var writer = new StreamWriter(requestStream))
{
// write to the request stream
}
request.BeginGetResponse(new AsyncCallback(ProcessResponse), context);
}
Be very careful with BeginGetRequestStream. It never times out, thus you must add additional logic to your program to recover from situations where GetRequestStream will throw a timeout exception.
In general, threads are cheap. The async Begin/End methods of HttpWebRequest are only worth using if you will have 10,000 or more concurrent requests; because implementing timeouts is very tricky and error-prone. In general, using BeginGetRequestStream is premature optimization unless you need significant performance gains.

Categories