Weird Timeout on 4th HttpWebRequest in Multi-Threaded Application - c#

I have a multi-threaded console application acting as a server. The server spawns a new thread every time a new client connects to the TcpListener:
//code copied from http://www.switchonthecode.com/tutorials/csharp-tutorial-simple- threaded-tcp-server
//blocks until a client has connected to the server
TcpClient client = tcpListener.AcceptTcpClient();
//create a thread to handle communication with connected client
Thread clientThread = new Thread(new ParameterizedThreadStart(HandleClientComm));
clientThread.Start(client);
The thread makes a number of HttpWebRequests using the following code:
public static HttpWebResponse HttpGet(string pRequestURI, string pArgs)
{
string requestURI = string.Format("{0}?{1}", pRequestURI, pArgs);
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(requestURI);
request.Method = "GET";
request.ContentType = "application/x-www-form-urlencoded";
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
return response;
}
The problem is that, I get a timeout on the FOURTH REQUEST. It is really weird and I cannot get the hang of it. The code worked just fine when it was in a single-threaded application. I am also making sure to close the response stream using:
response.Close();
The requestURI is correct, 'cause I tried copying and pasting it into my browser. Actually, it doesn't matter what the 4th request is (I've tried with different ones), I always get a timeout.
I guess it may be something related to thread-limits, but I really don't know how to solve it. Any suggestions would be greatly appreciated.
Thank you.

After a lot of blood, sweat and tears, I managed to solve it.
It was a stupid mistake really, turns out there was this one place where I was not closing the request.
For some (unknown) reason, this does not affect the requests when made from a Web Application, but, when issuing the requests from a Console Application, timeout problems are encountered.
Thank you #arx for your help - much appreciated.

request.ServicePoint.CloseConnectionGroup(request.ConnectionGroupName);
This line of code resolved my issue
Thanks

Related

.Net C# RESTSharp 10 Minute Timeout

I have embedded a browser control into a .Net form and compiled it as a window's executable. The browser control is displaying our HTML5 image viewer. The application opens sockets so it can listen to "push" requests from various servers. This allows images to be pushed to individual user's desktops.
When an incoming image push request comes in, the application calls a REST service using RESTSharp to generate a token for the viewer to use to display the image.
As long as the requests are consistently arriving, everything works great. If there is a lull (10 minutes seems to be the time frame), then the RESTSharp request times out. It is almost as though the creation of a new instance of the RESTSharp artifacts are reusing the old ones in an attempted .Net optimization.
Here is the RESTSharp code I am using:
private async Task<string> postJsonDataToUrl(string lpPostData) {
IRestClient client = new RestClient(string.Format("{0}:{1}", MstrScsUrlBase, MintScsUrlPort));
IRestRequest request = new RestRequest(string.Format("{0}{1}{2}", MstrScsUrlContextRoot, MstrScsUrlPath, SCS_GENERATE_TOKEN_URL_PATH));
request.Timeout = 5000;
request.ReadWriteTimeout = 5000;
request.AddParameter("application/json", lpPostData, ParameterType.RequestBody);
IRestResponse response = await postResultAsync(client, request);
return response.Content;
} // postJsonDataToUrl
private static Task<IRestResponse> postResultAsync(IRestClient client, IRestRequest request) {
return client.ExecutePostTaskAsync(request);
} // PostResultAsync
This is the line where the time out occurs:
IRestResponse response = await postResultAsync(client, request);
I have tried rewriting this using .Net's HttpWebRequest and I get the same problem.
If I lengthen the RESTSharp timeouts, I am able to make calls to the server (using a different client) while the application is "timing out" so I know the server isn't the issue.
The initial version of the code did not have the await async call structure - that was added as an attempt to get more information on the problem.
I am not getting any errors other than the REST timeout.
I have had limited success with forcing a Garbage Collection with this call:
GC.Collect(GC.MaxGeneration, GCCollectionMode.Forced);
Any thoughts?
It is possible you are hitting the connection limit for .Net apps, as in MS docs:
"By default, an application using the HttpWebRequest class uses a maximum of two persistent connections to a given server, but you can set the maximum number of connections on a per-application basis."
(https://learn.microsoft.com/en-us/dotnet/framework/network-programming/managing-connections).
Closing the connections should help, or you might be able to increase that limit, that is also in the doc
I ended up putting
GC.Collect(GC.MaxGeneration, GCCollectionMode.Forced);
in a timer that fired every 2 minutes. This completely solved my issue.
This is very surprising to me since my HttpWebRequest code was wrapped in "using" statements so the resources should have been released properly. I can only conclude that .Net was optimizing the use of the class and was trying to reuse a stale class rather than allow me to create a new one from scratch.
A new way of doing things.
var body = #"{ ""key"": ""value"" }";
// HTTP package
var request = new RestRequest("https://localhost:5001/api/networkdevices", Method.Put);
request.AddHeader("Content-Type", "application/json");
request.AddHeader("Keep-Alive", "");// set "timeout=120" will work as well
request.Timeout = 120;
request.AddBody(body);
// HTTP call
var client = new RestClient();
RestResponse response = await client.ExecuteAsync(request);
Console.WriteLine(response.Content);

Avoid network throttling when making HTTP requests

I have a very specific problem when making HTTP webrequests. To be more specific, the application is making the requests 24/7 and updating my database table. Since other users are performing requests as well, I've came into a situation where when I make parallel web requests using parallel for loop with a combination of concurrent bag to speed things up, the other users experience a huge slowdown in websites performance. At some point the website becomes very slow and unresponsive when users + application makes the requests...
So now my question is following:
How can I limit the applications amount of webrequests it does at a specific moment?
For example if there are 10000 ports available through which app can make a web request. I wanna be able to tell to application to use lets say 10/15 threads at a time to make the request, and at the same time not to slow down the website to user so that there is no network throttling.
I read a few articles and some people were suggesting to use semaphore slim, but I have no idea how can I pair it up with my web request which looks like following:
private string MakeHTTPWebRequest(string requestXML)
{
var request = (HttpWebRequest)WebRequest.Create("https://api.ebay.com/ws/api.dll");
string GetItemTransactionXMLRequest = null;
byte[] bytes = null;
bytes = System.Text.Encoding.ASCII.GetBytes(requestXML);
ServicePointManager.DefaultConnectionLimit = 9999;
ServicePointManager.Expect100Continue = false;
request.Method = "POST";
request.ContentType = "application/xml";
request.Accept = "application/xml";
request.Proxy = null;
Stream requestStream = request.GetRequestStream();
requestStream.Write(bytes, 0, bytes.Length);
requestStream.Close();
using (var response = (HttpWebResponse)request.GetResponse())
{
if (response.StatusCode == HttpStatusCode.OK)
{
Stream responseStream = response.GetResponseStream();
string responseStr = new StreamReader(responseStream).ReadToEnd();
responseStream.Flush();
responseStream.Close();
return responseStr;
}
return null;
}
}
This is how I do it currently:
Parallel.For(0,somelength,i=> List.Add(MakeHTTPWebRequest("Some xml request here")));
The method above gives me terrible network throttling. How can I do this in a manner where application would know if it's causing network throttling to reduce number of calls or await while user makes the request and then it continues the request?
At the same time this raises another question and issue, how can I set the timeout in this webrequest to unlimted xxx minutes so that app can wait till others are done with their requests so it can continue fetching the results from API...
Can someone help me out with this ?
You're setting some global variables every time you make an HTTP request. I'd recommend only setting them once.
I wanna be able to tell to application to use lets say 10/15 threads at a time to make the request
The fastest fix would be to just pass a ParallelOptions parameter to Parallel.For, setting MaxDegreeOfParallelism to 10/15.
I would also recommend considering making the code asynchronous, since this is an I/O-bound operation. That's where you would use SemaphoreSlim for throttling.
How can I do this in a manner where application would know if it's causing network throttling to reduce number of calls
That's a much harder problem. You'd have to measure your response times and feed them into a routine that establishes a "normal response time", and then starts throttling if the response times start getting too big. This is assuming that your app is throttled similarly to how a user would be.

How do I resend a "failed" WebRequest?

I send POST and GET WebRequest that should support longer periods of internet being down. The idea is to queue the failed (timedout) webrequest and to try to resend them periodically until the internet is up again and all queued WebRequests are sent.
However, I seems that I cannot just reuse the old WebRequest. Do I need to set it up again?
IAsyncResult result = request.BeginGetResponse (o => {
callCallback (o);
}, state);
When request is just setup using:
var request = HttpWebRequest.Create (String.Format (#"{0}/{1}", baseServiceUrl, path));
request.Method = "GET";
request.ContentType = "application/xml; charset=UTF-8";
request.Headers.Add ("Authority", account.Session);
return request;
it works fine. But after a timeout (and request.Abort ();) and calling BeginGetResponse() on the same webrequest just freezes.
You cannot call BeginGetResponse() multiple times on the same HttpWebRequest. I'm not sure whether that's support on .NET, but it's not possible with Mono's implementation (and not something that'd be easy to change).
See also Can I reuse HttpWebRequest without disconnecting from the server? - the underlying network connection will be reused.

Http Requests Timing Out in Production but not Dev

Below is an example of a website that when requested from my local dev environment returns ok but when requested from the production server, the request times out after 15 seconds. The request headers are exactly the same. Any ideas?
http://www.dealsdirect.com.au/p/wall-mounted-fish-tank-30cm/
Here's one thing that I wanna point beside what other stuff I've already provided. When you call GetResponse the object that's returned has to be disposed of ASAP. Otherwise stalling will occur, or rather the next call will block and possibly time out because there's a limit to the number of requests that can go through the HTTP request engine concurrently in System.Net.
// The request doesn't matter, it's not holding on to any critical resources
var request = (HttpWebRequest)WebRequest.Create(url);
// The response however is, these will eventually be reclaimed by the GC
// but you'll run into problems similar to deadlocks if you don't dispose them yourself
// when you have many of them
using (var response = request.GetResponse())
{
// Do stuff with `response` here
}
This is my old answer
This question is really hard to answer without knowing more about the specifics. There's no reason why the IIS would behave like this which leads me to conclude that the problem has to do with something you app is doing but I know nothing about it. If you can reproduce the problem with a debugger attached you might be able to track down where the problem is occuring but if you cannot do this first then there's little I can do to help.
Are you using the ASP.NET Development Server or IIS Express in development?
If this is an issue with proxies here's a factory method I use to setup HTTP requests that require where the proxy requires some authentication (though, I don't believe I ever received a time out):
HttpWebRequest CreateRequest(Uri url)
{
var request = (HttpWebRequest)WebRequest.Create(url);
request.Timeout = 120 * 1000;
request.CookieContainer = _cookieContainer;
if (UseSystemWebProxy)
{
var proxy = WebRequest.GetSystemWebProxy();
if (UseDefaultCredentials)
{
proxy.Credentials = CredentialCache.DefaultCredentials;
}
if (UseNetworkCredentials != null
&& UseNetworkCredentials.Length > 0)
{
var networkCredential = new NetworkCredential();
networkCredential.UserName = UseNetworkCredentials[0];
if (UseNetworkCredentials.Length > 1)
{
networkCredential.Password = UseNetworkCredentials[1];
}
if (UseNetworkCredentials.Length > 2)
{
networkCredential.Domain = UseNetworkCredentials[2];
}
proxy.Credentials = networkCredential;
}
request.Proxy = proxy;
}
return request;
}
Try this out Adrian and let me know how it goes.

Difficulty with BugzScout.net from behind a proxy

I'm attempting to use Fogbugz's BugzScout in order to automatically submit unhanded application exceptions to my Fogbugz on demand Account. I've written up a wrapper class for it and everything appears to be just groovy - on my box. Testing the same code in the production environment, behind a Proxy that requires authentication, I have had nothing but issues.
I went to work modifying the BugzScout code in order to get it to authenticate with the Proxy, and after trying many different methods suggested via a Google search, found one that works! But now I'm getting an "Connection actively refused" error from Fogbugz itself, and I don't know what to do.
Here is the code where the BugzScout connects via a .net WebClient to submit a new case, with my modifications to deal with our Proxy. What am I doing that would cause Fogbugz to refuse my request? I've removed all non web-client related code from the procedure for ease of reading.
public string Submit(){
WebClient client = new WebClient();
WebProxy proxy = new WebProxy();
proxy.UseDefaultCredentials = true;
client.Proxy = proxy;
Byte[] response = client.DownloadData(fogBugzUrl);
string responseText = System.Text.Encoding.UTF8.GetString(response);
return (responseText == "") ? this.defaultMsg : responseText;
}
The url is correct and the case is filled in properly- this has been verified.
EDIT: Additional info.
Using Fogbugz on Demand.
Using FogBugz.net code in it's entirety, with only these additions
WebProxy proxy = new WebProxy();
proxy.UseDefaultCredentials = true;
client.Proxy = proxy;
Error occurs when attempting to connect to both https://oursite.fogbugz.com/scoutsubmit.asp and http://oursite.fogbugz.com//scoutsubmit.asp (except one says port 443, and the other port 80, obviously)
I don't know anything about web authentication so I can't tell you what kind I'm using- if you tell me where to look I'd be happy to answer that for you.
Got the fix from Fogbugz- this is the appropriate network code to get though the proxy authentication and not mis-authenticate with Bugzscout.
WebRequest.DefaultWebProxy.Credentials = CredentialCache.DefaultNetworkCredentials;
WebRequest request = WebRequest.Create(fogBugzUrl);
request.ContentType = "application/x-www-form-urlencoded";
request.Method = "POST";
request.Proxy.Credentials = CredentialCache.DefaultNetworkCredentials;
Stream requestStream = request.GetRequestStream();
requestStream.Write(bytes, 0, bytes.Length);
requestStream.Close();
Is your fogbugzUrl using HTTP Basic Authentication? Is it SSL (hosted on On Demand?)
The connection actively refused message would be coming from the web server itself, not really FogBugz.
Can you post the HTTP Status Code?
One thing to note if you are using FogBugz On Demand is you HAVE to use the https:// url (not the http url).

Categories