Http Requests Timing Out in Production but not Dev - c#

Below is an example of a website that when requested from my local dev environment returns ok but when requested from the production server, the request times out after 15 seconds. The request headers are exactly the same. Any ideas?
http://www.dealsdirect.com.au/p/wall-mounted-fish-tank-30cm/

Here's one thing that I wanna point beside what other stuff I've already provided. When you call GetResponse the object that's returned has to be disposed of ASAP. Otherwise stalling will occur, or rather the next call will block and possibly time out because there's a limit to the number of requests that can go through the HTTP request engine concurrently in System.Net.
// The request doesn't matter, it's not holding on to any critical resources
var request = (HttpWebRequest)WebRequest.Create(url);
// The response however is, these will eventually be reclaimed by the GC
// but you'll run into problems similar to deadlocks if you don't dispose them yourself
// when you have many of them
using (var response = request.GetResponse())
{
// Do stuff with `response` here
}
This is my old answer
This question is really hard to answer without knowing more about the specifics. There's no reason why the IIS would behave like this which leads me to conclude that the problem has to do with something you app is doing but I know nothing about it. If you can reproduce the problem with a debugger attached you might be able to track down where the problem is occuring but if you cannot do this first then there's little I can do to help.
Are you using the ASP.NET Development Server or IIS Express in development?
If this is an issue with proxies here's a factory method I use to setup HTTP requests that require where the proxy requires some authentication (though, I don't believe I ever received a time out):
HttpWebRequest CreateRequest(Uri url)
{
var request = (HttpWebRequest)WebRequest.Create(url);
request.Timeout = 120 * 1000;
request.CookieContainer = _cookieContainer;
if (UseSystemWebProxy)
{
var proxy = WebRequest.GetSystemWebProxy();
if (UseDefaultCredentials)
{
proxy.Credentials = CredentialCache.DefaultCredentials;
}
if (UseNetworkCredentials != null
&& UseNetworkCredentials.Length > 0)
{
var networkCredential = new NetworkCredential();
networkCredential.UserName = UseNetworkCredentials[0];
if (UseNetworkCredentials.Length > 1)
{
networkCredential.Password = UseNetworkCredentials[1];
}
if (UseNetworkCredentials.Length > 2)
{
networkCredential.Domain = UseNetworkCredentials[2];
}
proxy.Credentials = networkCredential;
}
request.Proxy = proxy;
}
return request;
}
Try this out Adrian and let me know how it goes.

Related

CredentialCache and HttpWebRequest in .NET

I'm having difficulty understanding how web requests and credentials work in .NET.
I have the following method that is executing a request to a SOAP endpoint.
public WebResponse Execute(NetworkCredential Credentials)
{
HttpWebRequest webRequest = CreateWebRequest(_url, actionUrl);
webRequest.AllowAutoRedirect = true;
webRequest.PreAuthenticate = true;
webRequest.Credentials = Credentials;
// Add headers and content into the requestStream
asyncResult.AsyncWaitHandle.WaitOne();
return webRequest.EndGetResponse(asyncResult);
}
It works well enough. However, users of my applications may have to execute dozens of these requests in short succession. Hundreds over the course of the day. My goal is to implement some of the recommendations I've read about, namely using an HttpClient that exists for the entire lifetime of the application, and to use the CredentialCache to store user's credentials, instead of passing them in to each request.
So I'm starting with the CredentialCache.
Following the example linked above, I instantiated a CredentialCache and added my network credentials to it. Note that this is the exact same NetworkCredential object that I was passing to the request earlier.
NetworkCredential credential = new NetworkCredential();
credential.UserName = Name;
credential.Password = PW;
Program.CredCache.Add(new Uri("https://blah.com/"), "Basic", credential);
Then, when I go to send my HTTP request, I get the credentials from the cache, instead of providing the credentials object directly.
public WebResponse Execute(NetworkCredential Credentials)
{
HttpWebRequest webRequest = CreateWebRequest(_url, actionUrl);
webRequest.AllowAutoRedirect = true;
webRequest.PreAuthenticate = true;
webRequest.Credentials = Program.CredCache;
// more stuff down here
}
The request now fails with a 401 error.
I am failing to understand this on several levels. For starters, I can't seem to figure out whether or not the CredentialCache has indeed passed the proper credentials to the HTTP request.
I suspect part of the problem might be that I'm trying to use "Basic" authentication. I tried "Digest" as well just as a shot in the dark (which also failed), but I'm sure there must be a way to see what kind of authentication the server is expecting.
I have been combing StackOverflow and MDN trying to read up as much as possible about this, but I am having a difficult time separating the relevant information from the outdated and irrelevant information.
If anyone can help me solve the problem that would be most appreciated, but even links to proper educational resources would be helpful.
According to the documentation the CredentialCache class is only for SMTP, it explicitly says that it is not for HTTP or FTP requests:
https://msdn.microsoft.com/en-us/library/system.net.credentialcache(v=vs.110).aspx
Which directly contradicts the info in the later api docs. Which one is right I don't know.
You could try using the HttpClient class. The methods and return types are different, so you would need to tweak your other a code a bit, but it would look a bit like this:
public class CommsClass
{
private HttpClient _httpClient;
public CommsClass(NetworkCredential credentials)
{
var handler = new HttpClientHandler { Credentials = credentials };
_httpclient = new HttpClient(handler);
}
public HttpResponseMessage Execute(HttpRequestMessage message)
{
var response = _httpClient.SendAsync(message).Result;
return response;
}
}
You can do all sorts of other things with the handler, and the client like set request headers or set a base address.

Ftp requests in Windows Service Timing out

I have windows service that periodically upload file upload file on an FTP server. I have set,
ServicePointManager.DefaultConnectionLimit = 100;
and I have,
public void MyMethod(string url,
string userName,
string password)
{
try
{
var request = (FtpWebRequest) WebRequest.Create(url);
request.Timeout = GetFtpTimeoutInMinutes()*60*1000;
request.Credentials = new NetworkCredential(userName, password);
request.Method = method;
request.UseBinary = true;
request.UsePassive = true;
request.KeepAlive = false;
request.GetResponse();
}
catch (Exception ex)
{
_logger.Log(ex);
}
Its work fine for 100 or more request but after 100 or more I am continuously getting,
System.Net.WebException: The operation has timed out.
at System.Net.FtpWebRequest.CheckError()
at System.Net.FtpWebRequest.GetResponse()
Why this is happening.
Update: I am thinking to move in http
request.GetResponse();
Your snippet is very incomplete, but surely the problem started here. GetResponse() returns an FtpWebResponse object, it is crucial that your code calls Close() or disposes it to ensure that the Ftp connection is closed.
If you forget then you will have 100 active connections after downloading 100 files and trip the ServicePointManager.DefaultConnectionLimit. The garbage collector doesn't run often enough to keep you out of trouble. After which any subsequent downloads will die with a timeout. Fix:
using (var response = request.GetResponse()) {
// Do stuff
//...
}
Which uses the reliable way to ensure that the connection will be closed, even if the code falls over with an exception. If you still have trouble then diagnose with, say, SysInternals' TcpView utility. It is a good way to see active connections.

What is the best way to download files via HTTP using .NET?

In one of my application I'm using the WebClient class to download files from a web server. Depending on the web server sometimes the application download millions of documents. It seems to be when there are lot of documents, performance vise the WebClient doesn't scale up well.
Also it seems to be the WebClient doesn't immediately close the connection it opened for the WebServer even after it successfully download the particular document.
I would like to know what other alternatives I have.
Update:
Also I noticed that for each and every download WebClient performs the authentication hand shake. I was expecting to see this hand shake once since my application only communicate with a single web server. Shouldn't the subsequent calls of the WebClient reuse the authentication session?
Update: My application also calls some web service methods and for these web service calls it seems to authentication session is reused. Also I'm using WCF to communicate with the web service.
I think you can still use "WebClient". However, you are better off using the "using" block as a good practice. This will make sure that the object is closed and is disposed off:-
using(WebClient client = new WebClient()) {
// Use client
}
I bet you are running into the default limit of 2 connections per server. Try running this code at the beginning of your program:
var cme = new System.Net.Configuration.ConnectionManagementElement();
cme.MaxConnection = 100;
System.Net.ServicePointManager.DefaultConnectionLimit = 100;
I have noticed the same behavior with the session in another project I was working on. To solve this "problem" I did use a static CookieContainer (since the session of the client is recognized by a value saved in a cookie).
public static class SomeStatics
{
private static CookieContainer _cookieContainer;
public static CookieContainer CookieContainer
{
get
{
if (_cookieContainer == null)
{
_cookieContainer = new CookieContainer();
}
return _cookieContainer;
}
}
}
public class CookieAwareWebClient : WebClient
{
protected override WebRequest GetWebRequest(Uri address)
{
WebRequest request = base.GetWebRequest(address);
if (request is HttpWebRequest)
{
(request as HttpWebRequest).CookieContainer = SomeStatics.CookieContainer;
(request as HttpWebRequest).KeepAlive = false;
}
return request;
}
}
//now the code that will download the file
using(WebClient client = new CookieAwareWebClient())
{
client.DownloadFile("http://address.com/somefile.pdf", #"c:\\temp\savedfile.pdf");
}
The code is just an example and inspired on Using CookieContainer with WebClient class and C# get rid of Connection header in WebClient.
The above code will close your connection immediately after the file is download and it will reuse the authentication.
WebClient is probably the best option. It doesn't close the connection straight away for a reason: so it can use the same connection again, without having to open a new one. If you find that it's not reusing the connection as expected, that's usually because you're not Close()ing the response from the previous request:
var request = WebRequest.Create("...");
// populate parameters
var response = request.GetResponse();
// process response
response.Close(); // <-- make sure you don't forget this!

How to perform a fast web request in C#

I have a HTTP based API which I potentially need to call many times. The problem is that I can't get the request to take less than about 20 seconds, though the same request made through a browser is near instantaneous. The following code illustrates how I have implemented it so far.
WebRequest r = HttpWebRequest.Create("https://example.com/http/command?param=blabla");
var response = r.GetResponse();
One solution would be to make an asynchronous request but I would like to know why it takes so long and if I can avoid it. I have also tried using the WebClient class but I suspect it uses a WebRequest internally.
Update:
Running the following code took about 40 seconds in Release Mode (measured with Stopwatch):
WebRequest g = HttpWebRequest.Create("http://www.google.com");
var response = g.GetResponse();
I'm working at a university where there might be different things in the network configuration affecting the performance, but the direct use of the browser illustrates that it should be near instant.
Update 2:
I uploaded the code to a remote machine and it worked fine so the conclusion must be that the .NET code does something extra compared to the browser or it has problems resolving the address through the university network (proxy issues or something?!).
This problem is similar to another post on StackOverflow:
Stackoverflow-2519655(HttpWebrequest is extremely slow)
Most of the time the problem is the Proxy server property. You should set this property to null, otherwise the object will attempt to search for an appropriate proxy server to use before going directly to the source. Note: this property is turn on by default, so you have to explicitly tell the object not to perform this proxy search.
request.Proxy = null;
using (var response = (HttpWebResponse)request.GetResponse())
{
}
I was having the 30 second delay on 'first' attempt - JamesR's reference to the other post mentioning setting proxy to null solved it instantly!
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(_site.url);
request.Proxy = null; // <-- this is the good stuff
...
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
Does your site have an invalid SSL cert? Try adding this
ServicePointManager.ServerCertificateValidationCallback = new System.Net.Security.RemoteCertificateValidationCallback(AlwaysAccept);
//... somewhere AlwaysAccept is defined as:
using System.Security.Cryptography.X509Certificates;
using System.Net.Security;
public bool AlwaysAccept(object sender, X509Certificate certification, X509Chain chain, SslPolicyErrors sslPolicyErrors)
{
return true;
}
You don't close your Request. As soon as you hit the number of allowed connections, you have to wait for the earlier ones to time out. Try
using (var response = g.GetResponse())
{
// do stuff with your response
}

Test if a website is alive from a C# application

I am looking for the best way to test if a website is alive from a C# application.
Background
My application consists of a Winforms UI, a backend WCF service and a website to publish content to the UI and other consumers. To prevent the situation where the UI starts up and fails to work properly because of a missing WCF service or website being down I have added an app startup check to ensure that all everything is alive.
The application is being written in C#, .NET 3.5, Visual Studio 2008
Current Solution
Currently I am making a web request to a test page on the website that will inturn test the web site and then display a result.
WebRequest request = WebRequest.Create("http://localhost/myContentSite/test.aspx");
WebResponse response = request.GetResponse();
I am assuming that if there are no exceptions thown during this call then all is well and the UI can start.
Question
Is this the simplest, right way or is there some other sneaky call that I don't know about in C# or a better way to do it.
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
if (response == null || response.StatusCode != HttpStatusCode.OK)
As #Yanga mentioned, HttpClient is probably the more common way to do this now.
HttpClient client = new HttpClient();
var checkingResponse = await client.GetAsync(url);
if (!checkingResponse.IsSuccessStatusCode)
{
return false;
}
While using WebResponse please make sure that you close the response stream ie (.close) else it would hang the machine after certain repeated execution.
Eg
HttpWebRequest req = (HttpWebRequest)WebRequest.Create(sURL);
HttpWebResponse response = (HttpWebResponse)req.GetResponse();
// your code here
response.Close();
from the NDiagnostics project on CodePlex...
public override bool WebSiteIsAvailable(string Url)
{
string Message = string.Empty;
HttpWebRequest request = (HttpWebRequest)HttpWebRequest.Create(Url);
// Set the credentials to the current user account
request.Credentials = System.Net.CredentialCache.DefaultCredentials;
request.Method = "GET";
try
{
using (HttpWebResponse response = (HttpWebResponse)request.GetResponse())
{
// Do nothing; we're only testing to see if we can get the response
}
}
catch (WebException ex)
{
Message += ((Message.Length > 0) ? "\n" : "") + ex.Message;
}
return (Message.Length == 0);
}
We can today update the answers using HttpClient():
HttpClient Client = new HttpClient();
var result = await Client.GetAsync("https://stackoverflow.com");
int StatusCode = (int)result.StatusCode;
Assuming the WCF service and the website live in the same web app, you can use a "Status" WebService that returns the application status. You probably want to do some of the following:
Test that the database is up and running (good connection string, service is up, etc...)
Test that the website is working (how exactly depends on the website)
Test that WCF is working (how exactly depends on your implementation)
Added bonus: you can return some versioning info on the service if you need to support different releases in the future.
Then, you create a client on the Win.Forms app for the WebService. If the WS is not responding (i.e. you get some exception on invoke) then the website is down (like a "general error").
If the WS responds, you can parse the result and make sure that everything works, or if something is broken, return more information.
You'll want to check the status code for OK (status 200).
Solution from: How do you check if a website is online in C#?
var ping = new System.Net.NetworkInformation.Ping();
var result = ping.Send("https://www.stackoverflow.com");
if (result.Status != System.Net.NetworkInformation.IPStatus.Success)
return;

Categories