Ftp requests in Windows Service Timing out - c#

I have windows service that periodically upload file upload file on an FTP server. I have set,
ServicePointManager.DefaultConnectionLimit = 100;
and I have,
public void MyMethod(string url,
string userName,
string password)
{
try
{
var request = (FtpWebRequest) WebRequest.Create(url);
request.Timeout = GetFtpTimeoutInMinutes()*60*1000;
request.Credentials = new NetworkCredential(userName, password);
request.Method = method;
request.UseBinary = true;
request.UsePassive = true;
request.KeepAlive = false;
request.GetResponse();
}
catch (Exception ex)
{
_logger.Log(ex);
}
Its work fine for 100 or more request but after 100 or more I am continuously getting,
System.Net.WebException: The operation has timed out.
at System.Net.FtpWebRequest.CheckError()
at System.Net.FtpWebRequest.GetResponse()
Why this is happening.
Update: I am thinking to move in http

request.GetResponse();
Your snippet is very incomplete, but surely the problem started here. GetResponse() returns an FtpWebResponse object, it is crucial that your code calls Close() or disposes it to ensure that the Ftp connection is closed.
If you forget then you will have 100 active connections after downloading 100 files and trip the ServicePointManager.DefaultConnectionLimit. The garbage collector doesn't run often enough to keep you out of trouble. After which any subsequent downloads will die with a timeout. Fix:
using (var response = request.GetResponse()) {
// Do stuff
//...
}
Which uses the reliable way to ensure that the connection will be closed, even if the code falls over with an exception. If you still have trouble then diagnose with, say, SysInternals' TcpView utility. It is a good way to see active connections.

Related

How do I resend a "failed" WebRequest?

I send POST and GET WebRequest that should support longer periods of internet being down. The idea is to queue the failed (timedout) webrequest and to try to resend them periodically until the internet is up again and all queued WebRequests are sent.
However, I seems that I cannot just reuse the old WebRequest. Do I need to set it up again?
IAsyncResult result = request.BeginGetResponse (o => {
callCallback (o);
}, state);
When request is just setup using:
var request = HttpWebRequest.Create (String.Format (#"{0}/{1}", baseServiceUrl, path));
request.Method = "GET";
request.ContentType = "application/xml; charset=UTF-8";
request.Headers.Add ("Authority", account.Session);
return request;
it works fine. But after a timeout (and request.Abort ();) and calling BeginGetResponse() on the same webrequest just freezes.
You cannot call BeginGetResponse() multiple times on the same HttpWebRequest. I'm not sure whether that's support on .NET, but it's not possible with Mono's implementation (and not something that'd be easy to change).
See also Can I reuse HttpWebRequest without disconnecting from the server? - the underlying network connection will be reused.

Cancel HTTPWebRequest when server goes down

I have created a Windows Service that calls a API (that returns JSON) with HTTPWebRequest.
The API doesn't return anything until it has something to "deliver". So I set the timeout quite high and lets the request wait until it receivs a response.
The problem is that when I test to turn off or disconnect the server running the API. The HTTPWebRequest doesn't stop the request. So I can't know if the API server has gone down.
The request code:
HttpWebRequest Request = (HttpWebRequest)HttpWebRequest.Create(Url);
Request.Method = "POST";
Request.ContentType = "application/x-www-form-urlencoded";
Request.ContentLength = PostData.Length;
Request.Timeout = Timeout;
Request.KeepAlive = true;
using (StreamWriter sw = new StreamWriter(Request.GetRequestStream()))
{
sw.Write(PostData);
}
using (HttpWebResponse Response = (HttpWebResponse)Request.GetResponse())
{
using (StreamReader sr = new StreamReader(Response.GetResponseStream()))
{
ResponseText = sr.ReadToEnd();
}
}
Is there anyway to "break" the request when the requested server goes down?
I have tried using the webbrowser to call the API server and after a while disconnect it and that return an error to the webpage.
you could use a background worker only cecking if the server is online. It has some disatvantages but may work fine.
It is always good to keep the requests asynchronous (See the BeginXXX methods in HttpWebRequest - http://msdn.microsoft.com/en-us/library/system.net.httpwebrequest.aspx).
Using the asynchronous APIs ensures that you are not blocked until you get a response from the server.
In addition to using the asynchronous APIs, you can have a heart-beat requests (that could be just a HEAD HTTP request to a ping service on the server, which returns an empty body and HTTP 200 status), to keep track that the server is alive. If this request times out, then server is not alive - in which case, you can cancel / just 'forget' that the request has been made.

HTTP request not responding [duplicate]

This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
Adjusting HttpWebRequest Connection Timeout in C#
In my code, I am calling a live chat api to get list of operators in the following format:
HttpWebRequest request
= WebRequest.Create("url-here") as HttpWebRequest;
request.Credentials
= new NetworkCredential(
username,
password
);
using (HttpWebResponse response = request.GetResponse() as HttpWebResponse)
{
StreamReader reader = new StreamReader(response.GetResponseStream());
//dowhatever
}
Today the live chat api went down and because of the our site also went down. In this case the api request went on spinning and was in 'not responding' state.
How to fix this? I do not want continue waiting until the api responds. because it will cause my page also go on spinning. Is there way like timeout - or wait for 2 seconds then skip the live chat request part?
Thanks for the help!
You can use the WebRequest.Timeout property to set how long you'd like your code to wait before timing out:
var request = WebRequest.Create("url-here") as HttpWebRequest;
request.Credentials = new NetworkCredential(username, password);
request.Timeout = 30 * 1000; // wait for 30 seconds (30,000 milliseconds)

Http Requests Timing Out in Production but not Dev

Below is an example of a website that when requested from my local dev environment returns ok but when requested from the production server, the request times out after 15 seconds. The request headers are exactly the same. Any ideas?
http://www.dealsdirect.com.au/p/wall-mounted-fish-tank-30cm/
Here's one thing that I wanna point beside what other stuff I've already provided. When you call GetResponse the object that's returned has to be disposed of ASAP. Otherwise stalling will occur, or rather the next call will block and possibly time out because there's a limit to the number of requests that can go through the HTTP request engine concurrently in System.Net.
// The request doesn't matter, it's not holding on to any critical resources
var request = (HttpWebRequest)WebRequest.Create(url);
// The response however is, these will eventually be reclaimed by the GC
// but you'll run into problems similar to deadlocks if you don't dispose them yourself
// when you have many of them
using (var response = request.GetResponse())
{
// Do stuff with `response` here
}
This is my old answer
This question is really hard to answer without knowing more about the specifics. There's no reason why the IIS would behave like this which leads me to conclude that the problem has to do with something you app is doing but I know nothing about it. If you can reproduce the problem with a debugger attached you might be able to track down where the problem is occuring but if you cannot do this first then there's little I can do to help.
Are you using the ASP.NET Development Server or IIS Express in development?
If this is an issue with proxies here's a factory method I use to setup HTTP requests that require where the proxy requires some authentication (though, I don't believe I ever received a time out):
HttpWebRequest CreateRequest(Uri url)
{
var request = (HttpWebRequest)WebRequest.Create(url);
request.Timeout = 120 * 1000;
request.CookieContainer = _cookieContainer;
if (UseSystemWebProxy)
{
var proxy = WebRequest.GetSystemWebProxy();
if (UseDefaultCredentials)
{
proxy.Credentials = CredentialCache.DefaultCredentials;
}
if (UseNetworkCredentials != null
&& UseNetworkCredentials.Length > 0)
{
var networkCredential = new NetworkCredential();
networkCredential.UserName = UseNetworkCredentials[0];
if (UseNetworkCredentials.Length > 1)
{
networkCredential.Password = UseNetworkCredentials[1];
}
if (UseNetworkCredentials.Length > 2)
{
networkCredential.Domain = UseNetworkCredentials[2];
}
proxy.Credentials = networkCredential;
}
request.Proxy = proxy;
}
return request;
}
Try this out Adrian and let me know how it goes.

C# FtpWebRequest Error 403

I have the following code to retrieve a file via FTP:
try
{
FtpWebRequest request = (FtpWebRequest)WebRequest.Create(serverPath);
request.KeepAlive = true;
request.UsePassive = true;
request.UseBinary = true;
request.Method = WebRequestMethods.Ftp.DownloadFile;
request.Credentials = new NetworkCredential(username, password);
using (FtpWebResponse response = (FtpWebResponse)request.GetResponse())
using (Stream responseStream = response.GetResponseStream())
using (StreamReader reader = new StreamReader(responseStream))
using (StreamWriter destination = new StreamWriter(destinationFile))
{
destination.Write(reader.ReadToEnd());
destination.Flush();
}
return 0;
}
catch (Exception ex)
{
MessageBox.Show(ex.Message);
}
This works in most cases, but I have one client machine where I get an exception:
The remote server returned an error 403: Forbidden
Can anyone tell me why this could be the case? It’s exactly the same code running on all clients (including the same username and password)?
I've encountered the same problem.
I have already done some monitoring.
First I monitored communications for FTP connection via TotalCommander (It works, so it is a good reference). After that I also monitored my program. I was shocked by what was happening.
Somehow HTTP request was sent by an FTPRequest instance! I haven't solved the problem yet, but I am getting closer. I'll inform you if I find out anything new yet.
The solution was very simple for me. I just added the following line, after creating the request instance:
request.Proxy = null;
The only thing I can suggest is installing Wireshark and monitoring exactly what's being transmitted between the client and server, and comparing that between different machines. If necessary, to get the messages more similar between FTP and IE, change the request's user agent. Is there any funky networking going on, like IP-based permissions?

Categories