For the project I'm working on, we have a desktop program that contacts an online server for a store. Because it's used in schools, getting the proxy setup right is tricky. What we've gone for is to allow users to specify proxy details to use if they want, otherwise it uses the ones from IE. We've also tried to bypass incorrect details being put in, so the code tries the user specified proxy, if that fails the default one, if that fails, then with credentials, if that fails then null.
The problem I'm having is that in places where the proxy settings need to be changed in succession (for example, if their registration fails because the proxy is wrong, they change one tiny thing and try again, takes seconds.) I end up with calls to a HttpRequests .GetResponse() timing out, causing the program to freeze for a good while. Sometimes if I leave a minute or two between the changes, it doesn't freeze, but not every time (Just tried again now after 10mins and it's timing out again).
I can't spot anything in the code that could cause this - though it looks a bit messy. I don't think it could be the server refusing the request unless it's generic server behaviour as I've tried this with requests to our server and others such as google.co.uk.
I'm posting the code in the hope that someone may be able to spot something that's wrong with it, or knows a much simpler way of doing what we're trying to.
The tests we run are without any proxy, so the first part is usually skipped. The first time ApplyProxy is run, it works fine and finishes everything in the first try block, the second, it can either timeout on the GetResponse in the first try block and then go through the rest of the code, or it can work there and timeout on the actual requests made for the registration.
Code:
void ApplyProxy()
{
Boolean ProxySuccess = true;
String WebRequestURI = #"http://www.google.co.uk";
if (UseProxy)
{
try
{
String ProxyUrl = (ProxyUri.ToLower().Contains("http://")) ?
ProxyUri :
"http://" + ProxyUri;
WebRequest.DefaultWebProxy = new WebProxy(ProxyUrl);
if (!string.IsNullOrEmpty(ProxyUsername) && !string.IsNullOrEmpty(ProxyPassword))
WebRequest.DefaultWebProxy.Credentials = new NetworkCredential(ProxyUsername, ProxyPassword);
HttpWebRequest request = HttpWebRequest.Create(WebRequestURI) as HttpWebRequest;
request.Method = "GET";
HttpWebResponse response = request.GetResponse() as HttpWebResponse;
}
catch
{
ProxySuccess = false;
}
}
if(!ProxySuccess || !UseProxy)
{
try
{
WebRequest.DefaultWebProxy = WebRequest.GetSystemWebProxy();
HttpWebRequest request = HttpWebRequest.Create(WebRequestURI) as HttpWebRequest;
request.Method = "GET";
HttpWebResponse response = request.GetResponse() as HttpWebResponse;
}
catch (Exception e)
{ //try with credentials
//make a new proxy from defaults
WebRequest.DefaultWebProxy = WebRequest.GetSystemWebProxy();
String newProxyURI = WebRequest.DefaultWebProxy.GetProxy(new Uri(WebRequestURI)).ToString();
if (newProxyURI == String.Empty)
{ //check we actually get a result
WebRequest.DefaultWebProxy = null;
return;
}
//continue
WebProxy NewProxy = new WebProxy(newProxyURI);
NewProxy.UseDefaultCredentials = true;
NewProxy.Credentials = CredentialCache.DefaultCredentials;
WebRequest.DefaultWebProxy = NewProxy;
try
{
HttpWebRequest request = HttpWebRequest.Create(WebRequestURI) as HttpWebRequest;
request.Method = "GET";
HttpWebResponse response = request.GetResponse() as HttpWebResponse;
}
catch
{
WebRequest.DefaultWebProxy = null;
}
}
}
}
Is it not just a case of needing to set the Timeout property of the HttpWebRequest? It could be that the connection is being made, but not serviced (wrong type of proxy server or stalled server, for example), in which case it may be that the request is waiting for the Timeout period before giving up — a shorter timeout may be preferrable here.
Seems to be a programming error on my behalf. The requests were left open and obviously either the program or the server doesn't like this. Simply closing the HttpWebRequests once done seems to remove this issue.
Related
So I am trying to simulate a person being on my website, in this case it's my console application.
I can connect to it using a HttpWebRequest and create a WebRequest but it doesn't show as a person being on the website in my dashboard. However when I manually get on my website through my web browser it says that someone is online on the website in my dashboard system (WordPress),
So my question is, how do I accomplish the same thing, would I have to create a Socket connection? Or is this possible by using KeepAlive because I think the issue is that it's not on the page for long enough, it connect and gets the request but it doesnt actually establish a connection if that makes any sense.
That's just my theory please correct me if I am wrong.
public static bool isServerOnline()
{
Boolean ret = false;
try
{
HttpWebRequest req = (HttpWebRequest)HttpWebRequest.Create("https://arcticinnovative.com");
req.CookieContainer = cookieContainer; // <= HERE
req.Method = "HEAD";
req.KeepAlive = false;
HttpWebResponse resp = (HttpWebResponse)req.GetResponse();
if (resp.StatusCode == HttpStatusCode.OK)
{
// HTTP = 200 - Internet connection available, server online
ret = true;
}
resp.Close();
return ret;
}
catch (WebException we)
{
// Exception - connection not available
Debug.Print("InternetUtils - isServerOnline - " + we.Status);
return false;
}
}
According to the documentation found here, you can set KeepAlive to true in order to maintain a persistent connection.
I faced with the issue that after retrying the request my POST data got lost somehow.
Code sample below. (Please note that request.timeout = 1 set for testing purposes to reproduce the behavior shown in the code below):
//post_data_final getting
private void request_3()
{
for(int i=1; i<=5; i++)
{
byte[] byteArray = Encoding.ASCII.GetBytes(post_data_final);
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(site_URI);
request.Method = "POST";
//some headers info
request.Timeout = 1;
request.ContentLength = byteArray.Length;
using (Stream os = request.GetRequestStream())
{
os.Write(byteArray, 0, byteArray.Length);
}
try
{
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
//some code about response
}
catch (WebException wex)
{
if (wex.Status == WebExceptionStatus.Timeout)
{
continue;
}
//some additional checks
}
}
}
The magic is that first request (until Request timeout error) goes well. Further requests are going without POST data, but content length is counted properly (i.e. stays the same as in previous request).
Updated:
post_data_final getting is separate function. It is not used (except byteArray) or changed in request_3() function.
Request works fine if it got into for loop and Timeout exception has not occured. So if I just put my request into for loop it will do particular number of valid requests. As soon as I'm getting Timeout exception, the next request will be without POST data.
Source code is edited for those who thinks that recursion is a bad idea. The edited code still doesn't work.
Any suggestions are appreciated
I cant find anything wrong in your code, so provide mode details, as the comments mentioned.
private void request_3()
{
bool sendData = true;
int numberOfTimeOuts = 0;
// The follwing only needs to be done only once, unless you alter post_data_final after each timeout.
byte[] dataToSend = Encoding.ASCII.GetBytes(post_data_final);
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(site_URI);
using (Stream outputStream = request.GetRequestStream())
outputStream.Write(dataToSend, 0, dataToSend.Length);
// request.TimeOut = 1000 * 15; would mean 15 Seconds.
while(sendData && numberOfTimeOuts < MAX_NUMBER_OF_TIMEOUTS)
{
try
{
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
if(response != null)
processResponse(response);
else
{
//You should handle this case aswell.
}
sendData = false;
}
catch(WebException wex)
{
if (wex.Status == WebExceptionStatus.Timeout)
numberOfTimeOuts++;
else
throw;
}
}
}
The issue was because of using Fiddler2 - analogue of Wireshark (i.e. intercepting traffic tool).
Requested site uses https protocol. For debugging purposes I installed Fiddler2 and Fiddler2 certificate to be able to see all incoming and outcoming responses. For some magic reason when I turned off Fiddler2 and added some additional logging into console, i figured out that requests are appeared to be valid (i.e. POST body data still exists after first request).
So with Fiddler2 code above doesn't work while we're having Timeout exception. Without Fiddler2 everything works fine under the same circumstances and using the same code.
I didn't dig deep into Fiddler2, but seems to me issue could be only with compatibility of VS2010 and internal proxy for Error Codes (taking into account that using point 2 under "Updates" area (The Fiddler2 was also used there) for success codes (i.e. 2xx - 3xx) worked fine)
Thanks everyone for getting attention into this.
I'm using a function to check if an external url exists. Here's the code with the status messages removed for clarity.
public static bool VerifyUrl(string url)
{
url.ThrowNullOrEmpty("url");
if (!(url.StartsWith("http://") || url.StartsWith("https://")))
return false;
var uri = new Uri(url);
var webRequest = HttpWebRequest.Create(uri);
webRequest.Timeout = 5000;
webRequest.Method = "HEAD";
HttpWebResponse webResponse;
try
{
webResponse = (HttpWebResponse)webRequest.GetResponse();
webResponse.Close();
}
catch (WebException)
{
return false;
}
if (string.Compare(uri.Host, webResponse.ResponseUri.Host, true) != 0)
{
string responseUri = webResponse.ResponseUri.ToString().ToLower();
if (responseUri.IndexOf("error") > -1 || responseUri.IndexOf("404.") > -1 || responseUri.IndexOf("500.") > -1)
return false;
}
return true;
}
I've run a test over some external urls and found that about 20 out of 100 are coming back as errors. If i add a user agent the errors are around 14%.
The errors coming back are "forbidden", although this can be resolved for 6% using a user agent, "service unavialable", "method not allowed", "not implemented" or "connection closed".
Is there anything I can do to my code to ensure more, preferrably all give a valid response to their existance?
Altermatively, code that can be purchased to do this more effectively.
UPDATE - 14th Nov 12 ----------------------------------------------------------------------
After following advice from previous respondants, I'm now in a situation where I have a single domain that returns Service Unavailable (503). The example I have is www.marksandspencer.com.
When I use this httpsniffer web-sniffer.net as opposed to the one recommended in this thread, it works, returning the data using a webrequest.GET, however I can't work out what I need to do, to make it work in my code.
I finally got to the point of bieng able to validate all the urls without exception.
Firstly I took Davios advice. Some domains return an error on Request.HEAD so I have included a retry for specific scenarios. This created a new Request.GET for the second request.
Secondly, the Amazon scenario. Amazon was intermittently returning a 503 error for its own site and permanent 503 errors for sites hosted on the Amazon framework.
After some digging I found adding the following line to the Request resolved both. It is the Accept string used by Firefox.
var request = (HttpWebRequest)HttpWebRequest.Create(uri);
request.Accept = "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8";
I'm calling web service (my web service) like this:
var request = WebRequest.Create(Options.ServerUri + Options.AccountId + "/integration/trip") as HttpWebRequest;
request.Timeout = 20000; // 20 seconds should be plenty, no need for 100 seconds
request.ContentType = "application/json";
request.Headers.Add(HttpRequestHeader.Authorization, "Basic " + Convert.ToBase64String(Encoding.UTF8.GetBytes(Options.LoginName + ":" + Options.Password)));
request.Method = "POST";
var serializedData = (new JavaScriptSerializer()).Serialize(trip);
var bytes = Encoding.UTF8.GetBytes(serializedData);
request.ContentLength = bytes.Length;
var os = request.GetRequestStream();
os.Write(bytes, 0, bytes.Length);
os.Close();
request.GetResponse();
LoggingAndNotifications.LogAndNotify(string.Format("Success uploading trip: {0}", trip.TripId), false);
return true;
This code called repeatedly to post new objects. After about 3 calls I start getting timeouts on reguest.GetReponse()
There is no errors on server side, nothing in Event Log. It feels like "something" stops me from repeatedly hitting service. What should I look for? Is it possible with company firewall? Or something wrong with my code?
I think the issue is that you are not closing the response. Try editing your code as follows:
var response = request.GetResponse() as HttpWebResponse;
response.Close();
You should close the response as per the example in the doco.
WebRequest myRequest = WebRequest.Create("http://www.contoso.com");
// Return the response.
WebResponse myResponse = myRequest.GetResponse();
// Code to use the WebResponse goes here.
// Close the response to free resources.
myResponse.Close();
Hmm. The doco also says
Any public static (Shared in Visual Basic) members of this type are
thread safe. Any instance members are not guaranteed to be thread
safe.
You should probably ask for a lock of some kind.
Are you sure this is not caused by server side bugs?
It seems strange, as far as I known, the webrequest on .net4 is based on IOCP in lower layer, maybe you can try release web request/response resources after each loop.
Since the GetResponse() will return a stream, if you don't read from it, the real data may not transfer from server to client side. (I found this when I am trying to parse a response that I used peek(), and it always return an invalid value until the read() is called.)
So, try to read it or just close it.
I created RESTful webservice (WCF) where I check credentials on each request. One of my clients is Android app and everything seems to be great on server side. I get request and if it's got proper header - I process it, etc..
Now I created client app that uses this service. This is how I do GET:
// Create the web request
var request = WebRequest.Create(Context.ServiceURL + uri) as HttpWebRequest;
if (request != null)
{
request.ContentType = "application/json";
// Add authentication to request
request.Credentials = new NetworkCredential(Context.UserName, Context.Password);
// Get response
using (var response = request.GetResponse() as HttpWebResponse)
{
// Get the response stream
if (response != null)
{
var reader = new StreamReader(response.GetResponseStream());
// Console application output
var s = reader.ReadToEnd();
var serializer = new JavaScriptSerializer();
var returnValue = (T)serializer.Deserialize(s, typeof(T));
return returnValue;
}
}
}
So, this code get's my resource and deserializes it. As you see - I'm passing credentials in my call.
Then when debugging on server-side I noticed that I get 2 requests every time - one without authentication header and then server sends back response and second request comes bach with credentials. I think it's bad for my server - I'd rather don't make any roundtrips. How should I change client so it doesn't happen? See screenshot of Fiddler
EDIT:
This is JAVA code I use from Android - it doesn't do double-call:
MyHttpResponse response = new MyHttpResponse();
HttpClient client = mMyApplication.getHttpClient();
try
{
HttpGet request = new HttpGet(serviceURL + url);
request.setHeader(new BasicHeader(HTTP.CONTENT_TYPE, "application/json"));
request.addHeader("Authorization", "Basic " + Preferences.getAuthorizationTicket(mContext));
ResponseHandler<String> handler = new BasicResponseHandler();
response.Body = client.execute(request, handler);
response.Code = HttpURLConnection.HTTP_OK;
response.Message = "OK";
}
catch (HttpResponseException e)
{
response.Code = e.getStatusCode();
response.Message = e.getMessage();
LogData.InsertError(mContext, e);
}
The initial request doesn't ever specify the basic header for authentication. Additionally, since a realm is specified, you have to get that from the server. So you have to ask once: "hey, I need this stuff" and the server goes "who are you? the realm of answering is 'secure area'." (because realm means something here) Just because you added it here:
request.Credentials = new NetworkCredential(Context.UserName, Context.Password);
doesn't mean that it's going to be for sure attached everytime to the request.
Then you respond with the username/password (in this case you're doing BASIC so it's base64 encoded as name:password) and the server decodes it and says "ok, you're all clear, here's your data".
This is going to happen on a regular basis, and there's not a lot you can do about it. I would suggest that you also turn on HTTPS since the authentication is happening in plain text over the internet. (actually what you show seems to be over the intranet, but if you do go over the internet make it https).
Here's a link to Wikipedia that might help you further: http://en.wikipedia.org/wiki/Basic_access_authentication
Ok, I got it. I manually set HttpHeader instead of using request.Credentials
request.Headers.Add(HttpRequestHeader.Authorization, "Basic " + Convert.ToBase64String(Encoding.UTF8.GetBytes(Context.UserName + ":" + Context.Password)));
Now I see only single requests as expected..
As an option you can use PreAuthenticate property of HttpClientHandler. This would require a couple of lines more
var client = new HttpClient(new HttpClientHandler
{
Credentials = yourCredentials,
PreAuthenticate = true
});
With using this approach, only the first request is sent without credentials, but all the rest requests are OK.