Retrieve response time of a web request within a catch - c#

First of all, thanks for the help you can give me.
Okay, first, excuse my little knowledge, I'm learning, so forgive my ignorance.
I have a method that calls a service, and it is assumed that if any response other than the range of 200 is received, it goes to catch. Well, I need, within that catch, to retrieve from the header (or somewhere) of that response the time it took the service to respond.
I can't use timer or things like that (it was what I first thought of).
The thing is, I don't know how to recover this data from the exception.
Thank you very much!
{
try
{
// Reading the http response
using (HttpWebResponse webResponse = httpWebRequest.GetResponse () as HttpWebResponse)
{
///// Call to endpoint
}
}
catch (WebException ex)
{
//// Here I need to retrieve the time it took for the service to respond (with the error)
}
return response;
} ```

Adding an example to what kshkarin mentioned:
var watch = System.Diagnostics.Stopwatch.StartNew();
try
{
var request = WebRequest.Create("http://www.google.com");
using (HttpWebResponse response = (HttpWebResponse)request.GetResponse())
{
using (Stream answer = response.GetResponseStream())
{
// do something
watch.Stop();
Console.WriteLine($"Success at {watch.ElapsedMilliseconds}");
}
}
}
catch (WebException e)
{
// If we got here, it was a timeout exception.
watch.Stop();
Console.WriteLine($"Error occurred at {watch.ElapsedMilliseconds} \n {e}");
}

Indeed you must be set timeout in web request and check this with TimeoutException
if you have some scenario you can check this with simple DateTime
set DateTime.Now and get distance time

Related

How to detect HTTPRequest Failure

I am sending 100000 Requests and in order to check if all the requests has been sent successfully , I have developed a simple web page that counts the number of requests that has been sent.
The problem is this that the receiver counts less than 50000 and I also can not detect which one of them has been failed in order to send them again as the sender gets statusscode=OK for all of them and also no exception has detected.
I also tried it after removing webReq.Method="HEAD" but had no effect.
Any hints is appreciated.
Here is the sender's code:
try
{
var content = new MemoryStream();
var webReq = (HttpWebRequest)WebRequest.Create(url);
webReq.Method = "HEAD";
using (WebResponse response = await webReq.GetResponseAsync())
{
HttpWebResponse res = (HttpWebResponse)response;
if (res.StatusCode != HttpStatusCode.OK)
{
UnsuccessfulURLsPhase1.Add(url);
}
}
}
catch (Exception e)
{
UnsuccessfulURLsPhase1.Add(url);
}
This is receiver's code:
protected void Page_Load(object sender, EventArgs e)
{
try
{
if (!IsPostBack)
{
counter1++;
txtCounter.Text = counter1.ToString();
}
}
catch (Exception ex)
{
Debug.WriteLine("\nException raised!");
Debug.WriteLine("Source :{0} ", ex.Source);
Debug.WriteLine("Message :{0} ", ex.Message);
}
}
You're sending these out as quickly as possible? IANA suggests (and recent editions of Windows respect) using the range 49152 to 65535 for ephemeral ports, which are ports that are reserved to receive the reply from IP socket connections. This means that there are 16383 ports available, each of which must be left in a TIME_WAIT state for (IIRC) 120 seconds after the connection is closed.
In perfect conditions (and with routing equipment that can sustain thousands of simultaneous connections... a cheap SOHO router will probably overheat and become unreliable as it runs out of memory), you're going to be limited to a maximum of around 16000 requests every two minutes.
In practice, HttpWebRequest (and therefore WebClient) will maintain only a specific number of connections to a specific host, and pipeline requests over those connections, so without tweaking ServicePointManager, or the ServicePoint associated with the host you're trying to hit, you're going to have an awful amount of queueing to squeeze 100000 requests through these connections. It's likely that you'll hit timeouts somewhere down this path.
Your Page_Load is swallowing the exceptions so the server is always returning 200, OK.
You have to let the exception be thrown or you have to explicitly set the response status when an error occurs.
Found in here How to send a Status Code 500 in ASP.Net and still write to the response?
that TrySkipIisCustomErrors should be set.
Something like
protected void Page_Load(object sender, EventArgs e)
{
try
{
if (!IsPostBack)
{
counter1++;
txtCounter.Text = counter1.ToString();
}
}
catch (Exception ex)
{
Debug.WriteLine("\nException raised!");
Debug.WriteLine("Source :{0} ", ex.Source);
Debug.WriteLine("Message :{0} ", ex.Message);
// Raise the exception
// throw;
// or assign the correct status and status code
Response.Clear();
Response.TrySkipIisCustomErrors = true
Response.ContentType = "text/plain";
Response.StatusCode = (int)HttpStatusCode.InternalServerError;
Response.Write(ex.Message);
// Send the output to the client.
Response.Flush();
}
}
(Hope it helps, its been a long time since I have done something in WebForms :S )

The connection was closed unexpectedly C# after a long running time

Hi I was making a crawler for a site. After about 3 hours of crawling, my app stopped on a WebException. below are my code in c#. client is predefined WebClient object that will be disposed every time gameDoc has already been processed. gameDoc is a HtmlDocument object (from HtmlAgilityPack)
while (retrygamedoc)
{
try
{
gameDoc.LoadHtml(client.DownloadString(url)); // this line caused the exception
retrygamedoc = false;
}
catch
{
client.Dispose();
client = new WebClient();
retrygamedoc = true;
Thread.Sleep(500);
}
}
I tried to use code below (to keep the webclient fresh) from this answer
while (retrygamedoc)
{
try
{
using (WebClient client2 = new WebClient())
{
gameDoc.LoadHtml(client2.DownloadString(url)); // this line cause the exception
retrygamedoc = false;
}
}
catch
{
retrygamedoc = true;
Thread.Sleep(500);
}
}
but the result is still the same. Then I use StreamReader and the result stays the same! below are my code using StreamReader.
while (retrygamedoc)
{
try
{
// using native to check the result
HttpWebRequest webreq = (HttpWebRequest)WebRequest.Create(url);
string responsestring = string.Empty;
HttpWebResponse response = (HttpWebResponse)webreq.GetResponse(); // this cause the exception
using (StreamReader reader = new StreamReader(response.GetResponseStream()))
{
responsestring = reader.ReadToEnd();
}
gameDoc.LoadHtml(client.DownloadString(url));
retrygamedoc = false;
}
catch
{
retrygamedoc = true;
Thread.Sleep(500);
}
}
What should I do and check? I am so confused because I got am able to crawl on some pages, on the same site, then in about 1000 reasults, it cause the exception. the message from exception is only The request was aborted: The connection was closed unexpectedly. and the status is ConnectionClosed
PS. the app is a desktop form app.
update :
Now I am skipping the values and turned them to null so that the crawling can goes on. But if the data is really needed, I still have to update the crawling result manually, which is tiring because the result contains thousands of record. Please help me.
example :
it was like you have downloaded like about 1300 data from the website, then the application stopped saying The request was aborted: The connection was closed unexpectedly. while all your internet connection still on and on a good speed.
ConnectionClosed may indicate (and probably does) that the server you're downloading from is closing the connection. Perhaps it is noticing a large amount of requests from your client and is denying you additional service.
Since you can't control server-side shenanigans, I'd recommend you have some sort of logic to retry the download a bit later.
Got this error because it was returned as 404 from the server.

Intermittant Repeated Timeouts from WebRequest.GetResponse connecting to Jenkins/Hudson

I am kicking off parameterized Jenkins builds from a c# application.
The urls are valid (I can pull it from the log and run it with no issue). At certain points all webrequests will time out, no matter how much the timeout is set for (i've gone up to 90 seconds) or how many times it is run.
This is intermittant and certain times, I will have no issues at all.
while (count<5)
{ try{
log.WriteEntry("RunningJenkinsBuild- buildURL=" + buildUrl, EventLogEntryType.Information);
WebRequest request = WebRequest.Create(buildUrl);
request.GetResponse();
return;
}
catch (WebException ex)
{
log.WriteEntry("Timeout- wait 15 seconds and try again-"+ex.Message, EventLogEntryType.Error);
Thread.Sleep(15000);
count++;
}
catch (Exception ex2)
{
log.WriteEntry(ex2.Message, EventLogEntryType.Error);
return;
}
}
This cleared it up. 'Using' helped it out.
WebRequest request = WebRequest.Create(buildUrl);
request.Timeout = 10000;
using (WebResponse response = request.GetResponse()) { }
Thread.Sleep(5000);

Why does Http Web Request and IWebProxy work at weird times

Another question about Web proxy.
Here is my code:
IWebProxy Proxya = System.Net.WebRequest.GetSystemWebProxy();
Proxya.Credentials = CredentialCache.DefaultNetworkCredentials;
HttpWebRequest rqst = (HttpWebRequest)WebRequest.Create(targetServer);
rqst.Proxy = Proxya;
rqst.Timeout = 5000;
try
{
rqst.GetResponse();
}
catch(WebException wex)
{
connectErrMsg = wex.Message;
proxyworks = false;
}
This code hangs the first time it is called for a minute of two. After that on successive calls it works sometimes, but not others. It also never hits the catch block.
Now the weird part. If I add a MessageBox.Show(msg) call in the first section of code before the GetResponse() call this all will work every time without hanging. Here is an example:
try
{
// ========Here is where I make the call and get the response========
System.Windows.Forms.MessageBox.Show("Getting Response");
// ========This makes the whole thing work every time========
rqst.GetResponse();
}
catch(WebException wex)
{
connectErrMsg = wex.Message;
proxyworks = false;
}
I'm baffled about why it is behaving this way. I don't know if the timeout is not working (it's in milliseconds, not seconds, so should timeout after 5 seconds, right?...) or what is going on. The most confusing this is that the message box call makes it all work without hanging.
So any help and suggestions on what is happening is appreciated. These are the kind of bugs that drive me absolutely out of my mind.
EDIT and CORRECTION:
OK, so I've been testing this and the problem is caused when I try to download data from the URI that I am getting a response from. I am testing the connectivity using the GetResponse() method with a WebRequest, but am downloading the data with a WebClient. Here is the code for that:
public void LoadUpdateDataFromNet(string url, IWebProxy wProxy)
{
//Create web client
System.Net.WebClient webClnt = new System.Net.WebClient();
//set the proxy settings
webClnt.Proxy = wProxy;
webClnt.Credentials = wProxy.Credentials;
byte[] tempBytes;
//download the data and put it into a stream for reading
try
{
tempBytes = webClnt.DownloadData(url); // <--HERE IS WHERE IT HANGS
}
catch (WebException wex)
{
MessageBox.Show("NEW ERROR: " + wex.Message);
return;
}
//Code here that uses the downloaded data
}
The WebRequest and WebClient are both accessing the same URL which is a web path to an XML file and the proxy is the same one created in the method at the top of this post. I am testing to see if the created IWebProxy is valid for the specified path and file and then downloading the file.
The first piece of code I put above and this code using the WebClient are in separate classes and are called at different times, yet using a message box in the first bit of code still makes the whole thing run fine, which confuses me. Not sure what all is happening here or why message boxes and running/debugging in Visual Studio makes the program run OK. Suggestions?
So, I figured out the answer to the problem. The timeout for the we request is still 5 sec, but for some reason if it is not closed explicitly it makes consecutive web requests hang. Here is the code now:
IWebProxy Proxya = System.Net.WebRequest.GetSystemWebProxy();
//to get default proxy settings
Proxya.Credentials = CredentialCache.DefaultNetworkCredentials;
Uri targetserver = new Uri(targetAddress);
Uri proxyserver = Proxya.GetProxy(targetserver);
HttpWebRequest rqst = (HttpWebRequest)WebRequest.Create(targetserver);
rqst.Proxy = Proxya;
rqst.Timeout = 5000;
try
{
//Get response to check for valid proxy and then close it
WebResponse wResp = rqst.GetResponse();
//===================================================================
wResp.Close(); //HERE WAS THE PROBLEM. ADDING THIS CALL MAKES IT WORK
//===================================================================
}
catch(WebException wex)
{
connectErrMsg = wex.Message;
proxyworks = false;
}
Still not sure exactly how calling the message box was making everything work, but it doesn't really matter at this point. The whole thing works like a charm.

Different performance between Java and c# code when testing URL

When running the following Java code, I get very accurate and consistent results in determining if the web page I'm testing is up.
protected synchronized boolean checkUrl(HttpURLConnection connection){
boolean error = false;
//HttpURLConnection connection = null;
GregorianCalendar calendar = new GregorianCalendar();
try{
if(connection != null){
connection.connect();
//200 is the expected HTTP_OK response
error = processResponseCode(connection.getResponseCode());
connection.disconnect();
} else{
error = false;
}
}catch(java.net.UnknownHostException uhe){
... }
catch(Exception e){
... }
return error;
}
The closest match to the Java pattern in c# has much higher results of false positives (mostly due to timeouts - which has a default period of 100000ms).
protected bool connectedToUrl = false;
response = null;
HttpWebRequest webreq = (HttpWebRequest)WebRequest.Create(this.getUri());
webreq.Credentials = CredentialCache.DefaultCredentials;
WebResponse res = null;// webreq.GetResponse();
try
{
WebRequest request = WebRequest.Create(this.getUri()) as WebRequest;
request.Credentials = CredentialCache.DefaultCredentials;
if (request != null)
{
// Get response
res = webreq.GetResponse();
connectedToUrl = processResponseCode(res);
}
else
{
logger.Fatal(getFatalMessage());
string error = string.Empty;
}
}
catch (Exception e)
{
throw e;
}
return connectedToUrl;
}
I have tried various patterns in c# to match the effectiveness of the quoted Java code, to no avail.
Any ideas?
I believe this is because you're not closing any of the request objects.
Simply change this:
res = webreq.GetResponse();
connectedToUrl = processResponseCode(res);
to
using (WebResponse res = webreq.GetResponse())
{
connectedToUrl = processResponseCode(res);
}
(Remove the declaration from earlier.)
Until you haven't closed/disposed the response (or it's been finalized), it's holding onto the connection. You can only have a certain number (2 by default, I believe) of connections to any one host at a time, hence the timeouts. When you dispose the response, it allows another request to use the same connection.
Also this:
catch (Exception e)
{
throw e;
}
Does nothing but destroy the stack trace on an exception that's been bubbled upwards. If you have error handling elsewhere in your code I suggest removing the try catch block. Otherwise you should log the exception and move on. Don't just catch it to throw it.
I think you're missing the GregorianCalendar in the C# version :-)
Why do you have two Request Objects in the C# version?

Categories