Keeping a session when using HttpWebRequest - c#

In my project I'm using C# app client and tomcat6 web application server.
I wrote this snippet in the C# client:
public bool isServerOnline()
{
Boolean ret = false;
try
{
HttpWebRequest req = (HttpWebRequest)HttpWebRequest.Create(VPMacro.MacroUploader.SERVER_URL);
req.Method = "HEAD";
req.KeepAlive = false;
HttpWebResponse resp = (HttpWebResponse)req.GetResponse();
if (resp.StatusCode == HttpStatusCode.OK)
{
// HTTP = 200 - Internet connection available, server online
ret = true;
}
resp.Close();
return ret;
}
catch (WebException we)
{
// Exception - connection not available
Log.e("InternetUtils - isServerOnline - " + we.Status);
return false;
}
}
Everytime I invoke this method, I get a new session at server side.
I suppose it's because I should use HTTP cookies in my client. But I don't know how to do that, can you help me?

You must use a CookieContainer and keep the instance between calls.
private CookieContainer cookieContainer = new CookieContainer();
public bool isServerOnline()
{
Boolean ret = false;
try
{
HttpWebRequest req = (HttpWebRequest)HttpWebRequest.Create(VPMacro.MacroUploader.SERVER_URL);
req.CookieContainer = cookieContainer; // <= HERE
req.Method = "HEAD";
req.KeepAlive = false;
HttpWebResponse resp = (HttpWebResponse)req.GetResponse();
if (resp.StatusCode == HttpStatusCode.OK)
{
// HTTP = 200 - Internet connection available, server online
ret = true;
}
resp.Close();
return ret;
}
catch (WebException we)
{
// Exception - connection not available
Log.e("InternetUtils - isServerOnline - " + we.Status);
return false;
}
}

Related

Check to see if site is online, limit the timeout

I'm trying to make a function that checks if a site is online or not, but is having some problem with the timeout. I want to limit it to a max 3 sec, if there is no respons within 3 sec I should see the page as offline.
My try:
class OnlineCheck
{
public static bool IsOnline(string url)
{
try
{
WebClient webclient = new WebClient();
webclient.Headers.Add(HttpRequestHeader.KeepAlive, "1000");
webclient.OpenRead(url);
}
catch { return false; }
return true;
}
}
The WebClient doesn't support timeout. But you can use the HttpWebRequest!
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(Endpoint);
request.Timeout=3000;
request.GetResponse();
If you want to check that the site is online, you are not really interested in the content of the page, just that you get a response. To make that more efficient, you should only request the http headers. Here is a quick example on how you could do:
private static IEnumerable<HttpStatusCode> onlineStatusCodes = new[]
{
HttpStatusCode.Accepted,
HttpStatusCode.Found,
HttpStatusCode.OK,
// add more codes as needed
};
private static bool IsSiteOnline(string url, int timeout)
{
HttpWebRequest request = WebRequest.Create(url) as HttpWebRequest;
{
if (request != null)
{
request.Method = "HEAD"; // get headers only
request.Timeout = timeout;
using (var response = request.GetResponse() as HttpWebResponse)
{
return response != null && onlineStatusCodes.Contains(response.StatusCode);
}
}
}
return false;
}
Use HttpWebRequest rather than WebClient. HttpWebRequest class has a timeout property.
You can try this code:
System.Net.WebRequest r = System.Net.WebRequest.Create("http://www.google.com");
r.Timeout = 3000;
System.Net.WebProxy proxy = new System.Net.WebProxy("<proxy address>");
System.Net.NetworkCredential credentials = new System.Net.NetworkCredential();
credentials.Domain = "<domain>";
credentials.UserName = "<login>";
credentials.Password = "<pass>";
proxy.Credentials = credentials;
r.Proxy = proxy;
try
{
System.Net.WebResponse rsp = r.GetResponse();
}
catch (Exception)
{
MessageBox.Show("Is not avaliable");
return;
}
MessageBox.Show("Avaliable!");
static bool isOnline (string URL)
{
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(URL);
request.Timeout = 3000;
try
{
WebResponse resp = request.GetResponse();
}
catch (WebException e)
{
if (((HttpWebResponse)e.Response).StatusCode == HttpStatusCode.NotFound)
{
return false;
}
}
return true;
}

httpwebrequest gives timeout until restarted

I am working on a desktop application developed in C# (.NET environment).
This application connects to remote server using HttpWebRequest. If due to any reason my PC is disconnected from the internet and I re-connect it my application always gives request timeout for HttpWebRequest until I restart my whole application and if I again add new thread to my application after network d/c it works fine.
Is there any way to reset my network or anyone can tell me how does it work?
//my code is..
public String request(String add, String post, int time, String reff, int id, int rwtime)
{
try
{
if (rwtime == 0)
{
rwtime = 100000;
}
string result = "";
string location = "";
// Create the web request
HttpWebRequest req = WebRequest.Create(add) as HttpWebRequest;
req.ReadWriteTimeout = rwtime;
req.KeepAlive = true;
req.Headers.Add(HttpRequestHeader.AcceptEncoding, "gzip,deflate");
req.Accept = "application/xml,application/xhtml+xml,text/html;q=0.9,text/plain;q=0.8,image/png,*/*;q=0.5";
req.ContentType = "application/x-www-form-urlencoded";
req.Timeout = time;
req.Referer = reff;
req.AllowAutoRedirect = false;
req.CookieContainer = statictk.cc[id];
req.PreAuthenticate = true;
if (post != "")
{
req.Method = "POST";
string postData = post;
ASCIIEncoding encoding = new ASCIIEncoding();
byte[] byte1 = encoding.GetBytes(postData);
// Set the content type of the data being posted.
req.ContentType = "application/x-www-form-urlencoded";
// Set the content length of the string being posted.
req.ContentLength = byte1.Length;
Stream newStream = req.GetRequestStream();
newStream.Write(byte1, 0, byte1.Length);
newStream.Close();
}
else
{
req.Method = "GET";
}
// Get response
try
{
HttpWebResponse response = req.GetResponse() as HttpWebResponse;
// Get the response stream
location = response.GetResponseHeader("Location");
if (location == "")
{
Stream responseStream = response.GetResponseStream();
if (response.ContentEncoding.ToLower().Contains("gzip"))
responseStream = new GZipStream(responseStream, CompressionMode.Decompress);
else if (response.ContentEncoding.ToLower().Contains("deflate"))
responseStream = new DeflateStream(responseStream, CompressionMode.Decompress);
StreamReader reader = new StreamReader(responseStream, Encoding.Default);
// Read the whole contents and return as a string
result = reader.ReadToEnd();
}
else
{
result = location;
}
response.Close();
if (result == "") result = "retry";
return result;
}
catch (Exception e)
{
log.store("errorinresponce", e.Message);
if (statictd.status[id] != "removed")
{
return "retry";
}
else
{
return "error";
}
}
}
catch(Exception f)
{
log.store("Networkerrorretry", f.Message);
if (f.Message == "The operation has timed out")
{
return "retry";
}
string ans = MessageBox.Show("There was a Network Error..Wish to Retry ?\nError msg : "+ f.Message, "Title", MessageBoxButtons.YesNo).ToString();
if (ans == "Yes")
return "retry";
else
{
Invoketk.settxt(id, "Not Ready");
return "error";
}
}
}
It sounds like your application is missing some error handling. A disconnect can happen at any time and your application should be able to handle it. Try to surround the network loop with a try-catch statement, and then catch for the different kinds of exceptions. Depending on what exception was thrown, you can then decide if you reconnect to the server silently or if you want to generate an error message.

How to check if remote image exists with C#

public void BuildImg()
{
// The two different images as strings.
string url1 = "http://remoteimage.com/image.jpg";
string url2 = "http://remoteimage.com/image2.jpg";
try
{
// Check to see if url1 exists or not
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(url1);
request.Credentials = System.Net.CredentialCache.DefaultCredentials;
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
myImg.Visible = true;
myImg.ImageUrl = url1;
}
catch (Exception ex)
{
// Check to see if url2exists or not
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(url2);
request.Credentials = System.Net.CredentialCache.DefaultCredentials;
HttpWebResponse response;
try
{
response = request.GetResponse() as HttpWebResponse;
}
catch (WebException exc)
{
response = exc.Response as HttpWebResponse;
}
// Set myImg to show if url2 exists
myImg.Visible = true;
myImg.ImageUrl = url2;
// If response returns 404, then hide myImg
if (response.StatusCode == HttpStatusCode.NotFound)
{
myImg.Visible = false;
}
}
var arr = new[]
{
"http://example.com/image.jpg",
"http://example.com/image2.jpg"
...
};
myImg.ImageUrl = arr.FirstOrDefault(i => CheckExistence(i));
static bool CheckUrlExistence(string url)
{
try
{
var request = (HttpWebRequest)WebRequest.Create(url);
request.Credentials = CredentialCache.DefaultCredentials;
request.Method = "HEAD";
var response = (HttpWebResponse)request.GetResponse();
return response.StatusCode == HttpStatusCode.OK;
}
catch (Exception ex)
{
var code = ((HttpWebResponse)((WebException)ex).Response).StatusCode; // NotFound, etc
return false;
}

can I check if a file exists at a URL?

I know I can locally, on my filesystem, check if a file exists:
if(File.Exists(path))
Can I check at a particular remote URL?
If you're attempting to verify the existence of a web resource, I would recommend using the HttpWebRequest class. This will allow you to send a HEAD request to the URL in question. Only the response headers will be returned, even if the resource exists.
var url = "http://www.domain.com/image.png";
HttpWebResponse response = null;
var request = (HttpWebRequest)WebRequest.Create(url);
request.Method = "HEAD";
try
{
response = (HttpWebResponse)request.GetResponse();
}
catch (WebException ex)
{
/* A WebException will be thrown if the status of the response is not `200 OK` */
}
finally
{
// Don't forget to close your response.
if (response != null)
{
response.Close();
}
}
Of course, if you want to download the resource if it exists it would most likely be more efficient to send a GET request instead (by not setting the Method property to "HEAD", or by using the WebClient class).
If you want to just copy & paste Justin's code and get a method to use, here's how I've implemented it:
using System.Net;
public class MyClass {
static public bool URLExists (string url) {
bool result = false;
WebRequest webRequest = WebRequest.Create(url);
webRequest.Timeout = 1200; // miliseconds
webRequest.Method = "HEAD";
HttpWebResponse response = null;
try {
response = (HttpWebResponse)webRequest.GetResponse();
result = true;
} catch (WebException webException) {
Debug.Log(url +" doesn't exist: "+ webException.Message);
} finally {
if (response != null) {
response.Close();
}
}
return result;
}
}
I'll keep his observation:
If you want to download the resource, and it exists, it would be more efficient to send a GET request instead by not setting the Method property to "HEAD" or by using the WebClient class.
Below is a simplified version of the code:
public bool URLExists(string url)
{
bool result = true;
WebRequest webRequest = WebRequest.Create(url);
webRequest.Timeout = 1200; // miliseconds
webRequest.Method = "HEAD";
try
{
webRequest.GetResponse();
}
catch
{
result = false;
}
return result;
}
If you are using a unc path or a mapped drive, this will work fine.
If you are using a web address (http, ftp etc) you are better off using WebClient - you will get a WebException if it doesn't exist.
public static bool UrlExists(string file)
{
bool exists = false;
HttpWebResponse response = null;
var request = (HttpWebRequest)WebRequest.Create(file);
request.Method = "HEAD";
request.Timeout = 5000; // milliseconds
request.AllowAutoRedirect = false;
try
{
response = (HttpWebResponse)request.GetResponse();
exists = response.StatusCode == HttpStatusCode.OK;
}
catch
{
exists = false;
}
finally
{
// close your response.
if (response != null)
response.Close();
}
return exists;
}
I had the same problem to solve in asp.net core, I've solved with HttpClient
private async Task<bool> isFileExist(string url)
{
using (HttpClient client = new HttpClient())
{
var restponse = await client.GetAsync(url);
return restponse.StatusCode == System.Net.HttpStatusCode.OK;
}
}
My version:
public bool IsUrlExist(string url, int timeOutMs = 1000)
{
WebRequest webRequest = WebRequest.Create(url);
webRequest.Method = "HEAD";
webRequest.Timeout = timeOutMs;
try
{
var response = webRequest.GetResponse();
/* response is `200 OK` */
response.Close();
}
catch
{
/* Any other response */
return false;
}
return true;
}
WebRequest will waiting long time(ignore the timeout user set) because not set proxy, so I change to use RestSharp to do this.
var client = new RestClient(url);
var request = new RestRequest(Method.HEAD);
request.Timeout = 5000;
var response = client.Execute(request);
result = response.StatusCode == HttpStatusCode.OK;
Thanks for all answers.
And I would like to add my implementation which includes default state when we get errors, for specific cases like mine.
private bool HTTP_URLExists(String vstrURL, bool vResErrorDefault = false, int vTimeOut = 1200)
{
bool vResult = false;
WebRequest webRequest = WebRequest.Create(vstrURL);
webRequest.Timeout = vTimeOut; // miliseconds
webRequest.Method = "HEAD";
HttpWebResponse response = null;
try
{
response = (HttpWebResponse)webRequest.GetResponse();
if (response.StatusCode == HttpStatusCode.OK) vResult = true;
else if (response.StatusCode == HttpStatusCode.NotFound) vResult = false;
else vResult = vResErrorDefault;
}
catch (WebException ex)
{
if (ex.Status == WebExceptionStatus.ProtocolError && ex.Response != null)
{
var resp01 = (HttpWebResponse)ex.Response;
if (resp01.StatusCode == HttpStatusCode.NotFound)
{
vResult = false;
}
else
{
vResult = vResErrorDefault;
}
}
else
{
vResult = vResErrorDefault;
}
}
finally
{
// Don't forget to close your response.
if (response != null)
{
response.Close();
}
}
return vResult;
}
Anoter version with define timeout :
public bool URLExists(string url,int timeout = 5000)
{
...
webRequest.Timeout = timeout; // miliseconds
...
}
This works for me:
bool HaveFile(string url)
{
try
{
using (WebClient webClient = new WebClient())
{
webClient.DownloadString(url);
}
return true;
}
catch (Exception)
{
return false;
}
}

WebRequest and System.Net.WebException on 404, slow?

I am using a WebRequest to check if a web page or media (image) exist. On GetResponse i get a System.Net.WebException exception. I ran through 100 links and it feels like its going slower then it should. Is there a way to not get this exception or handle this more gracefully?
static public bool CheckExist(string url)
{
HttpWebRequest wreq = null;
HttpWebResponse wresp = null;
bool ret = false;
try
{
wreq = (HttpWebRequest)WebRequest.Create(url);
wreq.KeepAlive = true;
wresp = (HttpWebResponse)wreq.GetResponse();
ret = true;
}
catch (System.Net.WebException)
{
}
finally
{
if (wresp != null)
wresp.Close();
}
return ret;
}
Try setting
wreq.Method = "Head";
after the "KeepAlive" line. If the webserver you are calling is smart enough, that will tell it not to return any body contents which should save some time.

Categories