How to detect trusted sites using HttpWebRequest in webservice? - c#

I have a console application for the 2.0 framework in C#, using vs
2005. This is weird, the exact same code I have worked on both my pc and the server(using webservice).
When I execute HttpWebRequest.GetResponse on URL, First one(pc)
works well but last one(webservice) returns an error :
(404) Not Found. The URL is a trusted site.
Internet sites and local intranet sites are detected successfully.
But trusted sites only could not be detected.
I don't know why?
This is the code:
private bool getSiteConnStatus(string url)
{
bool result = true;
Uri uri = new Uri(url);
try
{
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(uri);
WebProxy SCProxy = new WebProxy("123.123.123.123");
SCProxy.Credentials = new NetworkCredential();
request.Proxy = SCProxy;
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
if (response == null || response.StatusCode != HttpStatusCode.OK)
{
result = false;
}
response.Close();
}
catch (Exception ex)
{
result = false;
}
return result;
}
~~~~~~~~~~
I've solved.
The Proxy information was changed whenever each site called. So I added some lines below.
Reference site
private bool getSiteConnStatus(string url)
{
bool result = true;
Uri uri = new Uri(url);
try
{
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(uri);
***WebProxy SCProxy;
if (request.Address.Host == "test.net")
{
SCProxy = new WebProxy("111.111.111.111", 8080);
}
else
{
SCProxy = new WebProxy("123.123.123.123", true);
}
request.Proxy = SCProxy;***
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
if (response == null || response.StatusCode != HttpStatusCode.OK)
{
result = false;
}
response.Close();
}
catch (Exception ex)
{
result = false;
}
return result;
}

Try Accessing your url from Browser (on-server) - If it works your browser agent is configured to make communication to/from given ports to the outside world.
Check your firewall policies and any specific user-agent strings they are preventing to make connections to the out-side world ?
Also verify your Proxy Authentication , if you try it from the browser (i presume here its IE ) . browser might supply it automatically / NTLM Authentication strings. where your program might not be able to ?

Related

Is there another solution to WebResponse 403 error?

There are many posts on WebResponse 403 error but my situation is a little different. I have created a console application that will run as a task on my server. The console application passes user emails in WebRequest and waits for WebResponse to receive the uri with the returning parameters. The code below worked perfectly a few days ago but one of the other programmers added a new parameter for a return web address. I know for a fact that this is causing the 403 error because if I paste the uri in IE with new parameter it works. But since I have a console application a return web address is something I cannot do, at least I don't think so.
Unfortunately the programmer said that he cannot change it back and said that there is a way to receive the uri or the entire page content and I can process it that way. I still have no clue what he was talking about because StreamReader requires a WebResponse and pretty much all other solutions I could think of.
Even though I get a 403 error the response still has the uri with the parameters I need because I can see it in IE in the web address. So all I need is the response uri. I would appreciate any help you have to offer. Below is the method giving me problems.
String employeeInfo = "";
try
{
HttpWebRequest request = (HttpWebRequest)WebRequest
.Create("http://example.com/subsub.aspx?instprod=xxx&vabid=emailaddress");
using (HttpWebResponse webResponse =
(HttpWebResponse)request.GetResponse()) //Error occurs here. 403 Forbidden
{
Uri myUri = new Uri(webResponse.ResponseUri.ToString());
String queryParamerter = myUri.Query;
employeeInfo = HttpUtility.ParseQueryString(queryParamerter).Get("vres");
if (employeeInfo != "N/A")
{
return employeeInfo;
}
else
{
employeeInfo = "0";
return employeeInfo;
}
}
}
catch (WebException)
{
employeeInfo = "0";
return employeeInfo;
}
Let's follow Jim Mischel's idea. We'll handle the WebException and use the Response property of the exception.
String employeeInfo = "";
try
{
HttpWebRequest request = (HttpWebRequest)WebRequest.Create("http://example.com/subsub.aspx?instprod=xxx&vabid=emailaddress");
using (HttpWebResponse webResponse = (HttpWebResponse)request.GetResponse()) //Error occurs here. 403 Forbidden
{
Uri myUri = new Uri(webResponse.ResponseUri.ToString());
String queryParamerter = myUri.Query;
employeeInfo = HttpUtility.ParseQueryString(queryParamerter).Get("vres");
if (employeeInfo != "N/A")
{
return employeeInfo;
}
else
{
employeeInfo = "0";
return employeeInfo;
}
}
}
catch (WebException ex)
{
HttpWebResponse response = ex.Response as HttpWebResponse;
if(response.StatusCode != HttpStatusCode.Forbidden)
{
throw;
}
Uri myUri = new Uri(response.ResponseUri.ToString());
String queryParamerter = myUri.Query;
employeeInfo = HttpUtility.ParseQueryString(queryParamerter).Get("vres");
if (employeeInfo != "N/A")
{
return employeeInfo;
}
else
{
employeeInfo = "0";
return employeeInfo;
}
}

What is the best way to use .PAC in C# when request is redirected

Background
I have to use the proxy server specified by users in my application.
Right now, I check that the input proxy contains ".pac" or not. If yes, I'll get the real proxy from pac file. (http://www.codeproject.com/Articles/12168/Using-PAC-files-proxy) Otherwise, I just set the input proxy to WebProxy.
public HttpWebRequest CreateRequest(string url, string proxy)
{
var request = WebRequest.Create(url);
if (proxy.ToLower().Contains(".pac"))
{
string realProxy = GetProxyFromPac(url, proxy);
request.Proxy = (string.IsNullOrEmpty(realProxy))
? new WebProxy()
: new WebProxy(realProxy, true);
}
else
{
request.Proxy = new WebProxy(proxy, true);
}
return request as HttpWebRequest;
}
And use the request like this.
var request = CreateRequest(url, proxy);
var response = request.GetResponse() as HttpWebResponse;
Problem
When the server redirects url to another url, I'll get request timeout.
Example:
The input url URL
The input proxy PAC_P
GetProxyFromPac(PAC_P, URL) return P
The redirected url REAL_URL
GetProxyFromPac(PAC_P, REAL_URL) return PP
I found that it's because I set the proxy to P in the request and it will be used for all urls (including redirected urls) but REAL_URL is not accessible via P.
I have to get PP from the REAL_URL and PAC_P and use PP to request to REAL_URL.
The first solution in my head is to get a new proxy every time the request is redirected and manually request to the redirected url.
var request = CreateRequest(url, proxy);
request.AllowAutoRedirect = false;
var response = request.GetResponse() as HttpWebResponse;
while (response.StatusCode == HttpStatusCode.Redirect ||
response.StatusCode == HttpStatusCode.Moved)
{
request = CreateRequest(response.Headers["Location"], proxy);
response = request.GetResponse() as HttpWebResponse;
}
Question
I think there should be an elegant way to handle this. It seems to be an extremely general case. Do you have any idea?
You can implement your own IWebProxy class like this:
public class MyPacScriptProxy : IWebProxy
{
protected string PacScriptUrl;
public MyPacScriptProxy(string url)
{
PacScriptUrl = url;
}
public ICredentials Credentials { get; set; }
public Uri GetProxy(Uri dest)
{
// you can return your GetProxyFromPac(dest, PacScriptUrl); result here
if (dest.Host.EndsWith(".net")) {
return null; // bypass proxy for .net websites
}
return new Uri("http://localhost:8888");
}
public bool IsBypassed(Uri host)
{
return false;
}
}
then use it like this:
var request = WebRequest.Create("http://www.google.com");
request.Proxy = new MyPacScriptProxy("http://localhost/proxy.pac");
Hope this helps :)

Redirect to https after ensuring ssl availability

I'm using this code to redirect
If Not Request.IsSecureConnection Then
Dim url As String = Request.Url.ToString().Replace("http:", "https:")
Response.Redirect(url)
Exit Sub
End If
Now i want to redirect to HTTPS only if ssl available on my website.
So how to check ssl availability?
using function created at https://stackoverflow.com/a/5378470/2012977 can help check availability of url. See example below.
public bool UrlExists(string url)
{
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(url);
request.Timeout = 15000;
request.Method = "HEAD";
try
{
using (HttpWebResponse response = (HttpWebResponse)request.GetResponse())
{
return response.StatusCode == HttpStatusCode.OK;
}
}
catch (WebException)
{
return false;
}
}

WebRequest and protocol agnostic URL

In a web application I have to first check if an image exists and then display this image or a dummy image.
I use the following code and it works for URLS like:
"http://www.somedomain.com/niceimage.png"
"https://www.somedomain.com/niceimage.png".
public virtual bool WebResourceExists(string url)
{
WebHeaderCollection headers = null;
WebResponse response = null;
try
{
WebRequest request = WebRequest.Create(url);
request.Method = "HEAD";
response = request.GetResponse();
headers = response.Headers;
bool result = int.Parse(headers["Content-Length"]) > 0;
return result;
}
catch (System.Net.WebException)
{
return false;
}
catch (Exception e)
{
_log.Error(e);
return false;
}
finally
{
if (response != null)
{
response.Close();
}
}
}
In some places the the method is called with protocol agnostic urls like "//www.somedomain.com/niceimage.png".
There is an exception thrown for such urls:
System.InvalidCastException: Unable to cast object of type
'System.Net.FileWebRequest' to type 'System.Net.HttpWebRequest'
Is there a way to use protocol agnostic urls other then just prepending "http:" to the url?
Protocol-agnostic URLs are resolved by the browser using the current protocol, and are used to avoid making HTTP requests from an HTTPS page.
Code executing on the server doesn't really have a concept of a "current protocol". Whilst ASP.NET can determine whether the current request was issued over HTTP or HTTPS, the WebRequest classes are not restricted to ASP.NET applications, so they cannot rely on this.
You will need to specify the protocol. Whether you use HTTP or HTTPS will depend on whether you're concerned about third-parties eavesdropping on the connection between your server and "www.somedomain.com".
What about a two step process, check for the http version, if it doesn't exist check for https. I've quickly hacked together a basic example of how this could work but am not able to properly test and check it so it might need some tidying up/refactoring!
public virtual bool WebResourceExists(string url)
{
WebHeaderCollection headers = null;
WebResponse response = null;
try
{
if (url.StartsWith(#"//") {
url = "http:";
}
WebRequest request = WebRequest.Create(url);
request.Method = "HEAD";
response = request.GetResponse();
headers = response.Headers;
bool result = int.Parse(headers["Content-Length"]) > 0;
return result;
}
catch (System.Net.WebException)
{
if (url.StartsWith(#"http://") {
url = url.Replace("http://","https://");
} else {
return false;
}
try {
WebRequest request = WebRequest.Create(url);
request.Method = "HEAD";
response = request.GetResponse();
headers = response.Headers;
bool result = int.Parse(headers["Content-Length"]) > 0;
return result;
}
catch (System.Net.WebException)
{
return false;
}
}
catch (Exception e)
{
_log.Error(e);
return false;
}
finally
{
if (response != null)
{
response.Close();
}
}
}

Verify file exists on website

I am attempting to make a simple function that verifies that a specific file exists on a website.
The web request is set to head so I can get the file length instead of downloading the entire file, but I get "Unable to connect to the remote server" exception.
How can I verify a file exists on a website?
WebRequest w;
WebResponse r;
w = WebRequest.Create("http://website.com/stuff/images/9-18-2011-3-42-16-PM.gif");
w.Method = "HEAD";
r = w.GetResponse();
edit: My bad, turns out my firewall was blocking http requests after I checked the log.
It didnt prompt me for an exception rule so I assumed it was a bug.
I've tested this and it works fine:
private bool testRequest(string urlToCheck)
{
var wreq = (HttpWebRequest)WebRequest.Create(urlToCheck);
//wreq.KeepAlive = true;
wreq.Method = "HEAD";
HttpWebResponse wresp = null;
try
{
wresp = (HttpWebResponse)wreq.GetResponse();
return (wresp.StatusCode == HttpStatusCode.OK);
}
catch (Exception exc)
{
System.Diagnostics.Debug.WriteLine(String.Format("url: {0} not found", urlToCheck));
return false;
}
finally
{
if (wresp != null)
{
wresp.Close();
}
}
}
try it with this url: http://www.centrosardegna.com/images/losa/losaabbasanta.png then modify the image name and it will return false. ;-)
try
{
WebRequest request = HttpWebRequest.Create("http://www.microsoft.com/NonExistantFile.aspx");
request.Method = "HEAD"; // Just get the document headers, not the data.
request.Credentials = System.Net.CredentialCache.DefaultCredentials;
// This may throw a WebException:
using (HttpWebResponse response = (HttpWebResponse)request.GetResponse())
{
if (response.StatusCode == HttpStatusCode.OK)
{
// If no exception was thrown until now, the file exists and we
// are allowed to read it.
MessageBox.Show("The file exists!");
}
else
{
// Some other HTTP response - probably not good.
// Check its StatusCode and handle it.
}
}
}
catch (WebException ex)
{
// Cast the WebResponse so we can check the StatusCode property
HttpWebResponse webResponse = (HttpWebResponse)ex.Response;
// Determine the cause of the exception, was it 404?
if (webResponse.StatusCode == HttpStatusCode.NotFound)
{
MessageBox.Show("The file does not exist!");
}
else
{
// Handle differently...
MessageBox.Show(ex.Message);
}
}

Categories