Test if a website is alive from a C# application - c#

I am looking for the best way to test if a website is alive from a C# application.
Background
My application consists of a Winforms UI, a backend WCF service and a website to publish content to the UI and other consumers. To prevent the situation where the UI starts up and fails to work properly because of a missing WCF service or website being down I have added an app startup check to ensure that all everything is alive.
The application is being written in C#, .NET 3.5, Visual Studio 2008
Current Solution
Currently I am making a web request to a test page on the website that will inturn test the web site and then display a result.
WebRequest request = WebRequest.Create("http://localhost/myContentSite/test.aspx");
WebResponse response = request.GetResponse();
I am assuming that if there are no exceptions thown during this call then all is well and the UI can start.
Question
Is this the simplest, right way or is there some other sneaky call that I don't know about in C# or a better way to do it.

HttpWebResponse response = (HttpWebResponse)request.GetResponse();
if (response == null || response.StatusCode != HttpStatusCode.OK)
As #Yanga mentioned, HttpClient is probably the more common way to do this now.
HttpClient client = new HttpClient();
var checkingResponse = await client.GetAsync(url);
if (!checkingResponse.IsSuccessStatusCode)
{
return false;
}

While using WebResponse please make sure that you close the response stream ie (.close) else it would hang the machine after certain repeated execution.
Eg
HttpWebRequest req = (HttpWebRequest)WebRequest.Create(sURL);
HttpWebResponse response = (HttpWebResponse)req.GetResponse();
// your code here
response.Close();

from the NDiagnostics project on CodePlex...
public override bool WebSiteIsAvailable(string Url)
{
string Message = string.Empty;
HttpWebRequest request = (HttpWebRequest)HttpWebRequest.Create(Url);
// Set the credentials to the current user account
request.Credentials = System.Net.CredentialCache.DefaultCredentials;
request.Method = "GET";
try
{
using (HttpWebResponse response = (HttpWebResponse)request.GetResponse())
{
// Do nothing; we're only testing to see if we can get the response
}
}
catch (WebException ex)
{
Message += ((Message.Length > 0) ? "\n" : "") + ex.Message;
}
return (Message.Length == 0);
}

We can today update the answers using HttpClient():
HttpClient Client = new HttpClient();
var result = await Client.GetAsync("https://stackoverflow.com");
int StatusCode = (int)result.StatusCode;

Assuming the WCF service and the website live in the same web app, you can use a "Status" WebService that returns the application status. You probably want to do some of the following:
Test that the database is up and running (good connection string, service is up, etc...)
Test that the website is working (how exactly depends on the website)
Test that WCF is working (how exactly depends on your implementation)
Added bonus: you can return some versioning info on the service if you need to support different releases in the future.
Then, you create a client on the Win.Forms app for the WebService. If the WS is not responding (i.e. you get some exception on invoke) then the website is down (like a "general error").
If the WS responds, you can parse the result and make sure that everything works, or if something is broken, return more information.

You'll want to check the status code for OK (status 200).

Solution from: How do you check if a website is online in C#?
var ping = new System.Net.NetworkInformation.Ping();
var result = ping.Send("https://www.stackoverflow.com");
if (result.Status != System.Net.NetworkInformation.IPStatus.Success)
return;

Related

Adding SSL cert causes 404 only in browser calls

I am working in an internal corporate environment. We have created a webapi installed on iis on port 85. We call this from another MVC HelperApp on port 86. It all works as expected. Now we want to tighten security and add an SSL cert to iis on port 444 and bind it to our API.
Initially we test it with Postman, SoapUI, and a C# console app and it all works. Now we try calling it from our MVC HelperApp and it returns a 404 sometimes.
Deeper debugging; I put the code into a C# DLL (see below). Using the console app I call the Dll.PostAPI and it works as expected. Now I call that same Dll.PostAPI from the MVC HelperApp and it won't work. When I step through the code I make it as far as this line await client.PostAsync(url, data); and the code bizarrely ends, it doesn't return and it doesn't throw an exception. Same for Post and Get. I figure it makes the call and nothing is returned, no response and no error.
Also, if I change the url to "https://httpbin.org/post" or to the open http port85 on iss it will work. I have concluded that the C# code is not the problem (but I'm open to being wrong).
Therefore I have come to the conclusion that for some reason the port or cert is refusing calls from browsers.
We are looking at:
the "Subject Alternative Name" but all the examples show
WWW.Addresses which we are not using.
the "Friendly Name" on the cert creation.
and CORS Cross-Origin Resource Sharing.
These are all subjects we lack knowledge in.
This is the calling code used exactly the same in the console app and the web app:
var lib = new HttpsLibrary.ApiCaller();
lib.makeHttpsCall();
This is what's in the DLL that gets called:
public async Task<string> makeHttpsCall()
{
try
{
List<Quote> quotes = new List<Quote>();
quotes.Add(CreateDummyQuote());
var json = JsonConvert.SerializeObject(quotes);
var data = new StringContent(json, Encoding.UTF8, "application/json");
var url = "https://httpbin.org/post"; //this works in Browser
//url = "https://thepath:444//api/ProcessQuotes"; //444 DOES NOT WORK in browsers only. OK in console app.
//url = "http://thepath:85/api/ProcessQuotes"; //85 works.
var client = new HttpClient();
var response = await client.PostAsync(url, data); //<<<this line never returns when called from browser.
//var response = await client.GetAsync(url); //same outcome for Get or Post
var result = await response.Content.ReadAsStringAsync();
return result;
}
catch (Exception ex)
{
throw;
}
}

C# Check If Xampp Server/Localhost is Running

I'm using PHP to develop a freight application but it has to be a WindowsForm application. Using visual C#, I'm displaying webpages using webBrowser and the files are on the local server. Xampp needs to start all services before my application runs so it read php,etc. This works well.
The problem is. If I don't run Xampp services the browser will show an error "Navigation to the webpage was canceled". The reason for this is obvious(no server running).
So what I would like to do in C# is check if the server is running or check if http://127.0.0.1/site is available. If it is not then display an error messgaeBox and if it is available, do nothing else.
But I don't want the employees to see this message so If xampp services aren't running or localhost is not accessible I want to show a message box instead of attempting to load the page.
You probably want to craft a web request in .Net and read the response code, use conditional logic to then choose your action.
WebRequest request = WebRequest.Create ("http://127.0.0.1/site");
// If required by the server, set the credentials.
request.Credentials = CredentialCache.DefaultCredentials;
// Get the response.
HttpWebResponse response = (HttpWebResponse)request.GetResponse ();
if(response.StatusCode == HttpStatusCode.NotFound) {
// show messagebox here
}
Documentation that is pertinent WebRequest Class Documentation
My code is untested but it should do what you need.
With help from Daniel Lane I got it to work.
using System.IO;
using System.Web;
using System.Net;
WebRequest request = WebRequest.Create("http://127.0.0.1/site");
try
{
using (WebResponse response = request.GetResponse())
{
//Load Webpage
}
}
catch (WebException erf)
{
using (WebResponse response = erf.Response)
{
var errorForm = new error();
errorForm.Show();
this.Close();
}
}
From "WebRequest..." is in an event(load).
Here are the results:
Use HttpWebRequest to try to pull some page from your website and if successful you can also check the StatusCode on the HttpWebResponse. Should be 200 or similar.

Getting headers from WebRequest

Technically I'm asking a question for a friend who writes VB, but I post as C# since more people are on it. And I personally know neither.
I'm helping him connecting to a Mobile Backend as a Service, although the way he set up he is connecting it on behalf of someone loading his own web page with ASP.net (I think).
I'm connecting to the service just fine using Python. But he is getting a 422 server response. I would like to compare the request header & content difference between his and mine.
According to Chris Doggett's post on this page down below, you can't get the headers until the request is actually sent. However, as soon as request.GetResponse() is called, Visual Studio (or the Express, not sure) seems to just halt on a break point there and say there is a 422 error and some error message on the browser. So, he can't get to the next line where he wish to print out the headers.
Two questions:
Is that some sort of debugging turned on? I thought a 422 response is a response nevertheless and the program shouldn't just stop there.
How do I print out the the content as well, not just the headers? Preferably, I want to print out the entire request in text. There is this stuff sent in JSON and I don't think that belongs to the headers but I'm not so sure.
The Create method will return an HttpWebRequest for an http/https url. The 422 status code indicates that you are somehow sending incorrect formed data to the server. GetResponse() will throw a WebException because you don't receive the
status code 200.
To get the actual headers of the response you need to handle the exception
private static void Main(string[] args)
{
WebRequest request = WebRequest.Create("http://google.com/12345"); //generate 404
try
{
WebResponse response = request.GetResponse();
}
catch(WebException ex)
{
HttpWebResponse errorResponse = ex.Response as HttpWebResponse;
if (errorResponse == null)
throw; //errorResponse not of type HttpWebResponse
string responseContent = "";
using(StreamReader r = new StreamReader(errorResponse.GetResponseStream()))
{
responseContent = r.ReadToEnd();
}
Console.WriteLine("The server at {0} returned {1}", errorResponse.ResponseUri, errorResponse.StatusCode);
Console.WriteLine("With headers:");
foreach(string key in errorResponse.Headers.AllKeys)
{
Console.WriteLine("\t{0}:{1}", key, errorResponse.Headers[key]);
}
Console.WriteLine(responseContent);
}
Console.ReadLine();
}

Http Requests Timing Out in Production but not Dev

Below is an example of a website that when requested from my local dev environment returns ok but when requested from the production server, the request times out after 15 seconds. The request headers are exactly the same. Any ideas?
http://www.dealsdirect.com.au/p/wall-mounted-fish-tank-30cm/
Here's one thing that I wanna point beside what other stuff I've already provided. When you call GetResponse the object that's returned has to be disposed of ASAP. Otherwise stalling will occur, or rather the next call will block and possibly time out because there's a limit to the number of requests that can go through the HTTP request engine concurrently in System.Net.
// The request doesn't matter, it's not holding on to any critical resources
var request = (HttpWebRequest)WebRequest.Create(url);
// The response however is, these will eventually be reclaimed by the GC
// but you'll run into problems similar to deadlocks if you don't dispose them yourself
// when you have many of them
using (var response = request.GetResponse())
{
// Do stuff with `response` here
}
This is my old answer
This question is really hard to answer without knowing more about the specifics. There's no reason why the IIS would behave like this which leads me to conclude that the problem has to do with something you app is doing but I know nothing about it. If you can reproduce the problem with a debugger attached you might be able to track down where the problem is occuring but if you cannot do this first then there's little I can do to help.
Are you using the ASP.NET Development Server or IIS Express in development?
If this is an issue with proxies here's a factory method I use to setup HTTP requests that require where the proxy requires some authentication (though, I don't believe I ever received a time out):
HttpWebRequest CreateRequest(Uri url)
{
var request = (HttpWebRequest)WebRequest.Create(url);
request.Timeout = 120 * 1000;
request.CookieContainer = _cookieContainer;
if (UseSystemWebProxy)
{
var proxy = WebRequest.GetSystemWebProxy();
if (UseDefaultCredentials)
{
proxy.Credentials = CredentialCache.DefaultCredentials;
}
if (UseNetworkCredentials != null
&& UseNetworkCredentials.Length > 0)
{
var networkCredential = new NetworkCredential();
networkCredential.UserName = UseNetworkCredentials[0];
if (UseNetworkCredentials.Length > 1)
{
networkCredential.Password = UseNetworkCredentials[1];
}
if (UseNetworkCredentials.Length > 2)
{
networkCredential.Domain = UseNetworkCredentials[2];
}
proxy.Credentials = networkCredential;
}
request.Proxy = proxy;
}
return request;
}
Try this out Adrian and let me know how it goes.

What is the best way to download files via HTTP using .NET?

In one of my application I'm using the WebClient class to download files from a web server. Depending on the web server sometimes the application download millions of documents. It seems to be when there are lot of documents, performance vise the WebClient doesn't scale up well.
Also it seems to be the WebClient doesn't immediately close the connection it opened for the WebServer even after it successfully download the particular document.
I would like to know what other alternatives I have.
Update:
Also I noticed that for each and every download WebClient performs the authentication hand shake. I was expecting to see this hand shake once since my application only communicate with a single web server. Shouldn't the subsequent calls of the WebClient reuse the authentication session?
Update: My application also calls some web service methods and for these web service calls it seems to authentication session is reused. Also I'm using WCF to communicate with the web service.
I think you can still use "WebClient". However, you are better off using the "using" block as a good practice. This will make sure that the object is closed and is disposed off:-
using(WebClient client = new WebClient()) {
// Use client
}
I bet you are running into the default limit of 2 connections per server. Try running this code at the beginning of your program:
var cme = new System.Net.Configuration.ConnectionManagementElement();
cme.MaxConnection = 100;
System.Net.ServicePointManager.DefaultConnectionLimit = 100;
I have noticed the same behavior with the session in another project I was working on. To solve this "problem" I did use a static CookieContainer (since the session of the client is recognized by a value saved in a cookie).
public static class SomeStatics
{
private static CookieContainer _cookieContainer;
public static CookieContainer CookieContainer
{
get
{
if (_cookieContainer == null)
{
_cookieContainer = new CookieContainer();
}
return _cookieContainer;
}
}
}
public class CookieAwareWebClient : WebClient
{
protected override WebRequest GetWebRequest(Uri address)
{
WebRequest request = base.GetWebRequest(address);
if (request is HttpWebRequest)
{
(request as HttpWebRequest).CookieContainer = SomeStatics.CookieContainer;
(request as HttpWebRequest).KeepAlive = false;
}
return request;
}
}
//now the code that will download the file
using(WebClient client = new CookieAwareWebClient())
{
client.DownloadFile("http://address.com/somefile.pdf", #"c:\\temp\savedfile.pdf");
}
The code is just an example and inspired on Using CookieContainer with WebClient class and C# get rid of Connection header in WebClient.
The above code will close your connection immediately after the file is download and it will reuse the authentication.
WebClient is probably the best option. It doesn't close the connection straight away for a reason: so it can use the same connection again, without having to open a new one. If you find that it's not reusing the connection as expected, that's usually because you're not Close()ing the response from the previous request:
var request = WebRequest.Create("...");
// populate parameters
var response = request.GetResponse();
// process response
response.Close(); // <-- make sure you don't forget this!

Categories