C# Check If Xampp Server/Localhost is Running - c#

I'm using PHP to develop a freight application but it has to be a WindowsForm application. Using visual C#, I'm displaying webpages using webBrowser and the files are on the local server. Xampp needs to start all services before my application runs so it read php,etc. This works well.
The problem is. If I don't run Xampp services the browser will show an error "Navigation to the webpage was canceled". The reason for this is obvious(no server running).
So what I would like to do in C# is check if the server is running or check if http://127.0.0.1/site is available. If it is not then display an error messgaeBox and if it is available, do nothing else.
But I don't want the employees to see this message so If xampp services aren't running or localhost is not accessible I want to show a message box instead of attempting to load the page.

You probably want to craft a web request in .Net and read the response code, use conditional logic to then choose your action.
WebRequest request = WebRequest.Create ("http://127.0.0.1/site");
// If required by the server, set the credentials.
request.Credentials = CredentialCache.DefaultCredentials;
// Get the response.
HttpWebResponse response = (HttpWebResponse)request.GetResponse ();
if(response.StatusCode == HttpStatusCode.NotFound) {
// show messagebox here
}
Documentation that is pertinent WebRequest Class Documentation
My code is untested but it should do what you need.

With help from Daniel Lane I got it to work.
using System.IO;
using System.Web;
using System.Net;
WebRequest request = WebRequest.Create("http://127.0.0.1/site");
try
{
using (WebResponse response = request.GetResponse())
{
//Load Webpage
}
}
catch (WebException erf)
{
using (WebResponse response = erf.Response)
{
var errorForm = new error();
errorForm.Show();
this.Close();
}
}
From "WebRequest..." is in an event(load).
Here are the results:

Use HttpWebRequest to try to pull some page from your website and if successful you can also check the StatusCode on the HttpWebResponse. Should be 200 or similar.

Related

Why does my WebClient return a 404 error most of the time, but not always?

I want to get information about a Microsoft Update in my program. However, the server returns a 404 error at about 80 % of the time. I boiled the problematic code down to this console application:
using System;
using System.Net;
namespace WebBug
{
class Program
{
static void Main(string[] args)
{
while (true)
{
try
{
WebClient client = new WebClient();
Console.WriteLine(client.DownloadString("https://support.microsoft.com/api/content/kb/3068708"));
}
catch (Exception ex)
{
Console.WriteLine(ex.Message);
}
Console.ReadKey();
}
}
}
}
When I run the code, I have to get through the loop a few times until I get an actual response:
The remote server returned an error: (404) Not found.
The remote server returned an error: (404) Not found.
The remote server returned an error: (404) Not found.
<div kb-title title="Update for customer experience and diagnostic telemetry [...]
I can open and force refresh (Ctrl + F5) the link in my browser as often as I want to, but it'll show fine.
The problem occurs on two different machines with two different internet connections.
I've also tested this case using the Html Agility Pack, but with the same result.
The problem does not occur with other websites. (The root https://support.microsoft.com works fine 100 % of the time)
Why do I get this weird result?
Cookies. It's because of cookies.
As I started to dig into this problem I noticed that the first time I opened the site in a new browser I got a 404, but after refreshing (sometimes once, sometimes a few times) the site continued to work.
That's when I busted out Chrome's Incognito mode and the developer tools.
There wasn't anything too fishy with the network: there was a simple redirect to the https version if you loaded http.
But what I did notice was the cookies changed. This is what I see the first time I loaded the page:
and here's the page after a (or a few) refreshes:
Notice how a few more cookie entries got added? The site must be trying to read those, not finding them, and "blocking" you. This might be a bot-prevention device or bad programming, I'm not sure.
Anyways, here's how to make your code work. This example uses the HttpWebRequest/Response, not WebClient.
string url = "https://support.microsoft.com/api/content/kb/3068708";
//this holds all the cookies we need to add
//notice the values match the ones in the screenshot above
CookieContainer cookieJar = new CookieContainer();
cookieJar.Add(new Cookie("SMCsiteDir", "ltr", "/", ".support.microsoft.com"));
cookieJar.Add(new Cookie("SMCsiteLang", "en-US", "/", ".support.microsoft.com"));
cookieJar.Add(new Cookie("smc_f", "upr", "/", ".support.microsoft.com"));
cookieJar.Add(new Cookie("smcexpsessionticket", "100", "/", ".microsoft.com"));
cookieJar.Add(new Cookie("smcexpticket", "100", "/", ".microsoft.com"));
cookieJar.Add(new Cookie("smcflighting", "wwp", "/", ".microsoft.com"));
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(url);
//attach the cookie container
request.CookieContainer = cookieJar;
//and now go to the internet, fetching back the contents
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
using(StreamReader sr = new StreamReader(response.GetResponseStream()))
{
string site = sr.ReadToEnd();
}
If you remove the request.CookieContainer = cookieJar;, it will fail with a 404, which reproduces your issue.
Most of the legwork for the code example came from this post and this post.

c# - httpwebrequest status code returns 200 instead of 404

I'm trying this code..
HttpWebRequest request = (HttpWebRequest)HttpWebRequest.Create("http://www.goo4le.com/");
request.Method = "HEAD";
using (HttpWebResponse response = request.GetResponse() as HttpWebResponse)
{
Console.Write((int)response.StatusCode);
}
goo4le is a non existing domain. So its supposed to return 404. Instead it returns 200 status.
I think its because of my broadband provider using a custom 404 page.
This is what i see when i enter goo4le.com in my browser.
Can someone tell me how to get the real http status instead of my browser status?
I actually don't get any status code when running this, I get a DNS error saying I cant lookup the domain.
I imagine you are exactly right about the ISP, they may be doing this via a DNS redirection given you dont get this error. You could solve this by using a DNS server other than the one your ISP provides, try googles 8.8.8.8, 8.8.4.4 (https://developers.google.com/speed/public-dns/)
This from their FAQs
How is Google Public DNS different from my ISP's DNS service or other
open DNS resolvers? How can I tell if it is better?
Open resolvers and
your ISP all offer DNS resolution services. We invite you to try
Google Public DNS as your primary or secondary DNS resolver along with
any other alternate DNS services. There are many things to consider
when identifying a DNS resolver that works for you, such as speed,
reliability, security, and validity of responses. Unlike Google Public
DNS, some ISPs and open resolvers block, filter, or redirect DNS
responses.
How does Google Public DNS handle non-existent domains?
If
you issue a query for a domain name that does not exist, Google Public
DNS always returns an NXDOMAIN record, as per the DNS protocol
standards. The browser should show this response as a DNS error. If,
instead, you receive any response other than an error message (for
example, you are redirected to another page), this could be the result
of the following: A client-side application such as a browser plug-in
is displaying an alternate page for a non-existent domain. Some ISPs
may intercept and replace all NXDOMAIN responses with responses that
lead to their own servers. If you are concerned that your ISP is
intercepting Google Public DNS requests or responses, you should
contact your ISP.
You can try to disable redirecting, unfortunatelly i can't test it, since i got another provider (if that's the problem).
HttpWebRequest request = (HttpWebRequest)HttpWebRequest.Create("http://www.goo4le.com/");
request.Method = "HEAD";
request.AllowAutoRedirect = false;
using (HttpWebResponse response = request.GetResponse() as HttpWebResponse)
{
Console.Write((int)response.StatusCode);
}
if it is working, it will most likely throw an exception, since everything that returns a statuscode that's not 200 throws one......
if you wanna catch it try this:
try
{
HttpWebRequest request = (HttpWebRequest)HttpWebRequest.Create("http://www.goo4le.com/");
request.Method = "HEAD";
request.AllowAutoRedirect = false;
using (HttpWebResponse response = request.GetResponse() as HttpWebResponse)
{
Console.Write((int)response.StatusCode);
}
}
catch (WebException e)
{
// in this case it was a status code exception (not status 200...)
if (e.Response != null) Console.Write((int)e.Response.StatusCode);
else throw;
}

Why would C# HttpWebRequest return 500 error on ResponseStream, but not with PHP?

I would be very grateful if anyone could help me out with this problem. I've got some C# code which reads in the contents of a web page for parsing later on. The code is:
private StringReader ReadInUrl(string url)
{
string result = string.Empty;
System.Net.HttpWebRequest request = (System.Net.HttpWebRequest)System.Net.WebRequest.Create(url);
request.Method = "GET";
using (var stream = request.GetResponse().GetResponseStream())
using (var reader = new StreamReader(stream, Encoding.UTF8))
{
result = reader.ReadToEnd();
}
return new StringReader(result);
}
The code works fine with most pages, but throws a 'The remote server returned an error: (500) Internal Server Error.' with some pages. An example of a page that throws the error would be : http://www.thehut.com/blu-ray/harry-potter-collection-years-1-6/10061821.html
The thing that confuses me is that I can view the page fine using a webbrowser, and I can also grab the contents of the file using PHP fopen and fread, and then parse it in PHP.
I really need to be able to do this in C# and I'm stumped as to why it is happening. If any one could let me know why I can read in the page using PHP and not C#, and whether there is a setting in C# that could get round this issue? Any answers gratefully received!
The web site drops requests that doesn't specify a user agent. So you need to specify it. Also I would recommend you using WebClient instead of HttpWebRequest, HttpWebResponse, StreamReader, StringReader and company:
class Program
{
public static void Main()
{
using (var client = new WebClient())
{
client.Headers[HttpRequestHeader.UserAgent] = "Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US; rv:1.9.2.13) Gecko/20101203 Firefox/3.6.13";
string result = client.DownloadString("http://www.thehut.com/blu-ray/harry-potter-collection-years-1-6/10061821.html");
Console.WriteLine(result);
}
}
}
it's kinda shorter and works.
I suspect that PHP is including some header that WebRequest doesn't include by default - and the server is failing to handle it. I've just reproduced this myself, and it really is an internal server error at thehut.com. Here's the server-side exception as far as it's shown in the HTML that's returned:
org.apache.jasper.JasperException: java.lang.NullPointerException
org.apache.jasper.servlet.JspServletWrapper.handleJspException(JspServletWrapper.java:486)
org.apache.jasper.servlet.JspServletWrapper.service(JspServletWrapper.java:416)
org.apache.jasper.servlet.JspServlet.serviceJspFile(JspServlet.java:342)
org.apache.jasper.servlet.JspServlet.service(JspServlet.java:267)
javax.servlet.http.HttpServlet.service(HttpServlet.java:717)
org.apache.jasper.runtime.JspRuntimeLibrary.include(JspRuntimeLibrary.java:968)
org.apache.jsp.hut.errors.error_jsp._jspService(error_jsp.java:71)
org.apache.jasper.runtime.HttpJspBase.service(HttpJspBase.java:70)
javax.servlet.http.HttpServlet.service(HttpServlet.java:717)
org.apache.jasper.servlet.JspServletWrapper.service(JspServletWrapper.java:374)
org.apache.jasper.servlet.JspServlet.serviceJspFile(JspServlet.java:342)
org.apache.jasper.servlet.JspServlet.service(JspServlet.java:267)
javax.servlet.http.HttpServlet.service(HttpServlet.java:717)
com.thehut.elysium.filter.SiteFilter.forwardToErrorPage(Unknown Source)
com.thehut.elysium.filter.SiteFilter.doFilter(Unknown Source)
com.thehut.elysium.filter.SlowRequestFilter.doFilter(Unknown Source)
com.thehut.elysium.filter.SetCharacterEncodingFilter.doFilter(Unknown Source)
Not terribly helpful, but it does basically confirm that it's a server-side issue - and it certainly sounds suspiciously like a header which the servlet assumes will be present.
You could try making the same request from your PHP code, and see what headers it uses (using Wireshark). Add those headers to the .NET WebRequest one at a time, and see what it needs before it starts working.

What is the best way to download files via HTTP using .NET?

In one of my application I'm using the WebClient class to download files from a web server. Depending on the web server sometimes the application download millions of documents. It seems to be when there are lot of documents, performance vise the WebClient doesn't scale up well.
Also it seems to be the WebClient doesn't immediately close the connection it opened for the WebServer even after it successfully download the particular document.
I would like to know what other alternatives I have.
Update:
Also I noticed that for each and every download WebClient performs the authentication hand shake. I was expecting to see this hand shake once since my application only communicate with a single web server. Shouldn't the subsequent calls of the WebClient reuse the authentication session?
Update: My application also calls some web service methods and for these web service calls it seems to authentication session is reused. Also I'm using WCF to communicate with the web service.
I think you can still use "WebClient". However, you are better off using the "using" block as a good practice. This will make sure that the object is closed and is disposed off:-
using(WebClient client = new WebClient()) {
// Use client
}
I bet you are running into the default limit of 2 connections per server. Try running this code at the beginning of your program:
var cme = new System.Net.Configuration.ConnectionManagementElement();
cme.MaxConnection = 100;
System.Net.ServicePointManager.DefaultConnectionLimit = 100;
I have noticed the same behavior with the session in another project I was working on. To solve this "problem" I did use a static CookieContainer (since the session of the client is recognized by a value saved in a cookie).
public static class SomeStatics
{
private static CookieContainer _cookieContainer;
public static CookieContainer CookieContainer
{
get
{
if (_cookieContainer == null)
{
_cookieContainer = new CookieContainer();
}
return _cookieContainer;
}
}
}
public class CookieAwareWebClient : WebClient
{
protected override WebRequest GetWebRequest(Uri address)
{
WebRequest request = base.GetWebRequest(address);
if (request is HttpWebRequest)
{
(request as HttpWebRequest).CookieContainer = SomeStatics.CookieContainer;
(request as HttpWebRequest).KeepAlive = false;
}
return request;
}
}
//now the code that will download the file
using(WebClient client = new CookieAwareWebClient())
{
client.DownloadFile("http://address.com/somefile.pdf", #"c:\\temp\savedfile.pdf");
}
The code is just an example and inspired on Using CookieContainer with WebClient class and C# get rid of Connection header in WebClient.
The above code will close your connection immediately after the file is download and it will reuse the authentication.
WebClient is probably the best option. It doesn't close the connection straight away for a reason: so it can use the same connection again, without having to open a new one. If you find that it's not reusing the connection as expected, that's usually because you're not Close()ing the response from the previous request:
var request = WebRequest.Create("...");
// populate parameters
var response = request.GetResponse();
// process response
response.Close(); // <-- make sure you don't forget this!

Test if a website is alive from a C# application

I am looking for the best way to test if a website is alive from a C# application.
Background
My application consists of a Winforms UI, a backend WCF service and a website to publish content to the UI and other consumers. To prevent the situation where the UI starts up and fails to work properly because of a missing WCF service or website being down I have added an app startup check to ensure that all everything is alive.
The application is being written in C#, .NET 3.5, Visual Studio 2008
Current Solution
Currently I am making a web request to a test page on the website that will inturn test the web site and then display a result.
WebRequest request = WebRequest.Create("http://localhost/myContentSite/test.aspx");
WebResponse response = request.GetResponse();
I am assuming that if there are no exceptions thown during this call then all is well and the UI can start.
Question
Is this the simplest, right way or is there some other sneaky call that I don't know about in C# or a better way to do it.
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
if (response == null || response.StatusCode != HttpStatusCode.OK)
As #Yanga mentioned, HttpClient is probably the more common way to do this now.
HttpClient client = new HttpClient();
var checkingResponse = await client.GetAsync(url);
if (!checkingResponse.IsSuccessStatusCode)
{
return false;
}
While using WebResponse please make sure that you close the response stream ie (.close) else it would hang the machine after certain repeated execution.
Eg
HttpWebRequest req = (HttpWebRequest)WebRequest.Create(sURL);
HttpWebResponse response = (HttpWebResponse)req.GetResponse();
// your code here
response.Close();
from the NDiagnostics project on CodePlex...
public override bool WebSiteIsAvailable(string Url)
{
string Message = string.Empty;
HttpWebRequest request = (HttpWebRequest)HttpWebRequest.Create(Url);
// Set the credentials to the current user account
request.Credentials = System.Net.CredentialCache.DefaultCredentials;
request.Method = "GET";
try
{
using (HttpWebResponse response = (HttpWebResponse)request.GetResponse())
{
// Do nothing; we're only testing to see if we can get the response
}
}
catch (WebException ex)
{
Message += ((Message.Length > 0) ? "\n" : "") + ex.Message;
}
return (Message.Length == 0);
}
We can today update the answers using HttpClient():
HttpClient Client = new HttpClient();
var result = await Client.GetAsync("https://stackoverflow.com");
int StatusCode = (int)result.StatusCode;
Assuming the WCF service and the website live in the same web app, you can use a "Status" WebService that returns the application status. You probably want to do some of the following:
Test that the database is up and running (good connection string, service is up, etc...)
Test that the website is working (how exactly depends on the website)
Test that WCF is working (how exactly depends on your implementation)
Added bonus: you can return some versioning info on the service if you need to support different releases in the future.
Then, you create a client on the Win.Forms app for the WebService. If the WS is not responding (i.e. you get some exception on invoke) then the website is down (like a "general error").
If the WS responds, you can parse the result and make sure that everything works, or if something is broken, return more information.
You'll want to check the status code for OK (status 200).
Solution from: How do you check if a website is online in C#?
var ping = new System.Net.NetworkInformation.Ping();
var result = ping.Send("https://www.stackoverflow.com");
if (result.Status != System.Net.NetworkInformation.IPStatus.Success)
return;

Categories