Ignore invalid SSL (version 3) certificate with XmlTextReader https request - c#

I am trying to create an XML document from an https web request, but I am having trouble getting it to work when the site has an invalid certificate. I want my application to not care about the certificate, I want it to live it's life on the edge without fear!
Here is the initial code I had, which I have used before may times to get what I want from a standard http (non SSL) request:
XmlDocument xml = new XmlDocument();
XmlTextReader reader = new XmlTextReader("https://www.example.com");
xml.Load(reader);
With the site having an invalid SSL certificate I am now getting the following error:
The request was aborted: Could not create SSL/TLS secure channel.
Now I have done my Google-ing and tried a number of promising solutions but it seems to be of no help.
One I tried here on SO looked good but didn't seem to work, I added the 'accepted answer' line of code directly before my code above as it wasn't too clear as where it should go.
In case it makes any difference, my code is in a class library and I am testing via a console app. I am also using .Net 4.
Here is my latest attempt (which does not work):
XmlDocument xml = new XmlDocument();
ServicePointManager.ServerCertificateValidationCallback += new System.Net.Security.RemoteCertificateValidationCallback((s, ce, ch, ssl) => true);
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(urlCommand);
request.Credentials = CredentialCache.DefaultCredentials;
using (HttpWebResponse response = (HttpWebResponse)request.GetResponse())
{
using (Stream receiveStream = response.GetResponseStream())
{
XmlTextReader reader = new XmlTextReader(receiveStream);
xml.Load(reader);
}
}

OK so I have found the solution. We had opted to try and disable the SSL on the server for testing and noticed it was using SSL 3. After another Google search I found some additional code to fix the issue (important to set the SecurityProtocolType):
ServicePointManager.SecurityProtocol = SecurityProtocolType.Ssl3;
ServicePointManager.ServerCertificateValidationCallback += new System.Net.Security.RemoteCertificateValidationCallback((s, ce, ch, ssl) => true);
XmlDocument xml = new XmlDocument();
XmlTextReader reader = new XmlTextReader(urlCommand);
xml.Load(reader);

Hm, maybe the XmlTextReader uses a different way of accessing the HTTPS.
Try making the web request with a HttpWebRequest, and pass response.GetResponseStream() to the XML text reader (and leave the ServicePointManager.ServerCertificateValidationCallback override where you had it)

Reason for this error:
You are not using valid client certificate on your website.
You could try below:
For quick turn around you could try access the URL with http://_____ * - it's not recommended but for testing you could try.*
You are not using valid client certificate hence, you could write something like below:
Add this line where you are requesting to download XML or making some http request.
//Add Mock certificate validation
ServicePointManager.ServerCertificateValidationCallback = new RemoteCertificateValidationCallback(OnValidationCallback);
Add below as a global method:
public static bool OnValidationCallback(object sender, X509Certificate cert, X509Chain chain, SslPolicyErrors errors)
{
return true;
}

Related

How to detect SSL Policy errors with .net?

I'm totally new to handling policy errors when making web requests so I'm a little bit confused at this point..
I have this task to call a web service but not allow the call to be made if the server I'm calling has an invalid certificate.
So I created a method to call a site with invalid cerificate and using ServerCertificateValidationCallback to prevent the call to be made if the certificate is invalid.
What I need is a quick walkthrough in how to detect the invalid certificate inside my handler. I would have thought that the call to "revoked.badssl.com" would have caused something in the sslPolicyErrors object to be something other than "none" but is this not the case? I see no difference at this point in calling badssl or my other url that has a valid certificate.
For example: if https://pinning-test.badssl.com/ is opened in chrome it shows a "ERR_SSL_PINNED_KEY_NOT_IN_CERT_CHAIN" (although IE shows the page). How do I find this information that chrome deems as an invalid certificate so I can, if I want to, also handle it as invalid in my code??
This is my code I'm trying with at the moment:
ServicePointManager.ServerCertificateValidationCallback += (sender, cert, chain, sslPolicyErrors) =>
{
if(someError?!)
return false;
return true;
};
using (HttpClient client = DefaultHttpClient())
{
Uri uri = new Uri("https://revoked.badssl.com/");
string jsonObj = "{}";
var content = new StringContent(jsonObj, Encoding.UTF8, "application/json");
HttpResponseMessage response = client.PostAsync(uri, content).Result;
}
By default the revocation check is not performed. You need to set it on the ServicePointManger Class for your application to check it.
System.Net.ServicePointManager.CheckCertificateRevocationList = true;

How to understand the difference between Windows/.NET and Linux/Mono with a .NET WebRequest and a PFX certificate?

I'm currently working with an API that uses client certificate authentication. And I have a simple block of code that works under Linux/Mono. When executing under Windows/.NET, I receive a 200, but the response content hints that I need a certificate to make this call.
ServicePointManager.ServerCertificateValidationCallback = (sender, certificate, chain, sslPolicyErrors) => true;
var x509 = new X509Certificate2("foo.pfx", "test");
var request = (HttpWebRequest)WebRequest.Create("https://domain.com:8081");
request.Method = "POST";
request.ClientCertificates.Add(x509);
const string data = "{\"foo\":\"bar\"}";
var postdata = Encoding.ASCII.GetBytes(data);
request.ContentLength = data.Length;
var myStream = request.GetRequestStream();
myStream.Write(postdata, 0, postdata.Length);
var response = (HttpWebResponse)request.GetResponse();
Console.WriteLine(new StreamReader(response.GetResponseStream()).ReadToEnd());
The same foo.pfx is used in both cases. Does anyone know how I can explain the difference in results?
Is there some redirection ?
If yes, MSDN says:
The Authorization header is cleared on auto-redirects and HttpWebRequest automatically tries to re-authenticate to the redirected location. In practice, this means that an application can't put custom authentication information into the Authorization header if it is possible to encounter redirection. Instead, the application must implement and register a custom authentication module. The System.Net.AuthenticationManager and related class are used to implement a custom authentication module. The AuthenticationManager.Register method registers a custom authentication module.

How to port a curl command to RestSharp? How to troubleshoot?

I have some working curl commands, to a web service, and now I want to move them to a C# program. I am using RestSharp, and trying with the simplest of the web service calls, but just keep getting a generic error message, and I am a bit stumped how to troubleshoot it.
Is there a way to see the headers, and exact URL, that is being sent, and the headers being received?
The curl example is basically this:
curl --user user:pw https://example.com/api/version
And my C# code is:
var client = new RestClient("https://example.com");
client.Authenticator = new HttpBasicAuthenticator("user", "pw");
var request = new RestRequest ("api/version");
var response = client.Execute(request);
Console.WriteLine (response.Content);
Console.WriteLine (response.StatusCode);
Console.WriteLine (response.ErrorMessage);
This gives me:
RestSharp.RestRequest
0
Error getting response stream (Write: The authentication or decryption has failed.): SendFailure
I am using Mono, on Linux. Would that be related? But I could find a few (more advanced) questions with the mono tag on StackOverflow, so it should work. (?)
If it was actually a problem with the username/password, I would get a 403 status, instead of a zero status, I assume?
P.S. In case it matters, the rest of my script is:
using System;
using System.Net;
using RestSharp;
namespace webtest
{
class MainClass
{
public static void Main (string[] args)
{
...(above code)
}
}
}
Regarding troubleshooting
So far I can suggest:
Try commenting out the Authenticator line to see if anything changes (in my case it did not)
Try http://google.com
Try https://google.com
That was enough for me to see that http URLs work, https URLs fail.
(If you need more troubleshooting, and are using https, the sender parameter shown below contains various fields about the request being sent to the remote server.)
Regarding porting curl commands
By default curl on linux uses the certificates it finds in /etc/ssl/certs. The blanket equivalent for Mono is to do mozroots --import --ask-remove, which will import all certificates (see Mono security FAQ).
Another way to do it is by putting this at the very top of your program:
ServicePointManager.ServerCertificateValidationCallback +=
(sender, certificate, chain, sslPolicyErrors) => {
//Console.WriteLine(certificate.ToString());
return true;
};
The commented line can be used to report the certificate to the user, interactively get their approval, or to check the certificate fingerprint against the expected one. By simply returning true it means all certificates are trusted and unchecked.
Bonus: Cert checks
Here is one way to check for a specific certificate:
ServicePointManager.ServerCertificateValidationCallback +=
(sender,certificate,chain,sslPolicyErrors) => {
if(((System.Net.HttpWebRequest)sender).Host.EndsWith("google.com") ){
if(certificate.GetCertHashString() == "83BD2426329B0B69892D227B27FD7FBFB08E3B5E"){
return true;
}
Console.WriteLine("Uh-oh, google.com cert fingerprint ({0}) is unexpected. Cannot continue.",certificate.GetCertHashString());
return false;
}
Console.WriteLine("Unexpected SSL host, not continuing.");
return false;
}

How to perform a fast web request in C#

I have a HTTP based API which I potentially need to call many times. The problem is that I can't get the request to take less than about 20 seconds, though the same request made through a browser is near instantaneous. The following code illustrates how I have implemented it so far.
WebRequest r = HttpWebRequest.Create("https://example.com/http/command?param=blabla");
var response = r.GetResponse();
One solution would be to make an asynchronous request but I would like to know why it takes so long and if I can avoid it. I have also tried using the WebClient class but I suspect it uses a WebRequest internally.
Update:
Running the following code took about 40 seconds in Release Mode (measured with Stopwatch):
WebRequest g = HttpWebRequest.Create("http://www.google.com");
var response = g.GetResponse();
I'm working at a university where there might be different things in the network configuration affecting the performance, but the direct use of the browser illustrates that it should be near instant.
Update 2:
I uploaded the code to a remote machine and it worked fine so the conclusion must be that the .NET code does something extra compared to the browser or it has problems resolving the address through the university network (proxy issues or something?!).
This problem is similar to another post on StackOverflow:
Stackoverflow-2519655(HttpWebrequest is extremely slow)
Most of the time the problem is the Proxy server property. You should set this property to null, otherwise the object will attempt to search for an appropriate proxy server to use before going directly to the source. Note: this property is turn on by default, so you have to explicitly tell the object not to perform this proxy search.
request.Proxy = null;
using (var response = (HttpWebResponse)request.GetResponse())
{
}
I was having the 30 second delay on 'first' attempt - JamesR's reference to the other post mentioning setting proxy to null solved it instantly!
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(_site.url);
request.Proxy = null; // <-- this is the good stuff
...
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
Does your site have an invalid SSL cert? Try adding this
ServicePointManager.ServerCertificateValidationCallback = new System.Net.Security.RemoteCertificateValidationCallback(AlwaysAccept);
//... somewhere AlwaysAccept is defined as:
using System.Security.Cryptography.X509Certificates;
using System.Net.Security;
public bool AlwaysAccept(object sender, X509Certificate certification, X509Chain chain, SslPolicyErrors sslPolicyErrors)
{
return true;
}
You don't close your Request. As soon as you hit the number of allowed connections, you have to wait for the earlier ones to time out. Try
using (var response = g.GetResponse())
{
// do stuff with your response
}

Difficulty with BugzScout.net from behind a proxy

I'm attempting to use Fogbugz's BugzScout in order to automatically submit unhanded application exceptions to my Fogbugz on demand Account. I've written up a wrapper class for it and everything appears to be just groovy - on my box. Testing the same code in the production environment, behind a Proxy that requires authentication, I have had nothing but issues.
I went to work modifying the BugzScout code in order to get it to authenticate with the Proxy, and after trying many different methods suggested via a Google search, found one that works! But now I'm getting an "Connection actively refused" error from Fogbugz itself, and I don't know what to do.
Here is the code where the BugzScout connects via a .net WebClient to submit a new case, with my modifications to deal with our Proxy. What am I doing that would cause Fogbugz to refuse my request? I've removed all non web-client related code from the procedure for ease of reading.
public string Submit(){
WebClient client = new WebClient();
WebProxy proxy = new WebProxy();
proxy.UseDefaultCredentials = true;
client.Proxy = proxy;
Byte[] response = client.DownloadData(fogBugzUrl);
string responseText = System.Text.Encoding.UTF8.GetString(response);
return (responseText == "") ? this.defaultMsg : responseText;
}
The url is correct and the case is filled in properly- this has been verified.
EDIT: Additional info.
Using Fogbugz on Demand.
Using FogBugz.net code in it's entirety, with only these additions
WebProxy proxy = new WebProxy();
proxy.UseDefaultCredentials = true;
client.Proxy = proxy;
Error occurs when attempting to connect to both https://oursite.fogbugz.com/scoutsubmit.asp and http://oursite.fogbugz.com//scoutsubmit.asp (except one says port 443, and the other port 80, obviously)
I don't know anything about web authentication so I can't tell you what kind I'm using- if you tell me where to look I'd be happy to answer that for you.
Got the fix from Fogbugz- this is the appropriate network code to get though the proxy authentication and not mis-authenticate with Bugzscout.
WebRequest.DefaultWebProxy.Credentials = CredentialCache.DefaultNetworkCredentials;
WebRequest request = WebRequest.Create(fogBugzUrl);
request.ContentType = "application/x-www-form-urlencoded";
request.Method = "POST";
request.Proxy.Credentials = CredentialCache.DefaultNetworkCredentials;
Stream requestStream = request.GetRequestStream();
requestStream.Write(bytes, 0, bytes.Length);
requestStream.Close();
Is your fogbugzUrl using HTTP Basic Authentication? Is it SSL (hosted on On Demand?)
The connection actively refused message would be coming from the web server itself, not really FogBugz.
Can you post the HTTP Status Code?
One thing to note if you are using FogBugz On Demand is you HAVE to use the https:// url (not the http url).

Categories