C# Enforcing HttpWebRequest to use Tls12 instead of SSLv3 - c#

I have App that makes use of some web service and acquire data via JSON, all was working fine for quite long time, up until latest discoveries about SSLv3 being vulnerable to man-in-the-middle attacks and server owners turning off SSLv3 for good. My application started to have problems connecting and returned error "Request was aborted: cannot establish secure SSL/TLS connection". I've tried to look for solution and found information i got to add this code before creating web request:
ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls12;
ServicePointManager.ServerCertificateValidationCallback = delegate{
return true;
};
Unfortunately no luck here, app acts the same as before, and I have no clue if this code does nothing or there is still some problem with server. Error information is pretty vague and i have problem figuring where things go wrong.
Here is my code
...
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(url);
request.ContentType = GetRequestContentType();
request.Method = method.ToString();
request.Credentials = GetCredential(url);
request.PreAuthenticate = true;
CookieContainer cookieContainer = new CookieContainer();
request.CookieContainer = cookieContainer;
...
I want to ask how to set Tls12 to be used as default and ensure that at my end request I make is with desired protocol.
If I confirm that my app at my end works fine, is there way to get more detailed information from server response and pinpoint precise reason of error?
Thanks for all answers and suggestions.
EDIT
Second part of question is solved, I found this tool http://www.telerik.com/download/fiddler it pretty much allows to see what is going on with outgoing and incoming data. There is also thing that this tool allow to decode SSL connections, enabling this option makes that my application starts to work. I assume that this app does something that make communication between my app and destination host possible. But i do still have no idea what it could be. And how to make my app to handle these connections properly by itself.

Being desperate made me to inspect whole source code (part responsible for getting data of the internet was 3rd party and until it worked fine there was no reason to change it) and I discovered that line
request.Credentials = GetCredential(url);
called method that in its body had
ServicePointManager.SecurityProtocol = SecurityProtocolType.Ssl3;
So all my attempts to change that value before creating httpwebrequest was overwritten. Changing SecurityProtocolType to Tls12 makes it all work now.

Related

Sending post to creation factory on RTC but getting a 'GET' response

I've hit a strange bit of behaviour and I'm pretty sure is related to my code rather than the RTC instance I'm working with.
I've got a web request setup and configured:
var cookies = new CookieContainer();
var request = (HttpWebRequest)WebRequest.Create(getCreationFactoryUri);
var xmlString = getRDF.ToString();
request.CookieContainer = cookies;
request.Accept = "application/rdf+xml";
request.Method = "POST";
request.ContentType = "application/rdf+xml";
request.Headers.Add("OSLC-Core-Version", "2.0");
request.Timeout = 40000;
request.KeepAlive = true;
byte[] bytes = Encoding.ASCII.GetBytes(xmlString);
request.ContentLength = bytes.Length;
Stream dataStream = request.GetRequestStream();
dataStream.Write(bytes, 0, bytes.Length);
dataStream.Close();
This is passed to another method wrote based on an RTC example using forms authentication for RTC.
Under the OSLC v2 spec, I'm using a creation factory URL to post to. I know the URL is fine because I've setup a call using RESTClient in Firefox. Added the headers that are needed (Content-Type: application/rdf+xml, Accept: application/rdf+xml, OSLC-Core-Version: 2.0) and used the generated XML that my code is trying to pass. My manual call works perfectly and the ticket is created.
In my logs I captured the response from RTC, which is a list of tickets rather than a response showing my ticket as being created. I can re-create this behavior by doing a GET on the creation factory URL I'm using to create an event ticket.
So although I know I'm sending a POST to the creation factory (I debugged to check that my web request method was 100% set to 'POST') RTC instead returns a list of tickets and I can only conclude somewhere my request is treated as a 'GET'.
As a test I changed my request to use PUT instead of POST. This isn't permitted for use on the creation factory URL and in testing it indeed throws an error. So I'm totally miffed as to why RTC isn't creating my ticket, but instead treating my request as a GET and returning a list of tickets.
Anyone have any ideas?
Thanks.
If the server using form authentication, as you state, then I expect what is happening is that the POST is resulting in an HTTP redirection to the authentication form. Even if your other code is handling that authentication (which it sounds like it is), the result of that authentication will be an HTTP redirection to the URL of the original request. However, that redirection is likely to result in a GET to that URL, not the original POST. (Also, I don't believe the redirection after authentication is 100% reliable, if your requests are multi-threaded).
The jazz.net information on form authentication says "After authentication succeeded, you always have to replay the original request at least once to get to the protected resource. More replays may be required if the first replay led to another set of redirections and the original request had a non-GET method."
So if your code received an authentication challenge, you will need to re-send the original POST.
I believe the reason why the RESTClient plug-in in your browser is working first time is that it is sending the cookies from your previous log-in to the RTC web UI in the browser. (I had this experience recently, and also found it very confusing).
Also, if you are not preserving cookies between requests to RTC in your client app, then you will meet an auth challenge for every request. If you preserve cookies between calls from your client app (how you do that will depend on your client library - I'm not familiar with the code in your examples) then my experience is that you won't receive an auth challenge for every request. (However, you still need to be able to handle an auth challenge on every request - including POSTs - otherwise it may fail intermittently if the session times out just before you send a POST).

How to use proxy like browser OR CredentialCache.DefaultCredentials different between XP and 7

I am able to fix a problem with a client where they cannot authenticate through a proxy doing the following:
var proxy = WebRequest.GetSystemWebProxy();
proxy.Credentials = CredentialCache.DefaultNetworkCredentials;
service.Proxy = proxy;
This works fine for Windows XP, however on Windows 7 I get a 407 (proxy not authenticated exception). Does anybody know what the difference is, and more importantly, what I need to do to get this to work on both OS?
UPDATE
I am having the users check the following:
In the registry editor, can you go to HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows NT\CurrentVersion\Winlogon and let me know what the value is for CachedLogonsCount. f
In the Start box, type in Group Policy and an option to Edit Group Policy should pop up, click on it. Then go to Computer Configuration\Administrative Templates\System\User Profiles\Delete cached copies of roaming profiles and let me know if it is configured, and if so, to what is it set?
UPDATE FOR BOUNTY
So, I added the bounty. I can take a solution from here, or just an alternate means to getting through a proxy on Windows 7...
Another Update
I am not sure if this is useful or not, but we are also doing the following:
service.PreAuthenticate = true;
service.Url = "myurl";
service.Credentials = new NetworkCredential(txt_UserName.Text, txt_Password.Text);
My temporary solution
This is not really a solution, but works for now. I am using the app.config and setting the proxy to be default, with a ByPassList so that the proxy is not even used. This is only doable since the proxy does not have a strong firewall currently. For other clients, I need to get the above to work
This piece of code works for me on XP, Win7 and 2008
var webProxy = new WebProxy(WebRequest.DefaultWebProxy.GetProxy(new Uri({TheURLoftheService})));
webProxy.Credentials = CredentialCache.DefaultCredentials;
webProxy.UseDefaultCredentials = true;
service.Proxy = webProxy;
actually looks like they "fixed" it in Win7 :) Can you confirm that both client and server are specifying http 1.1
Now let's discuss as to why the browser works in this scenario. IE
uses WinINet under the hood rather than WinHTTP. If we look at the
network traces we see that IE sends HTTP/1.1, but the proxy replies
with HTTP/1.0. IE still accepts this behavior, because in the internet
scenario there are countless number of clients and servers which still
use HTTP/1.0.
WinHTTP strictly requires HTTP/1.1 compliance for keeping the
connection alive and HTTP Keep-Alives are not supported in HTTP/1.0
protocol. HTTP Keep-Alive feature was introduced in the HTTP/1.1
protocol as per RFC 2616. The server or the proxy which expects the
keep-alive should also implement the protocol correctly. WinHTTP on
Windows 7, Windows 2008 R2 are strict in terms of security wrto
protocol compliance. The ideal solution is to change the server/proxy
to use the right protocol and be RFC compliant.
http://blogs.msdn.com/b/httpcontext/archive/2012/02/21/changes-in-winhttp-on-windows-7-and-onwards-wrto-http-1-0.aspx
Will this work?
I am using this to set proxy, so far we did not encounter an error on all windows platform
Uri address = new Uri("http://your-webservice-address");
//Get User current network credential
ICredentials credentials = CredentialCache.DefaultCredentials;
NetworkCredential credential = credentials.GetCredential(address, "Basic");
//Get HttpWebRequest
HttpWebRequest request = WebRequest.Create(address) as HttpWebRequest;
//Network Credential should be included on the request to avoid network issues when requesting to the web servic
request.Proxy = WebRequest.DefaultWebProxy;
request.Credentials = new NetworkCredential(credential.UserName, credential.Password, credential.Domain);
It's hard to say based on the code you've given. I'd suspect that it's either your IE settings or your proxy variables.
Check http://social.msdn.microsoft.com/Forums/en/netfxnetcom/thread/61b71194-1758-4c7b-89fe-91be7363db13 it may help.

C# Sending an HttpWebRequest without the servername

Basically what I've been trying to do is download a file off a server. The server sends a redirect automatically which is fine, but through packet sniffing a program that does successfully download the file I've found that the Headers (for the second request) are:
GET /path/to/file.txt
...
Host: server.com
Rather than the current response being generated (what I thought was standard):
GET www.server.com/path/to/file.txt
Using the normal HttpWebRequest method results in a 500 server error, and I get exceptions thrown when trying to use just the relative path as one would expect.
Using AllowAutoRedirect does not work for this scenario as the cookies are not handled properly, but even if I handle it manually the same error occurs.
How does one go about doing this (preferably without sockets :D)?
To be honest, I'm really not sure what you are asking, but you mentioned cookie troubles. As a total shot in the dark guess, are you setting the CookieContainer on your WebRequest?
request.CookieContainer = new CookieContainer();
request.AllowAutoRedirect = true;

HttpWebRequest doesn't work except when fiddler is running

This is probably the weirdest problem I have run into. I have a piece of code to submit POST to a url. The code doesn't work neither throws any exceptions when fiddler isn't running, However, when fiddler is running, the code posts the data successfuly. I have access to the post page so I know if the data has been POSTED or not. This is probably very non-sense, But it's a situation I am running into and I am very confused.
byte[] postBytes = new ASCIIEncoding().GetBytes(postData);
HttpWebRequest req = (HttpWebRequest)WebRequest.Create("http://myURL);
req.UserAgent = "Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/534.10 (KHTML, like Gecko) Chrome/8.0.552.224 Safari/534.10";
req.Accept = "application/xml,application/xhtml+xml,text/html;q=0.9,text/plain;q=0.8,image/png,*/*;q=0.5";
req.Headers.Add("Accept-Charset", "ISO-8859-1,utf-8;q=0.7,*;q=0.3");
req.Headers.Add("Accept-Language", "en-US,en;q=0.8");
req.Method = "POST";
req.ContentType = "application/x-www-form-urlencoded";
req.ContentLength = postBytes.Length;
req.CookieContainer = cc;
Stream s = req.GetRequestStream();
s.Write(postBytes, 0, postBytes.Length);
s.Close();
If you don't call GetResponseStream() then you can't close the response. If you don't close the response, then you end up with a socket in a bad state in .NET. You MUST close the response to prevent interference with your later request.
Close the HttpWebResponse after getting it.
I had the same problem, then I started closing the responses after each request, and Boom, no need to have fiddler running.
This is a pseudo of a synchronous code:
request.create(url);
///codes
httpwebresponse response = (httpwebresponse)request.getresponse();
/// codes again like reading it to a stream
response.close();
I had a similar problem recently. Wireshark would show the HTTPWebRequest not leave the client machine unless Fiddler was running. I tried removing proxy settings, but that didn't fix the problem for me. I tried everything from setting the request to HttpVersion.Version10, enabling/disabling SendChuck, KeepAlive, and a host of other settings. None of which worked.
Ultimately, I just checked if .Net detected a proxy and had the request attempt to ignore it. That fixed my issue with request.GetResponse() throwing an immediate exception.
IWebProxy proxy = request.Proxy;
if (request.Proxy != null)
{
Console.WriteLine("Removing proxy: {0}", proxy.GetProxy(request.RequestUri));
request.Proxy = null;
}
In my case when I had the same situation (POST only works when Fiddler is running) the code was sending the POST from an application running on IISExpress in a development environment behind a proxy to an external server. Apparently even if you have proxy settings configured in Internet Options the environment IIS is running in may not have access to them. In my work environment I simply had to update web.config with the path to our proxy's configuration script. You may need to tweak other proxy settings. In that case your friend is this MSDN page that explains what they are: http://msdn.microsoft.com/en-us/library/sa91de1e.aspx.
Ultimately I included the following in the application's web.config and then the POST went through.
<configuration>
<system.net>
<defaultProxy>
<proxy scriptLocation="http://example.com:81/proxy.script" />
</defaultProxy>
</system.net>
</configuration>
Well i have faced similar problem few weeks back and the reason was that when fiddler is running it changes the proxy settings to pass the request through Fiddler but when its closed the proxy somehow still remains and thus doesn't allow your request to go ahead on internet.
I tried by setting the IE's and Firefox's network settings to not to take any proxy and it worked.
Try this, may it be the same problem...
I ran into the same problem with Python - requests to a local server were failing with a 404, but then when I ran them with Fiddler running they were working correctly.
The real clue to the problem here is that Fiddler works by acting as a proxy for HTTP traffic so that all requests from the local machine go through Fiddler rather than straight out into the network.
In the exact situation I was in, I was making requests to a local server, regular traffic passes through a proxy and in Local Area Network (LAN) Settings for the network connection the Proxy server pane the Bypass proxy server for local addresses option was checked.
My suspicion is that the "Bypass proxy server for local addresses" is not necessarily picked up by the programming language, but the proxy server details are. Fiddler is aware of that policy, so requests through Fiddler work but requests direct from the programming language don't.
By setting the proxy for the request for the local server to nothing, it worked correctly from code. Obviously, that could be a gotcha if you find yourself moving from an internal to external server during deployment.
I faced the same scenario : I was POSTing to an endpoint behind Windows Authentication.
Fiddler keeps a pool of open connections, but your C# test or powershell script does not when it runs without fiddler.
So you can make the test/script to also maintain a pool of open authenticated connections, by setting the property UnsafeAuthenticatedConnectionSharing to true on your HttpWebRequest. Read more about it here, microsoft KB. In both cases in that article, you can see that they are making two requests. The first one is a simple GET or HEAD to get the authentication header (to complete the handshake), and the second one is the POST, that will use the header obtained before.
Apparently you cannot (sadness) directly do the handshake with POST http requests.
Always use using construct. it make sure all resource release after call
using (HttpWebResponse responseClaimLines = (HttpWebResponse)requestClaimLines.GetResponse())
{
using (StreamReader reader = new StreamReader(responseClaimLines.GetResponseStream()))
{
responseEnvelop = reader.ReadToEnd();
}
}
add following entries to webconfig file
<system.net>
<connectionManagement>
<add address="*" maxconnection="30"/>
I found the solution in increasing the default number of connections
ServicePointManager.DefaultConnectionLimit = 10000;

C# maintaining session over HTTPS on the client

I need to login to a website and perform an action. The website is REST based so I can easily login by doing this (the login info is included as a querystring on the URL, so I dont't need to set the credentials):
CookieContainer cookieJar = new CookieContainer();
HttpWebRequest firstRequest = (HttpWebRequest) WebRequest.Create(loginUrl);
firstRequest.CookieContainer = cookieJar;
firstRequest.KeepAlive = true;
firstRequest.Method = "POST";
HttpWebResponse firstResponse = (HttpWebResponse)firstRequest.GetResponse();
That works and logs me in. I get a cookie back to maintain the session and it's stored in the cookieJar shown above. Then I do a second request such as this:
HttpWebRequest secondRequest = (HttpWebRequest) WebRequest.Create(actionUrl);
secondRequest.Method = "POST";
secondRequest.KeepAlive = true;
secondRequest.CookieContainer = cookieJar;
WebResponse secondResponse = secondRequest.GetResponse();
And I ensure I assign the cookies to the new request. But for some reason this doesn't appear to work. I get back an error telling me "my session has timed out or expired", and this is done one right after the other so its not a timing issue.
I've used Fiddler to examine the HTTP headers but I'm finding that difficult since this is HTTPS. (I know i can decrypt it but doesn't seem to work well.)
I can take my URL's for this rest service and paste them into firefox and it all works fine, so it must be something I'm doing wrong and not the other end of the connection.
I'm not very familiar with HTTPS. Do I need to do something else to maintain my session? I thought the cookie would be it, but perhaps there is something else I need to maintain across the two requests?
Here are the headers returned when I send in the first request (except I changed the cookie to protect the innocent!):
X-DB-Content-length=19
Keep-Alive=timeout=15, max=50
Connection=Keep-Alive
Transfer-Encoding=chunked
Content-Type=text/html; charset=WINDOWS-1252
Date=Mon, 16 Nov 2009 15:26:34 GMT
Set-Cookie:MyCookie stuff goes here
Server=Oracle-Application-Server-10g
Any help would be appreciated, I'm running out of ideas.
I finally got it working after decrypting the HTTP traffic from my program.
The cookie I'm getting back doesn't list the Path variable. So .NET takes the current path and assigns that as the path on the cookie including the current page. ie: If it was at http://mysite/somepath/somepage.htm it would set the cookie path=/somepath/somepage.htm. This is a bug as it should be assigned to "/" which is what all web browsers do. (hope they fix this.)
After noticing this I grabbed the cookie and modified the path property and everything works fine now.
Anyone else with a problem like this check out Fiddler. .NET uses the windows certificate store so to decrypt http traffic from your program you will need to follow the instructions here: http://www.fiddler2.com/Fiddler/help/httpsdecryption.asp . You will also need to turn on decryption under the Options\HTTPS tab of Fiddler.
From MSDN:
When a user moves back and forth between secure and public areas, the ASP.NET-generated session cookie (or URL if you have enabled cookie-less session state) moves with them in plaintext, but the authentication cookie is never passed over unencrypted HTTP connections as long as the Secure cookie property is set.
So basically, the cookie can be passed over both HTTP and HTTPS if the 'Secure' property is set to 'false'.
see also how can I share an asp.net session between http and https

Categories