I have a console application that I wrote for .NET/Windows that I suddenly had the need for on my unix system. Mono has, for the most part, been hugely successful at providing this for me.
There is however a small issue:
The application issues many HttpWebRequests as it runs, and for a small portion of these, Mono is returning an error:
Error getting response stream (Write: The authentication or decryption has failed.): SendFailure
This error message seems to indicate an SSL error. However, this application does not issue any request to SSL-secured URIs (i.e. all URIs are http://).
The main code in question is as follows:
HttpWebRequest req = WebRequest.Create(url) as HttpWebRequest;
req.UserAgent = UserAgent;
req.AuthenticationLevel = AuthenticationLevel.None;
req.AllowAutoRedirect = true;
req.KeepAlive = true;
req.CachePolicy = new RequestCachePolicy(RequestCacheLevel.NoCacheNoStore);
req.Timeout = timeout;
if (useCompression) {
req.Headers.Add(HttpRequestHeader.AcceptEncoding, "gzip,deflate");
}
Edit: For the purposes of running this code, you can define dummy variables as follows:
string url = "http://example.com";
string UserAgent = "WhateverBot";
int timeout = 5000;
bool useCompression = true;
It should be noted that the code works without any problem on Windows/.NET.
I also had this problem when running Mono in Windows even when the URL is not HTTPS. Let me say this loudly: ITS NOT REDIRECTED.
The solution has been to use the hack:
System.Net.ServicePointManager.ServerCertificateValidationCallback += delegate
{
return true;
};
My guess is that Mono does not look in the Windows keystore for certificates so does not find any so HTTPWebXXXX is not initialized correctly and then fails incorrectly.
My guess is that anyone taking the trouble to reproduce your error will know what they are doing with Mono (unlike me for example) and so have their environment set up correctly and be unable reproduce this real world problem.
Related
I am trying to patch a .net web application that after years of working started failing to get UPS shipping quotes, which is impacting web business dramatically. After much trial and error, I found the following code that works just fine in a console application:
static string FindUPSPlease()
{
string post_data = "<xml data string>";
string uri = "https://onlinetools.ups.com/ups.app/xml/Rate";
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(uri);
request.Method = "POST";
request.KeepAlive = false;
request.ProtocolVersion = HttpVersion.Version10;
byte[] postBytes = Encoding.ASCII.GetBytes(post_data);
request.ContentType = "application/x-www-form-urlencoded";
request.ContentLength = postBytes.Length;
Stream requestStream = request.GetRequestStream();
requestStream.Write(postBytes, 0, postBytes.Length);
requestStream.Close();
// get response and send to console
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
Console.WriteLine(new StreamReader(response.GetResponseStream()).ReadToEnd());
Console.WriteLine(response.StatusCode);
return "done";
}
This runs in Visual Studio just fine and gets a nice little response from UPS that the XML is, of course, malformed.
But, if I paste this function into the web application without changing a single character, an exception is thrown on request.GetRequestStream():
Authentication failed because the remote party has closed the transport stream.
I tried it in a couple of different place in the application with the same result.
What is there about the web application environment that would affect the request?
It turns out to be a TLS issue. I guess the console app uses a higher protocol by default than the web application, although none was specified. So, all you have to do is add the following line(s) of code sometime prior to making the request:
using System.Net;
...
System.Net.ServicePointManager.SecurityProtocol |= SecurityProtocolType.Tls11 | SecurityProtocolType.Tls12;
That was all it took, though I spent an enormous amount of getting there.
Here is the response from UPS on the issue:
Effective January 18, 2018, UPS will only accept TLS 1.1 and TLS 1.2 security protocols... 100% of requests from customers who are on TLS 1.0 while using production URLS (onlinetools.ups.com/tool name) will be rejected.
Anyway, hope this helps someone.
Jim
Can you try setting the Credentials to your request object like following.
request.Credentials = System.Net.CredentialCache.DefaultNetworkCredentials;
Try setting the default credentials or check if there is any proxy server set and pass it like in the example below.
The example is given for WebClient.
I was having problem with setting Default Credential, as proxy was enabled on the server. So i passed the proxy URL and port with credentials which can access it.
using (System.Net.WebClient web = new System.Net.WebClient())
{
//IWebProxy defaultWebProxy = WebRequest.DefaultWebProxy;
//defaultWebProxy.Credentials = CredentialCache.DefaultCredentials;
//web.Proxy = defaultWebProxy;
var proxyURI = new Uri(string.Format("{0}:{1}", proxyURL, proxyPort));
//Set credentials
System.Net.ICredentials credentials = new System.Net.NetworkCredential(proxyUserId, proxyPassword);
//Set proxy
web.Proxy = new System.Net.WebProxy(proxyURI, true, null, credentials);
web.Headers.Add("Content-Type", "application/x-www-form-urlencoded");
var result = web.UploadString(URL, "");
return result;
}
I am working on a API based App in Xamarin using HttpWebRequest class. I have to send request to URL
http://example.com/APIRequest/Request?Parts=33333|N|2014|ABCD
But when I see this request in Fiddler it shows me URL like
http://example.com/APIRequest/Request?Parts=33333%7CN%7C2014%7CABCD
Now the problem is this encoded URL is not getting understood by server and its returning errors, which is beyond my control.
Earlier in .NET2.0 C# Application I was using
Uri url = new Uri(rawurl, true);
But the second parameter has been deprecated in .NET 4.0 MonoTouch available on Xamarin so its giving error or simply not doing anything.
I have tried all possible ways like UrlDecode, htmldecode, double decode or even Java UrlDecode but nothing has worked and always shows encoded URL in Fiddler.
Please suggest how to overcome this problem or any alternate to new Uri(url-string, true) the old function.
UPDATE:
After spending hours n hours probably I have found the culprit. The problem is
When I use "new Uri(url, true)", it sends unescaped URL containing | (pipe) to WebRequest.Create but if I remove "true" it sends encoded URL, which produces result but unfortunately server doesn't understand, so I get error.
Uri ourUri = new Uri(url, true);
myHttpWebResponse1 = (System.Net.HttpWebResponse)request.GetResponse();
But it may be a bug that request.GetResponse() Stops working without throwing any exception and process hangs if I use | (pipe) in URL.
Any possible solution to that?
My complete function is given below (modified with hardcoded URL)
public static string getURLCustom(string GETurl, string GETreferal)
{
GETurl = "http://example.com/?req=111111|wwww|N|2014|asdwer4";
GETreferal = "";
Uri ourUri = new Uri(GETurl.Trim(), true);
HttpWebRequest request = (HttpWebRequest)(WebRequest.Create(ourUri));
request.Method = "GET";
request.UserAgent = "Mozilla/5.0 (Windows; U; Windows NT 6.1; ru; rv:1.9.2.3) Gecko/20100401 Firefox/4.0 (.NET CLR 3.5.30729)";
request.Accept = "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8";
request.KeepAlive = true;
request.CookieContainer = loginCookie; //stored after login
request.ContentType = "application/x-www-form-urlencoded";
request.Referer = GETreferal;
request.AllowAutoRedirect = true;
HttpWebResponse myHttpWebResponse1 = default(HttpWebResponse);
myHttpWebResponse1 = (System.Net.HttpWebResponse)request.GetResponse();
StreamReader postreqreader1 = new StreamReader(myHttpWebResponse1.GetResponseStream());
return postreqreader1.ReadToEnd();
}
And yes this code works perfectly in .NET 2.0 Windows Application but not on Xamarin Mono-Touch App.
It seems the server you are connecting to does not support internationalized resource identifier (IRI).
IRI is enabled by default since mono 3.10. mono 3.10 release notes
You can disable it on your client application by doing:
FieldInfo iriParsingField = typeof (Uri).GetField ("s_IriParsing",
BindingFlags.Static | BindingFlags.GetField | BindingFlags.NonPublic);
if (iriParsingField != null)
iriParsingField.SetValue (null, false);
You can also disable IRI parsing by setting the environment variable MONO_URI_IRIPARSING to false.
I have a HTTP based API which I potentially need to call many times. The problem is that I can't get the request to take less than about 20 seconds, though the same request made through a browser is near instantaneous. The following code illustrates how I have implemented it so far.
WebRequest r = HttpWebRequest.Create("https://example.com/http/command?param=blabla");
var response = r.GetResponse();
One solution would be to make an asynchronous request but I would like to know why it takes so long and if I can avoid it. I have also tried using the WebClient class but I suspect it uses a WebRequest internally.
Update:
Running the following code took about 40 seconds in Release Mode (measured with Stopwatch):
WebRequest g = HttpWebRequest.Create("http://www.google.com");
var response = g.GetResponse();
I'm working at a university where there might be different things in the network configuration affecting the performance, but the direct use of the browser illustrates that it should be near instant.
Update 2:
I uploaded the code to a remote machine and it worked fine so the conclusion must be that the .NET code does something extra compared to the browser or it has problems resolving the address through the university network (proxy issues or something?!).
This problem is similar to another post on StackOverflow:
Stackoverflow-2519655(HttpWebrequest is extremely slow)
Most of the time the problem is the Proxy server property. You should set this property to null, otherwise the object will attempt to search for an appropriate proxy server to use before going directly to the source. Note: this property is turn on by default, so you have to explicitly tell the object not to perform this proxy search.
request.Proxy = null;
using (var response = (HttpWebResponse)request.GetResponse())
{
}
I was having the 30 second delay on 'first' attempt - JamesR's reference to the other post mentioning setting proxy to null solved it instantly!
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(_site.url);
request.Proxy = null; // <-- this is the good stuff
...
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
Does your site have an invalid SSL cert? Try adding this
ServicePointManager.ServerCertificateValidationCallback = new System.Net.Security.RemoteCertificateValidationCallback(AlwaysAccept);
//... somewhere AlwaysAccept is defined as:
using System.Security.Cryptography.X509Certificates;
using System.Net.Security;
public bool AlwaysAccept(object sender, X509Certificate certification, X509Chain chain, SslPolicyErrors sslPolicyErrors)
{
return true;
}
You don't close your Request. As soon as you hit the number of allowed connections, you have to wait for the earlier ones to time out. Try
using (var response = g.GetResponse())
{
// do stuff with your response
}
I'm attempting to use Fogbugz's BugzScout in order to automatically submit unhanded application exceptions to my Fogbugz on demand Account. I've written up a wrapper class for it and everything appears to be just groovy - on my box. Testing the same code in the production environment, behind a Proxy that requires authentication, I have had nothing but issues.
I went to work modifying the BugzScout code in order to get it to authenticate with the Proxy, and after trying many different methods suggested via a Google search, found one that works! But now I'm getting an "Connection actively refused" error from Fogbugz itself, and I don't know what to do.
Here is the code where the BugzScout connects via a .net WebClient to submit a new case, with my modifications to deal with our Proxy. What am I doing that would cause Fogbugz to refuse my request? I've removed all non web-client related code from the procedure for ease of reading.
public string Submit(){
WebClient client = new WebClient();
WebProxy proxy = new WebProxy();
proxy.UseDefaultCredentials = true;
client.Proxy = proxy;
Byte[] response = client.DownloadData(fogBugzUrl);
string responseText = System.Text.Encoding.UTF8.GetString(response);
return (responseText == "") ? this.defaultMsg : responseText;
}
The url is correct and the case is filled in properly- this has been verified.
EDIT: Additional info.
Using Fogbugz on Demand.
Using FogBugz.net code in it's entirety, with only these additions
WebProxy proxy = new WebProxy();
proxy.UseDefaultCredentials = true;
client.Proxy = proxy;
Error occurs when attempting to connect to both https://oursite.fogbugz.com/scoutsubmit.asp and http://oursite.fogbugz.com//scoutsubmit.asp (except one says port 443, and the other port 80, obviously)
I don't know anything about web authentication so I can't tell you what kind I'm using- if you tell me where to look I'd be happy to answer that for you.
Got the fix from Fogbugz- this is the appropriate network code to get though the proxy authentication and not mis-authenticate with Bugzscout.
WebRequest.DefaultWebProxy.Credentials = CredentialCache.DefaultNetworkCredentials;
WebRequest request = WebRequest.Create(fogBugzUrl);
request.ContentType = "application/x-www-form-urlencoded";
request.Method = "POST";
request.Proxy.Credentials = CredentialCache.DefaultNetworkCredentials;
Stream requestStream = request.GetRequestStream();
requestStream.Write(bytes, 0, bytes.Length);
requestStream.Close();
Is your fogbugzUrl using HTTP Basic Authentication? Is it SSL (hosted on On Demand?)
The connection actively refused message would be coming from the web server itself, not really FogBugz.
Can you post the HTTP Status Code?
One thing to note if you are using FogBugz On Demand is you HAVE to use the https:// url (not the http url).
My use case is this, I want to call out to a webservice and if I am behind a proxy server that requires authentication I want to just use the default credentials...
WebRequest.DefaultWebProxy.Credentials = CredentialCache.DefaultCredentials;
Otherwise I'll just simply make the call, It would be very nice to determine if the auth is required up front, rather than handle the exception after I attempt to make the call.
Ideas?
It was only after I had first deployed my app that I realised some users were behind firewalls... off to work to test it. Rather than do a test for a '407 authentication required' I just do the same Proxy setup whether it might be needed or not...
System.Net.HttpWebRequest req = (System.Net.HttpWebRequest)System.Net.WebRequest.Create(uri.AbsoluteUri);
//HACK: add proxy
IWebProxy proxy = WebRequest.GetSystemWebProxy();
proxy.Credentials = System.Net.CredentialCache.DefaultCredentials;
req.Proxy = proxy;
req.PreAuthenticate = true;
//HACK: end add proxy
req.AllowAutoRedirect = true;
req.MaximumAutomaticRedirections = 3;
req.UserAgent = "Mozilla/6.0 (MSIE 6.0; Windows NT 5.1; DeepZoomPublisher.com)";
req.KeepAlive = true;
req.Timeout = 3 * 1000; // 3 seconds
I'm not sure what the relative advantages/disadvantages are (try{}catch{} without proxy first, versus just using the above), but this code now seems to work for me both at work (authenticating proxy) and at home (none).
System.Net.WebProxy has a property called UseDefaultCredentials that may be what you want (but I have to admit a bit of ignorance here). The link to the relevant documentation is here.
Actually, seems that this isn't an issue after all, Previously I was setting the auth like so...
WebProxy proxy = new WebProxy(#"http://<myProxyAddress>:8080");
proxy.Credentials = new NetworkCredential(<myUSerName>, <myPassword>, <myDomain>);
WebRequest.DefaultWebProxy = proxy;
This would be fine for when I was behind the proxy but throw an error when there was no proxy, so of course I expected the same error above as I was still just setting the same credentials, but you know what they say about assuming things....in fact there is no error at all with setting the default creds, all is sweet.
If you want to check for the Proxy settings in IE, you could also peek into the registry: check the HKEY_CURRENT_USER\Software\Microsoft\Windows\CurrentVersion\Internet Settings branch of the registry tree - lots of options and settings there. Most notably: ProxyEnable (a DWORD, 0 = no proxy, 1 = proxy enabled).
It looks like if you leave the Proxy stuff alone, .NET should just use the IE proxy settings, which seems like the most "correct" way of dealing with proxies...