WebRequest converting URL to Encoded in Xamarin (MonoTouch) - c#

I am working on a API based App in Xamarin using HttpWebRequest class. I have to send request to URL
http://example.com/APIRequest/Request?Parts=33333|N|2014|ABCD
But when I see this request in Fiddler it shows me URL like
http://example.com/APIRequest/Request?Parts=33333%7CN%7C2014%7CABCD
Now the problem is this encoded URL is not getting understood by server and its returning errors, which is beyond my control.
Earlier in .NET2.0 C# Application I was using
Uri url = new Uri(rawurl, true);
But the second parameter has been deprecated in .NET 4.0 MonoTouch available on Xamarin so its giving error or simply not doing anything.
I have tried all possible ways like UrlDecode, htmldecode, double decode or even Java UrlDecode but nothing has worked and always shows encoded URL in Fiddler.
Please suggest how to overcome this problem or any alternate to new Uri(url-string, true) the old function.
UPDATE:
After spending hours n hours probably I have found the culprit. The problem is
When I use "new Uri(url, true)", it sends unescaped URL containing | (pipe) to WebRequest.Create but if I remove "true" it sends encoded URL, which produces result but unfortunately server doesn't understand, so I get error.
Uri ourUri = new Uri(url, true);
myHttpWebResponse1 = (System.Net.HttpWebResponse)request.GetResponse();
But it may be a bug that request.GetResponse() Stops working without throwing any exception and process hangs if I use | (pipe) in URL.
Any possible solution to that?
My complete function is given below (modified with hardcoded URL)
public static string getURLCustom(string GETurl, string GETreferal)
{
GETurl = "http://example.com/?req=111111|wwww|N|2014|asdwer4";
GETreferal = "";
Uri ourUri = new Uri(GETurl.Trim(), true);
HttpWebRequest request = (HttpWebRequest)(WebRequest.Create(ourUri));
request.Method = "GET";
request.UserAgent = "Mozilla/5.0 (Windows; U; Windows NT 6.1; ru; rv:1.9.2.3) Gecko/20100401 Firefox/4.0 (.NET CLR 3.5.30729)";
request.Accept = "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8";
request.KeepAlive = true;
request.CookieContainer = loginCookie; //stored after login
request.ContentType = "application/x-www-form-urlencoded";
request.Referer = GETreferal;
request.AllowAutoRedirect = true;
HttpWebResponse myHttpWebResponse1 = default(HttpWebResponse);
myHttpWebResponse1 = (System.Net.HttpWebResponse)request.GetResponse();
StreamReader postreqreader1 = new StreamReader(myHttpWebResponse1.GetResponseStream());
return postreqreader1.ReadToEnd();
}
And yes this code works perfectly in .NET 2.0 Windows Application but not on Xamarin Mono-Touch App.

It seems the server you are connecting to does not support internationalized resource identifier (IRI).
IRI is enabled by default since mono 3.10. mono 3.10 release notes
You can disable it on your client application by doing:
FieldInfo iriParsingField = typeof (Uri).GetField ("s_IriParsing",
BindingFlags.Static | BindingFlags.GetField | BindingFlags.NonPublic);
if (iriParsingField != null)
iriParsingField.SetValue (null, false);
You can also disable IRI parsing by setting the environment variable MONO_URI_IRIPARSING to false.

Related

https request fails only in .net web app

I am trying to patch a .net web application that after years of working started failing to get UPS shipping quotes, which is impacting web business dramatically. After much trial and error, I found the following code that works just fine in a console application:
static string FindUPSPlease()
{
string post_data = "<xml data string>";
string uri = "https://onlinetools.ups.com/ups.app/xml/Rate";
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(uri);
request.Method = "POST";
request.KeepAlive = false;
request.ProtocolVersion = HttpVersion.Version10;
byte[] postBytes = Encoding.ASCII.GetBytes(post_data);
request.ContentType = "application/x-www-form-urlencoded";
request.ContentLength = postBytes.Length;
Stream requestStream = request.GetRequestStream();
requestStream.Write(postBytes, 0, postBytes.Length);
requestStream.Close();
// get response and send to console
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
Console.WriteLine(new StreamReader(response.GetResponseStream()).ReadToEnd());
Console.WriteLine(response.StatusCode);
return "done";
}
This runs in Visual Studio just fine and gets a nice little response from UPS that the XML is, of course, malformed.
But, if I paste this function into the web application without changing a single character, an exception is thrown on request.GetRequestStream():
Authentication failed because the remote party has closed the transport stream.
I tried it in a couple of different place in the application with the same result.
What is there about the web application environment that would affect the request?
It turns out to be a TLS issue. I guess the console app uses a higher protocol by default than the web application, although none was specified. So, all you have to do is add the following line(s) of code sometime prior to making the request:
using System.Net;
...
System.Net.ServicePointManager.SecurityProtocol |= SecurityProtocolType.Tls11 | SecurityProtocolType.Tls12;
That was all it took, though I spent an enormous amount of getting there.
Here is the response from UPS on the issue:
Effective January 18, 2018, UPS will only accept TLS 1.1 and TLS 1.2 security protocols... 100% of requests from customers who are on TLS 1.0 while using production URLS (onlinetools.ups.com/tool name) will be rejected.
Anyway, hope this helps someone.
Jim
Can you try setting the Credentials to your request object like following.
request.Credentials = System.Net.CredentialCache.DefaultNetworkCredentials;
Try setting the default credentials or check if there is any proxy server set and pass it like in the example below.
The example is given for WebClient.
I was having problem with setting Default Credential, as proxy was enabled on the server. So i passed the proxy URL and port with credentials which can access it.
using (System.Net.WebClient web = new System.Net.WebClient())
{
//IWebProxy defaultWebProxy = WebRequest.DefaultWebProxy;
//defaultWebProxy.Credentials = CredentialCache.DefaultCredentials;
//web.Proxy = defaultWebProxy;
var proxyURI = new Uri(string.Format("{0}:{1}", proxyURL, proxyPort));
//Set credentials
System.Net.ICredentials credentials = new System.Net.NetworkCredential(proxyUserId, proxyPassword);
//Set proxy
web.Proxy = new System.Net.WebProxy(proxyURI, true, null, credentials);
web.Headers.Add("Content-Type", "application/x-www-form-urlencoded");
var result = web.UploadString(URL, "");
return result;
}

Is there a method to send HttpWebRequest to download all url's and handles all exceptions?

I am working on a MultiThreadingDownloader over Http. So I have to work using HttpWebRequest to get partial requests. The application is working good for almost all URLs. But sometimes, when trying to get response it throws exception or works wrong (e.g ContentLength returns -1, TSL/SSL secure exception, cookie-required links)
I have not enough knowledge about client-server relationship, so I can't handle all exceptions.
I am using currently this method:
public static HttpWebRequest SendRequest(string url)
{
HttpWebRequest req = WebRequest.Create(url) as HttpWebRequest;
req.AllowAutoRedirect = true;
req.Accept = "*/*";
req.Method = "GET";
req.UserAgent = "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.2; .NET CLR 1.0.3705;)";
req.ServicePoint.ConnectionLimit = 8;
req.ServicePoint.Expect100Continue = true;
req.ProtocolVersion = HttpVersion.Version10;
ServicePointManager.ServerCertificateValidationCallback += (sender, cert, chain, sslPolicyErrors) => true;
ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls | SecurityProtocolType.Tls11 | SecurityProtocolType.Tls12 | SecurityProtocolType.Ssl3;
return req;
}
What should an HttpWebRequest contain to get the HttpWebResponse successfully for all URL sources? My method is not working for all as I said. A detailed documentation or code block for the method can help me.
The server is at liberty to not provide a content length. This is useful for streaming data. Here, you can't do much about it. You can't support segmented downloads for such URLs. There is no way to discover this situation other than trying.
It's the same thing with HTTPS and cookies. The server is free to reply with anything that it wants and require any input that it likes.
You will need to handle all these cases specially. They are allowed by the HTTP protocol. HttpWebRequest does not have much built-in to help you. The only helpful features in this regard are following redirects and decompression. This is default-on so you probably did not even notice it's there.

WebRequest.GetResponse() returns 404 on valid URl

I'm trying to scrape web page via C# application, but it keeps responding
"The remote server returned an error: (404) Not Found."
The web page is accesible through browser, but the app keeps failing. Any help appreciated.
var d = DateTime.UtcNow.Date;
var AddressString = #"http://www.booking.com/searchresults.html?src=searchresults&si=ai%2Cco%2Cci%2Cre%2Cdi&ss={0}&checkin_monthday={1}&checkin_year_month={2}&checkout_monthday={3}&checkout_year_month={4}";
var URi = String.Format(AddressString, "Prague", d.Day, d.Year + "-" + d.Month, d.Day + 1, d.Year + "-" + d.Month);
var request = (HttpWebRequest)WebRequest.Create(URi);
request.Timeout = 5000;
request.UserAgent = "Fiddler"; //I tried to set next three rows not to be null
request.Credentials = CredentialCache.DefaultCredentials;
request.Proxy = WebProxy.GetDefaultProxy();
try
{
var response = (HttpWebResponse)request.GetResponse();
}
catch(WebException e)
{
var response = (HttpWebResponse)e.Response; //e.Response contains WebPage, but it is incomplete
StreamReader sr = new StreamReader(response.GetResponseStream());
HtmlDocument doc = new HtmlDocument();
doc.Load(sr);
var a = doc.DocumentNode.SelectNodes("div[#class='resut-details']"); //fails, as not all desired nodes arent in response
}
EDIT:
Hi guys, thx for suggestions.
I added header: "Accept-Encoding: gzip,deflate,sdch" according to David Martins reply, but it didn't helped on its own.
I used Fidller to try to get any info about the problem, but I saw that app for the first time and it didn't made me any smarter. On the other hand, I tried to change request.UserAgent to that which is sent by my browser ("User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/34.0.1847.131 Safari/537.36";) and voila, I am not getting 404 exception anymore, but the document is not readable, as it is filled with such chars: ¿½O~���G�. I tried setting request.TransferEncoding = "UTF-8", but to enable this propperty, request.SendChunked must be set to true, which ends in
ProtocolViolationException
Additional information: Content-Length or Chunked Encoding cannot be set for an operation that does not write data.
EDIT 2:
I'm forgetting something and I can't figure out what. I'm getting somehow encoded response and need to decode it first to read it correctly. Even in Fiddler, when I want to see response, I need to confirm decoding to inspect result. After I decode it in fiddler, I'm getting just what I want to get into my application...
So, after trying suggestions from Jon Skeet and David Martin I got somewhere further and found relevant answer on new question in another toppic. If anyone ever looked for sth similar, answer is here:
.NET: Is it possible to get HttpWebRequest to automatically decompress gzip'd responses?

Mono: HttpWebRequest SSL Errors on non-SSL URI

I have a console application that I wrote for .NET/Windows that I suddenly had the need for on my unix system. Mono has, for the most part, been hugely successful at providing this for me.
There is however a small issue:
The application issues many HttpWebRequests as it runs, and for a small portion of these, Mono is returning an error:
Error getting response stream (Write: The authentication or decryption has failed.): SendFailure
This error message seems to indicate an SSL error. However, this application does not issue any request to SSL-secured URIs (i.e. all URIs are http://).
The main code in question is as follows:
HttpWebRequest req = WebRequest.Create(url) as HttpWebRequest;
req.UserAgent = UserAgent;
req.AuthenticationLevel = AuthenticationLevel.None;
req.AllowAutoRedirect = true;
req.KeepAlive = true;
req.CachePolicy = new RequestCachePolicy(RequestCacheLevel.NoCacheNoStore);
req.Timeout = timeout;
if (useCompression) {
req.Headers.Add(HttpRequestHeader.AcceptEncoding, "gzip,deflate");
}
Edit: For the purposes of running this code, you can define dummy variables as follows:
string url = "http://example.com";
string UserAgent = "WhateverBot";
int timeout = 5000;
bool useCompression = true;
It should be noted that the code works without any problem on Windows/.NET.
I also had this problem when running Mono in Windows even when the URL is not HTTPS. Let me say this loudly: ITS NOT REDIRECTED.
The solution has been to use the hack:
System.Net.ServicePointManager.ServerCertificateValidationCallback += delegate
{
return true;
};
My guess is that Mono does not look in the Windows keystore for certificates so does not find any so HTTPWebXXXX is not initialized correctly and then fails incorrectly.
My guess is that anyone taking the trouble to reproduce your error will know what they are doing with Mono (unlike me for example) and so have their environment set up correctly and be unable reproduce this real world problem.

Logging in to eBay using HttpWebRequest fails due to 'The browser you are using is rejecting cookies' response

I'm trying to log in to my eBay account using the following code:
string signInURL = "https://signin.ebay.com/ws/eBayISAPI.dll?co_partnerid=2&siteid=0&UsingSSL=1";
string postData = String.Format("MfcISAPICommand=SignInWelcome&userid={0}&pass={1}", "username", "password");
string contentType = "application/x-www-form-urlencoded";
string method = "POST";
string userAgent = "Mozilla/5.0 (Windows; U; MSIE 7.0; Windows NT 6.0; en-US)";
CookieContainer cookieContainer = new CookieContainer();
HttpWebRequest req = (HttpWebRequest)HttpWebRequest.Create(signInURL);
req.CookieContainer = cookieContainer;
req.Method = method;
req.ContentType = contentType;
req.UserAgent = userAgent;
ASCIIEncoding encoding = new ASCIIEncoding();
byte[] loginDataBytes = encoding.GetBytes(postData);
req.ContentLength = loginDataBytes.Length;
Stream stream = req.GetRequestStream();
stream.Write(loginDataBytes, 0, loginDataBytes.Length);
stream.Close();
HttpWebResponse res = (HttpWebResponse)req.GetResponse();
StreamReader xsr = new StreamReader(res.GetResponseStream());
String responseText = xsr.ReadToEnd();
Obviously substituting my real username and password. When I look at the string responseText, I see that part of the response from eBay is
The browser you are using is rejecting cookies.
Any ideas what I'm doing wrong?
P.S. And yes, I am also using the eBay API, but this is for something slightly different than what I want to do with the API.
You're doing a direct http request. The Ebay site has functionality to talk to a browser (probably to store the session cookie). Unless you make the request code smart enough to use cookies correctly it won't work. You'll probably have to use the internet explorer object instead.
Before doing the POST you need to download the page with the form that you are submitting in your code, take the cookie they give you, put it in your CookieContainer (making sure you get the path right) and post it back up in your request.
To clarify, while you might be POSTing the correct data, you are not sending the cookie that needs to go with it. You will get this cookie from the login page.
You need to intercept the http traffic to see what exactly what had happened. I use Fiddler2. It is the good tools for debugging http. So I can know whos wrong, my application or the remote web server.
Using fiddler, you can see the request header, response header with its cookies as well as response content. It used in the middle of your app and the Ebay.
Based on my experience. I think it is because Ebay cookie sent to you is not send back to Ebay server. Fiddler will prove it whether yes or not.
Another thing, the response cookie you receive should be send back to next request by using the same CookieContainer.
You should notice that CookieContainer has a bug on .Add(Cookie) and .GetCookies(uri) method. You may not using it, but internal codes might use it.
See the details and fix here:
http://dot-net-expertise.blogspot.com/2009/10/cookiecontainer-domain-handling-bug-fix.html
CallMeLaNN

Categories