c# WebClient problem using a URL with URL-encoded cyrillic chars - c#

I'm tryinig to load a file from a web server with a request URL that contains a parameter with cyrillic chars.
But I'm not getting this to work in c#, even if I URL-Encode the param.
When I open the page in IE with
http://translate.google.com/translate_tts?tl=ru&q=ЗДРАВСТВУЙТЕ
the server does not respond.
Using the URL-encoded version
http://translate.google.com/translate_tts?tl=ru&q=%d0%97%d0%94%d0%a0%d0%90%d0%92%d0%a1%d0%a2%d0%92%d0%a3%d0%99%d0%a2%d0%95
the server responds as expected.
Now my problem:
I want to download the MP3 from C# ...
var url = string.Format("http://translate.google.com/translate_tts?tl=ru&q={0}",
Server.UrlEncode("ЗДРАВСТВУЙТЕ"));
System.Net.WebClient client = new WebClient();
var res = client.DownloadData(url);
And this does NOT work with cyrillic chars. I always get a zero-byte answer, like the first, non-encoded request.
When I send "normal" chars, the code above works fine.
So ... I'm doing something wrong.
Any hints? Tipps? Solutions?
Thanks
Michael

You have to set the user-agent for the WebClient - this works:
string url = "http://translate.google.com/translate_tts?tl=ru&q=ЗДРАВСТВУЙТЕ";
WebClient client = new WebClient();
client.Headers.Add("user-agent",
"Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.2; .NET CLR 1.0.3705;)");
var res = client.DownloadData(url);
From the msdn documentation:
A WebClient instance does not send
optional HTTP headers by default. If
your request requires an optional
header, you must add the header to the
Headers collection. For example, to
retain queries in the response, you
must add a user-agent header. Also,
servers may return 500 (Internal
Server Error) if the user agent header
is missing.

Try to add
client.Encoding = System.Text.Encoding.UTF8;
I don't use user-agent header but for me it works:
WebClient client = new WebClient();
client.Encoding = System.Text.Encoding.UTF8;
string response = client.DownloadString(_url);

Related

WebClient returning 403 error only for this website?

I am trying to download file from these links by using C# WebClient, but I am getting 403 error.
https://www.digikey.com/product-search/download.csv?FV=ffe00035&quantity=0&ColumnSort=0&page=5&pageSize=500
https://www.digikey.com/product-search/download.csv?FV=ffe00035&quantity=0&ColumnSort=0&page=4&pageSize=500
I tried to use different user agents, accept encoding etc.
I replaced and tried https to http from url, but no success.
When I paste these urls in Chrome or FireFox or IE, I am able to download file, sometimes it give 403 error, then I replace https to http from url, it downloads. But no success in webclient
Tried Fiddler to inspect, no success
Can someone try in your system, solve this problem.
Here is my code:
ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls12;
WebClient client= new WebClient();
Uri request_url = new Uri("https://www.digikey.com/product-search/download.csv?FV=ffe00035&quantity=0&ColumnSort=0&page=5&pageSize=500);
//tried http also http://www.digikey.com/product-search/download.csv?FV=ffe00035&quantity=0&ColumnSort=0&page=5&pageSize=500
client.Headers.Add("user-agent", " Mozilla/5.0 (Windows NT 6.1; WOW64; rv:25.0) Gecko/20100101 Firefox/25.0");
client.DownloadFile(request_url, #"E:\123.csv");
I know there are many threads related to this topic, I tried all of them, no success, please don't mark duplicate. Try in your system, this <10 lines of code.
Note: the same code is working for other websites, only for this website it is giving error.
As I mentioned in my comment the issue here is that the server is expecting a cookie (specifically 'i10c.bdddb') to be present and is giving a 403 error when it's not. However, the cookie is sent with the 403 response. So you can make an initial junk request that will fail but give you the cookie. After this you can then proceed as normal.
Through some trial and error I was able to get the CSV using the code below:
System.Net.ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls12 | SecurityProtocolType.Tls11 | SecurityProtocolType.Tls;
CookieContainer cookieContainer = new CookieContainer();
Uri baseUri = new Uri("https://www.digikey.com");
using (HttpClientHandler handler = new HttpClientHandler() { CookieContainer = cookieContainer })
using (HttpClient client = new HttpClient(handler) { BaseAddress = baseUri})
{
//The User-Agent is required (what values work would need to be tested)
client.DefaultRequestHeaders.Add("User-Agent", "Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:67.0) Gecko/20100101 Firefox/67.0");
//Make our initial junk request that will fail but get the cookie
HttpResponseMessage getCookiesResponse = await client.GetAsync("/product-search/download.csv");
//Check if we actually got cookies
if (cookieContainer.GetCookies(baseUri).Count > 0)
{
//Try getting the data
HttpResponseMessage dataResponse = await client.GetAsync("product-search/download.csv?FV=ffe00035&quantity=0&ColumnSort=0&page=4&pageSize=500");
if(dataResponse.StatusCode == HttpStatusCode.OK)
{
Console.Write(await dataResponse.Content.ReadAsStringAsync());
}
}
else
{
throw new Exception("Failed to get cookies!");
}
}
Notes
Even with the right cookie if you don't send a User-Agent header the server will return a 403. I'm not sure what the server expects in terms of a user agent, I just copied the value my browser sends.
In the check to see if cookies have been set it would be a good idea to verify you actually have the 'i10c.bdddb' cookie instead of just checking if there are any cookies.
This is just a quick bit of sample code so it's not the cleanest. You may want to look into FormUrlEncodedContent to send the page number and other parameters.
I tested with your URL and I was able to reproduce your error. Any requests that I try with the querystring parameter quantity=0 seems to fail with a HTTP Error 403.
I would suggest requesting a quantity greater than zero.
A HTTP 403 status code mean forbidden, so there is a problem with your credentials. It doesn't seem to be like you're sending any. If you add them into your header this should work fine like this:
client.Headers.Add("Authorization", "token");
or sending them like this:
client.UseDefaultCredentials = true;
client.Credentials = new NetworkCredential("username", "password");
Most likely the links are working through web browsers is because you have already authenticated and the browser is sending the credentials/token.
I have this issue with Digi-key too.
The solution for me is to turn off my VPN service.

download sting data from both http and https servers in monotouch(detecting a host is working with https or not)

I'm using monotouch for developing my iOS application for iOS 6+ . The basis of the applications is downloading some data from serveres that user introduce.
This servers may work with http or https. I use below code for downloading:
System.Net.HttpWebRequest _HttpWebRequest = (System.Net.HttpWebRequest)System.Net.HttpWebRequest.Create (Url);
_HttpWebRequest.AllowWriteStreamBuffering = true;
_HttpWebRequest.UserAgent = "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1)";
_HttpWebRequest.Timeout = 15000;
_HttpWebRequest .Method ="GET";
System.Net.WebResponse _WebResponse = null;
_WebResponse = _HttpWebRequest.GetResponse ();
System.IO.Stream _WebStream = _WebResponse.GetResponseStream ();
var bytes = HelperMethods .ReadToEnd (_WebStream);
_WebResponse.Close ();
_WebResponse.Close ();
return System.Text.Encoding.UTF8.GetString(bytes);
it works when the server are http but when the servers are https I should add https:// before the host name for working. So how can I detect that whether a host is working with https or http, before sending requests.
You cannot; and there is nothing whatsoever that says that http://example.com and https://example.com need to represent the same information, although by convention it almost always does. If you are content to assume that the two are equivalent, you'll just need to try both (or at least, decide which to try first).

HttpRequestHeader Content encoding issue

I am using below code snippet to download HTTP response to local file.
Sometimes my content which is in url is multi-lingual (chinese, japanese, thai data etc.).
I am using ContentEncoding header to specify my content is in UTF-8 encoding, but this has no effect in my local output file which is generating in ASCII. Due to this, multi-lingual data is corrupted. Any help?
using (var webClient = new WebClient())
{
webClient.Credentials = CredentialCache.DefaultCredentials;
webClient.Headers.Add(HttpRequestHeader.UserAgent, "Mozilla/4.0");
webClient.Headers.Add(HttpRequestHeader.ContentEncoding, "utf-8");
webClient.DownloadFile(url, #"c:\temp\tempfile.htm");
}
The ContentEncoding header is not used to specify the character set. It's used by the client to say what kind of encoding (compression) it supports.
The client can't tell the server what character set to send. The server sends its data and some header fields that say what character set is being used. Typically it's in the ContentTypeheader and looks like: text/html; charset=UTF-8.
When you're using WebClient, you want to set the Encoding property as a fallback so that if the server doesn't identify the character set, your default will be used. For example:
WebClient client = new WebClient();
client.Encoding = Encoding.UTF8;
string s = client.DownloadString(DownloadUrl);
See http://www.informit.com/guides/content.aspx?g=dotnet&seqNum=800 for a bit more information.

C# webclient cannot getting response from https protocol

When i trying to load html from server by https, it returning an error code 500: but when i open same link in browser it works fine: is there any way to do this? I'm using Webclient and also sending a useragent information to the server:
HttpWebRequest req1 = (HttpWebRequest)WebRequest.Create("mobile.unibet.com/";);
req1.UserAgent = #"Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US; rv:1.9.1.5) Gecko/20091102 Firefox/3.5.5";
var response1 = req1.GetResponse();
var responsestream1 = response1.GetResponseStream();
David is correct, this generally happens when the server is expecting some headers that is not passed through, in your case Accept
this code works now
string requestUrl = "https://mobile.unibet.com/unibet_index.t";
var request = (HttpWebRequest)WebRequest.Create(requestUrl);
request.Accept = "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8";
request.UserAgent = "//Mozilla/4.0 (compatible; MSIE 5.01; Windows NT 5.0)";
using (var response = request.GetResponse() as HttpWebResponse)
{
using (var sr = new StreamReader(response.GetResponseStream()))
{
var responsestring = sr.ReadToEnd();
if (!string.IsNullOrEmpty(responsestring))
{
Console.WriteLine(responsestring);
}
}
}
This should probably be a comment but there's not enough room in the comment for all the questions... I don't think the question has enough information to answer with any level of confidence.
A 500 error means a problem at the server. The short answer is that the browser is sending some content that the WebClient is not.
The WebClient may not be sending headers that are expected by the server. Does the server require authentication? Is this a page on a company that you've contracted with that perhaps provided you with credentials or an API key that was Do you need to add HTTP Authorization?
If this is something you're doing with a company that you've got a partnership with, you should be able to ask them to help trace why you're getting a 500 error. Otherwise, you may need to provide us with a code sample and more details so we can offer more suggestions.

Logging in to eBay using HttpWebRequest fails due to 'The browser you are using is rejecting cookies' response

I'm trying to log in to my eBay account using the following code:
string signInURL = "https://signin.ebay.com/ws/eBayISAPI.dll?co_partnerid=2&siteid=0&UsingSSL=1";
string postData = String.Format("MfcISAPICommand=SignInWelcome&userid={0}&pass={1}", "username", "password");
string contentType = "application/x-www-form-urlencoded";
string method = "POST";
string userAgent = "Mozilla/5.0 (Windows; U; MSIE 7.0; Windows NT 6.0; en-US)";
CookieContainer cookieContainer = new CookieContainer();
HttpWebRequest req = (HttpWebRequest)HttpWebRequest.Create(signInURL);
req.CookieContainer = cookieContainer;
req.Method = method;
req.ContentType = contentType;
req.UserAgent = userAgent;
ASCIIEncoding encoding = new ASCIIEncoding();
byte[] loginDataBytes = encoding.GetBytes(postData);
req.ContentLength = loginDataBytes.Length;
Stream stream = req.GetRequestStream();
stream.Write(loginDataBytes, 0, loginDataBytes.Length);
stream.Close();
HttpWebResponse res = (HttpWebResponse)req.GetResponse();
StreamReader xsr = new StreamReader(res.GetResponseStream());
String responseText = xsr.ReadToEnd();
Obviously substituting my real username and password. When I look at the string responseText, I see that part of the response from eBay is
The browser you are using is rejecting cookies.
Any ideas what I'm doing wrong?
P.S. And yes, I am also using the eBay API, but this is for something slightly different than what I want to do with the API.
You're doing a direct http request. The Ebay site has functionality to talk to a browser (probably to store the session cookie). Unless you make the request code smart enough to use cookies correctly it won't work. You'll probably have to use the internet explorer object instead.
Before doing the POST you need to download the page with the form that you are submitting in your code, take the cookie they give you, put it in your CookieContainer (making sure you get the path right) and post it back up in your request.
To clarify, while you might be POSTing the correct data, you are not sending the cookie that needs to go with it. You will get this cookie from the login page.
You need to intercept the http traffic to see what exactly what had happened. I use Fiddler2. It is the good tools for debugging http. So I can know whos wrong, my application or the remote web server.
Using fiddler, you can see the request header, response header with its cookies as well as response content. It used in the middle of your app and the Ebay.
Based on my experience. I think it is because Ebay cookie sent to you is not send back to Ebay server. Fiddler will prove it whether yes or not.
Another thing, the response cookie you receive should be send back to next request by using the same CookieContainer.
You should notice that CookieContainer has a bug on .Add(Cookie) and .GetCookies(uri) method. You may not using it, but internal codes might use it.
See the details and fix here:
http://dot-net-expertise.blogspot.com/2009/10/cookiecontainer-domain-handling-bug-fix.html
CallMeLaNN

Categories