What url encoding a web browser uses while submitting data to server?
with my application i use HttpUtility.UrlEncode(string data)
but do not get result as i get with web browser.
My application submit some text data to forum.
When i am submitting data with web browser (Google Chrome) the exact text i can see submitted to server but when i am submitting using my application it showing some character different.
So is this necessary to submit any data to server must be url encoded?
---Edit---
data = HttpUtility.UrlEncode(textBox1.Text);
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(url);
request.CookieContainer = cookieContainer;
request.UserAgent = "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/536.5 (KHTML, like Gecko) Chrome/19.0.1084.56 Safari/536.5";
request.Accept = "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8";
request.Headers.Add("Accept-Charset", "ISO-8859-2,utf-8;q=0.7,*;q=0.7");
request.Headers.Add("Accept-Encoding", "gzip, deflate");
request.Headers.Add("Accept-Language", "pl,en-us;q=0.7,en;q=0.3");
request.Method = "POST";
byte[] byteArray = new byte[256];
ASCIIEncoding ascii = new ASCIIEncoding();
byteArray = ascii.GetBytes(data);
request.ContentType = "application/x-www-form-urlencoded";
request.ContentLength = byteArray.Length;
Stream dataStream = request.GetRequestStream();
dataStream.Write(byteArray, 0, byteArray.Length);
dataStream.Close();
and data in textbox1 is like this.
¦=¦=¦¦=¦=¦
RUNTIME………..: 2h:13m:15s
RELEASE SIZE……: 16,2GB
VIDEO CODEC…….: x264, 2pass,
You generally only have to URLEncode if you want to include a URL reserved charatcer (?, &, etc.) in a url parameter.
Related
I used the Fiddler extension RequestToCode to replay a POST from logging into Yahoo.
When I run the code, I can see in Fiddler that the login was successful and there are 10 cookies in the response.
In my code though, the response.Cookies had a count of 0.
So I updated my HTTPWebRequest and set:
request.CookieContainer = new CookieContainer();
When I run the code again and look at it in Fiddler I see the login failed because the response navigates to a failed login url.
My ultimate goal is to get the cookies from the login attempt to use in a later Get request to Yahoo.
Why is setting the cookie container causing a failure?
Maybe because you initializing new CookieContainer on every request.
Declare public variable CookieContainer cookies = new CookieContainer();
Now your new requests will use the same CookieContainer, example:
var request = (HttpWebRequest)WebRequest.Create("https://www.yahoo.com/");
request.CookieContainer = cookies;
request.Method = "GET";
request.UserAgent = "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/67.0.3396.99 Safari/537.36";
request.Accept = "text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,image/apng,*/*;q=0.8";
request.Headers.Add("accept-language", "en,hr;q=0.9");
request.Headers.Add("accept-encoding", "");
request.Headers.Add("Upgrade-Insecure-Requests", "1");
WebResponse response = request.GetResponse();
StreamReader reader = new StreamReader(response.GetResponseStream(), Encoding.UTF8);
string responseFromServer = reader.ReadToEnd();
reader.Close();
response.Close();
I am performing the following HttpWebRequest:
private static void InvokeHealthCheckApi()
{
var webRequest = (HttpWebRequest)WebRequest.Create(Url);
string sb = JsonConvert.SerializeObject(webRequest);
webRequest.Method = "GET";
webRequest.KeepAlive = true;
webRequest.AllowAutoRedirect = true;
webRequest.ContentType = "application/json";
webRequest.UserAgent =
"Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/42.0.2311.135 Safari/537.36";
webRequest.CookieContainer = cookieJar;
using (HttpWebResponse response = webRequest.GetResponse() as HttpWebResponse)
using (StreamReader reader = new StreamReader(response.GetResponseStream()))
{
File.AppendAllText("C:\\httpResponse1.txt", response.Headers.ToString());
File.AppendAllText("C:\\httpResponse2.html", reader.ReadToEnd());
}
}
The response from the request is coming back as web page that reads:
"Script is disabled. Click Submit to continue."
(Submit Button)
After clicking the submit button i get a prompt that reads:
"Do you want to open or save healthcheck.json(297 bytes) from fr74a87d9.work.corp.net?
After clicking the Open button I receive the json data that I am expecting to receive.
My question is how do I parse the response to get to the json data that I need? Is it normal to get the web page as the initial response and have to drill down to get the json response? StreamReader can't parse the response because it's a web page and not json data.
I found a resolution to my problem here: http://blogs.microsoft.co.il/applisec/2013/06/03/passive-federation-client/
When i try to get html page i get this error:
The underlying connection was closed: The connection was closed unexpectedly
I think the site I'm getting, is using some protection based on ip.
WebClient single_page_client = new WebClient();
single_page_client.Headers.Add("user-agent", "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; .NET CLR 1.0.3705;)");
string cat_page_single = single_page_client.DownloadString(the_url);
How can i do it?
What about use proxy with Webclient?
EDIT
If i use this code, it works. Why?
HttpWebRequest webrequest = (HttpWebRequest)WebRequest.Create(current_url);
webrequest.KeepAlive = true;
webrequest.Method = "GET";
webrequest.ContentType = "text/html";
webrequest.Accept = "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8";
//webrequest.Connection = "keep-alive";
webrequest.Host = "cat.sabresonicweb.com";
webrequest.Headers.Add("Accept-Language", "en-US,en;q=0.5");
webrequest.UserAgent = "Mozilla/5.0 (Windows NT 6.1; rv:18.0) Gecko/20100101 Firefox/18.0";
HttpWebResponse webresponse = (HttpWebResponse)webrequest.GetResponse();
Console.Write(webresponse.StatusCode);
Stream receiveStream = webresponse.GetResponseStream();
Encoding enc = System.Text.Encoding.GetEncoding(1252);//1252
StreamReader loResponseStream = new StreamReader(receiveStream, enc);
string current_page = loResponseStream.ReadToEnd();
loResponseStream.Close();
webresponse.Close();
The first request does not utilize a header that indicates the length of the result. It closes the connection when it finishes.
The second request utilizes the length header, reads the designated number of bytes, then closes the connection gracefully. (under the client side control instead of server driven disconnection)
-or-
The url you sent caused an error on the server. Is there an error in the server log or event viewer?
My friend is using C# to write a simple program for requesting a webpage.
However he encounter a problem when try to request a specified webpage.
He have already tried to set all the header and cookie inside the request, but it still got the timeout exception.
The example webpage is https://my.ooma.com
Here is the code:
string url = "https://my.ooma.com";
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(url);
request.Timeout = 30000;
request.UserAgent = "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/536.5 (KHTML, like Gecko) Chrome/19.0.1084.52 Safari/536.5";
request.Method = "GET";
request.CookieContainer = new CookieContainer();
request.Accept = "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8";
request.Headers.Add("Accept-Charset:ISO-8859-1,utf-8;q=0.7,*;q=0.3");
request.Headers.Add("Accept-Encoding:gzip,deflate,sdch");
request.Headers.Add("Accept-Language:en-US,en;q=0.8");
request.KeepAlive = true;
WebResponse myResponse = request.GetResponse();
StreamReader sr = new StreamReader(myResponse.GetResponseStream());
string result = sr.ReadToEnd();
sr.Close();
myResponse.Close();
All the headers is as same as when using Chrome to browse the webpage.
And he didn't see any cookies set by using the Chrome developer tool.
Do anyone can success request the page using C#?
Thanks a lot.
Sorry for being late.
The following code snippet should work just fine. I also tried with tour old URL that had "getodds.xgi" in it and it also worked fine.
The server uses a secure sockets layer (SSL) protocol for connections that use the Secure Hypertext Transfer Protocol (HTTPS) scheme only.
You don't need to set any UserAgent or Header if they were just intended to get response.
ServicePointManager.SecurityProtocol = SecurityProtocolType.Ssl3;
WebRequest request = WebRequest.Create("http://my.ooma.com/");
string htmlResponse = string.Empty;
using (WebResponse response = request.GetResponse())
{
using (StreamReader reader = new StreamReader(response.GetResponseStream()))
{
htmlResponse = reader.ReadToEnd();
reader.Close();
}
response.Close();
}
I have an HttpWebRequest with a StreamReader that works very fine without using a WebProxy. When I use WebProxy, the StreamReader reads strange character instead of the actual html. Here is the code.
HttpWebRequest req = (HttpWebRequest)WebRequest.Create("https://URL");
req.UserAgent = "Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/534.10 (KHTML, like Gecko) Chrome/8.0.552.224 Safari/534.10";
req.Accept = "application/xml,application/xhtml+xml,text/html;q=0.9,text/plain;q=0.8,image/png,*/*;q=0.5";
req.Headers.Add("Accept-Charset", "ISO-8859-1,utf-8;q=0.7,*;q=0.3");
req.Headers.Add("Accept-Encoding", "gzip,deflate,sdch");
req.Headers.Add("Accept-Language", "en-US,en;q=0.8");
req.Method = "GET";
req.CookieContainer = new CookieContainer();
WebProxy proxy = new WebProxy("proxyIP:proxyPort");
proxy.Credentials = new NetworkCredential("proxyUser", "proxyPass");
req.Proxy = this.proxy;
HttpWebResponse res = (HttpWebResponse)req.GetResponse();
StreamReader reader = new StreamReader(res.GetResponseStream());
string html = reader.ReadToEnd();
Without using the WebProxy, the variable html holds the expected html string from the URL. But with a WebProxy, html holds a value like that:
"�\b\0\0\0\0\0\0��]r����s�Y����\0\tP\"]ki���ػ��-��X�\0\f���/�!�HU���>Cr���P$%�nR�� z�g��3�t�~q3�ٵȋ(M���14&?\r�d:�ex�j��p������.��Y��o�|��ӎu�OO.�����\v]?}�~������E:�b��Lן�Ԙ6+�l���岳�Y��y'ͧ��~#5ϩ�it�2��5��%�p��E�L����t&x0:-�2��i�C���$M��_6��zU�t.J�>C-��GY��k�O�R$�P�T��8+�*]HY\"���$Ō�-�r�ʙ�H3\f8Jd���Q(:�G�E���r���Rܔ�ڨ�����W�<]$����i>8\b�p� �\= 4\f�> �&��$��\v��C��C�vC��x�p�|\"b9�ʤ�\r%i��w#��\t�r�M�� �����!�G�jP�8.D�k�Xʹt�J��/\v!�r��y\f7<����\",\a�/IK���ۚ�r�����ҿ5�;���}h��+Q��IO]�8��c����n�AGڟu2>�
Since you are passing
req.Headers.Add("Accept-Encoding", "gzip,deflate,sdch");
I would say your proxy compress the stream before sending it back to you.
Inspect the headers of the Response to check the encoding.
Just use Gzip to decompress it.