I have to work on JSON data from API (in my windows app), and I am trying make a POST request using WebClient.UploadString();
Below is my code, but its throwing error, I tried various options but not able to copy the JSON as a string.
string result = "";
string url = "https://30prnabicq-dsn.algolia.net/1/indexes/*/queries?x-algolia-agent=Algolia for vanilla JavaScript (lite) 3.24.12;JS Helper 2.24.0;vue-instantsearch 1.5.0&x-algolia-application-id=30PRNABICQ&x-algolia-api-key=dcccebe87b846b64f545bf63f989c2b1";
string json = "{\"requests\":[{\"indexName\":\"vacatures\",\"params\":\"query=&hitsPerPage=20&page=0&highlightPreTag=__ais-highlight__&highlightPostTag=__/ais-highlight__&facets=[\"category\",\"contract\",\"experienceNeeded\",\"region\"]&tagFilters=\"}]}";
using (var client = new WebClient())
{
client.Headers[HttpRequestHeader.Host] = "30prnabicq-dsn.algolia.net";
client.Headers[HttpRequestHeader.UserAgent] = "Mozilla/5.0 (Windows NT 6.3; Win64; x64; rv:60.0) Gecko/20100101 Firefox/60.0";
client.Headers[HttpRequestHeader.Accept] = "application/json";
client.Headers[HttpRequestHeader.AcceptLanguage] = "en-US,en;q=0.5";
client.Headers[HttpRequestHeader.AcceptEncoding] = "gzip, deflate, br";
client.Headers[HttpRequestHeader.Referer] = "https://bouwjobs.be/";
client.Headers[HttpRequestHeader.ContentType] = "application/json";
client.Headers[HttpRequestHeader.ContentLength] = "249";
client.Headers[HttpRequestHeader.Origin] = "https://bouwjobs.be";
client.Headers[HttpRequestHeader.Connection] = "keep-alive";
client.Headers[HttpRequestHeader.Cache - Control] = "max-age=0";
result = client.UploadString(url, "POST", json);
return result;
}
Please guide me in correcting my code.
Note - Some restricted headers I have included in my code, but even after commenting out those it is throwing error.
You do not seem uploading a valid Json with WebClient. Double quotes in your inner array facets means your query parameter has ended. Remove quotes from it.
string json = "{\"requests\":[{\"indexName\":\"vacatures\",\"paras\":\"query=&hitsPerPage=20&page=0&highlightPreTag=__ais-highlight__&highlightPostTag=__/ais-highlight__&facets=[category,contract,experienceNeeded,region]&tagFilters=\"}]}";
This is valid json and should work fine.
Related
I tried to get the source of a particular site page using the code below but it failed.
I was able to get the page source in 1~2 seconds using a webbrowser or webdriver, but httpwebrequest failed.
I tried putting the actual webbrowser cookie into httpwebrequest, but it failed, too.
(Exception - The operation has timed out)
I wonder why it failed and want to learn through failure.
Thank you in advance!!.
string Html = String.Empty;
CookieContainer cc = new CookieContainer();
HttpWebRequest req = (HttpWebRequest)WebRequest.Create("https://www.coupang.com/");
req.Method = "GET";
req.Host = "www.coupang.com";
req.UserAgent = "Mozilla/5.0 (Windows NT 6.1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/74.0.3729.157 Safari/537.36";
req.Accept = "text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3";
req.Headers.Add("Accept-Language", "ko-KR,ko;q=0.9,en-US;q=0.8,en;q=0.7");
req.CookieContainer = cc;
using (HttpWebResponse res = (HttpWebResponse)req.GetResponse())
using (StreamReader str = new StreamReader(res.GetResponseStream(), Encoding.UTF8))
{
Html = str.ReadToEnd();
}
Removing req.Host from your code should do the trick.
According to the documentation:
If the Host property is not set, then the Host header value to use in an HTTP request is based on the request URI.
You already set the URI in (HttpWebRequest)WebRequest.Create("https://www.coupang.com/") so I don't think doing it again is necessary.
Result
Please let me know if it helps.
I'm trying to download the html string of a website. The website has te following url:
https://www.gastrobern.ch/de/service/aus-weiterbildung/wirtekurs/234/?oid=1937&lang=de
First I tried to do a simple WebClient Request:
var wc = new WebClient();
string websitenstring = "";
websitenstring = wc.DownloadString("http://www.gastrosg.ch/default.asp?id=3020000&siteid=1&langid=de");
But, the websiteString was empty. Then, I read in some posts, that I have to send some additional headerinformations :
var wc = new WebClient();
string websitenstring = "";
wc.Headers[HttpRequestHeader.Accept] = "text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,image/apng,*/*;q=0.8";
wc.Headers[HttpRequestHeader.AcceptEncoding] = "gzip, deflate, br";
wc.Headers[HttpRequestHeader.AcceptLanguage] = "de-DE,de;q=0.9,en-US;q=0.8,en;q=0.7";
wc.Headers[HttpRequestHeader.CacheControl] = "max-age=0";
wc.Headers[HttpRequestHeader.Host] = "www.gastrobern.ch";
wc.Headers[HttpRequestHeader.Upgrade] = "www.gastrobern.ch";
wc.Headers[HttpRequestHeader.UserAgent] = "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/69.0.3497.100 Safari/537.36";
websitenstring = wc.DownloadString("https://www.gastrobern.ch/de/service/aus-weiterbildung/wirtekurs/234/?oid=1937&lang=de");
I tried this, but no answer. Then, I also tried to set some cookies:
wc.Headers.Add(HttpRequestHeader.Cookie,
"CFID=10609582;" +
"CFTOKEN=32721418;" +
"_ga=GA1.2.37" +
"_ga=GA1.2.379124242.1539000256;" +
"_gid=GA1.2.358798732.1539000256;" +
"_dc_gtm_UA-1237799-1=1;");
But this also didn't work. I also found out, that the Browser is somehow doing multiple requests, and my C-Sharp Application is just doing one and showing the first response headers.
But I don't know how I can make a following up request. I'm thankful for every answer.
Try HttpClient instead
Here is an Example On how to use it
public async static Task<string> GetString(string url)
{
HttpClient client = new HttpClient();
// Way around to avoid Deadlock
HttpResponseMessage message = await client.GetAsync(url).ConfigureAwait(false);
return await message.Content.ReadAsStringAsync().ConfigureAwait(false);
}
To call this Method
string dataFromServer = GetString("https://www.gastrobern.ch/de/service/aus-weiterbildung/wirtekurs/234/?oid=1937&lang=de").Result;
I checked Here dataFromServer has HTML content to that page
Pretty standard implementation of HttpWebRequest, whenever I pass a certain URL to get the html it comes back with nothing but special characters. An example of what comes back is below.
Now this site is SSL so I'm wondering if that has something to do with it but I've never had this problem before and I've used this with other SSL sites.
�
ServicePointManager.ServerCertificateValidationCallback = new System.Net.Security.RemoteCertificateValidationCallback(AcceptAllCertifications);
var request = (HttpWebRequest)WebRequest.Create(url);
using (var response = (HttpWebResponse)request.GetResponse())
{
Stream data = response.GetResponseStream();
HtmlDocument hDoc = new HtmlDocument();
using (StreamReader readURLContent = new StreamReader(data))
{
html = readURLContent.ReadToEnd();
hDoc.LoadHtml(html);
}
}
I can't really find anything for this specific issue so I'm kind of lost if anybody could point me in the right direction that would be awesome.
Edit: here's an image of what it looks like since I can't copy paste it
My guess is that the response is compressed. If you use a WebDebugger like Charles or Fiddler. You can see how the requests and structured and what data they contain - it makes it a lot easier to replicate the http requests later on when programming them. Try the following code.
try
{
string webAddr = url;
var httpWebRequest = (HttpWebRequest)WebRequest.Create(webAddr);
httpWebRequest.ContentType = "text/html; charset=utf-8";
httpWebRequest.UserAgent = "Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:58.0) Gecko/20100101 Firefox/58.0";
httpWebRequest.AllowAutoRedirect = true;
httpWebRequest.Method = "GET";
httpWebRequest.AutomaticDecompression = DecompressionMethods.GZip | DecompressionMethods.Deflate;
var httpResponse = (HttpWebResponse)httpWebRequest.GetResponse();
using (var streamReader = new StreamReader(httpResponse.GetResponseStream(), Encoding.UTF8))
{
var responseText = streamReader.ReadToEnd();
doc.LoadHtml(responseText);
}
}
catch (WebException ex)
{
Console.WriteLine(ex.Message);
}
The code sets the encoding on the requsts. You an also set the encoding at the streamreader when reading the response. And automatic decompression is enabled.
The main problem I think is that I am trying to get an output of a php script on an ssl protected website. Why doesn't the following code work?
string URL = "https://mtgox.com/api/0/data/ticker.php";
HttpWebRequest myRequest =
(HttpWebRequest)WebRequest.Create(URL);
myRequest.Method = "GET";
WebResponse myResponse = myRequest.GetResponse();
StreamReader _sr = new StreamReader(myResponse.GetResponseStream(),
System.Text.Encoding.UTF8);
string result = _sr.ReadToEnd();
//Console.WriteLine(result);
result = result.Replace('\n', ' ');
_sr.Close();
myResponse.Close();
Console.WriteLine(result);
It hangs at WebException was unhandeled The operation has timed out
You're hitting the wrong url. ssl is https://, but you're hitting http:// (note the lack of S). The site does redirect to the SSL version of the page, but your code is apparently not following that redirect.
Have added myRequest.UserAgent = "Mozilla/5.0 (Windows NT 6.1) AppleWebKit/537.11 (KHTML, like Gecko) Chrome/23.0.1271.64 Safari/537.11"; everything started working
Could someone help with parsing website.
I have parsed lots of sites but this one is interesting, the inner code is generated dynamically with php file. So I tried to use WebClient like this:
WebClient client = new WebClient();
string postData = "getProducts=1&category=340&brand=0";
byte[] byteArray = Encoding.UTF8.GetBytes(postData);
client.Headers.Add("POST", "/ajax.php HTTP/1.1");
client.Headers.Add("Host", site);
client.Headers.Add("Connection", "keep-alive");
client.Headers.Add("Origin", "http://massup.ru");
client.Headers.Add("X-Requested-With", "XMLHttpRequest");
client.Headers.Add("User-Agent", "Mozilla/5.0 (Windows NT 6.1) AppleWebKit/536.11 (KHTML, like Gecko) Chrome/20.0.1132.57 Safari/536.11");
client.Headers.Add("Accept", "*/*");
client.Headers.Add("Content-Type", "application/x-www-form-urlencoded");
client.Headers.Add("Content-length", byteArray.Length.ToString());
client.Headers.Add("Referer", "http://massup.ru/category/proteini");
client.Headers.Add("Accept-Encoding", "gzip,deflate,sdch");
client.Headers.Add("Accept-Language", "ru-RU,ru;q=0.8,en-US;q=0.6,en;q=0.4");
client.Headers.Add("Accept-Charset", "windows-1251,utf-8;q=0.7,*;q=0.3");
client.Headers.Add("Cookie", "cart=933a71dfee2baf8573dfc2094a937f0d; r_v=YToyOntpOjA7YTo3OntzOjU6Im1vZGVsIjtzOjI2OiIxMDAlIFdoZXkgUHJvdGVpbiA5MDgg0LPRgCI7czozOiJ1cmwiO3M6MzQ6Im11bHRpcG93ZXItMTAwLXdoZXktcHJvdGVpbi05MDgtZ3IiO3M6NToiYnJhbmQiO3M6MTA6Ik11bHRpcG93ZXIiO3M6ODoiY2F0ZWdvcnkiO3M6Mzk6ItCh0YvQstC%2B0YDQvtGC0L7Rh9C90YvQtSDQuNC30L7Qu9GP0YLRiyI7czo5OiJzY2F0ZWdvcnkiO3M6Mzc6ItCh0YvQstC%2B0YDQvtGC0L7Rh9C90YvQuSDQuNC30L7Qu9GP0YIiO3M6NToicHJpY2UiO3M6MToiMCI7czo0OiJpY29uIjtzOjM3OiJodHRwOi8vbWFzc3VwLnJ1L2ltYWdlcy9pY29uXzQ3NTIuanBnIjt9aToxO2E6Nzp7czo1OiJtb2RlbCI7czoxNzoiTWF0cml4IDIuMCA5ODQg0LMiO3M6MzoidXJsIjtzOjE2OiJtYXRyaXgtMi0wLTk4NC1nIjtzOjU6ImJyYW5kIjtzOjc6IlN5bnRyYXgiO3M6ODoiY2F0ZWdvcnkiO3M6Mzk6ItCh0YvQstC%2B0YDQvtGC0L7Rh9C90YvQtSDQuNC30L7Qu9GP0YLRiyI7czo5OiJzY2F0ZWdvcnkiO3M6Mzc6ItCh0YvQstC%2B0YDQvtGC0L7Rh9C90YvQuSDQuNC30L7Qu9GP0YIiO3M6NToicHJpY2UiO3M6NDoiMTE5MCI7czo0OiJpY29uIjtzOjM3OiJodHRwOi8vbWFzc3VwLnJ1L2ltYWdlcy9pY29uXzEwMDguanBnIjt9fQ%3D%3D; PHPSESSID=933a71dfee2baf8573dfc2094a937f0d");
Stream data = client.OpenRead("http://massup.ru/ajax.php");
StreamReader reader = new StreamReader(data);
string s = reader.ReadToEnd();
Console.WriteLine(s);
data.Close();
reader.Close();
But it gives me an error!
Could someone help me with this kind of parsing.
See my answer to C# https login and download file, which has working code that correctly handles HTTP POSTs, then clean up what you have based on it. After you've done that, if you still need help, post your updated code and a clearer description of what specific errors or exceptions you are seeing.