C# - How to get cookies from website using xNet - c#

I want to submit Emails in this website: https://www.bitdefender.com/site/Facebook/redFreeDownload?nfo%5Bemail%5D=agytnpbu%40gmx.com&nfo%5BreGoogle%5D=auto-detect&g-recaptcha-response=&nfo%5Bhash_page%5D=tsmd-2016-facebook&nfo%5Bsystem-2016%5D=active
There is a captcha but it doesn't work.
I use Fiddler to get all the information I need. However, I don't know how I can read the cookies so that I will be able to submit emails again and again...
How can I do it? Cookies:
__cfduid=d7132cd01781606b71aa23e43dca589bc1536917322; PHPSESSID=a4ujho0plamjc6nq4n31pqu627; _country=il; AWSALB=xAs9RL/e6nVR17J9qYZnooEQedeMW48ZEgx8no+xyhkQxhCjSsrcnc1l/LpjrfL8vBNXdA40agM6Zk3e4i84DGCDXd/7TqGaaYrb5zeyGHbxy8nBZqIGiKUKtKLV; bd112=3ZAxb4MwEIX%2FiyU6hRAgUIUqilRF6dq9ripjH2AFc5Z9lEZV%2Fnup24GhS9Zup3ffu3d6L59sdD2rWEdkK57wZJqmda1JQQODAreWaHjiNQFPTkJCjXjmiQN1cgBHnIYehToMDUbFIxih%2B6g47kV7ocHWY7TdtObj%2B8SdMPbhh3LwhNj2EMCRMFZAICkQbexACkuyE%2FPkLQ4e9gtvJ3z3ZkUbzOSNirNNWsbN72ML0l88gQnrECRJvwNbMdIGPM0cq9IiL3fpfZ5l19UtNTw7NEh6%2Fm1WrAOhlsrfGfnutox%2FWvW2vL5%2BAQ%3D%3D
My code:
using (var request = new HttpRequest())
{
request.Proxy = HttpProxyClient.Parse("127.0.0.1:8888");
request.IgnoreProtocolErrors = true;
request.KeepAlive = true;
request.ConnectTimeout = 10000;
request.AllowAutoRedirect = false;
request.AddParam("nfo[email]", "kirtchukdaniel#gmail.com");
request.AddParam("nfo[reGoogle]", "auto-detect");
request.AddParam("g-recaptcha-response", "");
request.AddParam("nfo[hash_page]", "tsmd-2016-facebook");
request.AddParam("nfo[system-2016]", "active");
request.UserAgent = "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/64.0.3282.140 Safari/537.36 Edge/17.17134";
request.AddHeader("Host ", "www.bitdefender.com");
request.Referer = "https://www.bitdefender.com/site/Facebook/redFreeDownload?nfo%5Bemail%5D=agytnpbu%40gmx.com&nfo%5BreGoogle%5D=auto-detect&g-recaptcha-response=&nfo%5Bhash_page%5D=tsmd-2016-facebook&nfo%5Bsystem-2016%5D=active";
var attempt = request.Get("https://www.bitdefender.com/site/Promotions/spreadPromotions/").ToString();
Console.WriteLine(attempt);
}
Console.ReadKey();

answer is simple.
request.Cookies = new CookieDictionary();
also remove this(or it can throw an exception).
request.AddHeader("Host ", "www.bitdefender.com");
// =>
// request.AddHeader("Host ", "www.bitdefender.com");

Related

Can't download web page in .net

I did a batch that parse html page of gearbest.com to extract data of the items (example link link).
It worked until 2-3 week ago after that the site was updated.
So I can't dowload pages to parse and I don't undastand why.
Before the update I did request with the following code with HtmlAgilityPack.
HtmlWeb web = new HtmlWeb();
HtmlDocument doc = null;
doc = web.Load(url); //now this the point where is throw the exception
I tried without the framework and I added some date to the request
HttpWebRequest request = (HttpWebRequest) WebRequest.Create("https://it.gearbest.com/tv-box/pp_009940949913.html");
request.Credentials = CredentialCache.DefaultCredentials;
request.UserAgent = "Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/72.0.3626.121 Safari/537.36";
request.ContentType = "text/html; charset=UTF-8";
request.CookieContainer = new CookieContainer();
request.Headers.Add("accept-language", "it-IT,it;q=0.9,en-US;q=0.8,en;q=0.7");
request.Headers.Add("accept-encoding", "gzip, deflate, br");
request.Headers.Add("upgrade-insecure-requests", "1");
request.Accept = "text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,image/apng,*/*;q=0.8";
request.CookieContainer = new CookieContainer();
Response response = request.GetResponse(); //exception
the exception is:
IOException: Unable to read data from the transport connection
SocketException: The connection could not be established.
If I try to request the main page (https://it.gearbest.com) it works.
What's the problem in your opinion?
For some reason it doesn't like the provided user agent. If you omit setting UserAgent everything works fine
HttpWebRequest request = (HttpWebRequest) WebRequest.Create("https://it.gearbest.com/tv-box/pp_009940949913.html");
request.Credentials = CredentialCache.DefaultCredentials;
//request.UserAgent = "Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/72.0.3626.121 Safari/537.36";
request.ContentType = "text/html; charset=UTF-8";
Another solution would be setting request.Connection to a random string (but not keep-alive or close)
request.UserAgent = "Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/72.0.3626.121 Safari/537.36";
request.Connection = "random value";
It also works but I cannot explain why.
Might be worth a try...
HttpRequest.KeepAlive = false;
HttpRequest.ProtocolVersion = HttpVersion.Version10;
https://stackoverflow.com/a/16140621/1302730

Why I keep getting 401 from my webrequest code in C#

If I use a regular url it works fine and if I use google domains update url I get 401 error. This is my first try on C# application.
HttpWebRequest request = WebRequest.Create("https://UUUUUUUUUUUUU:PPPPPPPPPPPPP#domains.google.com/nic/update?hostname=subdomain.example.com") as HttpWebRequest;
//request.Accept = "application/xrds+xml";
request.UserAgent = "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/67.0.3396.102 Safari/537.36 Viv/1.97.1246.7";
request.UseDefaultCredentials = true;
request.PreAuthenticate = true;
request.Credentials = CredentialCache.DefaultCredentials;
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
WebHeaderCollection header = response.Headers;
var encoding = ASCIIEncoding.ASCII;
using (var reader = new System.IO.StreamReader(response.GetResponseStream(), encoding))
{
string responseText = reader.ReadToEnd();
//responseddns = responseText;
MessageBox.Show(responseText);
}
If I use http://example.com/getip.php it works fine I can see the output.
you cannot use
> `CredentialCache.DefaultCredentials;`
since the url is the domain.google.com's domain.
You need to enter your google credentials or else directly use
http://example.com/getip.php as u did before

HttpWebResponse - The Operation timed out

I am trying to download and save the favicon for various websites. For the majority the following code works. However, I have a problem with some urls. for example:
https://www.bestbuy.com/favicon.ico bestbuy,
https://www.macys.com/favicon.ico macys
I can open these urls in my default browser (firefox) without any problems.
This is the code I'm using to do the HttpWebRequest and where I get the exception.
This is how I do the WebRequest
HttpWebRequest request = WebRequest.Create(uri) as HttpWebRequest;
request.Timeout = 10000;
request.AutomaticDecompression = DecompressionMethods.GZip | DecompressionMethods.Deflate;
request.Headers.Add("Upgrade-Insecure-Requests", "1");
request.CookieContainer = new CookieContainer();
request.UserAgent = "Application name here";
response = request.GetResponse() as HttpWebResponse;
Any ideas why the example urls time out (again, most work fine).
`
You're getting blocked by your useragent. Send something a browser would send. I used this:
Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/62.0.3202.75 Safari/537.36
HttpWebRequest request = WebRequest.Create(uri) as HttpWebRequest;
request.Timeout = 10000;
request.AutomaticDecompression = DecompressionMethods.GZip |
DecompressionMethods.Deflate;
request.Headers.Add("Upgrade-Insecure-Requests", "1");
request.CookieContainer = new CookieContainer();
request.UserAgent = "Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/62.0.3202.75 Safari/537.36";
response = request.GetResponse() as HttpWebResponse;

HttpRequest Header Response C#

I have a problem using the response of HttpRequest() i get the response but just the html not the headers and the key that i am searching is on the header so this is my code
HttpRequest rq = new HttpRequest();
rq.Cookies = new CookieDictionary();
rq.UserAgent = "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/59.0.3071.115 Safari/537.36";
rq.AllowAutoRedirect = true;
rq.IgnoreProtocolErrors = true;
rq.ConnectTimeout = TimeOut;
rq.KeepAlive = true;
var str = rq.Get("url").ToString();
if(str.Contains("404")){
}
i hope you can help me
I Found the answer thanks for your help
var req = rq.Get("url");
if(req.StatusCode.ToString().Contains("NotFound") ){
}

c# The underlying connection was closed: An unexpected error occurred on a send

First of all I think my issue is pretty different than the other topics in stackoverflow since I've tried the solutions in them.
I'm using .NET 4.5 :
HttpWebRequest MainRequest = (HttpWebRequest)WebRequest.Create(url);
WebHeaderCollection myWebHeaderCollection = MainRequest.Headers;
MainRequest.Method = "GET";
MainRequest.Host = "aro.example.com";
MainRequest.Timeout = 20000;
MainRequest.Accept = "text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8";
myWebHeaderCollection.Add("Upgrade-Insecure-Requests", "1");
MainRequest.UserAgent = "Mozilla/5.0 (Windows NT 6.3; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/48.0.2564.103 Safari/537.36";
myWebHeaderCollection.Add("Accept-Encoding", "gzip, deflate, sdch");
myWebHeaderCollection.Add("Accept-Language", "en-US,en;q=0.8");
MainRequest.AutomaticDecompression = DecompressionMethods.GZip | DecompressionMethods.Deflate;
MainRequest.ServicePoint.Expect100Continue = false;
MainRequest.CookieContainer = new CookieContainer();
HttpWebResponse MainResponse = (HttpWebResponse)MainRequest.GetResponse();
This throws the exception the underlying ...
I added :
ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls12;
I get no exception now, but also no response until the timeout occurs.

Categories