Clear context after HttpWebRequest - c#

We use an HttpWebRequest to send query to a webservice with NetworkCredential like this :
Byte[] bytes = Encoding.UTF8.GetBytes("...");
HttpWebRequest request = (HttpWebRequest)WebRequest.CreateHttp(strURL);
request.Method = "POST";
request.ContentLength = bytes.Length;
request.ContentType = "application/xml";
request.Timeout = 10000;
request.KeepAlive = false;
NetworkCredential cred = new NetworkCredential
{
UserName = "...",
Password = "..."
};
request.Credentials = cred;
WebRequest.DefaultWebProxy.Credentials = CredentialCache.DefaultCredentials;
using (Stream requestStream = request.GetRequestStream())
{
requestStream.Write(bytes, 0, bytes.Length);
}
HttpWebResponse resp = request.GetResponse() as HttpWebResponse;
To test it, we created a small console project with just this request and constant information (always the same).
After two successful requests, the third one systematically stops and ends in timeout.
But if we restart our application, we can immediately reissue two requests before the third one freezes again.
Do you have any idea of the cause? It looks like some information needs to be purged before we can continue.
-- EDIT --
I tried to start each request on a separate thread (Using Thread or Task) with the same result.

I change everything to use HttpClient and, now, everything works fine.

Related

Problem with getting response of a HTTP Web request

You can see my code down there.This is going to show user id of a instagram user as the response but it is showing "�".
private void button18_Click(object sender, EventArgs e)
{
string username = ""; // your username
string password = ""; // your password
HttpWebRequest httpWebRequest = (HttpWebRequest)WebRequest.Create("https://i.instagram.com/api/v1/accounts/login/");
httpWebRequest.Headers.Add("X-IG-Connection-Type", "WiFi");
httpWebRequest.Method = "POST";
httpWebRequest.ContentType = "application/x-www-form-urlencoded; charset=UTF-8";
httpWebRequest.Headers.Add("X-IG-Capabilities", "AQ==");
httpWebRequest.Accept = "*/*";
httpWebRequest.UserAgent = "Instagram 10.9.0 Android (23/6.0.1; 944dpi; 915x1824; samsung; SM-T185; gts210velte; qcom; en_GB)";
httpWebRequest.Headers.Add("Accept-Encoding", "gzip, deflate");
httpWebRequest.Headers.Add("Accept-Language", "en;q=1, ru;q=0.9, ar;q=0.8");
httpWebRequest.Headers.Add("Cookie", "mid=qzejldb8ph9eipis9e6nrd1n457b;csrftoken=a7nd2ov9nbxgqy473aonahi58y21i8ee");
httpWebRequest.Host = "i.instagram.com";
httpWebRequest.KeepAlive = true;
byte[] bytes = new ASCIIEncoding().GetBytes("ig_sig_key_version=5&signed_body=5128c31533802ff7962073bb1ebfa9972cfe3fd9c5e3bd71fe68be1d02aa92c8.%7B%22username%22%3A%22"+ username + "%22%2C%22password%22%3A%22"+ password +"%22%2C%22_uuid%22%3A%22D26E6E86-BDB7-41BE-8688-1D60DE60DAF6%22%2C%22_uid%22%3A%22%22%2C%22device_id%22%3A%22android-a4d01b84202d6018%22%2C%22_csrftoken%22%3A%22a7nd2ov9nbxgqy473aonahi58y21i8ee%22%2C%22login_attempt_count%22%3A%220%22%7D");
httpWebRequest.ContentLength = (long)bytes.Length;
using (Stream requestStream = httpWebRequest.GetRequestStream())
{
requestStream.Write(bytes, 0, bytes.Length);
}
string result = new StreamReader(((HttpWebResponse)httpWebRequest.GetResponse()).GetResponseStream()).ReadToEnd();
textBox1.Text = result;
}
It's answered in a comment by #Dour, but I'm going to add a bit of detail that isn't mentioned.
This following line tells the server: Hey server! Please send me a compressed response (to reduce size).
httpWebRequest.Headers.Add("Accept-Encoding", "gzip, deflate");
so, the response you're getting is a compressed response.
Does that mean you just remove this line ?
Well, Removing it will fix your problem but that isn't the correct way to do that.
It's really better to get a reduced size response because that will need lower time to download it from the server.
What you really need to do is to make HttpWebRequest handles the decompression process for you.
You can do that by setting AutomaticDecompression property.
httpWebRequest.AutomaticDecompression = DecompressionMethods.Deflate | DecompressionMethods.GZip;
Note: With the previous line, you do NOT need to set Accept-Encoding yourself. HttpWebRequest will send it automatically for you. And when you call GetResponse, HttpWebRequest will handle the decompression process for you.
Besides of your problem:
I suggest that you use using statement for getting the response, because with your current code, I think it won't get disposed.

Do I have to update every cookies everytime?

I am using my c# app to send request to the webpage. Before I send any request I have to be logged in this webpage. I want to avoid logging everytime I want to do some work, so I am storing cookies in sql server database in VARBINARY column.
Lets say I am sending 50 POST requests every day:
private string getRequest(string url, string postData = "")
{
try
{
System.Net.ServicePointManager.Expect100Continue = true;
StreamReader reader;
var request = WebRequest.Create(url) as HttpWebRequest;
request.UserAgent = "Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.8.1.14) Gecko/20080404 Firefox/2.0.0.14";
request.CookieContainer = Cookie;
request.AllowAutoRedirect = true;
request.ContentType = "application/x-www-form-urlencoded";
request.Headers.Add("X-Requested-With", "XMLHttpRequest");
postData = Uri.EscapeUriString(postData);
request.Method = "POST";
request.ContentLength = postData.Length;
Stream requestStream = request.GetRequestStream();
Byte[] postBytes = Encoding.ASCII.GetBytes(postData);
requestStream.Write(postBytes, 0, postBytes.Length);
requestStream.Close();
HttpWebResponse response = request.GetResponse() as HttpWebResponse;
foreach (Cookie tempCookie in response.Cookies)
{
Cookie.Add(tempCookie);
}
reader = new StreamReader(response.GetResponseStream());
string readerReadToEnd = reader.ReadToEnd();
response.Close();
database.updateCookie(AccountId, Cookie); //here I am updating the cookies
return readerReadToEnd;
}
catch { return ""; }
Do I really need to update Cookies after every request? Or maybe could I update my cookies only once, after sending 50 POST requests?
I am asking, because sometimes my cookies lasts couple days, and sometimes they died after 1 minute. I don't really know why and how to avoid that.
Do I have to store the newest verion of cookies or maybe can I use the same everytime?
Each site decides how long particular cookies are valid. Hence you need to get that information for each site individually to do it correctly. Paying attention to "expiration" on the cookie (in "set-cookie" headers of response) may give you initial guidance on when cookie is guaranteed to expire.
Common expiration/validity ranges:
authentication cookies - from 10 minutes to 1 day
ASP.Net session state - from 20 minutes
CSRF protection - possibly updated on every response

Writing a cookie locally

I let my program (c#) log in to website, i get correct buffered cookie information. Yet, i get a 401, session timed out, when i want to retrieve the correct data behind my login.
So i figured, the website must not be able to retrieve that cookie info. Can't figure out how to store it so that the website can retrieve it.
WebRequest req = WebRequest.Create(Url);
req.ContentType = "application/x-www-form-urlencoded";
req.Method = "POST";
byte[] bytes = Encoding.ASCII.GetBytes(Gegevens);
req.ContentLength = bytes.Length;
using (Stream os = req.GetRequestStream())
{
os.Write(bytes, 0, bytes.Length);
}
WebResponse resp = req.GetResponse();
cookieHeader = resp.Headers["Set-cookie"];
cookieHeader, contains the correct info. Thanks in advance.
You need to assign a CookieContainer to your web request and use this very same CookieContainer for the following requests, too.
See MSDN for reference.
You could (if you want to persist the cookies upon closing your application) get the list of Cookies from the CookieContainer and serialize these. Upon opening the application you could deserialize and rebuild the CookieContainer.
From the comment you provided, I'm going to hazard a guess and say you aren't properly adding your login cookies to your next WebRequest. Cookie handling with a WebRequest object is a bit difficult, so I recommend using HttpWebRequest and HttpWebResponse which has built-in cookie parsing. You only have to change a couple lines of code here and there:
Building a request (using the same example in your question)
CookieContainer cookies = new CookieContainer();
// When using HttpWebRequest or HttpWebResponse, you need to cast for it to work properly
HttpWebRequest req = (HttpWebRequest)WebRequest.Create(url);
req.CookieContainer = cookies;
req.ContentType = "application/x-www-form-urlencoded";
req.Method = "POST";
byte[] bytes = Encoding.ASCII.GetBytes(Gegevens);
req.ContentLength = bytes.Length;
using (Stream os = req.GetRequestStream())
{
os.Write(bytes, 0, bytes.Length);
}
// Cast here as well
using (HttpWebResponse resp = (HttpWebResponse)req.GetResponse())
{
// Code related to the web response goes in here
}
Right now, your cookie information is saved in the CookieContainer object. It can be reused later in your code to validate your login. You don't need to write this cookie information to the disk if you don't need to.
Building a request with cookie information
(Pretty much the same as above, but you're not adding POST data and you're using a GET request)
HttpWebRequest req = (HttpWebRequest)WebRequest.Create(url);
req.CookieContainer = cookies; // This is where you add your login cookies to the current request
req.Method = "GET";
using (HttpWebResponse resp = (HttpWebResponse)req.GetResponse())
{
// Code related to the web response goes here
}
Hopefully this will get you on the right track :)

Send multiple WebRequest in Parallel.For

I want to send multiple WebRequest. I used a Parallel.For loop to do that but the loop runs once and the second time it gives error while getting response.
Error:
The operation has timed out
Code :
Parallel.For(0, 10, delegate(int i) {
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(
new Uri("http://www.mysite.com/service"));
string dataToSend = "Data";
byte[] buffer = System.Text.Encoding.GetEncoding(1252).
GetBytes(dataToSend);
request.Method = "POST";
request.ContentType = "application/x-www-form-urlencoded";
request.ContentLength = buffer.Length;
request.Host = "www.mysite.com";
Stream requestStream = request.GetRequestStream();
requestStream.Write(buffer, 0, buffer.Length);
requestStream.Close();
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
});
Most likely the problem is that you need to call response.Close() after you're done processing the response.
In addition to what Jim Mischel said about calling Close on the response, you also need to factor in that, by default, .NET limits an application to only two active HTTP connections per domain at once. To change this you can set System.Net.ServicePointManager.DefaultConnectionLimit programmatically or set the same thing via config using the <system.net><connectionManagement> configuration section.

HttpWebRequest to https server works in fiddler but NOT from Visual Studio

Here is the scenario. I have written code use a digital certificate to GET cookies from a secured url, to in turn POST data back to another url using the retrieved cookies and same digital certificate. The GET works and cookies are retrieved, the POST comes back with an error 500. I installed fiddler to see what was going on...POST looks fine...cookies are present. I used the feature in fiddler that allows creating a request via drag and drop. POST the exact same POST recorded from C# code that was recorded in fiddler and it works!
What is Fiddler doing that Visual Studio is not? It must be doing something if fiddler can POST the data but Visual Studio returns an error 500. Here is the code below:
X509Certificate cert = new X509Certificate("mycert.pfx", "certpassword");
HttpWebRequest req = (HttpWebRequest)WebRequest.Create("https://servertoGETcookies/fileUpload.html");
req.CookieContainer = new CookieContainer();
req.Method = "GET";
req.ClientCertificates.Add(cert);
HttpWebResponse Response = (HttpWebResponse)req.GetResponse();
CookieCollection ck = req.CookieContainer.GetCookies(req.RequestUri);
string strcookie = ck[0].Value;
string strcookie2 = ck[1].Value;
Response.Close();
req = (HttpWebRequest)WebRequest.Create("https://servertoPOSTdatawithcookies/main");
req.CookieContainer = new CookieContainer();
Cookie auth = new Cookie("_wl_authcookie_", strcookie2);
Cookie jsess = new Cookie("JSESSIONID", strcookie);
auth.Domain = "server";
jsess.Domain = "server";
req.CookieContainer.Add(auth);
req.CookieContainer.Add(jsess);
req.ClientCertificates.Add(cert);
req.Method = "POST";
Byte[] data = ReadByteArrayFromFile("filewithdatatoPOST.txt");
req.ContentLength = data.Length;
Stream myStream = req.GetRequestStream();
myStream.Write(data, 0, data.Length);
myStream.Close();
HttpWebResponse Response2 = (HttpWebResponse)req.GetResponse();
Stream strm = Response2.GetResponseStream();
StreamReader sr2 = new StreamReader(strm);
Response2.Close();
Does your code work if you set
req.ServicePoint.Expect100Continue = false;
on all your WebRequests?
Solved!!! It wasn't related to the Expect100Continue.
So after weeks of troubleshooting....6 different programmers....I figured it out. I'm not sure if this is always true, but in this scenario the problem was that the url we were getting cookies from:
HttpWebRequest req = (HttpWebRequest)WebRequest.Create("https://servertoGETcookies/fileUpload.html");
was not the same as the url that we were posting the data back to:
req = (HttpWebRequest)WebRequest.Create("https://servertoPOSTdatawithcookies/main");
Getting the cookies and posting back to the same url fixed the issue:
HttpWebRequest req = (HttpWebRequest)WebRequest.Create("https://servertoGETandPOSTcookies/main");
req = (HttpWebRequest)WebRequest.Create("https://servertoGETandPOSTcookies/main");

Categories