(C#, HttpWebRequest)Timeout property is not working - c#

Request is not stopped after a period of time is passed set in Timeout.
I'd like to add a function to stop request automatically when a period of time is passed since the server has no response time to time.
Please find below source.
=========================================================================
String urlSetting = "http://www.test.com";
HttpWebRequest request = WebRequest.Create(urlSetting) as HttpWebRequest;
request.UserAgent = "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1);
Accept-Language:en";
request.MediaType = "text/html";
request.Method = System.Net.WebRequestMethods.Http.Get;
request.ContentType = "text/xml; charset=utf-8";
request.Timeout = 3000;
request.Proxy = new WebProxy() { UseDefaultCredentials = true };
HttpWebResponse webResponse = (HttpWebResponse)request.GetResponse();
=========================================================================
I am using GetResponse() to call server, but sometimes the server does not response.
I added "request.Timeout = 3000;" to stop requesting when three seconds passed.
But that part doesn't work.
Internet environment here is proxy as below.
proxy settings
I wonder if there is another way for Timeout for proxy environment.

Related

get method returns root url content

i try to ge the content of this url: https://www.eganba.com/index.php?p=Products&ctg_id=2000&sort_type=rel-desc&view=0&page=1
but as a result of the following code the response contains the content of this url, the home page: https://www.eganba.com
in addition, when i try to get the first url content via Postman application the response is correct.
do you have any idea?
WebRequest request = WebRequest.Create("https://www.eganba.com/index.php?p=Products&ctg_id=2000&sort_type=rel-desc&view=0&page=1");
request.Method = "GET";
request.Headers["X-Requested-With"] = "XMLHttpRequest";
WebResponse response = request.GetResponse();
Use WebClient method which inside System.Net. I think this code gives you what you need. It return the page's html
using (WebClient client = new WebClient())
{
client.Headers.Add("user-agent", "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.2; .NET CLR 1.0.3705;)");
client.Headers.Add("accept", "text/html");
var htmlCode = client.DownloadString("https://www.eganba.com/?p=Products&ctg_id=2000&sort_type=rel-desc&view=0&page=1");
var result = htmlCode.Contains("Stokta var") ? true : false;
}
Hope it helps to you.

Do I have to update every cookies everytime?

I am using my c# app to send request to the webpage. Before I send any request I have to be logged in this webpage. I want to avoid logging everytime I want to do some work, so I am storing cookies in sql server database in VARBINARY column.
Lets say I am sending 50 POST requests every day:
private string getRequest(string url, string postData = "")
{
try
{
System.Net.ServicePointManager.Expect100Continue = true;
StreamReader reader;
var request = WebRequest.Create(url) as HttpWebRequest;
request.UserAgent = "Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.8.1.14) Gecko/20080404 Firefox/2.0.0.14";
request.CookieContainer = Cookie;
request.AllowAutoRedirect = true;
request.ContentType = "application/x-www-form-urlencoded";
request.Headers.Add("X-Requested-With", "XMLHttpRequest");
postData = Uri.EscapeUriString(postData);
request.Method = "POST";
request.ContentLength = postData.Length;
Stream requestStream = request.GetRequestStream();
Byte[] postBytes = Encoding.ASCII.GetBytes(postData);
requestStream.Write(postBytes, 0, postBytes.Length);
requestStream.Close();
HttpWebResponse response = request.GetResponse() as HttpWebResponse;
foreach (Cookie tempCookie in response.Cookies)
{
Cookie.Add(tempCookie);
}
reader = new StreamReader(response.GetResponseStream());
string readerReadToEnd = reader.ReadToEnd();
response.Close();
database.updateCookie(AccountId, Cookie); //here I am updating the cookies
return readerReadToEnd;
}
catch { return ""; }
Do I really need to update Cookies after every request? Or maybe could I update my cookies only once, after sending 50 POST requests?
I am asking, because sometimes my cookies lasts couple days, and sometimes they died after 1 minute. I don't really know why and how to avoid that.
Do I have to store the newest verion of cookies or maybe can I use the same everytime?
Each site decides how long particular cookies are valid. Hence you need to get that information for each site individually to do it correctly. Paying attention to "expiration" on the cookie (in "set-cookie" headers of response) may give you initial guidance on when cookie is guaranteed to expire.
Common expiration/validity ranges:
authentication cookies - from 10 minutes to 1 day
ASP.Net session state - from 20 minutes
CSRF protection - possibly updated on every response

"The underlying connection was closed: The connection was closed unexpectedly"

When i try to get html page i get this error:
The underlying connection was closed: The connection was closed unexpectedly
I think the site I'm getting, is using some protection based on ip.
WebClient single_page_client = new WebClient();
single_page_client.Headers.Add("user-agent", "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; .NET CLR 1.0.3705;)");
string cat_page_single = single_page_client.DownloadString(the_url);
How can i do it?
What about use proxy with Webclient?
EDIT
If i use this code, it works. Why?
HttpWebRequest webrequest = (HttpWebRequest)WebRequest.Create(current_url);
webrequest.KeepAlive = true;
webrequest.Method = "GET";
webrequest.ContentType = "text/html";
webrequest.Accept = "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8";
//webrequest.Connection = "keep-alive";
webrequest.Host = "cat.sabresonicweb.com";
webrequest.Headers.Add("Accept-Language", "en-US,en;q=0.5");
webrequest.UserAgent = "Mozilla/5.0 (Windows NT 6.1; rv:18.0) Gecko/20100101 Firefox/18.0";
HttpWebResponse webresponse = (HttpWebResponse)webrequest.GetResponse();
Console.Write(webresponse.StatusCode);
Stream receiveStream = webresponse.GetResponseStream();
Encoding enc = System.Text.Encoding.GetEncoding(1252);//1252
StreamReader loResponseStream = new StreamReader(receiveStream, enc);
string current_page = loResponseStream.ReadToEnd();
loResponseStream.Close();
webresponse.Close();
The first request does not utilize a header that indicates the length of the result. It closes the connection when it finishes.
The second request utilizes the length header, reads the designated number of bytes, then closes the connection gracefully. (under the client side control instead of server driven disconnection)
-or-
The url you sent caused an error on the server. Is there an error in the server log or event viewer?

C# request a webpage failed but successd using web browser (Well checked the header and cookies)

My friend is using C# to write a simple program for requesting a webpage.
However he encounter a problem when try to request a specified webpage.
He have already tried to set all the header and cookie inside the request, but it still got the timeout exception.
The example webpage is https://my.ooma.com
Here is the code:
string url = "https://my.ooma.com";
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(url);
request.Timeout = 30000;
request.UserAgent = "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/536.5 (KHTML, like Gecko) Chrome/19.0.1084.52 Safari/536.5";
request.Method = "GET";
request.CookieContainer = new CookieContainer();
request.Accept = "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8";
request.Headers.Add("Accept-Charset:ISO-8859-1,utf-8;q=0.7,*;q=0.3");
request.Headers.Add("Accept-Encoding:gzip,deflate,sdch");
request.Headers.Add("Accept-Language:en-US,en;q=0.8");
request.KeepAlive = true;
WebResponse myResponse = request.GetResponse();
StreamReader sr = new StreamReader(myResponse.GetResponseStream());
string result = sr.ReadToEnd();
sr.Close();
myResponse.Close();
All the headers is as same as when using Chrome to browse the webpage.
And he didn't see any cookies set by using the Chrome developer tool.
Do anyone can success request the page using C#?
Thanks a lot.
Sorry for being late.
The following code snippet should work just fine. I also tried with tour old URL that had "getodds.xgi" in it and it also worked fine.
The server uses a secure sockets layer (SSL) protocol for connections that use the Secure Hypertext Transfer Protocol (HTTPS) scheme only.
You don't need to set any UserAgent or Header if they were just intended to get response.
ServicePointManager.SecurityProtocol = SecurityProtocolType.Ssl3;
WebRequest request = WebRequest.Create("http://my.ooma.com/");
string htmlResponse = string.Empty;
using (WebResponse response = request.GetResponse())
{
using (StreamReader reader = new StreamReader(response.GetResponseStream()))
{
htmlResponse = reader.ReadToEnd();
reader.Close();
}
response.Close();
}

How do I perform a simple GET request with cookies in C#

Everything I find is either about a POST request or does not assume cookies.
I have an URL like this:
http://page.com/find/1,1,1,find.html?advanced=1&param1=val1&param2[]=val2
When put into a browser, this will direct me to a search results page. Now I would like to replicate it in a C# program. I have this so far:
WebRequest req = WebRequest.Create(url);
((HttpWebRequest)req).UserAgent = "Mozilla/5.0 (Windows NT 6.1; WOW64; rv:10.0.2) Gecko/20100101 Firefox/10.0.2";
req.Method = "GET";
WebResponse response = req.GetResponse();
When I run it, it returns a "please login" page, as expected. But there is a problem with one of the parameters. This is the response URL:
http://page.com/login.html?ref=find/1,1,1,find.html?advanced=1&param1=val1&param2=Array
So, two questions: what might have happened to param2? And how do I add cookies to this?
EDIT: Managed to set the cookies by casting to HttpWebRequest.
As devio said you should use HttpWebRequest. I did dirty test to check it.
Prepare cookies to send. I made available for whole localhost:
HttpWebRequest rq = (HttpWebRequest)WebRequest.Create("http://localhost/test.php");
rq.CookieContainer = new CookieContainer();
rq.CookieContainer.Add(new Cookie("test", "xxxx", "/", "localhost"));
Your script should set cookies to make them available in response. And you could use them.
HttpWebResponse resp = (HttpWebResponse)rq.GetResponse();
foreach(var c in resp.Cookies)
{
Debug("{0}: {1}", c.Name, c.Value);
}
You can use this code piece:
Cookie SessionCookie = new Cookie("{CookieName}", {Cookievalue})
{
Domain = "{domain you want to hit}", Path = "/", Expired = false, HttpOnly = true
};
CookieContainer SessionCookieHolder = new CookieContainer();
SessionCookie.Add(SessionCookie);
try
{
HttpWebRequest WebReq = (HttpWebRequest)WebRequest.Create("{URL}");
WebReq.CookieContainer = SessionCookie;
WebReq.Method = "GET/POST/HEAD"; //depending on the request type//
WebReq.KeepAlive = true;
HttpWebResponse Resp = (HttpWebResponse)WebReq.GetResponse();
}
catch(Exception ex)
{
string ExceptionReader = ex.Message;
}
Cookies are stored in the HttpWebRequest.CookieContainer and HttpWebResponse.Cookies properties.
For subsequent requests, you need to add the response's cookies to the request's cookie container.

Categories