How to access external live meeting url via HttpWebRequest - c#

I'm trying to access external live meeting url using Httpwebrequest, and getting 401 unauthorized error. Same code is working in my local system.
Code:
HttpWebRequest myReq = (HttpWebRequest)WebRequest.Create(PostingUrl);
CredentialCache CredMCCache = new CredentialCache();
myReq.PreAuthenticate = true;
CredMCCache.Add(new System.Uri(PostingUrl),"Basic",new System.Net.NetworkCredential("username","password")
myReq.Credentials = CredMCCache;
myReq.KeepAlive = true;
myReq.UserAgent = "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; .NET CLR 1.1.4322; .NET CLR 2.0.50727; .NET CLR 3.0.04506.30; .NET CLR 3.0.04506.648; .NET CLR 3.5.21022)";
myReq.Accept = "*/*";
myReq.Headers.Add("Accept-Language", "en-us");
myReq.Headers.Add("Accept-Encoding", "gzip, deflate");
WebProxy proxyObject = new WebProxy("proxy url with port", false);
myReq.Proxy = proxyObject;
myReq.Proxy.Credentials = CredentialCache.DefaultNetworkCredentials;
myReq.Method = "GET";
HttpWebResponse myResp = null;
// Get the response from the conference center
myResp = (HttpWebResponse)myReq.GetResponse();
I am getting the error in the above line. Any pointers will be helpful.

Why are you setting the proxy, eg
myReq.Proxy = proxyObject;
Do you need to do this? if you are indeed going thru a corporate proxy you shouldnt need to set the proxy for the HttpWebRequest as it will be pick up the settings (if any) from IE.
Secondly, are you trying to use basic authentication to authenticate with the remote server? It looks like you are, so use this instead to set the authenitcation details in the header
string authInfo = userName + ":" + userPassword;
authInfo = Convert.ToBase64String(Encoding.Default.GetBytes(authInfo));
myReq.Headers["Authorization"] = "Basic " + authInfo;

Related

How to read HTML source of a page that requires NTML authentication

I need to get the HTML source of the web page.
This web page is a part of the web site that requires NTLM authentication.
This authentication is silent because Internet Explorer can use Windows log-in credentials.
Is it possible to reuse this silent authentication (i.e. reuse Windows log-in credentials), without making the user enter his/her credentials manually?
The options I have tried are below.
string url = #"http://myWebSite";
//works fine
System.Diagnostics.Process.Start("IExplore.exe", url);
InternetExplorer ie = null;
ie = new SHDocVw.InternetExplorer();
ie.Navigate(url);
//Works up to here, but I do not know how to read the HTML source with SHDocVw
NHtmlUnit.WebClient webClient = new NHtmlUnit.WebClient(BrowserVersion.INTERNET_EXPLORER_8);
HtmlPage htmlPage = webClient.GetHtmlPage(url);
string ghjg = htmlPage.WebResponse.ContentAsString; // Error 401
System.Net.WebClient client = new System.Net.WebClient();
client.Credentials = CredentialCache.DefaultNetworkCredentials;
client.Proxy.Credentials = CredentialCache.DefaultCredentials;
// DefaultNetworkCredentials and DefaultCredentials are empty
client.Headers.Add("user-agent", "Mozilla/5.0 (compatible; MSIE 8.0; Windows NT 6.1; Trident/4.0; GTB7.4; InfoPath.2; SV1; .NET CLR 3.3.69573; WOW64; en-US)");
string reply = client.DownloadString(url); // Error 401
HttpWebRequest request = HttpWebRequest.Create(url) as HttpWebRequest;
IWebProxy proxy = request.Proxy;
// Print the Proxy Url to the console.
if (proxy != null)
{
// Use the default credentials of the logged on user.
proxy.Credentials = CredentialCache.DefaultNetworkCredentials;
// DefaultNetworkCredentials are empty
}
request.UserAgent = "Mozilla/5.0 (compatible; MSIE 8.0; Windows NT 6.1; Trident/4.0; GTB7.4; InfoPath.2; SV1; .NET CLR 3.3.69573; WOW64; en-US)";
request.Accept = "*/*";
HttpWebResponse response = request.GetResponse() as HttpWebResponse;
Stream stream = response.GetResponseStream(); // Error 401

c# screen scraping and getting all the cookies for secured access to a website

I'm trying to access a website through c# program. There seems to be three cookies needed to access the website yet I only receive two in my cookie container so when I try to access other parts the website I can't. I first do a GET then a POST. The reason I programmed it this way because it seemed from the Chrome Dev tools I determined that it first used a GET for the first two and then a POST to login and get the third one. The POST shows a 302 Moved Temporarily and then right after that it's a redirect. Which I believe is the reason I can't obtain the last cookie can anyone shed any light?
cookieJar = new CookieContainer();
string formParams = string.Format("USERNAME={0}&PASSWORD={1}", username, password);
Console.Write(" \n 1st count before anything : " + cookieJar.Count + "\n"); // 0 cookies
//First go to the login page to obtain cookies
HttpWebRequest loginRequest = (HttpWebRequest)HttpWebRequest.Create("https://server.com/login/login.jsp");
loginRequest.Accept = "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8";
//.Connection = "keep-alive";
loginRequest.Method = "GET";
loginRequest.UseDefaultCredentials = true;
loginRequest.CookieContainer = cookieJar;
loginRequest.AllowAutoRedirect = false;
HttpWebResponse loginResponse = (HttpWebResponse)loginRequest.GetResponse();
Console.Write(" \n 2nd count after first response : " + cookieJar.Count + "\n"); // Only 2 are recorded.
//Create another request to actually log into website
HttpWebRequest doLogin = (HttpWebRequest)HttpWebRequest.Create("https://server.com/login/login.jsp");
doLogin.Method = "POST";
doLogin.ContentType = "application/x-www-form-urlencoded";
doLogin.AllowAutoRedirect = false;
byte[] bytes = Encoding.ASCII.GetBytes(formParams);
doLogin.ContentLength = bytes.Length;
using (Stream os = doLogin.GetRequestStream())
{
os.Write(bytes, 0, bytes.Length);
}
oLogin.CookieContainer = cookieJar;
doLogin.UserAgent = "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/27.0.1453.110 Safari/537.36";
doLogin.Referer = "https://server.com/login/login.jsp";
HttpWebResponse Response = (HttpWebResponse)doLogin.GetResponse();
Console.Write(" \n 3rd count after second repsonse : " + cookieJar.Count + "\n"); // still two
HttpWebRequest had a problem with cookies.
The problem was that a cookie that was assigned to "server.com" would be changed to ".server.com". However, "server.com" did not match ".server.com".
If you are using a framework older than (I think it is 3) you are probably experiencing this problem.
The work around is to use e.g. "www.server.com" in your request, this will match cookies assigned to ".server.com".

Httpwebrequst cookie not being set in POST

Im trying to post a form but i keep getting an error when i look at the returned html code saying
Cookies must be enabled in order to submit the form
I use the following code to submit the form:
string postData = "submit_hidden=submit_hidden&captcha_item_key=" + captcha_item_key + "&security_code=" + lastCaptcha + "&submit=Submit";
byte[] _data = _enc.GetBytes(postData);
_wReq = (HttpWebRequest)WebRequest.Create(urlen);
_wReq.Referer = urlen;
_wReq.CookieContainer = myContainer;
_wReq.Accept = "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8";
_wReq.Headers.Add("Accept-Charset", "ISO-8859-1,utf-8;q=0.7,*;q=0.3");
_wReq.Headers.Add("Accept-Language", "sv-SE,sv;q=0.8,en-US;q=0.6,en;q=0.4");
_wReq.KeepAlive = true;
_wReq.UserAgent = "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; .NET CLR 1.1.4322; .NET CLR 2.0.50727)";
_wReq.Method = "POST";
_wReq.ContentType = "application/x-www-form-urlencoded";
_wReq.ContentLength = _data.Length;
System.IO.Stream _outStream = _wReq.GetRequestStream();
_outStream.Write(_data, 0, _data.Length);
_outStream.Close();
_wResp = (HttpWebResponse)_wReq.GetResponse();
_sr = new System.IO.StreamReader(_wResp.GetResponseStream());
_respStr = _sr.ReadToEnd();
_sr.Close();
_wResp.Close();
I am saving the cookie into a cookie container so im not sure why its not working?
Javascript is not the problem as ive tried to submit the form from a browser without having Javascript activated.
Also checked the submittion with Fiddler and the only thing that differes from a browser with my method is that the cookie for some reason does not get set properly.
Any ideas?

performing http methods using windows application in c#

There are many sites which call a script upon form submit and pass in parameters using HTTP POST or GET, using a web debugger i have found the parameters being passed. Now i wish to do the same thing through my Windows Application in C#. How can i achieve such a functionality?
I am currently using HttpWebRequest and HttpWebResponse class in C#. But it is a pain as i have to write explicit code for each page i try to load and work. For Example i am trying to pass username and password to a php page and taking the response, which will send a cookie and a page in return, based on which i identify if the user has logged in or not.
HttpWebRequest loginreq = createreq("http://www.indyarocks.com/mobile/index.php");
String logintext = "username=" + TxtUsrname.Text + "&pass=" + TxtPasswd.Password + "&button.x=0&button.y=0";
loginreq.ContentLength = logintext.Length;
StreamWriter writerequest = new StreamWriter(loginreq.GetRequestStream());
writerequest.Write(logintext);
writerequest.Close();
HttpWebResponse getloginpageresponse = (HttpWebResponse)loginreq.GetResponse();
cookie = getloginpageresponse.Cookies[0];
BinaryFormatter bf1 = new BinaryFormatter();
Stream f1 = new FileStream("E:\\cookie.dat", FileMode.OpenOrCreate);
bf1.Serialize(f1, cookie);
f1.Close();
string nexturl = getloginpageresponse.Headers[HttpResponseHeader.Location];
StreamReader readresponse = new StreamReader(getloginpageresponse.GetResponseStream());
if (nexturl == "p_mprofile.php")
{
MessageBox.Show("Login Successful");
GrpMsg.IsEnabled = true;
}
else if (nexturl == "index.php?msg=1")
{
MessageBox.Show("Invalid Credentials Login again");
}
This is my createreq class
private HttpWebRequest createreq(string url)
{
HttpWebRequest temp = (HttpWebRequest)WebRequest.Create(url);
temp.Method = "POST";
temp.UserAgent = "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.0; SLCC1; .NET CLR 2.0.50727; Media Center PC 5.0; .NET CLR 3.0.04506; .NET CLR 3.5.21022; FDM)";
temp.KeepAlive = true;
temp.ContentType = "application/x-www-form-urlencoded";
temp.CookieContainer = new CookieContainer();
temp.AllowAutoRedirect = false;
return temp;
}
Am i on the right track? Is there any better way to do it?
You should use System.Net.WebClient.
You can use it to make a request with any method and headers that you'd like, and get the resulting page with a simple stream read.
There's a simple example on the MSDN page, but some sample code for using it might look like:
WebClient webclient= new WebClient();
using (StreamReader reader = new StreamReader(webclient.OpenRead("http://www.google.com")))
{
string result = reader.ReadToEnd();
// Parse web page here
}

Why Does my HttpWebRequest Return 400 Bad request?

The following code fails with a 400 bad request exception. My network connection is good and I can go to the site but I cannot get this uri with HttpWebRequest.
private void button3_Click(object sender, EventArgs e)
{
WebRequest req = HttpWebRequest.Create(#"http://www.youtube.com/");
try
{
//returns a 400 bad request... Any ideas???
WebResponse response = req.GetResponse();
}
catch (WebException ex)
{
Log(ex.Message);
}
}
First, cast the WebRequest to an HttpWebRequest like this:
HttpWebRequest req = (HttpWebRequest)HttpWebRequest.Create(#"http://www.youtube.com/");
Then, add this line of code:
req.UserAgent = "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1; .NET CLR 1.1.4322; .NET CLR 2.0.50727)";
Set UserAgent and Referer in your HttpWebRequest:
var request = (HttpWebRequest)WebRequest.Create(#"http://www.youtube.com/");
request.Referer = "http://www.youtube.com/"; // optional
request.UserAgent =
"Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.0; WOW64; " +
"Trident/4.0; SLCC1; .NET CLR 2.0.50727; Media Center PC 5.0; " +
".NET CLR 3.5.21022; .NET CLR 3.5.30729; .NET CLR 3.0.30618; " +
"InfoPath.2; OfficeLiveConnector.1.3; OfficeLivePatch.0.0)";
try
{
var response = (HttpWebResponse)request.GetResponse();
using (var reader = new StreamReader(response.GetResponseStream()))
{
var html = reader.ReadToEnd();
}
}
catch (WebException ex)
{
Log(ex);
}
There could be many causes for this problem. Do you have any more details about the WebException?
One cause, which I've run into before, is that you have a bad user agent string. Some websites (google for instance) check that requests are coming from known user agents to prevent automated bots from hitting their pages.
In fact, you may want to check that the user agreement for YouTube does not preclude you from doing what you're doing. If it does, then what you're doing may be better accomplished by going through approved channels such as web services.
Maybe you've got a proxy server running, and you haven't set the Proxy property of the HttpWebRequest?

Categories