I'm having an issue with HttpWebRequest accepting cookies, this is my code:
HttpWebRequest req= (HttpWebRequest)WebRequest.Create("https://www.companyabc.com/security?action=authenticate");
req.CookieContainer = new CookieContainer();
req.CookieContainer.Add(new Uri("https://www.companyabc.com"), new CookieCollection());
string postData = "account_id=xxxx&password=xxxx";
req.KeepAlive = true;
byte[] send = Encoding.Default.GetBytes(postData);
req.Method = "POST";
req.ContentType = "application/x-www-form-urlencoded";
req.ContentLength = send.Length;
Stream sout = req.GetRequestStream();
sout.Write(send, 0, send.Length);
sout.Flush();
sout.Close();
The response I'm getting is:
Sorry...
We have detected that your browser is not set up to allow Session Cookies. Our platform uses cookies to help enhance your overall user experience. You cannot log in without them.
Please enable Session Cookies and try again. Contact us at ....
What am I doing wrong? Thank you in advanced.
FYI, I can access the web page and login without any issues from a browser. The issue comes when I try to automate the process.
Sounds like the url you're talking to is expecting some specific headers. Try setting the User-Agent header to Mozilla/5.0 (Windows NT 6.3; rv:36.0) Gecko/20100101 Firefox/36.0 (the default Firefox UserAgent).
Failing that, have a look at the request in Firefox (open the developer tools, and switch to the 'Network' tab before you press the login button) and see what other headers the request has.
Related
I try to programmatically connect to the find-my network (belonging to iCloud) to locate devices attached to my Apple-ID.
This should pe possible using web requests. For example I post a request with my credentials to:
https://setup.icloud.com/setup/ws/1/login
string jsonreq = Newtonsoft.Json.JsonConvert.SerializeObject(temp); //temp is an object with login credentials
byte[] dataStream = Encoding.UTF8.GetBytes(jsonreq);
WebRequest webRequest = WebRequest.Create("https://setup.icloud.com/setup/ws/1/login");
webRequest.Method = "POST";
webRequest.Headers.Set("Origin", "https://www.icloud.com");
webRequest.ContentLength = dataStream.Length;
Stream newStream = webRequest.GetRequestStream();
// Attach the data.
newStream.Write(dataStream, 0, dataStream.Length);
newStream.Close();
WebResponse webResponse = webRequest.GetResponse();
// read server url, dsid, Cookie from response
This is working in the first step, but now I'm facing the problem that 2 factor authentication is to be used for logging in.
Can someone help me/show me next steps in the login process via web request for 2 factor authentication? (Of course I receive the login authentication code on my apple devices)
I am trying to patch a .net web application that after years of working started failing to get UPS shipping quotes, which is impacting web business dramatically. After much trial and error, I found the following code that works just fine in a console application:
static string FindUPSPlease()
{
string post_data = "<xml data string>";
string uri = "https://onlinetools.ups.com/ups.app/xml/Rate";
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(uri);
request.Method = "POST";
request.KeepAlive = false;
request.ProtocolVersion = HttpVersion.Version10;
byte[] postBytes = Encoding.ASCII.GetBytes(post_data);
request.ContentType = "application/x-www-form-urlencoded";
request.ContentLength = postBytes.Length;
Stream requestStream = request.GetRequestStream();
requestStream.Write(postBytes, 0, postBytes.Length);
requestStream.Close();
// get response and send to console
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
Console.WriteLine(new StreamReader(response.GetResponseStream()).ReadToEnd());
Console.WriteLine(response.StatusCode);
return "done";
}
This runs in Visual Studio just fine and gets a nice little response from UPS that the XML is, of course, malformed.
But, if I paste this function into the web application without changing a single character, an exception is thrown on request.GetRequestStream():
Authentication failed because the remote party has closed the transport stream.
I tried it in a couple of different place in the application with the same result.
What is there about the web application environment that would affect the request?
It turns out to be a TLS issue. I guess the console app uses a higher protocol by default than the web application, although none was specified. So, all you have to do is add the following line(s) of code sometime prior to making the request:
using System.Net;
...
System.Net.ServicePointManager.SecurityProtocol |= SecurityProtocolType.Tls11 | SecurityProtocolType.Tls12;
That was all it took, though I spent an enormous amount of getting there.
Here is the response from UPS on the issue:
Effective January 18, 2018, UPS will only accept TLS 1.1 and TLS 1.2 security protocols... 100% of requests from customers who are on TLS 1.0 while using production URLS (onlinetools.ups.com/tool name) will be rejected.
Anyway, hope this helps someone.
Jim
Can you try setting the Credentials to your request object like following.
request.Credentials = System.Net.CredentialCache.DefaultNetworkCredentials;
Try setting the default credentials or check if there is any proxy server set and pass it like in the example below.
The example is given for WebClient.
I was having problem with setting Default Credential, as proxy was enabled on the server. So i passed the proxy URL and port with credentials which can access it.
using (System.Net.WebClient web = new System.Net.WebClient())
{
//IWebProxy defaultWebProxy = WebRequest.DefaultWebProxy;
//defaultWebProxy.Credentials = CredentialCache.DefaultCredentials;
//web.Proxy = defaultWebProxy;
var proxyURI = new Uri(string.Format("{0}:{1}", proxyURL, proxyPort));
//Set credentials
System.Net.ICredentials credentials = new System.Net.NetworkCredential(proxyUserId, proxyPassword);
//Set proxy
web.Proxy = new System.Net.WebProxy(proxyURI, true, null, credentials);
web.Headers.Add("Content-Type", "application/x-www-form-urlencoded");
var result = web.UploadString(URL, "");
return result;
}
When it comes to web development, I know very very little...
I have found some code and explanations from the following site. https://dev.twitter.com/docs/auth/implementing-sign-twitter
Ultimately, I want to implement login with twitter. But I am having trouble rewriting those POST web requests into a c# HttpWebRequest format that I can reuse in the rest of our apps. If we examine the first webrequest made...
POST /oauth/request_token HTTP/1.1
User-Agent: themattharris' HTTP Client
Host: api.twitter.com
Accept: */*
Authorization:
OAuth oauth_callback="http%3A%2F%2Flocalhost%2Fsign-in-with-twitter%2F",
oauth_consumer_key="cChZNFj6T5R0TigYB9yd1w",
oauth_nonce="ea9ec8429b68d6b77cd5600adbbb0456",
oauth_signature="F1Li3tvehgcraF8DMJ7OyxO4w9Y%3D",
oauth_signature_method="HMAC-SHA1",
oauth_timestamp="1318467427",
oauth_version="1.0"
I want to transform that into a working HttpWebRequest.
Thus far. My code looks like this...
HttpWebRequest httpReq = (HttpWebRequest)WebRequest.Create("https://api.twitter.com/oauth/request_token");
ASCIIEncoding encoding = new ASCIIEncoding();
httpReq.Method = "POST";
httpReq.ContentType = "application/x-www-form-urlencoded";
httpReq.Accept = "Accept=text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8";
This is unfortunately how far I did get... I don't know how these requests work. I need to include the rest of the data and make the call. But I am stuck. Any help would be greatly appreciated.
Try this :
ASCIIEncoding encoder = new ASCIIEncoding();
byte[] data = encoder.GetBytes(serializedObject); // the data you wanted to send
HttpWebRequest request = new WebRequest.Create("https://api.twitter.com/oauth/request_token") as HttpWebRequest;
request.Method = "POST";
request.ContentType = "application/x-www-form-urlencoded";
request.ContentLength = data.Length;
request.GetRequestCode().Write(data, 0, data.Length);
Also a possible dublicate (similar question) : Why I get 411 Length required error?
I want to automate the download of an exe prompted from a link from the client side. I can get the first redirected link from http://go.microsoft.com/fwlink/?LinkID=149156 to http://www.microsoft.com/getsilverlight/handlers/getsilverlight.ashx. Please click and check how it works. fwlink -> .ashx - >.exe ...i want to get the direct link to the .exe.
But the response returns 404 when requesting the Web handler through the code but if you try on Browser it actually downloads.
Can anyone suggest how to automate the download form the above link? The code i am using to get the link redirected is this one.
public static string GetLink(string url)
{
HttpWebRequest httpWebRequest = WebRequest.Create(url) as HttpWebRequest;
httpWebRequest.Method = "HEAD";
httpWebRequest.AllowAutoRedirect = false;
// httpWebRequest.ContentType = "application/octet-stream";
//httpWebRequest.Headers.Add("content-disposition", "attachment; filename=Silverlight.exe");
HttpWebResponse httpWebResponse = httpWebRequest.GetResponse() as HttpWebResponse;
if (httpWebResponse.StatusCode == HttpStatusCode.Redirect)
{
return httpWebResponse.GetResponseHeader("Location");
}
else
{
return null;
}
}
Just tested this out and it will download the file.
WebClient client = new WebClient();
client.Headers.Add("user-agent", "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1)");
client.DownloadFile(url, "Filename.exe");
You just needed to add the user-agent as the particular silverlight download depends on what browser you are running on, hence if it can't detect one then it will fail.
Change the user-agent to something that will trigger the appropriate download you want.
I'm trying to log in to my eBay account using the following code:
string signInURL = "https://signin.ebay.com/ws/eBayISAPI.dll?co_partnerid=2&siteid=0&UsingSSL=1";
string postData = String.Format("MfcISAPICommand=SignInWelcome&userid={0}&pass={1}", "username", "password");
string contentType = "application/x-www-form-urlencoded";
string method = "POST";
string userAgent = "Mozilla/5.0 (Windows; U; MSIE 7.0; Windows NT 6.0; en-US)";
CookieContainer cookieContainer = new CookieContainer();
HttpWebRequest req = (HttpWebRequest)HttpWebRequest.Create(signInURL);
req.CookieContainer = cookieContainer;
req.Method = method;
req.ContentType = contentType;
req.UserAgent = userAgent;
ASCIIEncoding encoding = new ASCIIEncoding();
byte[] loginDataBytes = encoding.GetBytes(postData);
req.ContentLength = loginDataBytes.Length;
Stream stream = req.GetRequestStream();
stream.Write(loginDataBytes, 0, loginDataBytes.Length);
stream.Close();
HttpWebResponse res = (HttpWebResponse)req.GetResponse();
StreamReader xsr = new StreamReader(res.GetResponseStream());
String responseText = xsr.ReadToEnd();
Obviously substituting my real username and password. When I look at the string responseText, I see that part of the response from eBay is
The browser you are using is rejecting cookies.
Any ideas what I'm doing wrong?
P.S. And yes, I am also using the eBay API, but this is for something slightly different than what I want to do with the API.
You're doing a direct http request. The Ebay site has functionality to talk to a browser (probably to store the session cookie). Unless you make the request code smart enough to use cookies correctly it won't work. You'll probably have to use the internet explorer object instead.
Before doing the POST you need to download the page with the form that you are submitting in your code, take the cookie they give you, put it in your CookieContainer (making sure you get the path right) and post it back up in your request.
To clarify, while you might be POSTing the correct data, you are not sending the cookie that needs to go with it. You will get this cookie from the login page.
You need to intercept the http traffic to see what exactly what had happened. I use Fiddler2. It is the good tools for debugging http. So I can know whos wrong, my application or the remote web server.
Using fiddler, you can see the request header, response header with its cookies as well as response content. It used in the middle of your app and the Ebay.
Based on my experience. I think it is because Ebay cookie sent to you is not send back to Ebay server. Fiddler will prove it whether yes or not.
Another thing, the response cookie you receive should be send back to next request by using the same CookieContainer.
You should notice that CookieContainer has a bug on .Add(Cookie) and .GetCookies(uri) method. You may not using it, but internal codes might use it.
See the details and fix here:
http://dot-net-expertise.blogspot.com/2009/10/cookiecontainer-domain-handling-bug-fix.html
CallMeLaNN