I am experiencing some strange behavior with a Windows Phone 8 app that I am building and I hope someone here has some experience with it.
I am reading a website using a normal HttpWebRequest and expecting a cookie as a response. However, somehow, I am not getting the Set-cookie header back in my WebResponse. I have created the same functionality under WPF and it works as normal - returns the Set-cookie header in the response.
I have also tried looking at the CookieContainer of the response, but it is also empty.
Here is the code that I am using for this. Note: the same piece of code (without the async stuff) works correct in WPF and returns the Set-Cookie header. I can post it as well if necessary:
HttpWebRequest request = (HttpWebRequest)WebRequest.Create("https://www.mysite.com/login");
request.Method = HttpMethod.Post;
request.AllowAutoRedirect = false;//normally there is a redirect in place
postData = "username=1234&password=2345";
var data = Encoding.UTF8.GetBytes(postData);
request.ContentType = "application/x-www-form-urlencoded";
request.ContentLength = data.Length;
using (var stream = await Task.Factory.FromAsync<Stream>(request.BeginGetRequestStream, request.EndGetRequestStream, null))
{
await stream.WriteAsync(data, 0, data.Length);
stream.Close();
}
using (var response = await Task.Factory.FromAsync<WebResponse>(request.BeginGetResponse, request.EndGetResponse, null))
{
return response.Headers["set-cookie"];
}
As a result of this, I am getting some response headers (such as content-type and server specific ones) but not the Set-Cookie one.
I've done some more tests and the set-cookie header is omitted only on the Windows Phone Emulator. When debugging with an actual device, the header is received as expected.
It is still pretty strange to me why the emulator behaves this way. I saw many posts on issues with http-only cookies in the emulator but none with a concrete reason.
UPDATE:
Testing on the 8.0.10322 emulator works just fine - cookies are handled correctly.It looks as the default phone emulator does something fishy with the cookies.
Related
I am trying to patch a .net web application that after years of working started failing to get UPS shipping quotes, which is impacting web business dramatically. After much trial and error, I found the following code that works just fine in a console application:
static string FindUPSPlease()
{
string post_data = "<xml data string>";
string uri = "https://onlinetools.ups.com/ups.app/xml/Rate";
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(uri);
request.Method = "POST";
request.KeepAlive = false;
request.ProtocolVersion = HttpVersion.Version10;
byte[] postBytes = Encoding.ASCII.GetBytes(post_data);
request.ContentType = "application/x-www-form-urlencoded";
request.ContentLength = postBytes.Length;
Stream requestStream = request.GetRequestStream();
requestStream.Write(postBytes, 0, postBytes.Length);
requestStream.Close();
// get response and send to console
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
Console.WriteLine(new StreamReader(response.GetResponseStream()).ReadToEnd());
Console.WriteLine(response.StatusCode);
return "done";
}
This runs in Visual Studio just fine and gets a nice little response from UPS that the XML is, of course, malformed.
But, if I paste this function into the web application without changing a single character, an exception is thrown on request.GetRequestStream():
Authentication failed because the remote party has closed the transport stream.
I tried it in a couple of different place in the application with the same result.
What is there about the web application environment that would affect the request?
It turns out to be a TLS issue. I guess the console app uses a higher protocol by default than the web application, although none was specified. So, all you have to do is add the following line(s) of code sometime prior to making the request:
using System.Net;
...
System.Net.ServicePointManager.SecurityProtocol |= SecurityProtocolType.Tls11 | SecurityProtocolType.Tls12;
That was all it took, though I spent an enormous amount of getting there.
Here is the response from UPS on the issue:
Effective January 18, 2018, UPS will only accept TLS 1.1 and TLS 1.2 security protocols... 100% of requests from customers who are on TLS 1.0 while using production URLS (onlinetools.ups.com/tool name) will be rejected.
Anyway, hope this helps someone.
Jim
Can you try setting the Credentials to your request object like following.
request.Credentials = System.Net.CredentialCache.DefaultNetworkCredentials;
Try setting the default credentials or check if there is any proxy server set and pass it like in the example below.
The example is given for WebClient.
I was having problem with setting Default Credential, as proxy was enabled on the server. So i passed the proxy URL and port with credentials which can access it.
using (System.Net.WebClient web = new System.Net.WebClient())
{
//IWebProxy defaultWebProxy = WebRequest.DefaultWebProxy;
//defaultWebProxy.Credentials = CredentialCache.DefaultCredentials;
//web.Proxy = defaultWebProxy;
var proxyURI = new Uri(string.Format("{0}:{1}", proxyURL, proxyPort));
//Set credentials
System.Net.ICredentials credentials = new System.Net.NetworkCredential(proxyUserId, proxyPassword);
//Set proxy
web.Proxy = new System.Net.WebProxy(proxyURI, true, null, credentials);
web.Headers.Add("Content-Type", "application/x-www-form-urlencoded");
var result = web.UploadString(URL, "");
return result;
}
I am getting very frustrated. I am using .NET to import a bunch of leads in to vTiger 6.2.0. I am using a simple web request that i have seen plenty of examples working. I have also seen it working from my dev environment and i would intermittently get the above response from the service. I have moved over to using live data (still from my machine). There is no reason for the imports to fail as they all contain a last name.
I have scoured the internet for any relevant information and have come up blank.
I am using a post request in the following way...
byte[] bytes = Encoding.ASCII.GetBytes(requestParameters);
WebRequest request = HttpWebRequest.Create(url) as WebRequest;
request.ContentLength = bytes.Length;
request.ContentType = "application/x-www-form-urlencoded";
request.Method = "POST";
using (var swRequest = request.GetRequestStream())
{
swRequest.Write(bytes, 0, bytes.Length);
}
The request URL (minus the domain) looks like this: /webservice.php?operation=create
The requestParameters variable looks like this (anonymised the data where there was data):
operation=create&sessionName=660f54e47e39358ce&element={"leadsource":"blaah","cf_757":"blaah","salutationtype":"0.","firstname":"blaah","lastname":"blaah","email":"blaah","phone":"blaah","description":"blaah","cf_833":"blaah","emailoptout":"1","cf_761":"1","cf_759":"1","cf_763":"1","assigned_user_id":"19x5"}&elementType=Leads
in the response i get back i receive the error below for nearly every record.
MANDATORY_FIELDS_MISSING: lastname does not have a value.
Last name and assigned user are the only mandatory values needed and the assigned user is the id returned from the login.
Is anybody able to provide some help on this? i have spent hours on it with no success.
This is not a subject I am strong in so I apologize ahead of time if I say something ridiculous.
I have developed an HTTP service using Mule. I have it functioning perfectly when I connect directly to the service and send data using a test harness I wrote in C#.
As the final part of my testing, I need to send it to an HTTPS URL that is supposed to "decrypt" the message and forward it to my service. When I send a message to the HTTPS URL, it gets forwarded to my service but the message contents appear empty and therefore does not get processed. I understand that I may have to add some "encryption" to my Test Harness but I have been researching how to do this all day and nothing I have found is answering my question.
Here is an example of the code I am using for the simple HTTP request:
HttpWebRequest req = (HttpWebRequest)WebRequest.Create(ConfigurationManager.AppSettings["HttpDestination"].ToString());
req.Method = "POST";
req.ContentType = "application/x-www-form-urlencoded";
req.ContentLength = data.Length;
using (Stream strm = req.GetRequestStream())
{
strm.Write(data, 0, data.Length);
}
HttpWebResponse response = (HttpWebResponse)req.GetResponse();
What do I need to change here to make this work?
Here is the solution that I discovered here. I needed to add the following line:
req.ProtocolVersion = System.Net.HttpVersion.Version10;
Without this, a timeout was occurring when getting the request stream and the content was never being sent, only the headers.
Do I need to just slap some random garbage data in a WebRequest object to get by the HTTP status code 411 restriction on IIS?
I have an HttpPost action method in an MVC 3 app that consumes a POST request with all the relevant information passed in the querystring (no body needed).
[HttpPost] public ActionResult SignUp(string email) { ... }
It worked great from Visual Studio's built in web host, Cassini. Unfortunately, once the MVC code was live on IIS [7.5 on 2008 R2], the server is pitching back an HTTP error code when I hit it from my outside C# form app.
The remote server returned an error:
(411) Length Required.
Here is the calling code:
WebRequest request = WebRequest.Create("http://somewhere.com/signup/?email=a#b.com");
request.Method = "POST";
using (WebResponse response = request.GetResponse())
using (Stream responseStream = response.GetResponseStream())
using (StreamReader responseReader = new StreamReader(responseStream)) {
// Do something with responseReader.ReadToEnd();
}
Turns out you can get this to go through by simply slapping an empty content length on the request before you send it.
WebRequest request = WebRequest.Create("http://somewhere.com/signup/?email=a#b.com");
request.Method = "POST";
request.ContentLength = 0;
Not sure how explicitly giving an empty length vs. implying one makes a difference, but IIS was happy after I did. There are probably other ways around this, but this seems simple enough.
I believe you are required to set a Content-Length header anytime you post a request to a web server:
http://msdn.microsoft.com/en-us/library/system.web.httprequest.contentlength.aspx
You could try a GET request to test it.
I'm trying to log in to my eBay account using the following code:
string signInURL = "https://signin.ebay.com/ws/eBayISAPI.dll?co_partnerid=2&siteid=0&UsingSSL=1";
string postData = String.Format("MfcISAPICommand=SignInWelcome&userid={0}&pass={1}", "username", "password");
string contentType = "application/x-www-form-urlencoded";
string method = "POST";
string userAgent = "Mozilla/5.0 (Windows; U; MSIE 7.0; Windows NT 6.0; en-US)";
CookieContainer cookieContainer = new CookieContainer();
HttpWebRequest req = (HttpWebRequest)HttpWebRequest.Create(signInURL);
req.CookieContainer = cookieContainer;
req.Method = method;
req.ContentType = contentType;
req.UserAgent = userAgent;
ASCIIEncoding encoding = new ASCIIEncoding();
byte[] loginDataBytes = encoding.GetBytes(postData);
req.ContentLength = loginDataBytes.Length;
Stream stream = req.GetRequestStream();
stream.Write(loginDataBytes, 0, loginDataBytes.Length);
stream.Close();
HttpWebResponse res = (HttpWebResponse)req.GetResponse();
StreamReader xsr = new StreamReader(res.GetResponseStream());
String responseText = xsr.ReadToEnd();
Obviously substituting my real username and password. When I look at the string responseText, I see that part of the response from eBay is
The browser you are using is rejecting cookies.
Any ideas what I'm doing wrong?
P.S. And yes, I am also using the eBay API, but this is for something slightly different than what I want to do with the API.
You're doing a direct http request. The Ebay site has functionality to talk to a browser (probably to store the session cookie). Unless you make the request code smart enough to use cookies correctly it won't work. You'll probably have to use the internet explorer object instead.
Before doing the POST you need to download the page with the form that you are submitting in your code, take the cookie they give you, put it in your CookieContainer (making sure you get the path right) and post it back up in your request.
To clarify, while you might be POSTing the correct data, you are not sending the cookie that needs to go with it. You will get this cookie from the login page.
You need to intercept the http traffic to see what exactly what had happened. I use Fiddler2. It is the good tools for debugging http. So I can know whos wrong, my application or the remote web server.
Using fiddler, you can see the request header, response header with its cookies as well as response content. It used in the middle of your app and the Ebay.
Based on my experience. I think it is because Ebay cookie sent to you is not send back to Ebay server. Fiddler will prove it whether yes or not.
Another thing, the response cookie you receive should be send back to next request by using the same CookieContainer.
You should notice that CookieContainer has a bug on .Add(Cookie) and .GetCookies(uri) method. You may not using it, but internal codes might use it.
See the details and fix here:
http://dot-net-expertise.blogspot.com/2009/10/cookiecontainer-domain-handling-bug-fix.html
CallMeLaNN