Basically what I've been trying to do is download a file off a server. The server sends a redirect automatically which is fine, but through packet sniffing a program that does successfully download the file I've found that the Headers (for the second request) are:
GET /path/to/file.txt
...
Host: server.com
Rather than the current response being generated (what I thought was standard):
GET www.server.com/path/to/file.txt
Using the normal HttpWebRequest method results in a 500 server error, and I get exceptions thrown when trying to use just the relative path as one would expect.
Using AllowAutoRedirect does not work for this scenario as the cookies are not handled properly, but even if I handle it manually the same error occurs.
How does one go about doing this (preferably without sockets :D)?
To be honest, I'm really not sure what you are asking, but you mentioned cookie troubles. As a total shot in the dark guess, are you setting the CookieContainer on your WebRequest?
request.CookieContainer = new CookieContainer();
request.AllowAutoRedirect = true;
Related
I've hit a strange bit of behaviour and I'm pretty sure is related to my code rather than the RTC instance I'm working with.
I've got a web request setup and configured:
var cookies = new CookieContainer();
var request = (HttpWebRequest)WebRequest.Create(getCreationFactoryUri);
var xmlString = getRDF.ToString();
request.CookieContainer = cookies;
request.Accept = "application/rdf+xml";
request.Method = "POST";
request.ContentType = "application/rdf+xml";
request.Headers.Add("OSLC-Core-Version", "2.0");
request.Timeout = 40000;
request.KeepAlive = true;
byte[] bytes = Encoding.ASCII.GetBytes(xmlString);
request.ContentLength = bytes.Length;
Stream dataStream = request.GetRequestStream();
dataStream.Write(bytes, 0, bytes.Length);
dataStream.Close();
This is passed to another method wrote based on an RTC example using forms authentication for RTC.
Under the OSLC v2 spec, I'm using a creation factory URL to post to. I know the URL is fine because I've setup a call using RESTClient in Firefox. Added the headers that are needed (Content-Type: application/rdf+xml, Accept: application/rdf+xml, OSLC-Core-Version: 2.0) and used the generated XML that my code is trying to pass. My manual call works perfectly and the ticket is created.
In my logs I captured the response from RTC, which is a list of tickets rather than a response showing my ticket as being created. I can re-create this behavior by doing a GET on the creation factory URL I'm using to create an event ticket.
So although I know I'm sending a POST to the creation factory (I debugged to check that my web request method was 100% set to 'POST') RTC instead returns a list of tickets and I can only conclude somewhere my request is treated as a 'GET'.
As a test I changed my request to use PUT instead of POST. This isn't permitted for use on the creation factory URL and in testing it indeed throws an error. So I'm totally miffed as to why RTC isn't creating my ticket, but instead treating my request as a GET and returning a list of tickets.
Anyone have any ideas?
Thanks.
If the server using form authentication, as you state, then I expect what is happening is that the POST is resulting in an HTTP redirection to the authentication form. Even if your other code is handling that authentication (which it sounds like it is), the result of that authentication will be an HTTP redirection to the URL of the original request. However, that redirection is likely to result in a GET to that URL, not the original POST. (Also, I don't believe the redirection after authentication is 100% reliable, if your requests are multi-threaded).
The jazz.net information on form authentication says "After authentication succeeded, you always have to replay the original request at least once to get to the protected resource. More replays may be required if the first replay led to another set of redirections and the original request had a non-GET method."
So if your code received an authentication challenge, you will need to re-send the original POST.
I believe the reason why the RESTClient plug-in in your browser is working first time is that it is sending the cookies from your previous log-in to the RTC web UI in the browser. (I had this experience recently, and also found it very confusing).
Also, if you are not preserving cookies between requests to RTC in your client app, then you will meet an auth challenge for every request. If you preserve cookies between calls from your client app (how you do that will depend on your client library - I'm not familiar with the code in your examples) then my experience is that you won't receive an auth challenge for every request. (However, you still need to be able to handle an auth challenge on every request - including POSTs - otherwise it may fail intermittently if the session times out just before you send a POST).
I have App that makes use of some web service and acquire data via JSON, all was working fine for quite long time, up until latest discoveries about SSLv3 being vulnerable to man-in-the-middle attacks and server owners turning off SSLv3 for good. My application started to have problems connecting and returned error "Request was aborted: cannot establish secure SSL/TLS connection". I've tried to look for solution and found information i got to add this code before creating web request:
ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls12;
ServicePointManager.ServerCertificateValidationCallback = delegate{
return true;
};
Unfortunately no luck here, app acts the same as before, and I have no clue if this code does nothing or there is still some problem with server. Error information is pretty vague and i have problem figuring where things go wrong.
Here is my code
...
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(url);
request.ContentType = GetRequestContentType();
request.Method = method.ToString();
request.Credentials = GetCredential(url);
request.PreAuthenticate = true;
CookieContainer cookieContainer = new CookieContainer();
request.CookieContainer = cookieContainer;
...
I want to ask how to set Tls12 to be used as default and ensure that at my end request I make is with desired protocol.
If I confirm that my app at my end works fine, is there way to get more detailed information from server response and pinpoint precise reason of error?
Thanks for all answers and suggestions.
EDIT
Second part of question is solved, I found this tool http://www.telerik.com/download/fiddler it pretty much allows to see what is going on with outgoing and incoming data. There is also thing that this tool allow to decode SSL connections, enabling this option makes that my application starts to work. I assume that this app does something that make communication between my app and destination host possible. But i do still have no idea what it could be. And how to make my app to handle these connections properly by itself.
Being desperate made me to inspect whole source code (part responsible for getting data of the internet was 3rd party and until it worked fine there was no reason to change it) and I discovered that line
request.Credentials = GetCredential(url);
called method that in its body had
ServicePointManager.SecurityProtocol = SecurityProtocolType.Ssl3;
So all my attempts to change that value before creating httpwebrequest was overwritten. Changing SecurityProtocolType to Tls12 makes it all work now.
I am hitting up a server with the following code and am encountering a ServerProtocolViolation error:
// Prepare the webpage
HttpWebRequest request = (HttpWebRequest) WebRequest.Create(url + queryString);
// execute the request
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
Does anyone know how to work around this kind of error?
This error means that the webserver that you're sending the request to isn't conforming to the HTTP standard.
Other than fixing the server or rewriting HttpWebRequest to be more generous, there isn't much you can do.
What URL are you requesting, and what's the text of the exception?
EDIT: If you request the URL in Fiddler, you'll see that the server didn't return any headers. You should contact the owner of the server and complain.
As a workaround, if you run Fiddler while sending the request, Fiddler will fix the response and allow HttpWebResponse to parse it.
Just a quick note on another reason that I experienced:
If you configure your request to use a proxy server (i.e. through the HttpWebRequest.Proxy property), and you use a wrong proxy port, there might also be a chance to see that error.
In my case, I configured http://127.0.0.1/ as the proxy but had the actual proxy server running on http://127.0.0.1:808/ instead (i.e. port "808" instead of "80").
If this is the case for you, try using no proxy or of course, configure the correct proxy port.
I'm narrowing in on an underlying problem related to two prior questions.
Basically, I've got a URL that when I fetch it manually (paste it into browser) works just fine, but when I run through some code (using the HttpWebRequest) has a different result.
The URL (example):
http://208.106.250.207:8192/announce?info_hash=-%CA8%C1%C9rDb%ADL%ED%B4%2A%15i%80Z%B8%F%C&peer_id=01234567890123456789&port=6881&uploaded=0&downloaded=0&left=0&compact=0&no_peer_id=0&event=started
The code:
String uri = BuildURI(); //Returns the above URL
HttpWebRequest req = (HttpWebRequest)HttpWebRequest.Create(uri);
req.Proxy = new WebProxy();
WebResponse resp = req.GetResponse();
Stream stream = resp.GetResponseStream();
... Parse the result (which is an error message from the server claiming the url is incorrect) ...
So, how can I GET from a server given a URL? I'm obviously doing something wrong here, but can't tell what.
Either a fix for my code, or an alternative approach that actually works would be fine. I'm not wed at all to the HttpWebRequest method.
I recommend you use Fiddler to trace both the "paste in web browser" call and the HttpWebRequest call.
Once traced you will be able to see any differences between them, whether they are differences in the request url, in the form headers, etc, etc.
It may actually be worth pasting the raw requests from both (obtained from Fiddler) here, if you can't see anything obvious.
Well, the only they might differ is in the HTTP headers that get transmitted. In particular the User-Agent.
Also, why are you using a WebProxy? That is not really necessary and it most likely is not used by your browser.
The rest of your code is fine.. Just make sure you set up the HTTP headers correctly. Check this link out:
I would suggest that you get yourself a copy of WireShark and examine the communication that happens between your browser and the server that you are trying to access. Doing so will be rather trivial using WireShark and it will show you the exact HTTP message that is being sent from the browser.
Then take a look at the communication that goes on between your C# application and the server (again using WireShark) and then compare the two to find out what exactly is different.
If the communication is a pure HTTP GET method (i.e. there is no HTTP message body involved), and the URL is correct then the only two things I could think of are:
make sure that your are send the right protocol (i.e. HTTP/1.0 or HTTP/1.1 or whatever it is that you should be sending)
make sure that you are sending all required HTTP headers correctly, and obviously that you are not sending any HTTP headers that you shouldn't be sending.
There could be something wrong with the URL. Instead of using a string, it's usually better to use an instance of System.Uri:
String url = BuildURI(); //Returns the above URL
Uri uri = new Uri(url);
HttpWebRequest req = (HttpWebRequest)HttpWebRequest.Create(url);
req.Proxy = new WebProxy();
using (WebResponse resp = req.GetResponse()) {
using (Stream stream = resp.GetResponseStream()) {
// whatever
}
}
I think you need to see exactly what's flowing to your server in the HTTP request. Does sound likely that the headers are interestingly different.
You can introduce a some kind of debugging proxy between your request and the server (for example RAD has such a capability in the box).
I need to login to a website and perform an action. The website is REST based so I can easily login by doing this (the login info is included as a querystring on the URL, so I dont't need to set the credentials):
CookieContainer cookieJar = new CookieContainer();
HttpWebRequest firstRequest = (HttpWebRequest) WebRequest.Create(loginUrl);
firstRequest.CookieContainer = cookieJar;
firstRequest.KeepAlive = true;
firstRequest.Method = "POST";
HttpWebResponse firstResponse = (HttpWebResponse)firstRequest.GetResponse();
That works and logs me in. I get a cookie back to maintain the session and it's stored in the cookieJar shown above. Then I do a second request such as this:
HttpWebRequest secondRequest = (HttpWebRequest) WebRequest.Create(actionUrl);
secondRequest.Method = "POST";
secondRequest.KeepAlive = true;
secondRequest.CookieContainer = cookieJar;
WebResponse secondResponse = secondRequest.GetResponse();
And I ensure I assign the cookies to the new request. But for some reason this doesn't appear to work. I get back an error telling me "my session has timed out or expired", and this is done one right after the other so its not a timing issue.
I've used Fiddler to examine the HTTP headers but I'm finding that difficult since this is HTTPS. (I know i can decrypt it but doesn't seem to work well.)
I can take my URL's for this rest service and paste them into firefox and it all works fine, so it must be something I'm doing wrong and not the other end of the connection.
I'm not very familiar with HTTPS. Do I need to do something else to maintain my session? I thought the cookie would be it, but perhaps there is something else I need to maintain across the two requests?
Here are the headers returned when I send in the first request (except I changed the cookie to protect the innocent!):
X-DB-Content-length=19
Keep-Alive=timeout=15, max=50
Connection=Keep-Alive
Transfer-Encoding=chunked
Content-Type=text/html; charset=WINDOWS-1252
Date=Mon, 16 Nov 2009 15:26:34 GMT
Set-Cookie:MyCookie stuff goes here
Server=Oracle-Application-Server-10g
Any help would be appreciated, I'm running out of ideas.
I finally got it working after decrypting the HTTP traffic from my program.
The cookie I'm getting back doesn't list the Path variable. So .NET takes the current path and assigns that as the path on the cookie including the current page. ie: If it was at http://mysite/somepath/somepage.htm it would set the cookie path=/somepath/somepage.htm. This is a bug as it should be assigned to "/" which is what all web browsers do. (hope they fix this.)
After noticing this I grabbed the cookie and modified the path property and everything works fine now.
Anyone else with a problem like this check out Fiddler. .NET uses the windows certificate store so to decrypt http traffic from your program you will need to follow the instructions here: http://www.fiddler2.com/Fiddler/help/httpsdecryption.asp . You will also need to turn on decryption under the Options\HTTPS tab of Fiddler.
From MSDN:
When a user moves back and forth between secure and public areas, the ASP.NET-generated session cookie (or URL if you have enabled cookie-less session state) moves with them in plaintext, but the authentication cookie is never passed over unencrypted HTTP connections as long as the Secure cookie property is set.
So basically, the cookie can be passed over both HTTP and HTTPS if the 'Secure' property is set to 'false'.
see also how can I share an asp.net session between http and https