I'm not sure why but in my asp.net project in this code block when i get an exception on GetRequestStream() i cannot move my cursor to another point in the function as i normally do when i get a function.
The reason is Unable to set the next statement to this location. The next statement cannot be set to another function.
Is there something i can do to allow this?
static public CookieContainer login(string user, string pass)
{
var cookie = new CookieContainer();
var request = (HttpWebRequest)HttpWebRequest.Create(#"https://www.somesite.com/users/login");
request.CookieContainer = cookie;
{
var postData = string.Format("ref=http://www.somesite.com/&username={0}&password={1}", user, pass);
request.Method = "POST";
request.ContentType = "application/x-www-form-urlencoded";
request.ContentLength = postData.Length;
using (Stream writeStream = request.GetRequestStream())
{
UTF8Encoding encoding = new UTF8Encoding();
byte[] bytes = encoding.GetBytes(postData);
writeStream.Write(bytes, 0, bytes.Length);
}
}
var resp = request.GetResponse();
return cookie;
}
You need to disable Just-in-time debugging.
See: http://msdn.microsoft.com/en-us/library/09yze4a9.aspx
You should try putting a breakpoint at the line where the exception occurs.
Another option is to put in a try/catch in your code, and place a breakpoint in the catch.
In Visual Studio 2010, this seems to be a security restriction. You can circumvent it by Tools / Options / Debugging / Edit and Continue / Check: Enable while remote debugging or debugging an application running under another user account.
This worked when I was debugging an ASP.NET application.
Where are you trying to place the cursor?
The message seems to stay that you are trying to place it in a different function. If you want to place the cursor in the calling function, you might be able to move to the return cookie; statement, hit the F10 key 2 times, then move your cursor in the calling function (where you should be by now).
Related
I have the following Silverlight code. It uploads a file (postData) to a backend through the given url (url).
public async Task<HttpFileUploadResponse> PostFile(string url, byte[] postData, string fileName, bool isPDF) {
HttpWebRequest request = null;
Uri uri = new Uri(url);
request = (HttpWebRequest)WebRequest.Create(uri);
request.Method = "POST";
request.ContentType = "multipart/form-data";
using (Stream writeStream = await request.GetRequestStreamAsync()) {
await writeStream.WriteAsync(postData, 0, postData.Length);
}
using (HttpWebResponse response = (HttpWebResponse)await request.GetResponseAsync())
using (Stream responseStream = response.GetResponseStream())
using (StreamReader readStream = new StreamReader(responseStream, System.Text.Encoding.UTF8)) {
return ParseServerResponse(await readStream.ReadToEndAsync());
}
}
The problem with this method is that is freezing Internet Explorer when the line await request.GetRequestStreamAsync() is reached.
Although similar questions have been resolved, such as:
Application hangs on GetRequestStream() after first request
Why does HttpWebRequest.GetResponse() hang when trying to update an OData service?
I've followed the suggested steps, like setting the ContentLength, but the result is the same.
Also I've tried to change the URL to another one corresponding to another backend that is currently working with other Silverlight components that we have. These components also have an await request.GetRequestStreamAsync() but they are working perfectly.
I've tried to change the contentType. My workmates have used Fiddler to see the request but apparently nothing is being sent (I suppose it's because it freezes).
Any idea about what's happening here?
EDIT:
I have decompiled the AsynCompatLibExtensions.cs and TaskExtensions.cs to copy the GetRequestStreamAsync code in order to have a custom cloned class with logs among the instructions.
I have recompiled the solution using this class and the strange thing is that I'm not seeing any log. So I suppose it's freezing when it enters GetRequestStreamAsync but before it reaches any code line inside.
Any idea?
I'm calling an API hosted on Apache server to post data. I'm using HttpWebRequest to perform POST in C#.
API has both normal HTTP and secure layer (HTTPS) PORT on the server. When I call HTTP URL it works perfectly fine. However, when I call HTTPS it gives me time-out exception (at GetRequestStream() function). Any insights? I'm using VS 2010, .Net framework 3.5 and C#. Here is the code block:
string json_value = jsonSerializer.Serialize(data);
HttpWebRequest request = (HttpWebRequest)System.Net.WebRequest.Create("https://server-url-xxxx.com");
request.Method = "POST";
request.ProtocolVersion = System.Net.HttpVersion.Version10;
request.ContentType = "application/x-www-form-urlencoded";
byte[] buffer = Encoding.ASCII.GetBytes(json_value);
request.ContentLength = buffer.Length;
System.IO.Stream reqStream = request.GetRequestStream();
reqStream.Write(buffer, 0, buffer.Length);
reqStream.Close();
EDIT:
The console program suggested by Peter works fine. But when I add data (in JSON format) that needs to be posted to the API, it throws out operation timed out exception. Here is the code that I add to console based application and it throws error.
byte[] buffer = Encoding.ASCII.GetBytes(json_value);
request.ContentLength = buffer.Length;
I ran into the same issue. It seems like it is solved for me. I went through all my code making sure to invoke webResponse.Close() and/or responseStream.Close() for all my HttpWebResponse objects. The documentation indicates that you can close the stream or the HttpWebResponse object. Calling both is not harmful, so I did. Not closing the responses may cause the application to run out of connections for reuse, and this seems to affect the HttpWebRequest.GetRequestStream as far as I can observe in my code.
I don't know if this will help you with your specific problem but you should consider Disposing some of those objects when you are finished with them. I was doing something like this recently and wrapping stuff up in using statements seems to clean up a bunch of timeout exceptions for me.
using (var reqStream = request.GetRequestStream())
{
if (reqStream == null)
{
return;
}
//do whatever
}
also check these things
Is the server serving https in your local dev environment?
Have you set up your bindings *.443 (https) properly?
Do you need to set credentials on the request?
Is it your application pool account accessing the https resources or is it your account being passed through?
Have you thought about using WebClient instead?
using (WebClient client = new WebClient())
{
using (Stream stream = client.OpenRead("https://server-url-xxxx.com"))
using (StreamReader reader = new StreamReader(stream))
{
MessageBox.Show(reader.ReadToEnd());
}
}
EDIT:
make a request from console.
internal class Program
{
private static void Main(string[] args)
{
new Program().Run();
Console.ReadLine();
}
public void Run()
{
var request = (HttpWebRequest)System.Net.WebRequest.Create("https://server-url-xxxx.com");
request.Method = "POST";
request.ProtocolVersion = System.Net.HttpVersion.Version10;
request.ContentType = "application/x-www-form-urlencoded";
using (var reqStream = request.GetRequestStream())
{
using(var response = new StreamReader(reqStream )
{
Console.WriteLine(response.ReadToEnd());
}
}
}
}
Try this:
WebRequest req = WebRequest.Create("https://server-url-xxxx.com");
req.Method = "POST";
string json_value = jsonSerializer.Serialize(data); //Body data
ServicePointManager.Expect100Continue = false;
using (var streamWriter = new StreamWriter(req.GetRequestStream()))
{
streamWriter.Write(json_value);
streamWriter.Flush();
streamWriter.Close();
}
HttpWebResponse resp = req.GetResponse() as HttpWebResponse;
Stream GETResponseStream = resp.GetResponseStream();
StreamReader sr = new StreamReader(GETResponseStream);
var response = sr.ReadToEnd(); //Response
resp.Close(); //Close response
sr.Close(); //Close StreamReader
And review the URI:
Reserved characters. Send reserved characters by the URI can bring
problems ! * ' ( ) ; : # & = + $ , / ? # [ ]
URI Length: You should not exceed 2000 characters
I ran into this, too. I wanted to simulate hundreds of users with a Console app. When simulating only one user, everything was fine. But with more users came the Timeout exception all the time.
Timeout occurs because by default the ConnectionLimit=2 to a ServicePoint (aka website).
Very good article to read: https://venkateshnarayanan.wordpress.com/2013/04/17/httpwebrequest-reuse-of-tcp-connections/
What you can do is:
1) make more ConnectionGroups within a servicePoint, because ConnectionLimit is per ConnectionGroups.
2) or you just simply increase the connection limit.
See my solution:
private HttpWebRequest CreateHttpWebRequest<U>(string userSessionID, string method, string fullUrl, U uploadData)
{
HttpWebRequest req = (HttpWebRequest)HttpWebRequest.Create(fullUrl);
req.Method = method; // GET PUT POST DELETE
req.ConnectionGroupName = userSessionID; // We make separate connection-groups for each user session. Within a group connections can be reused.
req.ServicePoint.ConnectionLimit = 10; // The default value of 2 within a ConnectionGroup caused me always a "Timeout exception" because a user's 1-3 concurrent WebRequests within a second.
req.ServicePoint.MaxIdleTime = 5 * 1000; // (5 sec) default was 100000 (100 sec). Max idle time for a connection within a ConnectionGroup for reuse before closing
Log("Statistics: The sum of connections of all connectiongroups within the ServicePoint: " + req.ServicePoint.CurrentConnections; // just for statistics
if (uploadData != null)
{
req.ContentType = "application/json";
SerializeToJson(uploadData, req.GetRequestStream());
}
return req;
}
/// <summary>Serializes and writes obj to the requestStream and closes the stream. Uses JSON serialization from System.Runtime.Serialization.</summary>
public void SerializeToJson(object obj, Stream requestStream)
{
DataContractJsonSerializer json = new DataContractJsonSerializer(obj.GetType());
json.WriteObject(requestStream, obj);
requestStream.Close();
}
You may want to set timeout property, check it here http://www.codeproject.com/Tips/69637/Setting-timeout-property-for-System-Net-WebClient
Ok, I've been racking my brain on this one solo for too long. I've been unable to crack it even with hours spent on this and many other sites.
Essentially, I'm trying to strip some data from a webpage behind a LogIn page using WebRequest/Response. (I have gotten this to work using a WebBrowser control with some layered events which navigate to the different web pages but it's causing some problems when trying to refactor - not to mention it's been stated that using a hidden form to do the work is 'bad practice'.)
This is what I have:
string formParams = string.Format("j_username={0}&j_password={1}", userName, userPass);
string cookieHeader;
WebRequest request = WebRequest.Create(_signInPage);
request.ContentType = "text/plain";
request.Method = "POST";
byte[] bytes = Encoding.ASCII.GetBytes(formParams);
request.ContentLength = bytes.Length;
using (Stream os = request.GetRequestStream())
{
os.Write(bytes, 0, bytes.Length);
}
WebResponse response = request.GetResponse();
cookieHeader = response.Headers["Set-Cookie"];
WebRequest getRequest = WebRequest.Create(sessionHistoryPage);
getRequest.Method = "GET";
getRequest.Headers.Add("Cookie", cookieHeader);
WebResponse getResponse = getRequest.GetResponse();
try
{
using (StreamReader sr = new StreamReader(getResponse.GetResponseStream()))
{
textBox1.AppendText(sr.ReadToEnd());
}
}
catch (Exception ex)
{
MessageBox.Show(ex.Message);
throw;
}
So far, I'm able to get to the proper page from the first link but when I go to the second, it sends me back to the login page as if I didn't log in.
The problem may lie in cookies not getting captured correctly but I'm a novice so maybe I'm just doing it wrong. It captures the cookies sent back from the POST: JSESSIONID and S2V however, when we go to the "GET", using FireFox WebConsole, the browser shows that it sends JSESSIONID, S2V and a SPRING_SECURITY_REMEMBER_ME_COOKIE, which I believe is the cookie used when I click the "Remember Me" box on the login form.
I've tried many different ways of doing this using the resources of SO but I have yet to get to the page I need. So, for the sake of the hair I have left, I've decided to ask for help on good ole SO. (This is one of those things I don't want to let up on - stubborn like that sometimes)
If someone wants the actual address of the site I'm trying to log into, I'd be more than happy to send it to a couple people in a private message.
Code that I have to reflect a suggested answer by Wal:
var request = (HttpWebRequest)WebRequest.Create(sessionHistoryPage);
request.Credentials = new NetworkCredential(userName, userPass);
request.CookieContainer = new CookieContainer();
request.PreAuthenticate = true;
WebResponse getResponse = request.GetResponse();
try
{
using (StreamReader sr = new StreamReader(getResponse.GetResponseStream()))
{
textBox1.AppendText(sr.ReadToEnd());
}
}
catch (Exception ex)
{
MessageBox.Show(ex.Message);
throw;
}
This suggestion, at least the way I implemented it, didn't work.
As Krizz suggested, I changed the code to use CookieContainer and transferring the cookies from one request to the other however, the response just gives me back the original login page as if I didn't login.
Are there certain sites that just WILL NOT allow this type of behavior?
Final Solution
The final solution was proposed by Adrian Iftode where he stated that the website I'm trying to log in might not allow to have an authentication without a valid session so adding a GET to the beginning of the process allowed me to get that cookie.
Thanks for all your help guys!
I was doing some sort of cookie transfer for a website written with PHP.
Clearly you are passing the cookie, but maybe is like in that situation
var phpsessid = response.Headers["Set-Cookie"].Replace("; path=/", String.Empty);
The Set-Cookie header contains other related info about the cookie and possible other instructions for other cookies. I had one cookie with its info (Path), the session id which I needed to sent back to the server so the server would know that I am the same client which did the GET request.
The new request had to include this cookie
request.Headers["Cookie"] = phpsessid;
You already do this, but make sure the cookies you receive, you sent back to the server.
Considering session and authentication, there are two cookies, one for session, one for authentication and some servers/application might not allow to have an authentication without a valid session. What I want to say is that you might need to pass the session cookie too. So the steps would be:
Do first a GET request to obtain the session cookie.
Next do the POST request to authenticate and get the auth cookie.
Use both cookies to navigate to the protected pages.
Also check this question, it doesn't show the entire class, but the idea is to keep the CookieContainer in the same class, add the new cookies from POST/GET requests and assign them to the each new request, like #Krizz answered.
Try using CookieContainer which is a class to keep cookies context between several requests. You simply create an instance of it and assign it to each WebRequest.
Therefore, modifying your code:
string formParams = string.Format("j_username={0}&j_password={1}", userName, userPass);
string cookieHeader;
var cookies = new CookieContainer(); // added this line
var request = WebRequest.Create(_signInPage) as HttpWebRequest; // modified line
request.CookieContainer = cookies; // added this line
request.ContentType = "text/plain";
request.Method = "POST";
byte[] bytes = Encoding.ASCII.GetBytes(formParams);
request.ContentLength = bytes.Length;
using (Stream os = request.GetRequestStream())
{
os.Write(bytes, 0, bytes.Length);
}
request.GetResponse(); // removed some code here, no need to read response manually
var getRequest = WebRequest.Create(sessionHistoryPage) as HttpWebRequest; //modified line
getRequest.CookieContainer = cookies; // added this line
getRequest.Method = "GET";
WebResponse getResponse = getRequest.GetResponse();
try
{
using (StreamReader sr = new StreamReader(getResponse.GetResponseStream()))
{
textBox1.AppendText(sr.ReadToEnd());
}
}
catch (Exception ex)
{
MessageBox.Show(ex.Message);
throw;
}
I'm calling web service (my web service) like this:
var request = WebRequest.Create(Options.ServerUri + Options.AccountId + "/integration/trip") as HttpWebRequest;
request.Timeout = 20000; // 20 seconds should be plenty, no need for 100 seconds
request.ContentType = "application/json";
request.Headers.Add(HttpRequestHeader.Authorization, "Basic " + Convert.ToBase64String(Encoding.UTF8.GetBytes(Options.LoginName + ":" + Options.Password)));
request.Method = "POST";
var serializedData = (new JavaScriptSerializer()).Serialize(trip);
var bytes = Encoding.UTF8.GetBytes(serializedData);
request.ContentLength = bytes.Length;
var os = request.GetRequestStream();
os.Write(bytes, 0, bytes.Length);
os.Close();
request.GetResponse();
LoggingAndNotifications.LogAndNotify(string.Format("Success uploading trip: {0}", trip.TripId), false);
return true;
This code called repeatedly to post new objects. After about 3 calls I start getting timeouts on reguest.GetReponse()
There is no errors on server side, nothing in Event Log. It feels like "something" stops me from repeatedly hitting service. What should I look for? Is it possible with company firewall? Or something wrong with my code?
I think the issue is that you are not closing the response. Try editing your code as follows:
var response = request.GetResponse() as HttpWebResponse;
response.Close();
You should close the response as per the example in the doco.
WebRequest myRequest = WebRequest.Create("http://www.contoso.com");
// Return the response.
WebResponse myResponse = myRequest.GetResponse();
// Code to use the WebResponse goes here.
// Close the response to free resources.
myResponse.Close();
Hmm. The doco also says
Any public static (Shared in Visual Basic) members of this type are
thread safe. Any instance members are not guaranteed to be thread
safe.
You should probably ask for a lock of some kind.
Are you sure this is not caused by server side bugs?
It seems strange, as far as I known, the webrequest on .net4 is based on IOCP in lower layer, maybe you can try release web request/response resources after each loop.
Since the GetResponse() will return a stream, if you don't read from it, the real data may not transfer from server to client side. (I found this when I am trying to parse a response that I used peek(), and it always return an invalid value until the read() is called.)
So, try to read it or just close it.
This code is for an outlook plugin. We're trying to POST to a page and are getting this error:
The remote server returned an error: (422) Unprocessable Entity.
The C# code is here:
webClient.Headers.Add("Content-Type", "application/x-www-form-urlencoded");
ASCIIEncoding asciiEncoding = new System.Text.ASCIIEncoding();
Byte[] postData = asciiEncoding.GetBytes("email=e2#email.com&password=hunter2");
char[] resultHTML = asciiEncoding.GetChars(webClient.UploadData("http://url", "POST", postData));
string convertedResultHTML = new string(resultHTML);
Any idea what could be causing this?
POST data must be encoded prior to be sent out on the wire as ASCII, if you are sending character not in the ASCII range. You should try something like:
Byte[] postData = asciiEncoding.GetBytes(HttpUtility.UrlEncode("email=e2#email.com&password=hunter2"));
Because of its limited functionality, I avoid using WebClient and use WebRequest instead. The code below:
does not expect an HTTP 100 status code to be returned,
creates a CookieContainer to store any cookies we pick up,
sets the Content Length header, and
UrlEncodes each value in the post data.
Give the following a try and see if it works for you.
System.Net.ServicePointManager.Expect100Continue = false;
System.Net.CookieContainer cookies = new System.Net.CookieContainer();
// this first request just ensures we have a session cookie, if one exists
System.Net.WebRequest req = System.Net.WebRequest.Create("http://localhost/test.aspx");
((System.Net.HttpWebRequest)req).CookieContainer = cookies;
req.GetResponse().Close();
// this request submits the data to the server
req = System.Net.WebRequest.Create("http://localhost/test.aspx");
req.ContentType = "application/x-www-form-urlencoded";
req.Method = "POST";
((System.Net.HttpWebRequest)req).CookieContainer = cookies;
string parms = string.Format("email={0}&password={1}",
System.Web.HttpUtility.UrlEncode("e2#email.com"),
System.Web.HttpUtility.UrlEncode("hunter2"));
byte[] bytes = System.Text.Encoding.ASCII.GetBytes(parms);
req.ContentLength = bytes.Length;
// perform the POST
using (System.IO.Stream os = req.GetRequestStream())
{
os.Write(bytes, 0, bytes.Length);
}
// read the response
string response;
using (System.Net.WebResponse resp = req.GetResponse())
{
if (resp == null) return;
using (System.IO.StreamReader sr = new System.IO.StreamReader(resp.GetResponseStream()))
{
response = sr.ReadToEnd().Trim();
}
}
// the variable response holds the results of the request...
Credits: Hanselman, Simon (SO Question)
This is the RoR application telling you that you have not formed a request that it can handle; the destination script exists (otherwise you'd see a 404), the request is being handled (otherwise you'd get a 400 error) and it's been encoded correctly (or you'd get a 415 error) but the actual instruction can't be carried out.
Looking at it, you seem to be loading some email information. The RoR application could be telling you that the username and password is wrong, or that the user doesn't exist, or something else. It's up to the RoR application itself.
I think the code itself is good; it's just that the app at the other end isn't happy about doing what you ask it. Are you missing something else in the request information, like a command? (eg command=getnetemails&email=e2#email.com&password=hunter2) Are you sure the email/password combination you are passing is good?
see here for more on the 422 error.
Add the below line above your code.
System.Net.ServicePointManager.Expect100Continue = false;
Are you trying to access an authentication required page?
it was solved by returning xml instead of just unstructured text on the RoR side