I have a WPF (could be any winform I guess) app that tries to login to a standard MVC 5 website using a HttpClient.
Normally I can login successfully with a call to PostAsync() where I provide the UserName and Password params in a HttpContent!
However, when I add the [ValidateAntiForgeryToken] to my controller's Login (POST) action, the PostAsync() call fails with Internal Server Error.
I have tried collecting the "__RequestVerificationToken" from a simple GET request and sending it with my POST request by adding it to the POST params, the Header of the request or the HttpHandler's CookieContainer (or any combination of the three) but still I get error 500 from the server.
I know it can be done with HttpWebRequests (apparently) but I don't know what I'm missing when using a HttpClient. I also don't know what exactly went wrong on the server side.. or how to check that since the code never reaches my controller method.
Did someone else try this by any chance?
EDIT 1:
I'm adding the raw data sent by the browser for both GET and POST:
GET http://localhost:57457/Account/Login HTTP/1.1
Accept: text/html, application/xhtml+xml, */*
Referer: http://localhost:57457/Account/Login
Accept-Language: en-US
User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64; Trident/7.0; rv:11.0) like Gecko
Accept-Encoding: gzip, deflate
Connection: Keep-Alive
DNT: 1
Host: localhost:57457
Cookie: NavigationTreeViewState=%5b%7b%27N0_1%27%3a%27T%27%2c%27N0%27%3a%27T%27%7d%2c%27N0_1_2%27%2c%7b%7d%5d; style=default; __RequestVerificationToken=Bak42Ga5sHJitYlmut6OgvmqXNmP7kKQRNaMSsLMAUh86iHGGmz5pnNfz_soKu46Wax9sG23arPOTnSh1bvaWyWqQ9NH4GJxFmendW8VFTg1
RESPONSE:
HTTP/1.1 200 OK
Cache-Control: private
Content-Type: text/html; charset=utf-8
Content-Encoding: gzip
Vary: Accept-Encoding
Server: Microsoft-IIS/8.0
X-AspNetMvc-Version: 5.2
X-Frame-Options: SAMEORIGIN
X-AspNet-Version: 4.0.30319
X-SourceFiles: =?UTF-8?B?RDpcRUJTLkNvZGVcUHJvamVjdHNcQ1ZSUE9TX1dlYlNpdGVcQ1ZSUE9TX1dlYlNpdGVcQWNjb3VudFxMb2dpbg==?=
X-Powered-By: ASP.NET
Date: Thu, 04 Dec 2014 10:00:00 GMT
Content-Length: 1734
[View page content]
POST http://localhost:57457/Account/Login HTTP/1.1
Accept: text/html, application/xhtml+xml, */*
Referer: http://localhost:57457/Account/Login
Accept-Language: en-US
User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64; Trident/7.0; rv:11.0) like Gecko
Content-Type: application/x-www-form-urlencoded
Accept-Encoding: gzip, deflate
Connection: Keep-Alive
Content-Length: 180
DNT: 1
Host: localhost:57457
Pragma: no-cache
Cookie: NavigationTreeViewState=%5b%7b%27N0_1%27%3a%27T%27%2c%27N0%27%3a%27T%27%7d%2c%27N0_1_2%27%2c%7b%7d%5d; style=default; __RequestVerificationToken=Bak42Ga5sHJitYlmut6OgvmqXNmP7kKQRNaMSsLMAUh86iHGGmz5pnNfz_soKu46Wax9sG23arPOTnSh1bvaWyWqQ9NH4GJxFmendW8VFTg1
__RequestVerificationToken=Bak42Ga5sHJitYlmut6OgvmqXNmP7kKQRNaMSsLMAUh86iHGGmz5pnNfz_soKu46Wax9sG23arPOTnSh1bvaWyWqQ9NH4GJxFmendW8VFTg1&UserName=test&Password=test
RESPONSE:
HTTP/1.1 400 Bad request (user/password for testing purposes only)
Cache-Control: private
Content-Type: text/html; charset=utf-8
Server: Microsoft-IIS/8.0
X-AspNetMvc-Version: 5.2
X-Frame-Options: SAMEORIGIN
X-AspNet-Version: 4.0.30319
X-SourceFiles: =?UTF-8?B?RDpcRUJTLkNvZGVcUHJvamVjdHNcQ1ZSUE9TX1dlYlNpdGVcQ1ZSUE9TX1dlYlNpdGVcQWNjb3VudFxMb2dpbg==?=
X-Powered-By: ASP.NET
Date: Thu, 04 Dec 2014 10:00:00 GMT
Content-Length: 4434
[View page content]
EDIT 2:
This is what my app sends for GET and POST:
GET http://localhost:57457/Account/Login HTTP/1.1
Host: localhost:57457
Connection: Keep-Alive
POST http://localhost:57457/Account/Login HTTP/1.1
Content-Type: application/x-www-form-urlencoded
Host: localhost:57457
Cookie: __RequestVerificationToken=df9nBSP_J1IiLrv84RwrkmvbYBrnH4iqv97wRvz6HMPLWBhgI4XzGeAFcschovHwD8mTtHU6xrmVxz1Ku96_BaoB79le_vLTcrgGemU4gjc1
Content-Length: 163
Expect: 100-continue
__RequestVerificationToken=df9nBSP_J1IiLrv84RwrkmvbYBrnH4iqv97wRvz6HMPLWBhgI4XzGeAFcschovHwD8mTtHU6xrmVxz1Ku96_BaoB79le_vLTcrgGemU4gjc1&UserName=test&Password=test
And finally this is the error:
[HttpAntiForgeryException (0x80004005): Validation of the provided anti-forgery token failed. The cookie "__RequestVerificationToken" and the form field "__RequestVerificationToken" were swapped.]
Thanks!
You d probably need to include aspnet session id cookie with your requests
EDIT:
OK ur right, it is not the session id, but you need two token to send back to your post action.
I think what you re doing wrong is using same value for both tokens, but they should be different, altho name of both tokens is __RequestVerificationToken.
Token grabbed from cookie should be send back as cookie and token grabbed from form field goes back as form field.
It's because you're missing the anti-forgery token from HtmlHelper.AntiForgeryToken() in your POST from your application.
You'll need to load a page from your WPF application with HtmlHelper.AntiForgeryToken() on the view. Then take the value of the hidden input element with the name __RequestVerificationToken and attach it to your login POST request to the server.
Related
The context is an aspnetcore 2.1 website hosted in a Docker container on port HTTP, along with the use of an Nginx reverse proxy exposing HTTPS 443 only.
The website is accessed from the outside on HTTPS, it redirects to an STS website on HTTPS, which redirects to the /signin-wsfed on HTTPS.
However, the response location from the /signin-wsfed is HTTP.
Here is the request:
POST https://core-mydocker.####/signin-wsfed HTTP/1.1
Accept: image/gif, image/jpeg, image/pjpeg, application/x-ms-application, application/xaml+xml, application/x-ms-xbap, */*
Referer: https://sts-mydocker.####/Pages/Email/Default.aspx?wtrealm=https%3a%2f%2fcore-mydocker.####%2f&wa=wsignin1.0&wreply=https%3a%2f%2fcore-mydocker.####%2fsignin-wsfed&wctx=#####
Accept-Language: fr-FR,fr;q=0.8,en-GB;q=0.6,en;q=0.4,ja;q=0.2
User-Agent: Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.2; WOW64; Trident/7.0)
Content-Type: application/x-www-form-urlencoded
Accept-Encoding: gzip, deflate
Host: core-mydocker.####
Content-Length: 10612
Connection: Keep-Alive
Cache-Control: no-cache
and the response:
HTTP/1.1 302 Found
Server: nginx/1.12.2
Date: Thu, 21 Feb 2019 09:39:34 GMT
Content-Length: 0
Connection: keep-alive
Cache-Control: no-cache
Pragma: no-cache
Expires: Thu, 01 Jan 1970 00:00:00 GMT
Location: http://core-mydocker.####/Authenticate
Set-Cookie: .AspNetCore.Correlation.WsFederation.######=; expires=Thu, 01 Jan 1970 00:00:00 GMT; path=/; HTTPOnly; Securesignin-wsfed; httponly
Set-Cookie: FedAuth=#######=/; HTTPOnly; Secure; httponly
Access-Control-Allow-Credentials: true
Access-Control-Allow-Origin: https://mydocker.####
HTTP being inacessible from the outside, this provokes an error.
How does the Microsoft.AspNetCore.Authentication.WsFederation determine the response location, considering that every parameter in earlier requests (wtrealm, wreply, ...) are HTTPS?
I could find the solution to this problem, which is related to Forwarded Headers.
It happens that configuring Nginx as follows:
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
and then configuring ASPNET Core as follows:
app.UseForwardedHeaders(new ForwardedHeadersOptions
{ ForwardedHeaders = ForwardedHeaders.XForwardedFor | ForwardedHeaders.XForwardedProto }
);
is not enough. The resolution is provided in this post proto headers not working
Happens that by default, the KnownNetworks collection only processes 127.0.0.1. In a Docker environment where each container is on a separate IP, that's bound to be incorrect and therefore Forwarded Headers would be ignored.
As suggested by the validated answer, changing the code to the below, fixed the problem:
var forwardingOptions = new ForwardedHeadersOptions()
{
ForwardedHeaders = ForwardedHeaders.XForwardedFor | ForwardedHeaders.XForwardedProto
};
forwardingOptions.KnownNetworks.Clear(); //its loopback by default
forwardingOptions.KnownProxies.Clear();
app.UseForwardedHeaders(forwardingOptions);
I have a C# .NET application that is writing a zipped file back to the client for download. However, the browser does not receive the file or it rejects the file. The browser does not show any notifications. I have tried it both on Firefox and Chrome.
I have captured the request and response from the client and server using Fiddler:
Request:
POST http://localhost:62526/Reports/_Report_RewardLetters HTTP/1.1
Host: localhost:62526
Connection: keep-alive
Content-Length: 228
Accept: */*
Origin: http://localhost:62526
X-Requested-With: XMLHttpRequest
User-Agent: Mozilla/5.0 (Windows NT 6.3; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/48.0.2564.109 Safari/537.36
Content-Type: application/x-www-form-urlencoded; charset=UTF-8
DNT: 1
Referer: http://localhost:62526/Reports
Accept-Encoding: gzip, deflate
Accept-Language: en-US,en;q=0.8
Response:
HTTP/1.1 200 OK
Cache-Control: private
Transfer-Encoding: chunked
Content-Type: application/octet-stream
Server: Microsoft-IIS/10.0
content-dispostion: filename=Letter.zip
X-AspNet-Version: 4.0.30319
X-SourceFiles: =?UTF-8?B?QzpcVXNlcnNcc2FpbmlfaFxTb3VyY2VcUmVwb3NcUmVzZWFyY2hPZmZpY2VEYXNoYm9hcmRcUmVzZWFyY2hPZmZpY2VEYXNoYm9hcmRcUmVwb3J0c1xfUmVwb3J0X1Jld2FyZExldHRlcnM=?=
X-Powered-By: ASP.NET
Date: Thu, 11 Feb 2016 23:13:22 GMT
....Truncated the file contents.....
My code:
Response.Clear();
Response.ClearHeaders();
Response.ClearContent();
Response.AddHeader("content-dispostion", "filename=MyFile.zip");
Response.ContentType = "application/octet-stream";
Response.Flush();
Response.WriteFile(myfile);
Response.Flush();
Response.End();
I have tried numerous combinations of Response.Flush(), Response.Clear(), HttpContext.ApplicationInstance.CompleteRequest(), Response.BinaryWrite(), Response.TransmitFile(), etc. but none seem to work. Additionally, I have in my code the necessary checks to determine the existence of the file.
From the fiddler captures, I think there is something wrong in the encoding or the server response of the file being sent to the client whereby the browser is rejecting the file without any notification.
Thanks for your help!
Just a thought: do you need the Response.Flush statements? Setting your headers, writing your file and then calling Response.End should be enough.
Also, set the ContentType to "application/zip, application/octet-stream"
I had made a mistake that was causing all the fuss. The view was using an Ajax form to make the request instead of a normal Html form.
I fixed the problem by changing my controller and view according to the information present here:
http://geekswithblogs.net/rgupta/archive/2014/06/23/downloading-file-using-ajax-and-jquery-after-submitting-form-data.aspx
Download Excel file via AJAX MVC
In request header Accept-encoding: gzip, deflate, is missing, but in response header Content-encoding: gzip is present. does it cause compression failed. if yes, how to avoid it??
Request URL: http://something.com/something.js
Request Method: GET
Status Code: 200 OK 200 OK
Request Headers
Accept: */*
Referer: somthing.comsomthing.aspx
User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/38.0.2125.111 Safari/537.36
X-DevTools-Emulate-Network-Conditions-Client-Id: 2D3ED9B5-95BD-4984-9EEE-405C2889F11E
Response Headers
Accept-Ranges: bytes
Content-Encoding: gzip
Content-Length: 884
Content-Type: application/x-javascript
Date: Tue, 28 Oct 2014 11:09:13 GMT
ETag: "0ac99ce3e9fcf1:0"
Last-Modified: Mon, 14 Jul 2014 08:37:12 GMT
Server: Microsoft-IIS/8.0
Vary: Accept-Encoding
X-Powered-By: ASP.NET
From RFC 7231:
A request without an Accept-Encoding header field implies that the
user agent has no preferences regarding content-codings. Although
this allows the server to use any content-coding in a response, it
does not imply that the user agent will be able to correctly process
all encodings.
In short: if you specify no Accept-Encoding, it's legal (though ill-advised) for the server to send you compressed content. There doesn't appear to be a solid, reliable way to tell a web server that it should definitely not compress. You can try Accept-Encoding: *;q=0 or Accept-Encoding: identity, but support for this is not universal across web servers, and proxies can mess things up as well.
In the end you are probably better off with simply handling compressed content if it comes back as such -- there is no good reason for a client to not support compression and libraries for this are freely available.
I have a WCF contract like this:
[OperationContract]
[WebInvoke(Method = "POST")]
string Import_CSV();
And then the Import_CSV() method successfully accepts a file upload from HttpContext.Current.Request.Files...
The Import_CSV() method returns a string with message of success or the specific error that occurred.
However, if there are multiple errors with the server operation that I perform on the file that was uploaded, then my goal is to generate a csv file and serve it to the user for them to download.
So, I created csv content, and now I want to display it for the user to download.
Setting the Statuscode to (int)HttpStatusCode.OK, I then return an empty string for success, and use response.write to serve the newly generated csv file like this:
HttpContext.current.Response.ContentType = "text/csv";
HttpContext.current.Response.AddHeader("Content-Disposition", "attachment;filename=ImportErrors.csv");
HttpContext.current.Response.AddHeader("Pragma", "no-cache");
HttpContext.current.Response.AddHeader("Cache-Control", "no-cache");
foreach (var line in csvLines)
{
context.Response.Write(line);
context.Response.Write(Environment.NewLine);
context.Response.Flush();
}
context.ApplicationInstance.CompleteRequest();
In internet explorer, everything seems to work as planned, but Firefox & Chrome don't seem to know what to do with this data. I have tried setting different http status codes, and different mime types for the content, but I suspect I am headed down the wrong road.
Any suggestions?
I asked a smaller piece of this puzzle recently on stack overflow here: Limit of 88 bytes on response.write?
Answer was very helpful, but now my question is regarding WCF, returning both text and file with this service call, and needs to work in more than just IE. Thanks!
Headers when used in chrome:
request:
POST /services/PTService.svc/Import_CSV HTTP/1.1
Host: inf18
Connection: keep-alive
Content-Length: 565
Origin: http://inf18
User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.11 (KHTML, like Gecko) Chrome/23.0.1271.97
Safari/537.11
Content-Type: multipart/form-data; boundary=----WebKitFormBoundarykpa0b9lHil712wdL
Accept: */*
Referer: http://inf18/dashboard/ManageRecipient.aspx
Accept-Encoding: gzip,deflate,sdch
Accept-Language: en-US,en;q=0.8
Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.3
Cookie: ASP.NET_SessionId=o50tom0nkjfmglumcmrbtlds
response:
HTTP/1.1 200 OK
Cache-Control: no-cache
Pragma: no-cache
Transfer-Encoding: chunked
Content-Type: text/csv; charset=utf-8
Content-Encoding: gzip
Vary: Accept-Encoding
Server: Microsoft-IIS/7.5
Content-Disposition: attachment;filename=ImportErrors.csv
X-AspNet-Version: 4.0.30319
X-Powered-By: ASP.NET
Date: Tue, 18 Dec 2012 18:30:37 GMT
Headers when used in IE: (IE9 Compat. view, IE9 Standards)
request:
POST /services/PTService.svc/Import_CSV HTTP/1.1
Accept: application/x-ms-application, image/jpeg, application/xaml+xml, image/gif, image/pjpeg, application/x-
ms-xbap, application/vnd.ms-excel, application/vnd.ms-powerpoint, application/msword, */*
Referer: http://inf18/dashboard/ManageRecipient.aspx
Accept-Language: en-US
User-Agent: Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.1; WOW64; Trident/5.0; SLCC2; .NET CLR
2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0; .NET4.0C; .NET4.0E; MDDR)
Content-Type: multipart/form-data; boundary=---------------------------7dc180222c0fb0
Accept-Encoding: gzip, deflate
Host: inf18
Content-Length: 598
Connection: Keep-Alive
Pragma: no-cache
Cookie: ASP.NET_SessionId=e2iwodwyy2muxoj4zhpisl5k
response:
HTTP/1.1 200 OK
Cache-Control: no-cache
Pragma: no-cache
Transfer-Encoding: chunked
Content-Type: text/csv; charset=utf-8
Content-Encoding: gzip
Vary: Accept-Encoding
Server: Microsoft-IIS/7.5
Content-Disposition: attachment;filename=ImportErrors.csv
X-AspNet-Version: 4.0.30319
X-Powered-By: ASP.NET
Date: Tue, 18 Dec 2012 18:33:34 GMT
have you tried to add the following lines before you are writing to the Response?
/******************************************/
HttpContext.Current.Response.ClearHeaders();
HttpContext.Current.Response.Clear();
/******************************************/
HttpContext.Current.Response.ContentType = "text/csv";
HttpContext.Current.Response.AddHeader("Content-Disposition", "attachment;filename=ImportErrors.csv");
//rest of your code sample
I'm downloading a site for its content using a Webcrawler I wrote with Microsoft WebBrowser.
A part of the site's content is sent only after some kind of verification sent from the client side - my guess is that its cookies / session cookies.
When i'm trying to download the page from my crawler i see (with Fiddler's help) that the inner link for the ajax sends 'false' for one of the parameters and the data is not received.
When I try to perform the same action from any browser, Fiddler shows that the property is sent as '1'.
After a day of testing, any lead will be grateful - Is there a way to manipulate this property? plant cookies? any other idea?
Following khunj answer, I'm adding Headers from IE and from my WebBrowser:
In both headers i removed fields which have the same value
From IE:
GET /feed/prematch/1-1-234562-8527419630-1-2.dat HTTP/1.1
x-requested-with: XMLHttpRequest
Referer: http://www.mySite.com/ref=12345
Accept-Encoding: gzip, deflate
User-Agent: Mozilla/4.0 (compatible; MSIE 8.0)
Connection: Keep-Alive
Cookie: __utma=1.1088924975.1299439925.1299976891.1300010848.14;
__utmz=1.1299439925.1.1.utmcsr=(direct)|utmccn=
(direct)|__utmb=2.1.10.1300010848; __utmc=136771054; user_cookie=63814658;
user_hash=58b923a5a234ecb78b7cc8806a0371c5; user_time=1297166428; infobox_8=1;
user_login_id=12345; mySite=5e1c0u8g6qh41o2798ua2bfbi3
HTTP/1.1 200 OK
Date: Sun, 13 Mar 2011 10:07:38 GMT
Server: Apache
Last-Modified: Sun, 13 Mar 2011 10:07:25 GMT
ETag: "26a6d9-19df-49e5a5c9ed140"
Accept-Ranges: bytes
Content-Length: 6623
Cache-Control: max-age=0, no-cache, no-store, must-revalidate
Pragma: no-cache
Expires: Wed, 11 Jan 1984 05:00:00 GMT
Connection: close
Content-Type: text/plain
Content-Encoding: gzip
From WebBrowser:
GET /feed/prematch/1-1-234562-8527419630-false-2.dat HTTP/1.1
x-requested-with: XMLHttpRequest
Referer: http://www.mySite.com/ref=12345
Accept-Encoding: gzip, deflate
User-Agent: Mozilla/4.0 (compatible; MSIE 7.0)
Connection: Keep-Alive
Cookie: __utma=1.1782626598.1299416994.1299974912.1300011023.129;
__utmb=2.1.10.1300011023; __utmz=1.1299416994.1.1.utmcsr=
(direct)|utmccn=(direct)|__utmc=136771054; user_cookie=65192487;
user_hash=6425034050442671103fdd614e4a2932; user_time=1299416986;
user_full_time_zone=37;user_login_id=12345; mySite=q9qlqqm9bunm9siho32tdqdjo0
HTTP/1.1 404 Not Found
Date: Sun, 13 Mar 2011 10:10:33 GMT
Server: Apache
Content-Length: 313
Connection: close
Content-Type: text/html; charset=iso-8859-1
Thanks in advance,
Oz.
Well, the server is obviously treating your request from your crawler differently. Since you already have fiddler involved, what is different in your request headers when you make the request from IE versus using your crawler. The reason I say IE is because the webbrowser control uses the same engine as IE for doing its work.
The way I solved my problem is by using Fiddler as a proxy and defining a custom reply to the server that whenever the PathAndQuery property contains the site address, replace the 'false' to '1'.
Not the most elegant solution but fits my problem.
I learned the most from these 2 pages:
FiddlerScript CookBook
A site which teaches on the specific customRules.js file and the field i needed to edit
Thanks for the help,
Oz.