.AUTH Cookie Get Dropped During Form Authentication Process ASP.NET - c#

I have been stuck in a .AUTH cookie bug for a while now with a website that uses form authentication in ASP.NET to log in the user. The website uses form authentication and ask for authentication information from a server that is running remote. Our website is not working with the response from production server, but the response from the staging server works just fine. These servers should be the same.
I looked into the problem for a while and find that for some reason the form authentication process using the response from production authentication server will drop the .AUTH cookie when send a GET request to Microsoft-IIS/10.0. I attached the request header and the response for both production and staging server. If anyone could possibly suggest the potential problem it will be great.
This is the worked one:
request
response
This is the bugged one:
request
response
As you can tell the request are pretty much the same, but the response missed the .AUTH cookie.
Please let me know what could potentially cause the issue, I am pretty new to ASP.NET.

Could this be Cross-Origin Resource Sharing (CORS) issue?
If the domain specified on the cookie is not the same as the back end then it will not be shared.
You might also find this answer useful if this is the problem

Related

Request using .NET HttpClient request does not receiving response while via browser it does - possible SSL issue?

I'm developing an app and it access a website with HTTPS.
Requesting manually via incognito window, works.
Requesting via app using HttpClient, it never received a response.
I confirmed, via Fiddle, that headers are exactly the same between manual and app request. Via browser, a return comes in 1 second. Via app, it just hangs until eventually times out with an HTTP error 504. It is the exact same URL and exact same HTTP headers, checked one by one.
It is not a deadlock issue because in this case Fiddle would at least show the request completed. However the request stays open until it errors with HTTP 504.
Is it possible that the problem lies in the SSL process? There I noticed differences (in Fiddle). Both are TLS/1.2 version 3.3. The one via browser has SessionID, while the one via app this SessionID is empty. It is also missing some other data like ALPN, etc.
This would be the only thing I can think of, because the request is exactly the same.
Any help is appreciated!
Thank you.

Preventing Cross-Site Request Forgery Attacks With a Windows Client Application

I develop an web application by web api2.
My client application is a windows application that developed by C# , .net 4.0.
The client application sends some Json data to the web api application and the application stores data in database.
Now the issue is sending the request with another method except my application and sending dump data to the server.I have authentication on the server but it isn't enough,I need some tokens for handling this issue.
After some searches i find this article and read it, but the client is a web application.
Could i use this method in my windows client app?how?
Bottom line: You shouldn't need to.
By definition, CSRF attacks can only affect client applications that share cookies across domains. e.g. if you visit www.bank.com with your browser and then open another tab to www.evil.com, if www.bank.com does not protect against CSRF then www.evil.com may be able to POST a form submission to www.bank.com while you are logged in and then transfer money by forging the request to the form's action URL on the transfer money page.
If your client is a Windows application, the HTTP client should not have cookies stored for any other service other than your web API.
Note that the above only applies to when cookies are used as the session management mechanism (i.e. not Kerberos, NTLM, Basic Auth, etc).
.I have authentication on the server but it isn't enough
This should be enough as an attacker cannot forge a HTTP request to your API that will be sent along with the victim's cookies as the cookies are separated due to there being different instances of web clients. Much like being logged into Google on Chrome, but then accessing Google on Firefox - you will not share the same logged in session.
Of course, protect your API with HTTPS so the information is encrypted whilst in transit. Note that this does not protect against decompilation of your source code, which is something that is not easy to prevent. At the end of the day you cannot trust clients that are not under your control. You can make it difficult, but not impossible to prevent someone working out or changing what is being sent to your API.
Cross site anti-forgery tokens are a form of authentication. It authenticates the client who's sending the request: the client has to visit a certain page to get the token from the server, so it cannot be any client who has not visited that page and some how just send random data to that server.
The purpose of authentication is for the server to authenticate the client (other way around is also possible, but let's forget that for the moment). You setup the system such that it is very difficult for others to pretend to be your Windows Form app. Note it can be very difficult, but theoretically it's always possible to fake. So the aim is to setup an auth such that an attacker considers it impractical to launch an attack.
This auth should not be mixed up with the authentication to verify the human user. They are different. An app can provide a UI for human users to login, but the app is not written by you. So you need to authenticate 2 things:
the request actually comes from your app, if that succeeds, then
the human user is who he claims he is, otherwise
reject the request

How does HttpContext.Current.User.Identity.Name work? and how secure is it?

I am using HttpContext.Current.User.Identity.Name to get the user name of the logged in user. I would like to know how this is working (using NTLM v2 / Kerberos) and how secure is it? Can the user try to mimic he is someone else?
Basically, from a security point of view, is there something I should be worried about, or how should I improve it?
If you are authenticating using Windows authentication (which, given your mention of NTLM/Kerberos it appears you are) then what happens is (roughly) as follows
IE sends a request with no authentication header to your web server.
IIS refuses the request with a 401 response code and tells the browser the authentication scheme it wants (in this case Negotiate, which tries Kerberos first, and then falls back to NTLM)
The kerb handshake takes place over multiple connections, and the ticket is validated against AD
IIS passes the ticket down to ASP.NET which, in the process of building the Request object populates the principal on the thread assigned to the request with the identity details from the ticket.
When you access HttpContext.User you see the principal for the current thread.
It's secure. It's basically the same authentication type used when you connect to a Windows server via file shares or anything else that is using kerberos. It's actually IIS and Windows itself doing the vast majority of the work, ASP.NET is just giving you a nice way to query the results.

FormsAuthentication Fails To Authenticate After Changing Domain

I'm having an interesting problem with FormsAuthentication on a client project I'm trying to assist with. Here's the problem:
The domain of web app was changed from .companyA.com to .companyB.com and I set up an IIS redirect to send anyone attempting to go to .companyA.com to .companyB.com. That works just fine.
Now I can't log in to the site. I did some digging and found out that there are authCookies being defined in the web.config, so I changed the the authCookie domains to match .companyB.com. I was still not able to log in.
I did some more digging and found that there was a SQL Reporting server set up. I changed the domain in the config files of the reporting server to match .companyB.com. Still can't log in.
For general purpose troubleshooting of this type of issue is there anywhere else I could look? I've just been put on this as a firefight so I've limited domain knowledge and can't open the solution in VS because the only VS they have available is too old for the solution.
EDIT: OK, after further digging, I found out that the user isn't authenticating wasn't the main issue. There was a problem with connecting to the reporting service DB and the exception was being swallowed up. Thankfully it showed up in the event viewer. Unfortunately this still leaves me with problems, I can't figure out why SQL won't authenticate me anymore and how changing domain names could possibly lead to that.
Debugging is really simple. Just get any http debugger (Fiddler will do) and run your application. You will see a list of requests. Just pay attention to the response which sets the auth cookie and then what happens in consecutive requests.
You will probably see that the cookie is set but is missing futher on. This could be because the cookie is issued for domain A (you will see the cookie's domain in the debugger) and the browser is not delivering it to the domain B (which the browser is supposed to do; it will never carry cookies to other domains).
Anyway, an http debugger will be a great help here.

Easiest way to get web page source code from pages that require logins -- C#

So I play an online game that's web based and I'd like to automate certain things with it using C#. Problem is that I can't simply use WebClient.DownloadData() because I need to be logged in to actually recieve the source. The other alternative was to use the built-in web browser control but that doesn't give me access to source code. Any suggestions?
I don't think NetworkCredentials will work in all cases. This only works with "Basic" or "Negotiate" authentication.
I've done this before with an internal website for some load testing, but sounds like you are trying to "game" the game. For that reason I won't go into details but the login to the site is probably being done in the form of an HTTP POST when you hit the login button.
You'd have to trap the POST request and replicate it in your code and make sure that your implementation maintains the session state as well, because if the game site is written well at all it will make sure that the current session has logged in before doing anything game related.
You can set the login credentials on the webclient using its Credentials property before calling DownloadData:
WebClient client = new WebClient();
client.Credentials = new NetworkCredential("username", "password");
EDIT: As mjmarsh points out, this will only work for sites that use a challenge-response method of authentication as part of a single request (I'm so used to dealing with this at work, I hadn't considered the other types!). If the site uses forms authentication (or indeed any other form of authentication), this method will not work as the authentication is not part of a single request - multiple requests are needed that you will need to handle yourself.
Network credentials will not work as mjmarsh has already pointed out.
While web scraping we come across lot of pages where login is needed. One of the approaches I use is install fiddler and monitor the POST and GET packets while manually logging in the site. This allows you to find out how the browser emulates the login. Then you need to recreate the same process by Code.
For example, most web servers use cookies to assume the session is authenticated. So you can use the credentials to post UserName and Password on the web site and record the Cookie. This cookie can then be used to access any further details on the web site.
Please check following link to check out more about Advanced Web Scraping:
http://krishnan.co.in/blog/post/Web-Scraping-Yahoo-Mail.aspx
In this blog, you will find how to authenticate into Yahoo account and then read the page after authentication.

Categories