I'm working in a WPF application which is the desktop counter-part of a website.
The idea is that the application should have the same functionalities that the website has, but for the first release of the application not all features would be available, because of this I want to put a link on the features that are not available yet and hopefully when the user click on that link I want to take them to the website, to the page for that functionality.
So far, I can do this with no trouble, using the shell command I can open the default browser and send the request to the resource I need on the website.
Now, the tricky part is that I want to use the credentials that the user used in the desktop application to authenticate with the website, so the user doesn't have to authenticate again, I was thinking in sending the credentials encrypted in a header but I don't know how can I do this, how can I send the header to the web browser from my application.
Any idea on how to do this?
BTW the website is using Forms Authentication.
Thanks.
You could try passing your forms authentication cookie as part of the webrequest.
Uri uri = new Uri("http://services.mysite.com/document/documentservice.svc");
HttpWebRequest webRequest = (HttpWebRequest)WebRequest.Create(uri);
HttpCookie cookie = HttpContext.Current.Request.Cookies[FormsAuthentication.FormsCookieName];
Cookie authenticationCookie = new Cookie(FormsAuthentication.FormsCookieName, cookie.Value, cookie.Path, HttpContext.Current.Request.Url.Authority);
webRequest.CookieContainer = new CookieContainer();
webRequest.CookieContainer.Add(authenticationCookie);
WebResponse myResponse = webRequest.GetResponse();
Stream stream = myResponse.GetResponseStream();
try this link
Related
I am working on a download manager and trying to get cookie required contents using HttpWebRequest. I want to integrate my application to Chrome and so I can get the necessary cookie headers and values from the browser.
But first I need to know if cookie is required to get a content to download and which cookies are they. I can't find any helpful resource about this topic.
This is what I imagine:
HttpWebRequest req = (WebRequest.Create(url)) as HttpWebRequest;
//At first, get if cookies are necessary?
//If it is, get the required cookie headers
//Then add the cookies to the request
CookieContainer cc = new CookieContainer();
Cookie c1 = new Cookie("header1", "value1");
Cookie c2 = new Cookie("header2", "value2");
CookieCollection ccollection = new CookieCollection();
ccollection.Add(c1);
ccollection.Add(c2);
cc.Add(uri, ccollection);
req.CookieContainer = cc;
//Get response and other stuff......
How can I do these steps?
The cookies required to get content from a server are specified by that server in the HTTP response's "Set-Cookie" header. The generic scenario is:
Client makes an HTTP request to the server (this could be a login page, or a download page)
Server responds with HTTP response that contains "Set-Cookie" header(s)
Client remembers all those cookies
Client uses the cookies stored in step 3 for all subsequent requests to the same server
Now, considering your scenario of integrating into Chrome, I imagine that the initial requests (steps 1 to 3) will not be done by your application, but by the Chrome itself. The cookies will be stored in the Chrome's cookie store. So what your application will need to do is to get from Chrome all cookies for the domain where you want to download from, and include those cookies in your request (step 4).
See chrome.cookies document on how to use Chrome API to interact with its cookie store, and Set-Cookie docs from Mozilla for the in-depth description of how cookies are specified in the HTTP response.
Try to capture the cookies from first request (a login page may be) and add all those in next request (download request). Something like below.
public void MakeRequest()
{
var container = new CookieContainer();
HttpWebRequest request = (HttpWebRequest)WebRequest.Create("http://example.com/loginpage");
request.Method = WebRequestMethods.Http.Get;
request.CookieContainer = container;
HttpWebResponse response = null;
response = (HttpWebResponse)request.GetResponse();
//once you read response u need to add all cookie sent in header to the 'container' so that it can be forwarded on second response
foreach (Cookie cookie in response.Cookies)
{
container.Add(cookie);
}
HttpWebRequest downRequest = (HttpWebRequest)WebRequest.Create("http://example.com/downloadpage");
downRequest.Method = WebRequestMethods.Http.Get;
downRequest.Proxy = null;
//As you have added the cookies, this must response fine now
downRequest.CookieContainer = container;
response = (HttpWebResponse)downRequest.GetResponse();
var stream = response.GetResponseStream();
}
Hope this helps.
You definitely should integrate cookie because web sites requiring to identify user set data into cookie
without that token you can't perform the download
Cookie to be used are site dependent, you can't guess if the web site needs them or not, which are needed or not
If you are using .Net 4.5+, consider using "WebRequest.CreateHttp" static method
https://msdn.microsoft.com/fr-fr/library/ff382788(v=vs.110).aspx
Keep track of the CookieContainer, it will eventually be populated with new cookie from response (Set-Cookie header in response)
note : cookie are linked to a domain (ie stackoverflow.com)
I suggest you install a cookie extension to your Chrome browser to play with
This depends on actual download you want to perform and requirements of the server. Most servers will allow download regardless of cookies.
However, you can always send cookies just in case. What cookies you need? Here are some rules how browsers do it:
Cookie have Domain and Path attributes. Domain applies to subdomains.
So, if request is made for http://foo.bar.com/some/path following cookies will be sent:
-Those with domain com, bar.com and foo.bar.com without path
-Same as previous but having paths like /some or some/path etc.
It will not send cookies from other domains nor from domains listed above but with path not contained in path of request.
So, you would have to search for cookies in same way depending on URL of file you should download.
I would have added a comment instead, but don't have enough rep. You seem to be on the right track. Mateusz Radny's suggestion to use EditThisCookie will allow you to explore cookies you have in your browser, but for any particular web server you don't usually* need to decide which cookie is which (so you really don't need to care which is the login cookie).
The protocol for cookies is that the browser should send back the cookies that the web server initially sent to the browser for that particular website. Might help to read a bit more about cookies: https://en.wikipedia.org/wiki/HTTP_cookie (probably jump to Implementation, and then read up a bit in Privacy and third-party cookies because you really don't want to sharing cookies from one website with other websites).
So assuming you want to emulate what the browser would send, make sure your download manager also sends the same set of cookies that Chrome received from that particular website and it should work. Also avoid caching the cookies in your code, since your browser will update the cookies (e.g. delete expired cookies) and so you should always get them from the browser each time you need it.
*Note: Sometimes a cookie is marked for certain types of connections only or for use with a particular domain/sub-domain or Uri path. If set, then you should limit when you send those back based on if it matches the connection you're trying to make. Please research this separately (latest RFC specs: https://www.rfc-editor.org/rfc/rfc6265).
PS: The web server may send back new or updated cookies as part of the download request you made via your download manager. If you really want to be perfect, these should be copied from your download manager back to Chrome's cookie set (though I'm not familiar with the Chrome API so not sure how hard that would be).
I have an Azure website running with active directory enabled. I am able to log in to the website just fine with various accounts by going through the Azure login page.
I get a 401 when attempting to call an API on that website from a .net client with any account I try.
var request = WebRequest.Create(url);
request.Method = "GET";
request.Credentials = new NetworkCredential("username#ourdomain.com", "password");
request.GetResponse();
Is there a way to hit the website without going through the actual azure login page?
You're going to have to use an Azure Active Directory Authentication Library (ADAL). See https://msdn.microsoft.com/en-us/library/azure/dn151135.aspx
See for example the code samples on Web Application to Web API. This one looks like it would be useful for your application: https://github.com/AzureADSamples/WebApp-WebAPI-OAuth2-UserIdentity-Dotnet
I am trying to use the MindMeister API (as documented here: http://www.mindmeister.com/developers/authentication) to make a desktop application. So far, I am able to generate an authentication url as documented on their developer's guide with an api key an shared secret, which ends up looking something like this: http://www.mindmeister.com/services/auth/?api_key=abc123&perms=delete&api_sig=zxy987
If I copy and paste that url into my browser, it takes me to their log in page. Once I log in, then it says my application has been authenticated and I can proceed with my application, which then allows me to start using the different REST API methods. I would like to navigate to that authentication url and login to MindMeister programmatically without having to copy and paste the authentication url into the browser.
So far I have tried something like this
string authenticate
= #"http://www.mindmeister.com/services/auth/?"
+ api_key=abc123&perms=delete&api_sig=zxy987";
WebRequest request = WebRequest.Create(authenticate);
#"https://www.mindmeister.com/account/login");
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
//response.ResponseURI == #"https://www.mindmeister.com/account/login"
WebRequest request2 = WebRequest.Create(response.ResponseUri);
request2.Credentials = new NetworkCredential("username", "password");
HttpWebResponse response2 = (HttpWebResponse)request2.GetResponse();
but this does not work.
Can I get some guidance on how to accomplish what I want? I have next to no experience with WebRequest or HttpWebResponse as I basically just copied and pasted other solutions on StackOverflow.
From the documentation it appears that login this way is not officially supported. Keep in mind that most services that operate using an API key can and will revoke this key if they don't like the way it's being used.
However the login might not be needed every time a user starts your application.
It appears that if you get a frob using the mm.auth.getFrob method and pass that as a parameter with the login you could then use that same frob on the mm.auth.getToken method.
You can then store this token somewhere on the users desktop and could use this to use methods in the future.
This should keep working until the user actively revokes the permissions to your application.
I have a small conundrum where I have an asp.net application that's using forms based authentication. Inside of the application, I have a webservice that checks User.IsInRole("somerole") which works fine with ajax calls from the application since the user is logged in, and the ajax calls come from his logged in browser.
Now, I want to make a fat client call the webservices (c# console client for starters), but cant figure out how to pass the credential information to it.
I've looked at doing something like the following to no avail:
SomeWebService svc = new SomeWebService();
svc.Credentials = new NetworkCredential("formsusername","formspassword","");
String returnValue = svc.CallMyWebMethod();
Can anyone out there show me the trick to this? :-)
Thanks!
Forms Authentication works by having the client send a Cookie along each request. This cookie is emitted by the server when the client successfully authenticates by sending the correct credentials.
So here are the steps that you need to do in your console application in order to authenticate the user using forms authentication:
Send an HTTP POST request to some web page passing the username and password. In response the web server will give you the authentication cookie (Set-Cookie HTTP response header) that you need to capture. That's usually your Log page.
When calling your web service you need to pass this cookie (Cookie HTTP request header). In order to set a cookie along with the request, you will have to override the GetWebRequest method on the client proxy class that was generated for you:
protected override WebRequest GetWebRequest(Uri uri)
{
var request = (HttpWebRequest)base.GetWebRequest(uri);
request.CookieContainer.Add(
new Cookie(
".ASPXAUTH",
"THE VALUE YOU HAVE RETRIEVED WHEN YOU SEND YOUR FIRST LOGON REQUEST"
)
);
return request;
}
As the title says,
Does ASP .NET c# HttpWebRequest support HTTPS Authentication via username:password#www.website.com?
Almost everything I find regarding authentication appends the username and password after the website. Will using System.Net.HttpRequest work with username and password before the website?
[EDIT]
Let me explain the scenario, this is for an application that I am developing that makes a HTTP POST to a website. We are already using HTTPS, and the website server has my IP whielisted. At the request of someone from the 'website', they want to use the username:password#www.website.com in addition for authentication.
Adding credentials on the url never worked for me. This is how I'm currently doing it:
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(apiUrl);
System.Net.CredentialCache credentialCache = new System.Net.CredentialCache();
credentialCache.Add(
new System.Uri(apiUrl),
"Basic",
new System.Net.NetworkCredential(basicAuthUserName, basicAuthPassword)
);
request.PreAuthenticate = true;
request.Credentials = credentialCache;