I'm using HttpWebRequest to get data from websites. Currently I want to get data from website, which is secured by CAS system. I'm sending and receving headers via HttpWebRequest (C#) and all is ok. Unfortunately when I want to get certain pages, then I got message "Automatically logged out.". I checked headers in LiveHttpHeaders (Firefox Add-On), and all my headers in my C# app and in Firefox are the same.
Example (headers from Firefox):
https://www.example.com/balance/ > Moved temporarily to:
https://www.example.com/cas/login?service=https%3A%2F%2Fwww.example.com%3A443%2Fbalance%2Fj_spring_cas_security_check
In web browser all is working. In C# app nope.
Related
I have a solution with two ASP.NET Core MVC projects. One project (Client) is making a request to the other (Server) using HttpClient. When the action in Server receives the request, I want to get the URL of the thing that sent it. Every article I have read purports Request.Headers["Referer"] as the solution, but in my case Headers does not contain a "referer" key (or "referrer").
When receiving the request in Server, how should I find the URL of the Client that sent it?
That is how you you get the referring url for a request. But the referer isn't the thing that sent the request. The referer gets set in the headers by the browser when a person clicks on a link from one website to go to another website. When that request is made by the browser to the new website the request will typically have the Referer header which will contain the url of the prior website.
The receiving server can't get the url of the "client" making the request, remember a typical web browser client isn't at any url. All the receiving server can get is the IP address of the client typically.
Since you have control of the client software, if you wanted you could have the client put whatever info you want in the header of the request before it's sent to the server and the server could then get that info out of the header.
If you're using HttpClient, then it is up to the site making the request to add that header. It isn't added automatically in this case. So: change the code - or request that the code is changed - so as to add the header and value that you expect. If you are proxying through a request, you might get the value from the current request's Referer header, and add that.
Even in the general case of a browser making the request as part of a normal page cycle, you can't rely on it: the Referer header is often deliberately not sent; depending on the browser version, configuration, whether you're going between different domains, whether it is HTTPS or not, and rel markers on a <a href=... such as "noreferrer".
I have an application that generates a web request to Facebook Graph API to get a share count from an external page. I have been using this code for over a year without issue, and suddenly, the share count is not working when the request is made from .NET. However, if I make the request from a web browser, it works just fine. My code is as follows:
string fbLink = "https://graph.facebook.com/?id=" + externalLink + "&fields=og_object%7Bengagement%7D&access_token=<token_removed>";
WebClient client = new WebClient();
string fbString = client.DownloadString(fbLink);
This code still appears to be working fine, in that the request is made, and FB responds with no errors. In fact, it responds back with correct page id, and details. However, the share count is zero.
Here is where it gets a little bit weird. On my localhost development machine, the code works fine and returns the proper share count. However, if I run the code on my actual server (an AWS EC2 instance), the share count shows zero.
If I open Chrome and run the request from the browser, the share count displays as expected.
If I open Internet Explorer 11, and run the request from the browser, the counter shows zero. HOWEVER, if I log in to Facebook from IE11, and then run the request to FB Graph API, the response shows the correct page count.
This is very confusing to me, as it appears the reason the counter has stopped working, has to do with cookies, or maybe the browser being logged into FB. This should not be the case as I am using an APP token ID, and I wouldn't expect to need to be logged into FB in order to make a request to Graph API.
Does anybody have any ideas why my request/code in .NET worked just fine for a year and a half, and just stopped working? Or why the requests work fine on my localhost and not my live server?
After spending considerable time on this issue, I have fixed the issue. There is a FB authentication cookie that was being transmitted through a web browser query. The cookie name was "XS" and the value was a long string that is used as a sessionId for my specific login. If I created this cookie in my web request in C# code, I get the proper response with correct # of shares.
WebClient client = new WebClient();
client.Headers.Add("Cookie", "xs=<removed>;");
I have no idea why I have to do this, only on my EC2 server. Nowhere in FB's documentation does it say you have to spoof a valid logged in authentication string cookie in order to obtain correct Share Count results from a request to it's Graph API, but there you have it. A workaround at least.
Currently I have a slack button in my WPF application that opens a webpage and asks for user for access.
System.Diagnostics.Process.Start("https://slack.com/oauth/authorize?scope=client&client_id=XXXXXXXXXXXXXXXXXXXXXX");
After authorizing, the page gets redirected to a URL which has a generated code in the parameter that I need to get a token later on. The problem is how do I get this code. For now I have set the the redirect URL to, www.slack.com. And the following url is generated.
https://slack.com/?code=8XXXXXXXXXXXXXXX.XXXXXXXXXXXXXX5&state=
How do get the code back into my application. I am using the following but am not getting the response I need and this executes before the user can even authorize.
HttpWebRequest req = (HttpWebRequest)WebRequest.Create(URL);
var response =req.RequestUri.ToString();
Alternative solutions and suggestions would be good to implement my authorization for a desktop application using C#.
As part of the oauth process Slack will call your application using the redirect url and return the code parameter. So you the redirect URL needs to point to your application. Not to slack.com.
You will need to read the url parameters that your application as been called with. In C# that can be done with code = Request.QueryString["code"];
Your c# application needs to run as ASP script on a webserver that is accessible from the Internet, so that Slack can reach it.
In order to use Slack for authentication your application needs to implement the complete oauth process as described here.
I'm working on an application allowing users to sign in and register using Google and Yahoo through OpenID using ASP.NET MVC4, and the DotNetOpenAuth library. Google is working fine, and Yahoo was working fine for a few months as well until a few days ago.
For some reason, using my local version of IE 11, after authenticating with Yahoo, two responses are sent back to the web server, and each is validated in its own separate thread. One response is determined to be valid, and the other response is determined to be invalid because the first response is already validated. The responses are then sent back to the user, and depending on which one is sent first, two very different outcomes can occur.
Using Chrome and Firefox works correctly. Yahoo is returning only one response. Using different versions of IE on other machines (including 11) work correctly as well. Using fiddler, I've verified that the correct requests are being sent out. I've tried clearing my cache, disabling any addons, and changing document and browser modes using the dev tools, and no luck. Is there anything that can be causing two responses to be sent back?
The basic code to send the request is below. The config file is using all default values.
OpenIdRelyingParty openid = new OpenIdRelyingParty();
IAuthenticationRequest request = openid.CreateRequest(Identifier.Parse("https://me.yahoo.com"));
var fields = new ClaimsRequest();
fields.Email = DemandLevel.Require;
request.AddExtension(fields);
return request.RedirectingResponse.AsActionResult();
It turns out that the problem was that I was sending a request to tell Yahoo to redirect back to an unencrypted connection after authentication. If I tell Yahoo to return to an https url, rather than http, everything works correctly, and I only get one request coming back to the application.
I'm using HttpWebRequest to pull down XML, and POST data back to a 'WebService' and getting a 401 on the POST.
When creating the requests I've added Credentials and now tried a credentials cache and setting PreAutenticate to True, still getting the 401! :(
Watching the HTTP traffic on the router I set the get make an unauthenticated GET request.. it hits the 401 and then makes an authenticated GET and is allowed through. When I watch the POST I see it hit the 401... and it doesn't even try an authenticated POST.
This appears only on mobile phones (compact-framework 3.5 and 2.0 on WinMobile 6.1). The same .exe works perfectly on any desktop machines.
What am I missing? Please help!
Try setting the header manually:
http://devproj20.blogspot.com/2008/02/assigning-basic-authorization-http.html