Trying to call an API from a controller using HttpClient and the API does not recognize the user as authenticated and logged in. When calling the API from JS I have no issue. I noticed the HttpClient was only sending via HTTP 1.1 and so I upgraded to 2.0 settings the DOTNET_SYSTEM_NET_HTTP_USESOCKETSHTTPHANDLER flag but this made no difference. I have tried all combinations of the HttpClientHandler properties including UseCookies and the request is never authenticated.
using (var handler = new HttpClientHandler {UseDefaultCredentials = true})
{
using (var httpClient = new HttpClient(handler))
{
var response = httpClient.GetStringAsync(new Uri($"https://localhost:64366/api/")).Result;
}
}
Will move to token based auth in the future but for now would like to understand why there is a difference between calling the API from C# vs JS. This is all HTTPS on localhost using asp net core 2.2.
Difference between JS and C# is that browsers attach cookies automatically to requests and you have to attach cookies manually in C# as juunas mentioned.
To obtain and use authentication cookie you may use the following pattern
CookieContainer cookies = new CookieContainer(); //this container saves cookies from responses and send them in requests
var handler = new HttpClientHandler
{
CookieContainer = cookies
};
var client = new HttpClient(handler);
string authUrl = ""; //your auth url
string anyUrl = ""; //any url that requires you to be authenticated
var authContent = new FormUrlEncodedContent(
new List<KeyValuePair<string, string>> {
new KeyValuePair<string, string>("login", "log_in"),
new KeyValuePair<string, string>("password", "pass_word")
}
);
//cookies will be set on this request
HttpResponseMessage auth = await client.PostAsync(authUrl, authContent);
auth.EnsureSuccessStatusCode(); //retrieving result is not required but you will know if something goes wrong on authentication
//and here retrieved cookies will be used
string result = await client.GetStringAsync(anyUrl);
Related
I have two web APIs applications developed in .Net core. I need to import Json data from the second application to the first. However,I have a security issue. I need to secure the access to the external API. How should I securely manage the connection between these two APIs.
For example, I need to secure the access to the URL in the code bellow => securely access to the covid API without another authentication.
PS: I'm using JWT token authentication in both applications
Best regards.
using (var client = new HttpClient())
{
string url = string.Format("https://covid19.mathdro.id/api");
var response = client.GetAsync(url).Result;
string responseAsString = await response.Content.ReadAsStringAsync();
result = JsonConvert.DeserializeObject<CovidResult>(responseAsString);
}
If both APIs are protected by the same accessToken, then you can read the authorization header from the first request and pass it to the second request.
Something like this to read the header:
var authHeader = context.Request.Headers.Get("Authorization");
You should end up with authHeader equal to "Bearer ey...(a bunch of base64)"
Then add the auth header to the client:
var request = new HttpRequestMessage() {
RequestUri = new Uri("http://https://covid19.mathdro.id/api"),
Method = HttpMethod.Get,
};
...
request.Headers.Authorization.Add(new AuthenticationHeaderValue(authHeader));
var task = client.SendAsync(request)
I'm trying to transmit some data from my application to a specific web service using HttpClient. To do that I first have to login to the web service and receive the cookie (that's the authentication method used by the web service). I do it like that:
Uri uri = "login_uri";
CookieContainer CookieContainer_login = new CookieContainer();
HttpClientHandler ch = new HttpClientHandler
{
AllowAutoRedirect = true,
CookieContainer = CookieContainer_login,
UseCookies = true
};
HttpClient client = new HttpClient(ch);
List<KeyValuePair<string, string>> pairs = new List<KeyValuePair<string, string>>
{
new KeyValuePair<string, string>("user", "test"),
new KeyValuePair<string, string>("password", "test"),
new KeyValuePair<string, string>("loginSource", "0")
};
FormUrlEncodedContent content = new FormUrlEncodedContent(pairs);
System.Threading.Tasks.Task<HttpResponseMessage> response = client.PostAsync(uri, content);
It works, I receive the message about successful login via Fiddler. Now in order to use the web service (another Uri), for example to send a POST request, I have to pass my cookies (received during login process) to that request. As I am storing the cookies in the CookieContainer called CookieContainer_login I thought, that I can simply use the same client and only change the Uri in the PostAsync method or create a new client with the same HttpClientHandler and CookieContainer. Unfortunately it didn't work. Actually, I found out, that my CookieContainer is empty, even after the login process.
I tried to recreate that with HttpWebRequest like that:
string url_login = "login_uri";
string logparam = "user=test&password=test&loginSource=0";
HttpWebRequest loginRequest = (HttpWebRequest)WebRequest.Create(url_login);
loginRequest.ContentType = "application/x-www-form-urlencoded";
loginRequest.Accept = "text/xml";
loginRequest.Method = "POST";
loginRequest.CookieContainer = CookieContainer_login;
byte[] byteArray = Encoding.UTF8.GetBytes(logparam);
loginRequest.ContentLength = byteArray.Length;
Stream dataStream_login = loginRequest.GetRequestStream();
dataStream_login.Write(byteArray, 0, byteArray.Length);
It works, I also receive the successful login message, but also when I check the CookieContainer count, it shows 3 cookies that are being stored after login. Now my question is why with HttpClient there are no cookies in CookieContainer, but with the HttpWebRequest there are? How to get the cookies with the HttpClient as well?
Okay, I managed to solve my problem and hopefully my answer will be useful to someone with the similar issue. In my case the mistake was in the method PostAsync invocation. It's an asynchronous method so it needs an await operator that I was missing. The proper method invocation should look like this:
HttpResponseMessage response = new HttpResponseMessage();
response = await client.PostAsync(uri, content);
Now all the cookies are stored in my CookieContainer.
How to set cookies in the MS-Edge based Microsoft.Toolkit.Forms.UI.Controls.WebView?
I need to send an autentication token cookie to the website I'm navigating to.
What I've tried:
Passing a cookie header to the Navigate method: The header won't be passed to the website (verified by Fiddler). Other headers (like "MyCustomHeader" in the example below) are passed to the site though.
string cookieHeader = cookieContainer.GetCookieHeader(siteUri);
var headers = new Dictionary<string, string>();
headers.Add("Cookie", "MyAuthCookie=MyAuthToken; Domain=.somesite.net; Path=/");
headers.Add("MyCustomHeader", "MyCustomHeader-Value");
_browser.Navigate(siteUri, HttpMethod.Get, headers: headers);
Setting the cookie in CookieManager before calling WebView.Navigate:
var siteUri = new Uri("http://wwww.somesite.net/");
var filter = new Windows.Web.Http.Filters.HttpBaseProtocolFilter();
var cookieManager = filter.CookieManager;
var cookie = new Windows.Web.Http.HttpCookie("MyAuthCookie", siteUri.Host, "/");
cookie.Value = "MyAuthToken";
cookieManager.SetCookie(cookie);
webView.Navigate(siteUri);
This also does not work when calling NavigateWithHttpRequestMessage Microsoft.Toolkit.Win32.UI.Controls.Interop.WinRT.WebViewControlHost (via reflection) instead of WebView.Navigate.
It also does not work when requesting the same URL by HttpClient before calling WebView.Navigate:
using (var client = new Windows.Web.Http.HttpClient(filter) { })
{
var result = client.GetAsync(siteUri).AsTask().Result;
result.EnsureSuccessStatusCode();
}
webView.Navigate(siteUri);
That way, the cookie header is only sent with the HttpClient's request, but not with the subsequent WebView.Navigate's request. I guess that the reason for this could be the fact that WebView runs in it's own process.
Is there any way to pass the cookie to the website? Note that the cookie does not originate from the site. The authentication token is retrieved from some other system, and needs to be passed to the website.
Sorry for the awful title, I'm not really sure how to phrase my issue in a short title format.
I'm trying to communicate with an external API. I make a basic authentication request to that API and get an x-csrf-token and a session token from the api.
I then make another request to that API, now using the x-csrf-token as a header and attach the session token to the header as "cookie".
The team that maintains the API sent me an example project that handles all of the above, and it looks like this:
public static async Task<string> Send(string apiname, string value)
{
// Fetch the authorization tokens from SAP
HttpClient client = new HttpClient();
client.BaseAddress = new Uri(basePath);
client.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue("Basic", System.Convert.ToBase64String(System.Text.Encoding.UTF8.GetBytes(user + ":" + password)));
client.DefaultRequestHeaders.Add("x-csrf-token", "Fetch");
string csrfToken = "";
string sessionCookie = "";
HttpResponseMessage response = await client.GetAsync(string.Empty);
IEnumerable<string> values;
if (response.Headers.TryGetValues("x-csrf-token", out values))
{
csrfToken = values.FirstOrDefault();
}
if (response.Headers.TryGetValues("set-cookie", out values))
{
sessionCookie = values.Where(s => s.StartsWith("SAP_SESSION")).FirstOrDefault();
}
// Reinstantiate the HttpClient, adding the tokens we just got from SAP
client = new HttpClient();
client.DefaultRequestHeaders.Add("x-csrf-token", csrfToken);
client.DefaultRequestHeaders.Add("cookie", sessionCookie);
client.BaseAddress = new Uri(basePath);
client.DefaultRequestHeaders.Accept.Clear();
client.DefaultRequestHeaders.Accept.Add(new MediaTypeWithQualityHeaderValue("application/json"));
// Have to parse the string this way otherwise it'll break the dates
JToken token;
using (var sr = new StringReader(value))
using (var jr = new JsonTextReader(sr) { DateParseHandling = DateParseHandling.None })
{
token = JToken.ReadFrom(jr);
}
HttpResponseMessage response2 = await client.PostAsJsonAsync(apiname, token);
string responseBody = await response2.Content.ReadAsStringAsync();
return responseBody;
}
This all works great as a .NET Core webAPI (and also as a .netcore console app).
Interestingly enough (in my opinion anyway), when I use the exact same code in a .net 4.7.2 project, it doesn't append the "cookie" header properly, and so I'm getting an unauthorized redirect back from the API.
To be absolutely sure that I didn't change any code, I started from scratch with a brand new .netcore 2.0 console app and a brand new .net 4.7.2 console app and copy-pasted the exact same code and installed the same nuget packages (Newtonsoft.JSON and Microsoft.WebApi.Client). I inspected my web traffic with fiddler (seen below) and you can see that in .netcore, the cookie appends properly and everything works, but in .net 4.7.2, the API returns a redirect to authenticate.
HttpClient will eat the custom cookie if you do not setUseCookies to false,
using (var handler = new HttpClientHandler { UseCookies = false })
using (client = new HttpClient(handler) { BaseAddress = new Uri(Path) }){
client.DefaultRequestHeaders.Add("cookie", cookieValue);
}
It will try to use the cookie container and at the same time ignore any custom cookie headers, very frustrating behavior if you ask me.
.Net Framework uses Cookie Container.
Also core, perhaps its a better implementation then what you are doing now and more supported.
Please see cookie container docs
Small example:
var cookieContainer = new CookieContainer();
this.handler = new HttpClientHandler
{
CookieContainer = cookieContainer,
UseCookies = true
};
client = new HttpClient(handler);
Edit: I apologize for the confusion: I'm attempting to scrape a site that I did not write; I'm not writing an ASP app. I'm only attempting to scrape one.
After making a post request to a login page, I attempt to read the cookies from another page. I do not see all the cookies I expect, however. My code is as follows:
// Downloads login cookies for subsequent requests
public async Task<CookieContainer> loginCookies()
{
var cookies = new CookieContainer();
var handler = new HttpClientHandler {
UseCookies = true,
AllowAutoRedirect = true,
CookieContainer = cookies
};
var client = new HttpClient(handler);
var loginUri = new Uri("https://connect.example.edu/login.aspx");
var credentials = new Dictionary<string, string> {
{"example$txtUsername", this.Username},
{"example$txtPassword", this.Password}
};
var formCredentials = new FormUrlEncodedContent(credentials);
await client.PostAsync(loginUri, content: formCredentials);
var pointsUri = new Uri("https://info.example.edu/");
Console.WriteLine("COOKIES:");
foreach (Cookie cookie in cookies.GetCookies(pointsUri))
Console.WriteLine($"{cookie.Name} --> {cookie.Value}");
return cookies;
}
I believe the error is a result of loginUri and pointsUri having different subdomains. The info I need to scrape exists at the pointsUri page, but the login exists at loginUri.
A cookie that I'm missing in particular is ASP.NET_SessionID.