Restful API working in postman, C# not Working [Magento Integration] - c#

I am integrating with Magento 2, using RESTful APIs. When I use postman, it works like charm, while in C# code, it returns "Unauthorized 401" exception.
However, It was working in C# code earlier, but suddenly it stopped working.
I have tried every way, I tried (WebRequest, HTTPClient & RESTsharp) the same exception returned.
Also, I am using Fiddler 4 to catch & match the requests, I used Fiddler to C# plugins to extract C# code, also I used the RESTsharp Code of Postman same exception returned.
The remote server returned an error: (401) Unauthorized.
//Calls request functions sequentially.
private string MakeRequests()
{
HttpWebResponse response;
if (Request_hatolna_co(out response))
{
//Success, possibly uses response.
string responseText = ReadResponse(response);
response.Close();
return responseText;
}
else
{
//Failure, cannot use response.
return "";
}
}
private static string ReadResponse(HttpWebResponse response)
{
using (Stream responseStream = response.GetResponseStream())
{
Stream streamToRead = responseStream;
if (response.ContentEncoding.ToLower().Contains("gzip"))
{
streamToRead = new GZipStream(streamToRead, CompressionMode.Decompress);
}
else if (response.ContentEncoding.ToLower().Contains("deflate"))
{
streamToRead = new DeflateStream(streamToRead, CompressionMode.Decompress);
}
using (StreamReader streamReader = new StreamReader(streamToRead, Encoding.UTF8))
{
return streamReader.ReadToEnd();
}
}
}
private bool Request_hatolna_co(out HttpWebResponse response)
{
response = null;
try
{
//Create a request to URL.
HttpWebRequest request = (HttpWebRequest)WebRequest.Create("http://MAGENTO.co/index.php/rest//V1/orders/items?searchCriteria[filter_groups][0][filters][0][field]=item_id&searchCriteria[filter_groups][0][filters][0][value]=1");
//Set request headers.
request.KeepAlive = true;
request.Headers.Set(HttpRequestHeader.Authorization, "Bearer xxxxxxxxxxxxxxxxxxxxxx");
request.Headers.Add("Postman-Token", #"1181fa03-4dda-ae84-fd31-9d6fbd035614");
request.Headers.Set(HttpRequestHeader.CacheControl, "no-cache");
request.UserAgent = "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/75.0.3770.100 Safari/537.36";
request.ContentType = "application/json";
request.Accept = "*/*";
request.Headers.Set(HttpRequestHeader.AcceptEncoding, "gzip, deflate");
request.Headers.Set(HttpRequestHeader.AcceptLanguage, "en-US,en;q=0.9,ar;q=0.8,la;q=0.7");
request.Headers.Set(HttpRequestHeader.Cookie, #"store=default; private_content_version=f16533d4f181d42a1b3f386fa6d2cdf1");
//Get response to request.
response = (HttpWebResponse)request.GetResponse();
}
catch (WebException e)
{
//ProtocolError indicates a valid HTTP response, but with a non-200 status code (e.g. 304 Not Modified, 404 Not Found)
if (e.Status == WebExceptionStatus.ProtocolError) response = (HttpWebResponse)e.Response;
else return false;
}
catch (Exception)
{
if (response != null) response.Close();
return false;
}
return true;
}

Why Postman-Token has been set in C# code? Remove it and then try.

The problem was in the URL, where Magento's (Server's) Admin Changed it to [HTTPS] instead of [HTTP].
That concludes the difference between [Postman, Insomnia, or any other API app] & the C# code, that C# doesn't handle the [HTTP vs HTTPs], while the API app can handle it.

Related

What are the reasons an HttpWebRequest might take half a minute to get a response?

I'm developing an app for a school project which uses data from CoinGecko's free public API. The way I'm getting the data is with the HttpWebRequest class. At first, I was able to use the API fast, without any problems, but recently the requests have been getting stuck at the GetResponse() function for a long time. Even pinging the API takes about 30 seconds to complete. I've read a lot of posts regarding this problem, and none of the suggestions worked.
try
{
string uri = "https://api.coingecko.com/api/v3/ping";
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(uri);
request.Method = "GET";
request.Accept =
"text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng," +
"*/*;q=0.8,application/signed-exchange;v=b3;q=0.9";
request.ContentType = "application/json";
request.Headers.Add("accept-encoding", "gzip, deflate, br");
request.Headers.Add("accept-language", "en-US,en;q=0.9");
request.Headers.Add("cache-control", "max-age=0");
request.Headers.Add("if-none-match", "W/\"c1f074e1bdf979ac5dc291f2c0acf9e4\"");
request.Headers.Add("sec-ch-ua", "\" Not A;Brand\"; v = \"99\", \"Chromium\"; v = \"96\", \"Google Chrome\"; v = \"96\"");
request.Headers.Add("sec-ch-ua-mobile", "?0");
request.Headers.Add("sec-ch-ua-platform", "\"Windows\"");
request.Headers.Add("sec-fetch-dest", "document");
request.Headers.Add("sec-fetch-mode", "navigate");
request.Headers.Add("sec-fetch-site", "none");
request.Headers.Add("sec-fetch-user", "?1");
request.Headers.Add("upgrade-insecure-requests", "1");
request.UserAgent =
"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko)" +
" Chrome/97.0.4692.71 Safari/537.36 Edg/97.0.1072.55";
request.Proxy = null;
using HttpWebResponse response = (HttpWebResponse)request.GetResponse();
if (response.StatusCode != HttpStatusCode.OK && response.StatusCode != HttpStatusCode.Created)
{
throw new ApplicationException("REST client error code: "
+ response.StatusCode.ToString());
}
using Stream stream = response.GetResponseStream();
if (stream == null)
{
throw new ApplicationException("Error: Null response");
}
using StreamReader reader = new StreamReader(stream);
return reader.ReadToEnd();
}
catch (Exception ex)
{
return ex.ToString();
}
It works with the browser so I copied all the browser request headers and added them to the code but they didn't help. I have also tried: Increasing the number of connections, playing around with all the security protocol types, disabling my firewall. Could it be that the API is detecting and slowing down my requests since they're not "human"?
Set Windows to prioritise IPv4 by typing the following in the command line:
netsh interface ipv6 set prefixpolicy ::ffff:0:0/96 46 4
Source

I try API request post with body type raw to get a token but need understand what is wrong

My method to get tokenKey is :
HttpWebRequest httpRequest = (HttpWebRequest)WebRequest.Create(url);
httpRequest.Method = "POST";
httpRequest.UserAgent = "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/65.0.3325.181 Safari/537.36 OPR/52.0.2871.99";
string tokenResponse = null;
HttpClient client = new HttpClient();
HttpResponseMessage response = null;
try
{
client.DefaultRequestHeaders.Add("x-api-key", "key");
if (method.Equals("POST"))
{
httpRequest.Accept = "application/json";
httpRequest.ContentType = "application/json";
var data = #"{""username"":#"""+ login + #""",""password"" :#"""+ password + #"""}";
using (var streamWriter = new StreamWriter(httpRequest.GetRequestStream()))
{
streamWriter.Write(data);
}
var httpResponse = (HttpWebResponse)httpRequest.GetResponse();
using (var streamReader = new StreamReader(httpResponse.GetResponseStream()))
{
tokenResponse = streamReader.ReadToEnd();
}
}
}
catch (Exception ex)
{
throw new Exception(ex.Message);
}
return tokenResponse;
after var httpResponse = (HttpWebResponse)httpRequest.GetResponse();
this error msg: The underlying connection was closed: Unexpected error on a send.
x-api-key is okay but i don't put the original as the username and password
i do it easy with postman
postman request done
Can anyone help me understand where I'm going wrong?
Could this be a case of mismatch TLS versions? See this answer here:
C# HttpWebRequest The underlying connection was closed: An unexpected error occurred on a send
Also it looks like your HttpWebRequest won't have the correct headers, because you are setting them on the HttpClient object instead. The HttpClient also doesn't seem to be being used for this call.
See documentation, how to use HttpWebRequest and how to write body properly to request. I think the problem is with a request stream.

Spring WebSecurityConfigurerAdapter permitAll() does not allow REST POST requests from c# client?

I have this setup in my WebSecurityConfigurerAdapter to allow my client application to send POST request to the "/commands/" path on server:
#Override
protected void configure(HttpSecurity http) throws Exception {
http.authorizeRequests()
.antMatchers("/").permitAll()
.antMatchers("/commands/**").permitAll()
.antMatchers("/files/**").authenticated()
.and().
formLogin();
}
GET requests are fine,however the csrf seems be required for POST requests after this setup. I get following result if I don't login:
{
"timestamp": 1497904660159,
"status": 403,
"error": "Forbidden",
"message": "Could not verify the provided CSRF token because your session was not found.",
"path": "/commands/add"
}
If I login and attach the cookies from login request with C# client code, I will get following error:
{
"timestamp":1497897646380,
"status":403,
"error":"Forbidden",
"message":"Could not verify the provided CSRF token because your session was not found.",
"path":"/commands/add"
}
My C# code client for post looks like this:
public String SendJsonCommandByPost(String url, string data)
{
try
{
WebRequest req = HttpWebRequest.Create(url);
req.Proxy = null;
req.Method = "POST";
req.Timeout = TIMEOUT;
((HttpWebRequest)req).CookieContainer = myCookieContainer;
PrintCookies(myCookieContainer);
req.Headers.Add("X-CSRF-TOKEN", _csrftoken);
req.ContentType = "application/json";
((HttpWebRequest)req).UserAgent = "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/535.2 (KHTML, like Gecko) Chrome/15.0.874.121 Safari/535.2";
byte[] postdata = Encoding.UTF8.GetBytes(data);
req.ContentLength = postdata.Length;
Stream stream = req.GetRequestStream();
stream.Write(postdata, 0, postdata.Length);
stream.Flush();
stream.Close();
string source;
Console.WriteLine(req.Headers);
using (HttpWebResponse response = (HttpWebResponse)req.GetResponse())
{
using (StreamReader reader = new StreamReader(req.GetResponse().GetResponseStream()))
{
source = reader.ReadToEnd();
}
req.GetResponse().Close();
return source;
}
}
catch (Exception exp)
{
Console.WriteLine(exp);
if (exp is WebException)
{
var webexp = (WebException)exp;
Console.WriteLine(webexp.Response.Headers);
TextReader reader = new StreamReader(webexp.Response.GetResponseStream());
Console.WriteLine(reader.ReadToEnd());
}
return null;
}
}
May I know what could cause this kind of issue? Thank you!
add this line.
http.csrf().disable();
By default csrf is enabled so your post requests are getting blocked. Try this. It works for me

C# - Connection: keep-alive Header is Not Being Sent During HttpWebRequest redirect

I have a big problem when I using HttpWebRequest to get content from my blog.
First let's see the code
The request is made by a C# console application:
HttpWebRequest request = (HttpWebRequest)WebRequest.Create("http://localhost/MyBlog2014/Security/Login");
request.ProtocolVersion = HttpVersion.Version10;
var data = "Email=myemail&Password=1234567&keepMeOn=false";
byte[] send = Encoding.Default.GetBytes(data);
request.Accept = "text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8";
request.Headers.Add("Accept-Language: ro-RO,ro;q=0.8,en-US;q=0.6,en;q=0.4");
request.Headers.Add("Cache-Control: max-age=0");
request.KeepAlive = true;
request.UnsafeAuthenticatedConnectionSharing = true;
request.ServicePoint.ConnectionLimit = 1;
request.ServicePoint.UseNagleAlgorithm = false;
request.ServicePoint.Expect100Continue = false;
request.Method = "POST";
request.UserAgent =
"Mozilla/5.0 (Windows NT 6.2; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/36.0.1985.143 Safari/537.36";
request.ContentType = "application/x-www-form-urlencoded";
request.ContentLength = send.Length;
request.AllowAutoRedirect = true;
Stream sout = request.GetRequestStream();
sout.Write(send, 0, send.Length);
sout.Flush();
sout.Close();
Console.WriteLine("\nThe HTTP request Headers for the first request are: \n{0}", request.Headers);
IAsyncResult result = (IAsyncResult)request.BeginGetResponse(new AsyncCallback(RespCallback), request);
Callback method
static void RespCallback(IAsyncResult asynchronousResult)
{
try
{
HttpWebRequest request = (HttpWebRequest)asynchronousResult.AsyncState;
WebResponse res = request.GetResponse();
var stream = new StreamReader(res.GetResponseStream());
Console.WriteLine(stream.ReadToEnd());
stream.Close();
res.Close();
Console.WriteLine();
}
catch (WebException e)
{
//do something
}
catch (Exception e)
{
//do something
}
}
Now, this code don't work like I wish. It goes in my login method, save email on the session object and then makes a redirect to action index and here appears my big problem in the index method the object Session is null.
Let's see the code
Login method
[HttpPost]
public ActionResult Login(LoginDto login)
{
// if email and password are ok the save email on the sessio
// then reirect to index
//else show error messages(s) return View(login);
}
Index method
[AuthorizeUser(false)]
public ActionResult Index(int pag = 1)
{
//I built the model
return View(new HomePageModel
{
CUrrentPage = pag,
Articles = model,
MaxPage = totalpagesArticles
});
}
So, I will not receive the content from index method because I am not authorized.
The code for save email on the session object is this:
public void SaveDetaliiUser(SessionUserDetails userDetails)
{
HttpContext.Current.Session[SessionDetaliiUtilizator] = userDetails;
}
Before to write here I searched on the net a solution for my problem and the following links didn't help me
How to send KeepAlive header correctly in c#?
C# - Connection: keep-alive Header is Not Being Sent During HttpWebRequest
Keep a http connection alive in C#?
http://social.msdn.microsoft.com/Forums/en-US/87bc7029-ce23-438a-a767-f7c32dcc63a7/how-to-keep-connection-live-while-using-httpwebrequest?forum=netfxnetcom
Thank you in advance,
Marian
Finally I solved the problem. It seems to keep session during redirect I should set request.CookieContainer = new CookieContainer();

Scraping a dynamic page with cookies

I am trying to scrape this page for a set of zipcodes.
https://www.chase.com/mortgage/loan-officer/search-results.html#action-search;zipcode-11747;lastname-;language-
If you put that in your browser, you will get results however, trying to do so in code fails.
First I tried
HttpWebRequest request = (HttpWebRequest )System.Net.WebRequest.Create(URI);
var sr = new System.IO.StreamReader(resp.GetResponseStream());
string page= sr.ReadToEnd().Trim();
but this code generated by a plugin in fiddler didnt work as well either. no results are returned. So what exactly am I missing??
private void MakeRequests()
{
HttpWebResponse response;
string responseText;
if (Request_www_chase_com(out response))
{
responseText = ReadResponse(response);
response.Close();
}
}
private static string ReadResponse(HttpWebResponse response)
{
using (Stream responseStream = response.GetResponseStream())
{
Stream streamToRead = responseStream;
if (response.ContentEncoding.ToLower().Contains("gzip"))
{
streamToRead = new GZipStream(streamToRead, CompressionMode.Decompress);
}
else if (response.ContentEncoding.ToLower().Contains("deflate"))
{
streamToRead = new DeflateStream(streamToRead, CompressionMode.Decompress);
}
using (StreamReader streamReader = new StreamReader(streamToRead, Encoding.UTF8))
{
return streamReader.ReadToEnd();
}
}
}
private bool Request_www_chase_com(out HttpWebResponse response)
{
response = null;
try
{
HttpWebRequest request = (HttpWebRequest)WebRequest.Create("https://www.chase.com/mortgage/loan-officer/search-results.html");
request.KeepAlive = true;
request.Headers.Set(HttpRequestHeader.CacheControl, "max-age=0");
request.Accept = "text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8";
request.UserAgent = "Mozilla/5.0 (Windows NT 6.3; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/32.0.1700.76 Safari/537.36";
request.Headers.Add("DNT", #"1");
request.Referer = "https://mail.google.com/mail/u/0/?shva=1";
request.Headers.Set(HttpRequestHeader.AcceptEncoding, "gzip,deflate,sdch");
request.Headers.Set(HttpRequestHeader.AcceptLanguage, "en-US,en;q=0.8");
request.Headers.Set(HttpRequestHeader.Cookie, #"v1st=3B46E5CCD302C2DE; marketlist=68|90|152|170|198; chasezip=zipcode=11577&county=Nassau&state=NY; ASP.NET_SessionId=kwybehscfioasswbl20wb14f; PC_1_0=n%3Dundefined|u%3Dundefined|l%3Dundefined|zip%3D11577|lastUpdate%3D2014-01-24|lastSent%3D2014-01-24|home%3Dpersonal|; SessionPersistence=CLICKSTREAMCLOUD%3A%3DvisitorId%3D%7CPROFILEDATA%3A%3D%7CSURFERINFO%3A%3Dbrowser%3DChrome%2COS%3DWindows%2Cresolution%3D1366x768%7C; fsr.s=%7B%22v2%22%3A-2%2C%22v1%22%3A1%2C%22rid%22%3A%22d464cf6-82273859-c860-572f-2944b%22%2C%22to%22%3A5%2C%22c%22%3A%22https%3A%2F%2Fwww.chase.com%2Fmortgage%2Floan-officer%2Fsearch-results.html%23action-search%3Bzipcode-11747%3Blastname-%3Blanguage-%22%2C%22pv%22%3A12%2C%22lc%22%3A%7B%22d18%22%3A%7B%22v%22%3A12%2C%22s%22%3Atrue%7D%7D%2C%22cd%22%3A18%2C%22sd%22%3A18%2C%22f%22%3A1390649574789%7D");
request.IfModifiedSince = DateTime.Parse("Fri, 24 Jan 2014 20:18:51 GMT");
response = (HttpWebResponse)request.GetResponse();
}
catch (WebException e)
{
if (e.Status == WebExceptionStatus.ProtocolError) response = (HttpWebResponse)e.Response;
else return false;
}
catch (Exception)
{
if (response != null) response.Close();
return false;
}
return true;
}
To make this work, you'd need to parse the HTML, then download and run the JavaScript. Instead of writing your own browser, use a Web Browser control to load the page, then scrape its inner HTML.
The page uses AJAX to create the results so all you will see in your response is the initial HTML

Categories