Invalid PUT method from Webforms to Web API 2 (Azure) - c#

I have a Web API in my Azure server and I'm making calls from an ASP.NET Webforms website.
I seem to be able to perform GET with no trouble. Now for the PUT, it's giving me this error:
The page you are looking for cannot be displayed because an invalid
method (HTTP verb) is being used
I was not able to DELETE either. I see some other topics where people disable some WebDav and stuff on their IIS servers and it works. But on Azure?
Below my code for the PUT:
HttpResponseMessage response = client.GetAsync("api/People/" + id).Result;
if (response.IsSuccessStatusCode)
{
var yourcustomobjects = response.Content.ReadAsAsync<People>().Result;
Uri peopleUrl = response.Headers.Location;
yourcustomobjects.name= "Bob";
response = await client.PutAsJsonAsync(peopleUrl, yourcustomobjects);
tbDebug.Text += await response.Content.ReadAsStringAsync();
}

Alright I grew tired of trying to fix this issue by enabling PUT.
So what I did, was I wrote a GET that makes the needed change in the database.
Cheers

Related

Adding SSL cert causes 404 only in browser calls

I am working in an internal corporate environment. We have created a webapi installed on iis on port 85. We call this from another MVC HelperApp on port 86. It all works as expected. Now we want to tighten security and add an SSL cert to iis on port 444 and bind it to our API.
Initially we test it with Postman, SoapUI, and a C# console app and it all works. Now we try calling it from our MVC HelperApp and it returns a 404 sometimes.
Deeper debugging; I put the code into a C# DLL (see below). Using the console app I call the Dll.PostAPI and it works as expected. Now I call that same Dll.PostAPI from the MVC HelperApp and it won't work. When I step through the code I make it as far as this line await client.PostAsync(url, data); and the code bizarrely ends, it doesn't return and it doesn't throw an exception. Same for Post and Get. I figure it makes the call and nothing is returned, no response and no error.
Also, if I change the url to "https://httpbin.org/post" or to the open http port85 on iss it will work. I have concluded that the C# code is not the problem (but I'm open to being wrong).
Therefore I have come to the conclusion that for some reason the port or cert is refusing calls from browsers.
We are looking at:
the "Subject Alternative Name" but all the examples show
WWW.Addresses which we are not using.
the "Friendly Name" on the cert creation.
and CORS Cross-Origin Resource Sharing.
These are all subjects we lack knowledge in.
This is the calling code used exactly the same in the console app and the web app:
var lib = new HttpsLibrary.ApiCaller();
lib.makeHttpsCall();
This is what's in the DLL that gets called:
public async Task<string> makeHttpsCall()
{
try
{
List<Quote> quotes = new List<Quote>();
quotes.Add(CreateDummyQuote());
var json = JsonConvert.SerializeObject(quotes);
var data = new StringContent(json, Encoding.UTF8, "application/json");
var url = "https://httpbin.org/post"; //this works in Browser
//url = "https://thepath:444//api/ProcessQuotes"; //444 DOES NOT WORK in browsers only. OK in console app.
//url = "http://thepath:85/api/ProcessQuotes"; //85 works.
var client = new HttpClient();
var response = await client.PostAsync(url, data); //<<<this line never returns when called from browser.
//var response = await client.GetAsync(url); //same outcome for Get or Post
var result = await response.Content.ReadAsStringAsync();
return result;
}
catch (Exception ex)
{
throw;
}
}

Autodesk Forge Error trying to access the API online

I have a problem loading a 3D model on an online server, the error shown is related to accessing the Forge API, locally works smoothly however when mounted on the server or a website is made marks the following error "Failed to load resource: the server responded with a status of 404 (Not Found)", then "onDocumentLoadFailure() - errorCode:7".
As I comment, what I find stranger is that, locally, it works. Attached the segment of the code where it displays the error.
function getAccessToken() {
var xmlHttp = null;
xmlHttp = new XMLHttpRequest();
xmlHttp.open("GET", '/api/forge/toke', false); //Address not found
xmlHttp.send(null);
return xmlHttp.responseText;
}
Thank you very much in advance.
Are you sure the code you're running locally and the code you've deployed are really the same?
The getAccessToken function doesn't seem to be correct, for several reasons:
First of all, there seems to be a typo in the URL - shouldn't it be /api/forge/token instead of /api/forge/toke?
More importantly, the HTTP request is asynchronous, meaning that it cannot return the response immediately after calling xmlHttp.send(). You can find more details about the usage of XMLHttpRequest in https://developer.mozilla.org/en-US/docs/Web/API/XMLHttpRequest/Using_XMLHttpRequest.
And finally, assuming that the function is passed to Autodesk.Viewing.Initializer options, it should return the token using a callback parameter passed to it (as shown in https://forge.autodesk.com/en/docs/viewer/v7/developers_guide/viewer_basics/initialization/#example).
With that, your getAccessToken should probably look more like this (using the more modern fetch and async/await):
async function getAccessToken(callback) {
const resp = await fetch('/api/forge/token');
const json = await resp.json();
callback(json.access_token, json.expires_in);
}
I've already found the issue. When I make the deploy I have to change the url where the request is made for the public or the name of the domain. For example: mywebsite.com/aplication-name/api/forge/token.

Is there a way to retrieve the String the way it is actually uploaded to the server (as a whole)?

I am currently working on a OAuth2 implementation. However I am stuck on an Error 401. It seems like there is something wrong with my post request that is supposed to retrieve the access token from the Company the User logged in to. This is my code:
internal void RequestAccessToken(string code)
{
string requestBody = "grant_type="+ WebUtility.UrlEncode(GRANTTYPE)+ "&code=" + WebUtility.UrlEncode(code)+"&redirect_uri="+ WebUtility.UrlEncode(REDIRECT_URI);
WebClient client = new WebClient();
client.Headers.Add("Authorization",HeaderBase64Encode(CLIENT_ID, SECRETKEY));
var response = client.UploadString("https://thewebsiteiamcallingto.com/some/api", requestBody);
var responseString = client.OpenRead("https://thewebsiteiamcallingto.com/some/api");
}
My Questions are:
Is there anything wrong with the way I try to make the POST request ?
Is there a way to retrieve the whole string that is posted to the URI using UploadString?
P.S. I have seen this post regarding the POST creation. However I find the async part to be too complicated for my case.
Since we dont know the api documentation, I would suggest you to make a postman request and view the actual request sent and response received, and secondly make a request using your method and capture using a utility like wireshark and compare the difference.

HttpClient GET request fails, POST Succeeds

In a Xamarin application (backed by ASP.NET WebApi), I'm having trouble getting [all of] my GET requests to succeed -- they return 404. In fact, when watching network traffic in Fiddler, I don't even see the request happen.
Here is [basically] what I'm doing:
public async Task<bool> ValidateSponsor(string attendeeId, string sponsorId)
{
string address = String.Format("{0}/Sponsors/?attendeeId={1}&sponsorId={2}", BASE_URI, attendeeId, sponsorId);
var response = await client.GetAsync(address);
var content = response.content;
if (!response.IsSuccessStatusCode)
throw new HttpRequestException("Check your network connection and try again.");
string result = await content.ReadAsStringAsync();
return Convert.ToBoolean(result);
}
If I copy out the address variable and paste it into a browser, it succeeds. POST requests (to different methods, of course) succeed. I've also tried using the PCL version RestSharp but get the same results -- POST succeeds and GET fails.
Edit:
This looks like it also may only be a problem when deployed to Azure, it works fine locally.

Web Security in IE VS Chrome & Firefox (bug)

Why is the Web Security is working differently on different browser:
Details:
I have two applications
One is a simple HTML application and another one is an ASP.NET MVC4 WebApi application and the projects are inside of same solution and i have set multiple start-up project for run the application for same time .
Working version:
I have Used Web Security in the Web API project. I did full implementation of web security...
Login Action Code
// GET api/company
[System.Web.Http.AcceptVerbs("Post")]
[System.Web.Http.HttpPost]
public HttpResponseMessage Login(LoginRequest loginRequest)
{
try
{
if (WebSecurity.Login(loginRequest.EmailAddress, loginRequest.Password, true))
{
var userDetails = new string[2];
userDetails[0] = loginRequest.EmailAddress;
var currentUSerRole = Roles.GetRolesForUser(loginRequest.EmailAddress);
userDetails[1] = currentUSerRole[0].ToString();
HttpResponseMessage response =
Request.CreateResponse(HttpStatusCode.Accepted, userDetails);
return response;
}
else
{
HttpResponseMessage response
= Request.CreateResponse(HttpStatusCode.Unauthorized);
return response;
}
}
catch (Exception e)
{
HttpResponseMessage response
= Request.CreateResponse(HttpStatusCode.Unauthorized);
return response;
}
}
*WebSecurity.Login* is working on all browsers when i call the login method using Ajax.
But I have another method in another controller, That named as CurrentDateAndUser
Code:
[AllowAnonymous]
[System.Web.Http.AcceptVerbs("Get")]
[System.Web.Http.HttpGet]
public HttpResponseMessage CurrentDateAndUser()
{
if (WebSecurity.IsAuthenticated)
{
int userId = WebSecurity.CurrentUserId;
string[] currentDateAndUSerId = new string[2];
currentDateAndUSerId[0] = userId.ToString();
currentDateAndUSerId[1] = DateTime.UtcNow.ToString();
HttpResponseMessage response =
Request.CreateResponse(HttpStatusCode.Accepted, currentDateAndUSerId);
return response;
}
HttpResponseMessage responseNew =
Request.CreateResponse(HttpStatusCode.NotAcceptable);
return responseNew;
}
Issue:
If I call the CurrentDateAndUser method from Microsoft Internet Explorer Using an Ajax call, then everything works. The WebSecurity.IsAuthenticated returns true and is working well.
However,
If I call the CurrentDateAndUser method from Google Chrome Or Mozilla Firefox using an Ajax call, then nothing works. The WebSecurity.IsAuthenticated always returns false.
I don't know why. If you have any idea, then please let me know.
I also found a similar problem (not sure if it is a real issue):
When I run my application with Fiddler, I see a different result:
When i call the CurrentDateAndUser method from IE, the request is:
I can see the Cooke/Login values in above image
But When i call the CurrentDateAndUser method from Chrome And Firefox , the request is:
I can't see the cookie values, meaning that the Web Security.IsAuthenticated property is returning false.
Is it Bug in WebSecurity?????
Edit
My Ajax request code is
function GetCurrentUserId() {
return $.ajax({
method: 'GET',
url: rootUrl + '/api/Common/CurrentDateAndUser',
async: false
}).success(function (response) {
return response[0];
}).error(function () {
toastr.error('Somthing is wrong', 'Error');
})
}
This request does not send the Auth Cookie values to Web API method when I run the application in Chrome and Firefox, however, this request sends the cookie values to the API method, if it is run in IE
i have posted the Image , Please take a look at the above image
The issue is not with web security at all, it's with the way you implement your security. You should never be using a userid, email, or anything important in the cookies.
I would suggest you use the FormsAuthentication class to encrypt and decrypt your cookies, and even so, only store something such as the SessionID plus a custom hash of that session ID to verify your self when you decrypt the cookie
Here is a site that gives a pretty good example: http://www.c-sharpcorner.com/uploadfile/nipuntomar/update-formsauthenticationticket/
There are 3 things around it:
WebSecurity.IsAuthenticated actually returns the value of HttpRequest.IsAuthenticated, which is true if the Forms Authentication cookie has been set and is current. It's not available until the user makes the next request after successfully logging in, which is why you are seeing the behaviour that you describe.
I remember reading on MSDN or someplace, the WebSecurity.IsAuthenticated does not work until the page is fully loaded. Meaning if you login a user in a page and in the same flow of code you check IsAuthenticated, it will NOT return True. For IsAuthenticated to be True the page has to be reloaded or use the better practice; which is to redirect the user to another secured page as soon as the login is successful and in that page check IsAuthenticated.
We had the same issue with Chrome (version 21.0.1180). Despite that we see expiration date on Header, some Chrome in Windows XP ignored it. Then we removed the Expiration Date and Chrome accepted keep the session cookie without problems.
So what to do is:
After login try to check this on new page not on same page.
Also try to set cookie explicitly
System.Web.Security.FormsAuthentication.SetAuthCookie(user.Username, false);
I don't know if this will help or not.
But I remember I was learning jQuery ajax
So I setup a simple project on my laptop. When I tested it, it worked fine on IE, but failed in Chrome. After searching for hours, I found that Chrome will not allow AJAX requests from the local machine. When I tested it using an actual web server it worked fine for IE and Chrome.
So my question and advice is: are you testing on the same machine?
Try to deploy it to a machine running a web server with a unique domain name and test your application!

Categories