I am looking for a way to call the appropriate method (get, post etc.) on an ApiController class based on the URL and request type etc. without making a http request.
Background: We have an API application with numerous controllers that needs to also accept requests from a remote server. Due to restrictions I cannot control there is no way to open ports between the two servers to allow the remote server to make the request directly so we decided to forward the data using websockets (SignalR). I can send (within reason) whatever information is required.
I have tried the below:
HttpRequestMessage request = new HttpRequestMessage();
var bld = new UriBuilder
{
Port = 123,
Path = "api/v1/search",
Query = "query=search_string"
};
request.RequestUri = bld.Uri;
var httpCfg = AppConfiguration.Get().HttpConfig; //this is the same config that UseWebApi was called with and contains the routes.
var route = httpCfg.Routes.GetRouteData(request);
var controllerSelector = new DefaultHttpControllerSelector(httpCfg);
var descriptor = controllerSelector.SelectController(request);
route contains the controller name (search) but the call to SelectController throws an exception with a 404 response in it (I presume this indicates I am missing something from the fake request). The same URI works when sent as a direct http request so the routes do work as best I can tell.
Is there a better way to do this, or if not what am I missing from the request that is causing the 404?
Related
I'm trying to connect to a finicky API using RestSharp. The API uses OAuth1.0 and on the initial Request Token requires oauth_callback parameter ONLY in the query and not in the Authentication Header (I have confirmed this with Postman).
When I construct the request this way:
var Authenticator = OAuth1Authenticator.ForRequestToken(mc_apiKey, mc_appsecret);
Authenticator.ParameterHandling = OAuthParameterHandling.HttpAuthorizationHeader;
Authenticator.SignatureMethod = OAuthSignatureMethod.PlainText;
client.Authenticator = Authenticator;
var request = new RestRequest(RequestToken, Method.POST);
string AuthorizationCallBackURL = string.Format(LoopbackCallback);
request.AddParameter(_oauthCallback, AuthorizationCallBackURL, ParameterType.QueryStringWithoutEncode);
and look at the logs on the server I see the query string in the Http call,
http://192.168.0.187:8080/xxxx/ws/oauth/initiate?oauth_callback=http://192.168.0.187:8080/provider_emailer/callback.jsp
but it is also in the Authentication header:
Headers:
{Accept=[application/json, text/json, text/x-json, text/javascript, application/xml, text/xml],
accept-encoding=[gzip, deflate],
Authorization=[OAuth oauth_callback="http://192.168.0.187:8080/provider_emailer/callback.jsp",
oauth_consumer_key="XXXXXXXXXXXX",
oauth_nonce="cei09xm04qetk2ce",
oauth_signature="XXXXXXXXXXXXXXXX",
oauth_signature_method="PLAINTEXT",
oauth_timestamp="1591197088",
oauth_version="1.0"],
Content-Length=[0], Content-Type=[null],
cookie=[JSESSIONID=C8C8DB501382F7D1E52FE436600094C0],
host=[192.168.0.187:8080], user-agent=[RestSharp/106.11.4.0]}
This causes a "NotAcceptable" response. The same request done with Postman without the callback parameter in the Authentication header works.
Am I doing something wrong? Is there a way to only get the callback in the query string?
That's tricky. I looked at the code and we don't set the callback URL to the workflow object when you use the overload without this parameter. So, you're doing it conceptually correct.
However, we must collect all the parameters (default and request parameters) plus OAuth parameters to build the OAuth signature. We then use the parameters collection and extract all of them that have a name starting with oauth_ or xauth_ to the request headers (in case you use HttpAuthorizationHeader) and by doing so, we put your query parameter to the header.
Apparently, that's not ideal and it looks like a bug, so I can suggest opening an issue in RestSharp repository. Should not be hard to fix.
I am having list of http urls, I need to find https url is available or not. Example : http://www.apra.gov.au/Insight/Pages/insight-issue2-2017.html, need to check whether https is available on the same domain via c# code. since I have a list of 5k http urls. I need to verify all these url available on HTTPS?
You can probably do a simle string replace (http: = https: ) and then loop through them all calling httpget to check:
Psuedo code:
var httpClient = new HttpClient()
foreach(var url in urls)
var httpUrl = url.Replace("http:","https:");
httpClient.Get(url);
I need to effect a type of reverse proxy from C# code. (Yes, I know that IIS has a reverse proxy, but for several reasons, I need to do this from code.)
So, my controller action will "relay" the inbound request to another URL, then return the response. Kind of like this:
public string Proxy()
{
// This would be an extension method; it's currently hypothetical
var newRequest = Request.GetRequestToNewUrl("http://newurl.com");
// Make the request and send back whatever we get
var response = newRequest.GetResponse();
using (var reader = new StreamReader(response.GetResponseStream(), Encoding.Something))
{
return reader.ReadToEnd();
}
}
The proxied request (to newurl.com) should be identical to the inbound request (headers, body, cookies, etc.), just to a different URL.
I've been playing around with it, and it's more complex than I thought. The inbound Request is an HttpRequestBase, and the proxy request will be an HttpWebRequest. They are fundamentally different types, and there's no direct translation between the two. So far, it's been a tedious process of copy and translating properties.
Before I spend a ton of time debugging all this, is there an easier way? There are a fair number of different types to represent an HTTP request:
HttpRequestBase
HttpWebRequest
HttpRequest
HttpRequestWrapper
Is there a way I'm not aware of to simply "reuse" the inbound request, while changing the URL? Or should I continue with my translation from HttpRequestBase?
yes, it is possible. You can reuse the Request content from an incoming request and the forward it by creating a new request of your own. Create a new client with the address where the request was supposed to be forwarded. And do a get or post with new HTTP client and just return the result.
var client = new HttpClient
{
BaseAddress = new Uri(destinationBaseAddress)
};
return await client.PostAsync(requestUrl, Request.Content);
I have a http handler which is registered and working fine. Now i want to process a request, and send a custom html response which is then shown on the client.
So my function is written as follows:
public void ProcessRequest(HttpContext _context)
{
HttpResponse response = _context.Response;
response.Clear();
var requestedUrl = _context.Request.Url;
PhantomModuleController pmc = new PhantomModuleController();
response.BufferOutput = true;
var snapshot = pmc.DoThings(requestedUrl); //this returns a string
response.Write(snapshot); //i put it in the response
response.ContentType = "text/html";
response.End(); //it should send it to the client now
}
But according to my fiddler, the response never arrives on the client. In fact, the httpresponse is never even sent.
Did i forget somethiing
Because the request is not showing up in Fiddler as being sent back to the client (nor an error back to the client), the routing engine may be getting in the way of the request. The scenerio is described by phil hack.
However, there are other cases where you might have requests for files
that don’t exist on disk. For example, if you register an HTTP Handler
directly to a type that implements IHttpHandler. Not to mention
requests for favicon.ico that the browser makes automatically. ASP.NET
Routing attempts to route these requests to a controller. One solution
to this is to add an appropriate ignore route to indicate that routing
should ignore these requests. Unfortunately, we can’t do something
like this: {*path}.aspx/{*pathinfo}
You need to setup the route engine to ignore the route that has the file extension. E.G.
routes.IgnoreRoute("{*allaspx}", new {allaspx=#".*\.aspx(/.*)?"});
routes.IgnoreRoute("{*favicon}", new {favicon=#"(.*/)?favicon.ico(/.*)?"});
I have application. It send request to my proxy class. Proxy must to parse http header string (I done this) and resend it request to server to get a video.
At first, mediacomponent connect to proxy:
var uri = new Uri("http://127.0.0.1:2233/files/1.mp4");
videoPlayer.Source = uri;
Play();
Proxy get http header string
"GET /files/1.mp4 HTTP/1.1\r\nCache-Control: no-cache\r\nConnection: Keep-Alive\r\nPragma: getIfoFileURI.dlna.org\r\nAccept: */*\r\nUser-Agent: NSPlayer/12.00.7601.17514 WMFSDK/12.00.7601.17514\r\nGetContentFeatures.DLNA.ORG: 1\r\nHost: 127.0.0.1:2233\r\n\r\n"
I replase host:
"GET /files/1.mp4 HTTP/1.1\r\nCache-Control: no-cache\r\nConnection: Keep-Alive\r\nPragma: getIfoFileURI.dlna.org\r\nAccept: */*\r\nUser-Agent: NSPlayer/12.00.7601.17514 WMFSDK/12.00.7601.17514\r\nGetContentFeatures.DLNA.ORG: 1\r\nHost: myserver.ru\r\n\r\n"
Now proxy must get video from server. What must I do?
When using .NET, you don't have to manually create the HTTP message itself. Instead, use the classes in the System.Net.Http namespace to form and send an HTTP message and process the response.
For example, sending an HTTP GET message to a URL can be as simple as:
var uri = new Uri("http://www.foobar.com/");
var client = new HttpClient();
string body = await client.GetStringAsync(uri);
Note that this general approach will download the entire contents of the resource at the given URI. In your case, you may not want to wait for the whole video to download before you start playing/processing/storing it. In which case, you might want to use the HttpClient.ReadAsStream() method which will return a stream from which you can read until the stream closes.