Opening a websocket channel inside MVC controller - c#

Has anyone has any good experience with opening a websocket connection inside MVC controller?
Technology stack: ASPNET Core 1.0 (RC1) MVC, dnx46, System.Net.WebSockets
Why MVC instead of middleware: for overall consistency, routing, already injected repositories, an option to call private methods in the same controller.
[HttpGet("v1/resources/{id}")]
public async Task<IActionResult> GetAsync(string id)
{
var resource = await this.repository.GetAsync(id);
if (resource == null)
{
return new HttpStatusCodeResult(404);
}
if (this.HttpContext.WebSockets.IsWebSocketRequest)
{
var webSocket = await this.HttpContext.WebSockets.AcceptWebSocketAsync();
if (webSocket != null && webSocket.State == WebSocketState.Open)
{
while (true)
{
var response = string.Format("Hello! Time {0}", System.DateTime.Now.ToString());
var bytes = System.Text.Encoding.UTF8.GetBytes(response);
await webSocket.SendAsync(new System.ArraySegment<byte>(bytes),
WebSocketMessageType.Text, true, CancellationToken.None);
await Task.Delay(2000);
}
}
}
return new HttpStatusCodeResult(101);
}
Question: are there any known downsides going with way instead of handling a websocket connections in a middleware? How about the handshake, do we need to do anything else in addition to returning HTTP 101 status code?
Update 1: why not SignalR? there is no need to use fallback techniques, so while it's a good product, it see no benefit of adding additional dependency in this situation.
Update 2: one downside I've already noticed - when the while(true) exists (for simplicity reasons, not shown in an example above, let' say, when a channel needs to be closed), the methods needs to return something (Task). What it should be? HTTP 200 status response? I guess no, because in the WebSockets documentation is written, that nothing should be sent after the "close" frame.
Update 3: one thing I learned the hard way, that if you want to get WebSockets working while debugging in Visual Studio 2015 using IIS Express 10.0 on Windows 10, you still have to use https://github.com/aspnet/WebSockets and configure app.UseWebSockets() in your Startup.cs file. Otherwise, IsWebSocketRequest will be false. Anyone knows why? Handshake?

Seems fine.
You probably want to change the while(true) to while (!HttpContext.RequestAborted.IsCancellationRequested) so you detect client disconnects and end the request.
You don't need to check for null or the state of the websocket after you call accept.
I'm assuming all of that code is temporary and you'll actually be reading something from the websocket.
All the usual websocket rules apply:
Use SSL (when you're hosting it forreal)
It won't work on multiple servers (it's a point to point socket connection)
You need to support handling partial frames. You can punt this if you know the client won't send any.

Related

Possible to get response of each API calls inside ForEach without waiting to end of loop?

I need to return response of each API call inside ForEach loop without waiting for end of it.
First Api to get data from SharePoint and call another Api to update website database:
[HttpGet]
[Route("GetAllFaqs")]
public async Task<ApiResult<List<FaqModel>>> GetAllFaqs()
{
ApiSuccessResult<List<FaqModel>> faqs = await sharepointConnector.GetAllFaqsAsync();
foreach (var faq in faqs.Response)
{
// Here I want to return response of this Api call
// so I could show on client side one by one
await UpdateFaq(faq);
}
return faqs;
}
Another Api to update website database:
[HttpPost]
[Route("UpdateFaq")]
public async Task<ApiResult<FaqSyncResult>> UpdateFaq(FaqModel model)
{
ApiSuccessResult<FaqSyncResult> result = await websiteConnector.SyncFaq(model);
return result;
}
Is it possible to return response inside ForEach loop without stop it and also without waiting for the end?
This is usually not how it works.
There are 2 things which prevents you for doing this approach:
most javascript libraries used to fetch the data awaits the full response from the api
most javascript libraries used to process the data works with a full set of data
An alternative approach, to get this behaviour would be to push the data from the server to the client using HTTP2/websockets or SignalR.
I've implemented the SignalR variant quite a few times and it works really well. Here is an example on of chat application
The way it works, is basically sending data by a hub to the client (which is connected through the SignalR library) and call a piece of javascript with the appropriate data client side.
I must warn you however, this will break the "general-purpose" idea of the API because it relies heavily on a client accepting the data in this way.

RestRequestAsyncHandle.Abort() won't trigger CancellationToken.IsCancellationRequested

I have two C# app running simultaneously. One is a C# console app that sends REST calls to the other and the other is running a Nancy self hosted rest server.
Here's a really basic view of some of the code of both parts.
RestsharpClient:
public async void PostAsyncWhile()
{
IRestClient RestsharpClient = new RestClient("http://localhost:50001");
var request = new RestRequest($"/nancyserver/whileasync", Method.POST);
var asyncHandle = RestsharpClient.ExecuteAsync(request, response =>
{
Console.WriteLine(response.Content.ToString());
});
await Task.Delay(1000);
asyncHandle.Abort();//PROBLEM
}
NancyRestServer:
public class RestModule : NancyModule
{
public RestModule() : base("/nancyserver")
{
Post("/whileasync", async (args, ctx) => await WhileAsync(args, ctx));
}
private async Task<bool> WhileAsync(dynamic args, CancellationToken ctx)
{
do
{
await Task.Delay(100);
if (ctx.IsCancellationRequested)//PROBLEM
{
break;
}
}
while (true);
return true;
}
}
The client sends a command to start a while loop on the server, waits for 1sec and aborts the async call.
The problem I'm having is that the asyncHandle.Abort(); doesn't seem to trigger the ctx.IsCancellationRequested on the server's side.
How am I supposed to cancel/abort an async call on a nancy host server using a restsharp client?
There's a couple of parts that go into cancelling a web request.
First, the client must support cancellation and use that cancellation to clamp the connection closed. So the first thing to do is to monitor the network packets (e.g., using Wireshark) and make sure RestSharp is sending a RST (reset), and not just closing its side of the connection. An RST is a signal to the other side that you want to cancel the request. Closing the connection is a signal to the other side that you're done sending data but are still willing to receive a response.
Next, the server must support detecting cancellation. Not all servers do. For quite a while, ASP.NET (pre-Core) did not properly support client-initiated request cancellation. There was some kind of race condition where it wouldn't cancel properly (don't remember the details), so they disabled that cancellation path. This has all been sorted out in ASP.NET Core, but AFAIK, it was never fixed in ASP.NET Legacy Edition.
If RestSharp is not sending an RST, then open an issue with that team. If Nancy is not responding to the RST, then open an issue with that team. If it's an ASP.NET issue, the Nancy team should know that (and respond informing you they can't fix it), in which case you're out of luck. :/

Implement sending Server Sent Events in C# (no ASP.NET / MVC / ...)

For a project, I need to implement SSE (Server Sent Events) in a C# Application. Although this may sound easy, I've got no clue how to solve this.
As I'm new to C# (though, not new to programming in general) I took an excursion to Google and tried to look for some sample code. From what I've seen so far, I could learn to build a HTTP Server with C# or consume server sent events. But I found nothing about sending SSEs.
What I'm trying to get my head around: how can I keep sending updated data over the incoming request? Normally, you get a request, do your thing and reply. Done, connection closed. But in this case I want to kind of "stick" to the response-stream and send new data through, each time an event in my application fires.
The problem, for me, lies in this event-based approach: it's not some intervall-based polling and updating. It's rather that the app goes like "Hey, something happend. I really should tell you about it!"
TL;DR: how can I hold on to that response-stream and send updates - not based on loops or timers, but each time certain events fire?
Also, before I forget: I know, there are libraries out there doing just that. But from what I've seen so far (and from what I've understood; correct me if I'm wrong) those solutions depend on ASP.NET / MVC / you name it. And as I'm just writing a "plain" C# application, I don't think I meet these requirements.
As for a light-weight server I would go with an OWIN selfhost WebAPI (https://learn.microsoft.com/en-us/aspnet/web-api/overview/hosting-aspnet-web-api/use-owin-to-self-host-web-api).
A simple server-sent event server action would basically go like:
public class EventController : ApiController
{
public HttpResponseMessage GetEvents(CancellationToken clientDisconnectToken)
{
var response = Request.CreateResponse();
response.Content = new PushStreamContent(async (stream, httpContent, transportContext) =>
{
using (var writer = new StreamWriter(stream))
{
using (var consumer = new BlockingCollection<string>())
{
var eventGeneratorTask = EventGeneratorAsync(consumer, clientDisconnectToken);
foreach (var #event in consumer.GetConsumingEnumerable(clientDisconnectToken))
{
await writer.WriteLineAsync("data: " + #event);
await writer.WriteLineAsync();
await writer.FlushAsync();
}
await eventGeneratorTask;
}
}
}, "text/event-stream");
return response;
}
private async Task EventGeneratorAsync(BlockingCollection<string> producer, CancellationToken cancellationToken)
{
try
{
while (!cancellationToken.IsCancellationRequested)
{
producer.Add(DateTime.Now.ToString(), cancellationToken);
await Task.Delay(1000, cancellationToken).ConfigureAwait(false);
}
}
finally
{
producer.CompleteAdding();
}
}
}
The important part here is the PushStreamContent, which basically just sends the HTTP headers and then leaves the connection open to write the data when it is available.
In my example the events are generated in an extra-task which is given a producer-consumer collection and adds the events (here the current time every second) if they are available to the collection. Whenever a new event arrives GetConsumingEnumerable is automatically notified. The new event is then written in the proper server-sent event format to the stream and flushed. In practice you would need to send some pseudo-ping events every minute or so, as streams which are left open for a long time without data being sent over them would be closed by the OS/framework.
The sample client code to test this would go like:
Write the following code in async method.
using (var client = new HttpClient())
{
using (var stream = await client.GetStreamAsync("http://localhost:9000/api/event"))
{
using (var reader = new StreamReader(stream))
{
while (true)
{
Console.WriteLine(reader.ReadLine());
}
}
}
}
This sounds like a good fit for SignalR. Note SignalR is part of the ASP.NET family, however this does NOT require the ASP.NET framework (System.Web), or IIS, as mentioned in comments.
To clarify, SignalR is part of ASP.NET. According to their site:
ASP.NET is an open source web framework for building modern web apps
and services with .NET. ASP.NET creates websites based on HTML5, CSS,
and JavaScript that are simple, fast, and can scale to millions of
users.
SignalR has no hard dependency on System.Web or on IIS.
You can self-host your ASP.Net application (see https://learn.microsoft.com/en-us/aspnet/signalr/overview/deployment/tutorial-signalr-self-host). If you use .net core, it is actually self-hosted by default and runs as a normal console application.

Web API allow only single async task

What is a proper scenario for handling only single async action? For example I need to import large file and while it being imported I need to disable that option to ensure that second import not triggered.
What comes in mind that:
[HttpPost]
public async Task<HttpResponseMessage> ImportConfigurationData()
{
if (HttpContext.Current.Application["ImportConfigurationDataInProcess"] as bool? ?? false)
return Request.CreateErrorResponse(HttpStatusCode.InternalServerError, "Task still running");
HttpContext.Current.Application["ImportConfigurationDataInProcess"] = true;
string root = HttpContext.Current.Server.MapPath("~/App_Data");
var provider = new MultipartFormDataStreamProvider(root);
await Request.Content.ReadAsMultipartAsync(provider);
//actual import
HttpContext.Current.Application["ImportConfigurationDataInProcess"] = false;
Request.CreateResponse(HttpStatusCode.OK, true)
}
But it seems like very hard-coded solution. What is a proper way of handling that?
Another thing it is not properly works on client side at it still waits for a response. So is it possible for user just to send that file to server and not wait unlit it will finishes but reload page after file sent to server without waiting while await stuff will finish.
async does not change the HTTP protocol (as I explain on my blog). So you still just get one response per request.
The proper solution is to save a "token" (and import data) for the work in some reliable storage (e.g., Azure table/queue), and have a separate processing backend that does the actual import.
The ImportConfigurationData action would then check whether a token already exists for that data, and fault the request if found.

Web Api + HttpClient: An asynchronous module or handler completed while an asynchronous operation was still pending

I'm writing an application that proxies some HTTP requests using the ASP.NET Web API and I am struggling to identify the source of an intermittent error.
It seems like a race condition... but I'm not entirely sure.
Before I go into detail here is the general communication flow of the application:
Client makes a HTTP request to Proxy 1.
Proxy 1 relays the contents of the HTTP request to Proxy 2
Proxy 2 relays the contents of the HTTP request to the Target Web Application
Target Web App responds to the HTTP request and the response is streamed (chunked transfer) to Proxy 2
Proxy 2 returns the response to Proxy 1 which in turn responds to the original calling Client.
The Proxy applications are written in ASP.NET Web API RTM using .NET 4.5.
The code to perform the relay looks like so:
//Controller entry point.
public HttpResponseMessage Post()
{
using (var client = new HttpClient())
{
var request = BuildRelayHttpRequest(this.Request);
//HttpCompletionOption.ResponseHeadersRead - so that I can start streaming the response as soon
//As it begins to filter in.
var relayResult = client.SendAsync(request, HttpCompletionOption.ResponseHeadersRead).Result;
var returnMessage = BuildResponse(relayResult);
return returnMessage;
}
}
private static HttpRequestMessage BuildRelayHttpRequest(HttpRequestMessage incomingRequest)
{
var requestUri = BuildRequestUri();
var relayRequest = new HttpRequestMessage(incomingRequest.Method, requestUri);
if (incomingRequest.Method != HttpMethod.Get && incomingRequest.Content != null)
{
relayRequest.Content = incomingRequest.Content;
}
//Copies all safe HTTP headers (mainly content) to the relay request
CopyHeaders(relayRequest, incomingRequest);
return relayRequest;
}
private static HttpRequestMessage BuildResponse(HttpResponseMessage responseMessage)
{
var returnMessage = Request.CreateResponse(responseMessage.StatusCode);
returnMessage.ReasonPhrase = responseMessage.ReasonPhrase;
returnMessage.Content = CopyContentStream(responseMessage);
//Copies all safe HTTP headers (mainly content) to the response
CopyHeaders(returnMessage, responseMessage);
}
private static PushStreamContent CopyContentStream(HttpResponseMessage sourceContent)
{
var content = new PushStreamContent(async (stream, context, transport) =>
await sourceContent.Content.ReadAsStreamAsync()
.ContinueWith(t1 => t1.Result.CopyToAsync(stream)
.ContinueWith(t2 => stream.Dispose())));
return content;
}
The error that occurs intermittently is:
An asynchronous module or handler completed while an asynchronous operation was still pending.
This error usually occurs on the first few requests to the proxy applications after which the error is not seen again.
Visual Studio never catches the Exception when thrown.
But the error can be caught in the Global.asax Application_Error event.
Unfortunately the Exception has no Stack Trace.
The proxy applications are hosted in Azure Web Roles.
Any help identifying the culprit would be appreciated.
Your problem is a subtle one: the async lambda you're passing to PushStreamContent is being interpreted as an async void (because the PushStreamContent constructor only takes Actions as parameters). So there's a race condition between your module/handler completing and the completion of that async void lambda.
PostStreamContent detects the stream closing and treats that as the end of its Task (completing the module/handler), so you just need to be sure there's no async void methods that could still run after the stream is closed. async Task methods are OK, so this should fix it:
private static PushStreamContent CopyContentStream(HttpResponseMessage sourceContent)
{
Func<Stream, Task> copyStreamAsync = async stream =>
{
using (stream)
using (var sourceStream = await sourceContent.Content.ReadAsStreamAsync())
{
await sourceStream.CopyToAsync(stream);
}
};
var content = new PushStreamContent(stream => { var _ = copyStreamAsync(stream); });
return content;
}
If you want your proxies to scale a bit better, I also recommend getting rid of all the Result calls:
//Controller entry point.
public async Task<HttpResponseMessage> PostAsync()
{
using (var client = new HttpClient())
{
var request = BuildRelayHttpRequest(this.Request);
//HttpCompletionOption.ResponseHeadersRead - so that I can start streaming the response as soon
//As it begins to filter in.
var relayResult = await client.SendAsync(request, HttpCompletionOption.ResponseHeadersRead);
var returnMessage = BuildResponse(relayResult);
return returnMessage;
}
}
Your former code would block one thread for each request (until the headers are received); by using async all the way up to your controller level, you won't block a thread during that time.
I would like to add some wisdom for anyone else who landed here with the same error, but all of your code seems fine. Look for any lambda expressions passed into functions across the call-tree from where this occurs.
I was getting this error on a JavaScript JSON call to an MVC 5.x controller action. Everything I was doing up and down the stack was defined async Task and called using await.
However, using Visual Studio's "Set next statement" feature I systematically skipped over lines to determine which one caused it. I kept drilling down into local methods until I got to a call into an external NuGet package. The called method took an Action as a parameter and the lambda expression passed in for this Action was preceded by the async keyword. As Stephen Cleary points out above in his answer, this is treated as an async void, which MVC does not like. Luckily said package had *Async versions of the same methods. Switching to using those, along with some downstream calls to the same package fixed the problem.
I realize this is not a novel solution to the problem, but I passed over this thread a few times in my searches trying to resolve the issue because I thought I didn't have any async void or async <Action> calls, and I wanted to help someone else avoid that.
A slightly simpler model is that you can actually just use the HttpContents directly and pass them around inside the relay. I just uploaded a sample illustrating how you can rely both requests and responses asynchronously and without buffering the content in a relatively simple manner:
http://aspnet.codeplex.com/SourceControl/changeset/view/7ce67a547fd0#Samples/WebApi/RelaySample/ReadMe.txt
It is also beneficial to reuse the same HttpClient instance as this allows you to reuse connections where appropriate.

Categories