How to queue multiple requests in ASP.NET Core 6 Web API - c#

Description: queue the multiple requests for ASP.NET Core 6 Web API.
There are multiple terminals/clients pc to call Web API which triggered the code which is responsible for printing to execute and send out the print command with object to be printed to a printer.
As there are many printers are available but if any one printer is in use means if one request is printing on 1 printer then other request should not proceeds and wait till first request will finish printing and then it will take another request and process for printing like this.
Do we have any suggestion for building this approach?
I have tried this approach https://michaelscodingspot.com/c-job-queues/ but its not working as while adding request in thread it will loose the this.DBContext object.

Related

puppeteer-sharp for server side HTML to PDF conversions

I found that puppeteer sharp is best way for server side, HTML to PDF conversion as it uses and downloads latest chrome and runs headless in background so the conversion is top class. tables and all else is rendered perfectly.
However for running on server, how should the concurrency be managed, because i think each web site users conversion request will launch another chrome instance. how does puppeteer sharp manages concurrency.
is is better to split it as separate web service and enqueue conversion requests and pass all pdf related requests to be served on one by one basis instead of running into concurrency or resource issues on webserver for multiple chrome instances.
I found this is the most scalable way to implement this is by using a background process. This is one real-life example:
A WebClient request a PDF sending a signalR message.
The SignalR hub creates some kind of an ID, put the request in an Azure Queue, and adds the SignalR client to a SignalR group with that ID.
A console app would read that queue, process the HTML, and sends back the result to the server using a SignalR message.
The WebServer will get the message, and broadcast that message to all the clients in that group.
It might sound quite complicated, but it's not that much. You can make it simpler.

execution terminated in middle when the API is Called concurrently using HttpClient

I am having a problem while calling a webService using HttpClient from my another webservice hosted in the same server.
The Scenario is, There is a REST webService, Lets Say "WebService1" that receives the request from multiple clients to create the PDF.
The WebService1 further calls the another webservice "BackendWebService" to create the PDF Doument. This backend WebService references the third party Library to create the PDF.
What is happening now is if a single user calls the WebService1 with 10 requests, then there is no problem. All the requests are processed successfully.
But if there are involvement of multiple users and sends the request to WebService1 simultaneously, then only few records are processed successfully, some of the records are
terminated in middle. It seems the code is not executed completely and the application is terminated without completing the whole process.
Why this is happening ? Please help. I have already created only one static instance of HttpClient, but still no Luck.
WebService1 calls the BackendWebService as below:
for each number of PDF request from client:
This is how WebService calls the BackendWebService
Finally i got the reason of this issue.
There was an concurrency issue in the library (a custom library called processhandler.dll ) referenced in BackendWebService while writing log in to the txt file by multiple processes. And when there is an exception in this referenced library, the process was terminating without any information in IIS or in application level logs as well.

Handling Long-Running Processes in an ASP.net Web API 2 Service

I created a REST service using ASP.net Web API 2. The service allows a client to POST records which are stored in a SQL Server database. At this point some other process needs to pull the records from the database, process them, and send the results to another, separate REST service.
Since the processing of the records will be time-consuming, I don't want to perform this step as part of the POST action, but I'm trying to figure out the best approach to designing this other process.
Is there a way to integrate this functionality into the ASP.net REST API project, or is something like a Windows Service the best solution?
You could write your web service asynchronously. So that,
public async Task<HttpResponseMessage> PostRecords(Records records)
{
await [POST data to db]
// process records asynchronously without awaiting
}
You can also perform the post operation synchronously and at the same time track the completion of the post in your database. And as you mentioned, have a job or windows service process the completed posts separately.
You can use Quartz.NET scheduling library within the asp.net mvc or web api. Define a job which executes your query and fire the job from your post action.

Asp.Net - Offload processing to external application

I have a asp.net website which processes requests using a 3rd party exe. Currently my workflow is
User accesses website using any browser and fills out a form with job details
Website calls a WCF self hosted windows service which is listening on a port
Windows service launches 3rd party exe to process the job and returns the result to website
Website displays the returned result to the user
The above website was a prototype which now needs to be turned into a production ready deployment. I realize that the above architecture has many points that could break. For example, if the machine is powered off or if the windows service crashes and is no longer listening on the port, all current requests will stop processing. To make the architecture more robust, I am considering the following
User accesses website using any browser and fills out a form with job details
Website writes out the job details to a database
Windows service which is polling the database every 10 seconds for a new job picks up the job and executes it using the 3rd party application. The results are written back to the database.
Website which has now started polling the database, picks up the results and displays them to the user.
The second architecture provides me with more logging capabilities and jobs can start again if they are in a queue. However it involves large amounts of polling which may not be scalable. Can anyone recommend a better architecture?
Instead of Polling I would go with MSMQ or RabbitMQ.
That way you could off-load your processing to multiple consumers (possibly separate servers from the web server) of the queue, and process more requests in parallel.
I have implemented the same architecture in one of my applications where users are making multiple requests to process. So I have -
Users goto website and select parameters etc. Submit the request
Request is stored into database table with all the details + user name etc
A service looks the database table and picks up the request in FIFO manner
After the request is processed, the status is updated as Failed or Completed into database table against that requestId, which can be seen by users on website
Service picks up next request if there is any, otherwise stops.
Service runs every 30 mins

Async Web Service Calls

I'm looking to create a web service and accompanying web app that uses an async web service call. I've seen plenty of suggestions on how to do async calls but none seem to fit exactly what i'm trying to do or are using a really outdated tech. I'm trying to do this in ASP.net 3.5 (VS2008)
What i need to do is:
the webpage needs to submit a request to the service
the page then needs to poll the service every 5 seconds or so to see if the task has completed
once complete the request needs to be retrieved from the service.
Could someone give me some suggestions or point me in the right direction?
The way I have typically handled asynchronous server-side processing is by:
Have the webpage initiate a request against a webservice and have the service return an ID to the long-running transaction. In my case, I have used Ajax with jQuery on the client webpage and a webservice that returns data in JSON format. ASP.NET MVC is particularly well suited for this, but you can use ASP.NET to return JSON string in response to a GET, or not use JSON at all.
Have the server create a record in a database that also stores the associated data to be processed. The ID of this transaction is returned to the client webpage. The service then sends a message to a third service via a message queue. In my case, the service was a WCF service hosted in a Windows Service with MSMQ as the intermediary. It should be noted that it is better not to do the actual task processing in ASP.NET, as it is not meant for requests that are long-running. In a high demand system you could exhaust available threads.
A third service receives and responds to the queued message by reading and processing necessary data from the database. It eventually marks the database record "complete".
The client webpage polls the webservice passing the transaction record ID. The webservice queries the database based on this ID to determine if the record is marked complete or not. If it is complete, it queries for the result dataset and returns it. Otherwise it returns an empty set.
The client webpage processes the webservice response, which will either contain the resulting data or an empty set, in which it should continue polling.
This just serves as an example, you may find that you can take shortcuts and avoid doing processing in a third service and just use ASP.NET threads. But that presents it's own problems, namely how you would have another request (the polling request) know if the original request is complete. The hackish-solution to that is to use a thread-safe collection in a static variable which would hold a transaction ID/result pair. But for that effort, it really is better to use a database.
EDIT: I see now that it appears to be a demonstration rather than a production system. I still stand by my above outline for "real-world" situations, but for a demo the "hackish" solution would suffice.
Which part are going to need to do async ? As far as I can tell your actions are synchronous:
1) -> 2) -> 3)
A simple web service would do, IIS (as any web server) supports multiple request to be handled async so you have no problem.
Something which you may need to be aware of. And also the javascript engine executes code in a single thread.
Step 0: Create the web service.
Step 1: Create the web app project (assuming it's ASP.NET).
Step 2: Add a web reference to the webs service to your web app project.
Step 3: The reference would create a proxy for you, using which you can invoke both synchronous and asynchronous calls.

Categories