Asp.Net - Offload processing to external application - c#

I have a asp.net website which processes requests using a 3rd party exe. Currently my workflow is
User accesses website using any browser and fills out a form with job details
Website calls a WCF self hosted windows service which is listening on a port
Windows service launches 3rd party exe to process the job and returns the result to website
Website displays the returned result to the user
The above website was a prototype which now needs to be turned into a production ready deployment. I realize that the above architecture has many points that could break. For example, if the machine is powered off or if the windows service crashes and is no longer listening on the port, all current requests will stop processing. To make the architecture more robust, I am considering the following
User accesses website using any browser and fills out a form with job details
Website writes out the job details to a database
Windows service which is polling the database every 10 seconds for a new job picks up the job and executes it using the 3rd party application. The results are written back to the database.
Website which has now started polling the database, picks up the results and displays them to the user.
The second architecture provides me with more logging capabilities and jobs can start again if they are in a queue. However it involves large amounts of polling which may not be scalable. Can anyone recommend a better architecture?

Instead of Polling I would go with MSMQ or RabbitMQ.
That way you could off-load your processing to multiple consumers (possibly separate servers from the web server) of the queue, and process more requests in parallel.

I have implemented the same architecture in one of my applications where users are making multiple requests to process. So I have -
Users goto website and select parameters etc. Submit the request
Request is stored into database table with all the details + user name etc
A service looks the database table and picks up the request in FIFO manner
After the request is processed, the status is updated as Failed or Completed into database table against that requestId, which can be seen by users on website
Service picks up next request if there is any, otherwise stops.
Service runs every 30 mins

Related

Wep Api or WCF to run scheduled long-running tasks

I'm developing a Client/Server applications (C#, Winforms for GUI).
We have a module to perform tasks to import / export data from the database to other external sources. Activities are managed by users using any client station. The next step will be to allow the schedule to automatically execute tasks (eg, X start time and repetition every hour, daily or weekly or monthly time, and so on).
Each tasks allows to import or export a large amount of data with any datasources (excel. access or dbms), therefore they are long-running activities.
Now the DLL that implements this logic is distributed to each client station. This is not a good solution because we have to install all the potential requirements in each client (for example driver ado / oledb / odbc for all managed dbms).
I have to move this logic to the server station. In each client I want to see the tasks progress, stop or start any tasks, or change the schedule table and restart the process.
I'm considering what is the best solution. Realize a Web API or WCF. Probably WCF because service-oriented, but I've seen projects or articles with Web APIs combined with libraries like Quartz or Hangfire.
I'm also considering whether it is better to use a Windows service and to host WCF inside it.
What is the best solution? or are there any other solutions I'm not considering?
Thank you
EDIT:
From any client workstation the user can schedule all tasks to be executed depending by the applied settings (frequence time, repeat each day/week/month). Probably I should use a windows service because when the server machine is automatically switch on, this service must be automatically started and check if there are tasks to run. At the same time the user can decide to run manually any task without schedule it and, in this case, it will be queued and processed when it is his turn.
Now I'm thinking to host a WCF service into a Windows service in the server machine. Automatically I will start a background worker to check the scheduled tasks to run. In addition all clients can invoke a method to start one or more tasks. To notify the progress to all clients I'll use Contract Duplex.
You will need to compare between WCF and Web API and Choosing which technology to use according to your requirements.
If you just need HTTP only as transport protocols and Lightweight web-hosted services go with Web API.
And I will recommend Hangfire as it has many features than Windows service like Distributed, Persistent and Also, it's out of the box Dashboard that shows you all your scheduled, processing, succeeded and failed jobs.
Check also this article about
Runing Background Tasks in ASP.NET
if this is an internal application and clients are using winforms, behind the scenes you can make gets/posts to web api endpoints -- this allows users to retrieve/export data without having to install database drivers
web api driven imo, not very familiar with windows services, but one of the benefits i'm seeing is that the service can still be running on reboot
feel free to reach out to me directly

Discrete .NET middleware processor vs spawning a new process from IIS

I have a 4 tier .NET application which consists of a
Silverlight 5 Client
MVC4 Web API Controller (Supplying data to the SL5 Client)
Windows Service - responsible for majority of data processing.
Oracle DB storage.
The workflow is simple: SL5 client sends a request to the rest service, the rest service simply stores it in the DB.
The windows service, while periodically polling the DB for new records, detects the new records and attempts to process them accordingly. Once finished it updates the records and their status in the DB.
In the meantime the SL5 Client also periodically polls the DB to see if the records have been processed. When they are, the result is retrieved and rendered on the screen.
So the question here is the following:
Is there a difference between spawning the same processing code (currently in the windows service) in a new discrete process (right out of the Web API Controller), vs keeping it as is in the windows service?
Aside from removing the constant DB polling that happens in the windows service, it simplifies processing greatly because it can be done on a per-request basis as the requests arrive from the client. But are there any other drawbacks? Perhaps server or other issues with IIS?
Yes there is a difference.
Windows services are the right tool for asynchronous processing. Operations can take a long time without producing strange effects. After all, it is a continuously running service.
IIS on the other hand, processes requests by using a thread pool. Long running tasks have the potential to exhaust that thread pool, so this may cause problems depending on the number of background tasks you start. Also, IIS makes no guarantees to keep long running tasks alive. If the web site is recycled, which happens regularly in a IIS default installation, your background task may die.

ASP.NET + Entity Framework - handling intermittent traffic spikes

I have an MVC and WebAPI application that needs to log activities performed by the users back to my database. This is almost always a single insert into a table that have less than 5 columns (i.e. very little data is crossing the wire). The data interface that I am currently using is Entity Framework 6
Every once in a while, I'll get a large number of users needing to log that they performed a single activity. In this case, "Large Number" could be a couple hundred requests every second. This typically will only last for a few minutes at most. The rest of the time, I see very manageable traffic to the site.
When the traffic spikes, Some of my clients are getting timeout errors because the page doesn't finish loading until the server has inserted the data into the database. Now, the actual inserting of the data into the database isn't necessary for the user to continue on using the application, so I can cache these requests somewhere locally, and then batch insert them later.
Is there any good solutions for ASP.NET MVC to buffer incoming request data and then batch insert them into the database every few seconds?
As for my environment, I have several servers running Server 2012 R2 in a load balanced Web Farm. I would prefer to stay stateless if at all possible, because users might hit different servers per request.
When the traffic spikes, Some of my clients are getting timeout errors because the page doesn't finish loading until the server has inserted the data into the database.
I would suggest using a message queue. Have the website rendering code simply post an object to the queue representing the action, and have a separate process (e.g. Windows Service) read off the queue and write to the database using Entity Framework.
UPDATE
Alternatively you could log access to a file (fast), and have a separate process read the file and write the information into your database.
I prefer the message queue option, but it does add another piece of architecture.

WCF/ASP.NET Web API service architecture suggestions

I have some experience with WCF services development but with this requirement I want to get some help/suggestion from some of the experienced developers here. Here is my scenario,
I will have a service(REST) (let calls it Service 1) which will receive requests from a different service (lets calls it Service Main)with some parameters. I am planing to save these parameters in a database so that I can track the status of the progress in future steps. Then I have to a start a process on the server from Service 1 which will run for in determinant time (based on the parameters) and lets call this process A. When process A is done with its task and comes back with good results then I have to start a different process which is called Process B which will use files generated by process A. When process B is done with its business and sends an acknowledgement to service 1 then I have to send the information back to Service Main.
For database i am planing to use no sql database since there are no relationships involved and it is more like a cache. I am having hard time on how to architect this entire process so that all of these steps/tasks run asynchronous and able to scale and handle lot of requests.
Approach 1: My initial idea was to have a wcf or ASP.NET Web api service(REST) use TPL framework to launch process A and wait for it to complete and call async callback method of process A then launch Process B on a new Task. But I am not sure if that is a good solutions or even possible.
Approach 2: After lot of reading i thought may be having a windows service on the hosted server to launch process A and process B. WCF service will talk to window service to start the process.
Hopefully I explained the problem clearly and waiting to hear some advises.

Persistent connection with queries from server to client

I have a company network under my control and a couple of closed customer networks. I want to communicate from a web application in my network to a database inside a customer network. My first idea was:
Web application stores query in a database in the company network and waits for answer.
Windows service inside client network polls our database a couple of times every second through a (WCF) web service also in our company network.
If a query is available the Windows service executes it in it's local database and stores the answer in the company database.
I've been thinking about removing the polling idea and instead using persistent connection between a client in the customer network and a server in our company network. The client initiates the connection and then waits for queries from the server. What would be better or worse compared to polling through a web service? Would WCF be the right thing to use here?
you have few approaches:
WCF Duplex, Once the web application stores a query in database, you initiate a call to the client (in this case the Windows Service) instead of making the windows service polls every few seconds. net.tcp will be good choice but still you can use http.
Long polling, Instead of letting your Windows Service client sends a request every few seconds, let it send the request, the channel is open, set the timeout in both client and WCF service for longer time, let the server method loops and checks the database for new notifications. Once new notifications found, the method returns with them. At Client side, once you get a return send another request to the server and then process the data. If timeOut error occure, send another request. Just Google Long polling you will find a lot.
Regarding querying the database every few seconds, the better approach would be making a table for notifications, So instead of querying a large table with a complex sql string every few seconds you can let the client add the notifications in a separate table (after they are done adding them the main table), so your query will be much simpler and takes less resources. you can add direct pointers (Like Ids) in the notifications table to save time. Later clean up the notifications table..

Categories