After so much research, I thought I should ask the experts.
I am working on a project for my corporate employer, we have android and iPhone mobile apps that make request to a web service, the request is logged in pending state for processing.
A windows service retrieves the pending requests and spins a new thread for every request. This is because the request could be directed to different providers who process request in different manners. One could immediately process the request and return feedback, others could receive the request and take up to 30 seconds to return a feedback which you have to poll for the status.
The mobile app will also be polling for the status of the request.
Now my challenge is:
I am thinking of creating a list of threads say 100 and assign each thread to execute a request, once finished the thread will be recreated and assigned a new request. It is a high response based platform so I am thinking of not using ThreadPool.
Is it advisable to spin new threads in a fire and forget sequence or manage these list of threads and if it is to managed, then what is the best approach to manage these list of threads and ensure high performance in C# as the mobile apps will be polling for a response.
Regards
Related
I'm maintaining an ASP.NET Core web application that needs to repeatedly run some background threads. I know it's not a good design but currently I have to fix its major issues with minimum effort. Now I wonder if I should worry about handling users http requests by web server or not?
Question is simple but I can't find any clear answer for it:
What is the difference between threads that are created in application like this:
Task.Run(() => { // some parallel job })
and worker threads of IIS that handle http requests?
Are they come from the same thread pool or they're reside in separate pools?
According to this it's all one pool: "ASP.NET Core already runs app code on normal Thread Pool threads." In other words, there isn't a separate max for threads serving requests vs. background threads.
The biggest difference is that IIS is aware of the threads it creates itself for an incoming request. IIS is not aware of any threads you create yourself.
When an app pool is recycled, or IIS is shut down, it waits until all requests have finished processing - it waits until the threads it creates for each request has finished processing - and then it kills the process. If you create any threads that outlive the request (for example, if you create a background thread and then send the response back to the client) IIS has no idea that thread is still running and could kill the whole process at any time.
If you don't return a response until all the threads are complete, then you won't have that specific problem.
The other issue is that you may hit the maximum number of allowable threads. Then all kinds of weird performance issues would happen. But that depends on how many threads you are creating and how many HTTP requests are coming in.
So here is the problem statement. I have a service which services mobile devices. Each user on the trigger of an event sends message to the service to register itself. After which the service performs a particular set of tasks at regular intervals(say 5 min) from the time of registration. So the execution time will be different for each user based on registration time.
I Implemented this using threads and Timers, it worked to an extent but as the users increased, the threads are killed and the tasks are not completed. Also this service is hosted on azure. I have created a WCF service with WebHtpp binding which accepts and returns JSON data.
Web jobs are a suggestion given to me. But since the execution times vary I am unable to use that as well. Is it even possible to perform such a task using C# and asp.net or am i going i the wrong direction entirely.
You need to identify where's the bottleneck that stops your threads before completion.
I would solve this using another approach: every new user, put a message in a queue, and create one Azure Function to dequeue the message and perform the logic of your service. This way your application will scale better, and you'll save money with the serverless approach.
I have a 4 tier .NET application which consists of a
Silverlight 5 Client
MVC4 Web API Controller (Supplying data to the SL5 Client)
Windows Service - responsible for majority of data processing.
Oracle DB storage.
The workflow is simple: SL5 client sends a request to the rest service, the rest service simply stores it in the DB.
The windows service, while periodically polling the DB for new records, detects the new records and attempts to process them accordingly. Once finished it updates the records and their status in the DB.
In the meantime the SL5 Client also periodically polls the DB to see if the records have been processed. When they are, the result is retrieved and rendered on the screen.
So the question here is the following:
Is there a difference between spawning the same processing code (currently in the windows service) in a new discrete process (right out of the Web API Controller), vs keeping it as is in the windows service?
Aside from removing the constant DB polling that happens in the windows service, it simplifies processing greatly because it can be done on a per-request basis as the requests arrive from the client. But are there any other drawbacks? Perhaps server or other issues with IIS?
Yes there is a difference.
Windows services are the right tool for asynchronous processing. Operations can take a long time without producing strange effects. After all, it is a continuously running service.
IIS on the other hand, processes requests by using a thread pool. Long running tasks have the potential to exhaust that thread pool, so this may cause problems depending on the number of background tasks you start. Also, IIS makes no guarantees to keep long running tasks alive. If the web site is recycled, which happens regularly in a IIS default installation, your background task may die.
I have a application that will allow a user to upload a file. After the upload is complete there are a number of processing steps that must be done on the server (decompression, storage, validation, etc ...) thus the user will be informed sometime later by email when everything is complete.
I have seen a number of examples where the BackGroundWorker from System.ComponentModel is used to allow asynchronous work to be done in another thread. However, it seems like they lead to the user eventually getting a response. In our case no web response is necessary - the code can take up to 30 minutes to complete.
Is there another way to start a completely separate thread/process that will keep running even after the user completely closes their session?
If there is no need to respond immediately, you want to offload to some other process to do the heavy lifting. I would dump it in a DB, folder or post to a Message Queue. The worker processes (Windows Services?) would process the files, reading from the db, file system or queue. When the work is done, your worker process can call out to your ASP app (webhook style) if it needs to know when it's done. Just a suggestion.
Write a Windows Service that will run on the ASP.NET server. Architect it in such a way that it can accept and queue job requests. The queue will allow you to create the optimal number of threads in a ThreadPool for executing a subset of the queued jobs concurrently. Submit jobs to the Windows Service using either .NET Remoting, or WCF.
If processing can take up to 30 minutes, I'd recommend skipping using a background thread from the the web worker process and using something like a Windows service instead, or running a console application on a schedule using the Windows scheduler.
Once the file is uploaded, you would add it to a queue of some sort (either in a database, or using a message queuing system like RabbitMQ if you're feeling adventurous). Your web request could then return immediately and let the user know that the file is being processed, and the background service would pick the item up off the queue and continue the processing, emailing the user when it is complete.
I am developing an Application where I am submitting POST Requests to a .NET Web Service.
Current implementation is to process the request instantly and give response. In actual deployment, there will be huge amount of data that needs to be processed and thus the request must be processed offline.
What are the strategies that can have the task accomplished
Should I implement a Windows Service, or a scheduled task that invokes an application to perform the desired task.
This might be a good case for MSMQ. Your webservice can fill the queue with incoming data, and another process can read those messages and perform the necessary processing.
Here's a good overview of MSMQ:
http://www.primaryobjects.com/CMS/Article77.aspx
If you have so much data it cannot be processed in real-time, I would probably setup the service to do the following:
ProcessRecordViaPost
Create new record in "Queue" database with UniqueID, and all other info to be processed
Return UniqueID to client immediatly
ReadRecordViaGet
Check queue, if processed return data if not return status code (number of items in queue before it?)
I would also have a windows service that continually grabs the oldest item from the Queue, and processes it and moves on to the next oldest.