I am building a simple ASP.NET website, which will receive SMSes via a 3rd party SMS API (through URL push), and then the website will acknowledge the senders by sending SMSes back to them. I am saving the incoming SMSes in a database. I have implemented the logic as below:
public partial class _Default : System.Web.UI.Page
{
protected void Page_Load(object sender, EventArgs e)
{
//declaring variables to be used elsewhere, but in this same class. I have just this one class.
//Grabbing the URL parameters here (contents of SMS).
//Saving SMSes in the database. Database interaction happening only inside Page_Load.
//Call these functions
SomeLogicFunction();
SendSMSFunction();
}
SomeLogicFunction()
{
}
SendSMSFunction()
{
}
}
Now, I have read somewhere that ASP.NET takes care of multi-threading and similar aspects. So, if I have a simple website like this, do I need to take care of multi-threading? Or Page_Load function/ASP.NET pretty much handles it automatically?
If the answer is I don't need to do anything, then its all awesome!
But if I have to take care of multi-threading, can you please help with some tips on how should I approach? I expect few thousand SMSes.
Thanks.
By default, for each request that comes in, ASP.NET grab a thread pool thread, make a new instance of your Page class, and call the appropriate Page_Load and event functions on that thread. It will use multiple threads for multiple requests simultaneously. Just be careful that any state you have in static members and fields is being properly shared and synchronized. Of course, avoid the shared state if possible.
ASP.NET and IIS will begin to reject requests if there are enough requests already being processed, so make sure your processing times are sufficiently fast. If you run into bottlenecks, you can increase the number of requests happening in parallel. If you have really high load (say hundreds of requests a second), there are asynchrony APIs you can use to increase the number of in-flight requests even further. But start out simple, of course, and you'll probably be fine.
In this particular case I think you would benefit from using Multi-Threading in your ASP.Net application.
Let me explain first how Multi-threading works in ASP.Net
ASP.Net provides a fixed number of threads to be used to handle requests, when the max number of threads have been used, incoming requests will be placed in a queue blocking potentially the web application. If incoming requests keep arriving to the server, you would end up with a Service Unavailable error.
Therefore one simple way to increase scalability in your application is releasing ASAP the threads used by ASP.Net to handle your request.
A way to do this, is to create an Async page (you could also create an Async HttpHandler), when you create an Async page, the long-time consuming process is placed in another thread, releasing almost immediately the ASP.Net thread. When your process has finished, a new thread will be used to instantiate a new instance of your page (the whole page life cycle won't be run though, meaning this process is cheaper) and finally the response will be sent to your client.
As you can see, since you release your ASP.Net threads almost immediately new incoming requests can be handled by ASP.Net
As an example, consider the following answer:
https://stackoverflow.com/a/11525692/1268570
The real question to ask in your case is that: Are the functions SomeLogicFunction() and SendSMSFunction() blocking or non-blocking? (i.e. do they block furthur code until the SMSes are sent or it resumes your Page_Load() processing while continuing to send messages asynchronously?
In case even one function blocks, then you will "have to" implement multithreading because you must create a separate thread for those functions to run parallelly while your Page_Load() processing is going on. OTOA, if they are non-blocking then ASP.NET will take care of them by running in separate threads if required or mandated by the framework.
Related
I have two web pages . In one page I am uploading some file and processing that data which takes lot of time to be completed. While on other page I am simply rendering the data on database.
I have implemented this application in c# mvc.
My requirement is that once user upload the file the file processing start in background and user would be able to navigate other pages.
Can we achieve this through asynchronous controller.
You are saying that processing the data takes a lot of time. Using an asynchronous controller, you will free up the web server to serve other requests, however the request will complete in the same time as it would when invoked synchronously. (source: https://msdn.microsoft.com/en-us/library/ee728598%28v=vs.100%29.aspx)
If you do not want your user to wait, add a job queue to your stack, tell the user that you've accepted the file and are processing it, and notify him when the operation completes.
There are many job queue implementations available in .NET, a concrete suggestion would depend on whether you're running on "full" .NET or .NET Core.
Using async controllers won't do what you want here, although you should still use them as a first step. Async controllers will just free up the server threads so that more requests can be processed, without async any long running operations will block the threads they're using and stop other requests being processed. If there are enough long running threads then other client requests will get rejected.
You'll also (or instead) need to look into a different mechanism to process the file, for example the API action could just put the file in a folder and another (non-web service) process could monitor that folder and pick up new files to process. Alternatively you could look at queuing or message bus technology, this adds more complexity but also gives you safety around queue processing.
The other thing to consider is how you report validation issues or errors back to the uploading client, you could do some checks in the API action but you'll probably still need to consider how to notify clients when an error occurs during processing of a file. How you best do this will depend on your system.
I have searched high and low for this solution. Any insights will be highly appreciated.
The Situation: When there multiple PageMethod calls in a single page, each of the method call holds a lock on the Session object thus blocking. The PageMethod calls can be made asynchronous only with #Page directive is turned to False|ReadOnly
Findings: When the Page directive is default (read/write) but the session is not used anywhere on the page, the calls are not blocked. Any read or write in to the session at the page level blocks the pagemethod calls.
The Problem: Making EnableSessionState=ReadOnly at the #Page directive is very restrictive and don't want to take that route.
Can the pagemethod calls not block? and still access the session? (may be not write but just read)
I don't believe you can do this without implementing your own session provider. There's some info on this MSDN page.
ASP.NET applications are inherently
multithreaded. Because requests that
arrive in parallel are processed on
concurrent threads drawn from a thread
pool, it's possible that two or more
requests targeting the same session
will execute at the same time. (The
classic example is when a page
contains two frames, each targeting a
different ASPX in the same
application, causing the browser to
submit overlapping requests for the
two pages.) To avoid data collisions
and erratic behavior, the provider
"locks" the session when it begins
processing the first request, causing
other requests targeting the same
session to wait for the lock to come
free.
Because there's no harm in allowing
concurrent requests to perform
overlapping reads, the lock is
typically implemented as a
reader/writer lock-that is, one that
allows any number of threads to read a
session but that prevents overlapping
reads and writes as well as
overlapping writes.
Answer detailed in Sessions in Asynchronous design
Hi,
I have a ASP.NET application where I have added a Webservice that contains a "fire and forget" method. When this method is executed it will start a loop (0-99999) and for every loop it will read a xml file and save it to the database.
The problem is that this action will take a couple of hours and it usually ends with a Thread Aborted exception?
I have seen that you can increase the executionTimeout and this is how :
<httpRuntime executionTimeout="604800"/>
<compilation debug="true">
But this does not help?
I have also tried to add a thread.sleep within the loop. If I set it to 500 it will go half way and if I set <100 it will just go a couple of 1000 loops before the thread aborted exception?
How can I solve this?
Don't run the loop inside the web service. Instead, have it in a console app, a winforms app, or possibly even a windows service. Use the web service to start up the other program.
A web service is basically a web page, and asp.net web pages are not meant to host long running processes.
This article does not directly answer your question, but contains a snippet of info relevant to my answer:
http://msdn.microsoft.com/en-us/magazine/dd296718.aspx
However, when the duration of the
operation grows longer than the
typical ASP.NET session duration (20
minutes) or requires multiple actors
(as in my hiring example), ASP.NET
does not offer sufficient support. You
may recall that the ASP.NET worker
processes automatically shut down on
idle and periodically recycle
themselves. This will cause big
problems for long-running operations,
as state held within those processes
will be lost.
and the article is a good read, at any rate. It may offer ideas for you.
Not sure if this is 'the answer', but when you receive the web service call you could consider dispatching the action onto another thread. That could then run until completion. You would want to consider how you ensure that someone doesn't kick off two of these processes simultaneously though.
I have a ASP.NET application where I
have added a Webservice that contains
a "fire and forget" method. When this
method is executed it will start a
loop (0-99999) and for every loop it
will read a xml file and save it to
the database.
Lets not go into that I fhind this approach quite... hm... bad for many reasons (like a mid of the thing reset). I would queue the request, then return, and have a queue listener do the processing with transactional integrity.
Anyhow, what you CAN do is:
Queue a WorkItem for a wpool thread to pick things up.
Return immediately.
Besides that, web services and stuff like this are not a good place for hourly long running processes. Tick off a workflow, handle it separately.
I’m looking for the best way of using threads considering scalability and performance.
In my site I have two scenarios that need threading:
UI trigger: for example the user clicks a button, the server should read data from the DB and send some emails. Those actions take time and I don’t want the user request getting delayed. This scenario happens very frequently.
Background service: when the app starts it trigger a thread that run every 10 min, read from the DB and send emails.
The solutions I found:
A. Use thread pool - BeginInvoke:
This is what I use today for both scenarios.
It works fine, but it uses the same threads that serve the pages, so I think I may run into scalability issues, can this become a problem?
B. No use of the pool – ThreadStart:
I know starting a new thread takes more resources then using a thread pool.
Can this approach work better for my scenarios?
What is the best way to reuse the opened threads?
C. Custom thread pool:
Because my scenarios occurs frequently maybe the best way is to start a new thread pool?
Thanks.
I would personally put this into a different service. Make your UI action write to the database, and have a separate service which either polls the database or reacts to a trigger, and sends the emails at that point.
By separating it into a different service, you don't need to worry about AppDomain recycling etc - and you can put it on an entire different server if and when you want to. I think it'll give you a more flexible solution.
I do this kind of thing by calling a webservice, which then calls a method using a delegate asynchronously. The original webservice call returns a Guid to allow tracking of the processing.
For the first scenario use ASP.NET Asynchronous Pages. Async Pages are very good choice when it comes to scalability, because during async execution HTTP request thread is released and can be re-used.
I agree with Jon Skeet, that for second scenario you should use separate service - windows service is a good choice here.
Out of your three solutions, don't use BeginInvoke. As you said, it will have a negative impact on scalability.
Between the other two, if the tasks are truly background and the user isn't waiting for a response, then a single, permanent thread should do the job. A thread pool makes more sense when you have multiple tasks that should be executing in parallel.
However, keep in mind that web servers sometimes crash, AppPools recycle, etc. So if any of the queued work needs to be reliably executed, then moving it out of process is a probably a better idea (such as into a Windows Service). One way of doing that, which preserves the order of requests and maintains persistence, is to use Service Broker. You write the request to a Service Broker queue from your web tier (with an async request), and then read those messages from a service running on the same machine or a different one. You can also scale nicely that way by simply adding more instances of the service (or more threads in it).
In case it helps, I walk through using both a background thread and Service Broker in detail in my book, including code examples: Ultra-Fast ASP.NET.
Using C# ASP.NET I want to program a queue. I want to have X number of a process. When it finishes it should take the next item on the list and process it. I figure the most simple way is to insert and delete it from an SQL database. My problem is:
How do I start this when I add the first item? Do I launch a separate thread? AFAIK every connection to my development environment and server is its own thread? I would need to lock something launch a thread to process the list then unlock and let the thead keep going until its done? So... 1) Should I be launching threads? If so, what kind? (I haven't done any multithreading in C# yet) 2) Should I have a static mutex in my ASP.NET project? And lock it when launching threads? (are static variables still shared across ASP connections/threads correct?) Or should I not be doing this and launch them a different way?
NOTE: I may want to launch 2 processes instead of 1 and I may want to launch other processes for other things (example 2 FFmpeg + 5 ImageMagick.)
A typical ASP.NET application will actually be sharing a thread for multiple requests (although it is possible to configure it to use one thread per request). I wouldn't recommend changing it to use one thread per request though.
Also, any work being done during an ASP.NET request has to be completed by the time you finish returning your response to the client, or it will be terminated. This includes any child threads you spawn.
Your best bet here is to set up MSMQ (or perhaps even using the SQS from Amazon) and have a windows service that pulls messages off the queue and processes them. The process would look like:
First off, it sounds like what you really want is a Windows Service.
That said, if you're committed to using ASP.NET for this, the following might work:
Make a single .aspx page whose sole purpose is to process one unit of work.
Have another page (HTML will do) that uses JavaScript to asynchronously load your first page (using something like jQuery's $.load() method).
When $.load() returns, you'll know the job is complete and you can make another request to process the next job.
Optionally, you can put something into the returned page to indicate whether or not there are any remaining units of work left in the queue. This would allow you to throttle back on the client-side when there's not any work to be done.
Client-Side Example
<script type="text/javascript">
$(document).ready(function() {
processJob();
});
function processJob() {
$('#result').load("ProcessOneJob.aspx", function() {
// called when ProcessOneJob.aspx comes back
processJob();
});
}
</script>
Pros to this approach:
Relatively simple to implement
No need to deal with threads/locking
No service to install
Cons
Relies entirely on having a machine somewhere with a browser open, pointing at this page.
A brainstorm answer. I don't know if it will work.
The issue is threads get terminated when a request is finished so create a thread outside of a request in Application_Start to avoid the problem.
To keep everything organized and simple; Have a master class that acts as a utility controlling the amount of each process you would like to launch and to do the actual launching. Call the class in void Application_Start(object sender, EventArgs e) to create the initial master thread which will launch any process (or threads) you'll need. Allow it to put itself to sleep and use the utility methods to wake it up and pass messages. Then handle the rest as needed :)