PageMethods and Session - c#

I have searched high and low for this solution. Any insights will be highly appreciated.
The Situation: When there multiple PageMethod calls in a single page, each of the method call holds a lock on the Session object thus blocking. The PageMethod calls can be made asynchronous only with #Page directive is turned to False|ReadOnly
Findings: When the Page directive is default (read/write) but the session is not used anywhere on the page, the calls are not blocked. Any read or write in to the session at the page level blocks the pagemethod calls.
The Problem: Making EnableSessionState=ReadOnly at the #Page directive is very restrictive and don't want to take that route.
Can the pagemethod calls not block? and still access the session? (may be not write but just read)

I don't believe you can do this without implementing your own session provider. There's some info on this MSDN page.
ASP.NET applications are inherently
multithreaded. Because requests that
arrive in parallel are processed on
concurrent threads drawn from a thread
pool, it's possible that two or more
requests targeting the same session
will execute at the same time. (The
classic example is when a page
contains two frames, each targeting a
different ASPX in the same
application, causing the browser to
submit overlapping requests for the
two pages.) To avoid data collisions
and erratic behavior, the provider
"locks" the session when it begins
processing the first request, causing
other requests targeting the same
session to wait for the lock to come
free.
Because there's no harm in allowing
concurrent requests to perform
overlapping reads, the lock is
typically implemented as a
reader/writer lock-that is, one that
allows any number of threads to read a
session but that prevents overlapping
reads and writes as well as
overlapping writes.

Answer detailed in Sessions in Asynchronous design

Related

thread.sleep in asp.net

I'm simulating the comet live feed protocol for my site, so in my controller I'm adding:
while(nothing_new && before_timeout){
Thread.Sleep(1000);
}
But I noticed the whole website got slow after I added this feature. After debugging I concluded that when I call Thread.Sleep all the threads, even in other requests, are being blocked.
Why does Thread.Sleep block all threads, not just the current thread, and how should I deal with an issue like this?
What #Servy said is correct. In addition to his answer I would like to throw my 2 cents. I bet you are using ASP.NET Sessions and you are sending parallel requests from the same session (for example you are sending multiple AJAX requests). Except that the ASP.NET Session is not thread safe and you cannot have parallel requests from the same session. ASP.NET will simply serialize the calls and execute them sequentially.
That's why you are observing this blocking. It will block only requests from the same ASP.NET Session. If you send an HTTP requests from a different session it won't block. This behavior is by design and you can read more about it here.
ASP.NET Sessions are like a cancer and I recommend you disabling them as soon as you find out that they are being used in a web application:
<sessionState mode="Off" />
No more queuing. Now you've got a scalable application.
I concluded that when I call thread.sleep all the threads even in other requests are being blocked
That conclusion is incorrect. Thread.Sleep does not block any other thread, it only blocks the current thread. If multiple threads are all being blocked by this line of code then it is because all of those threads are hitting this line of code.

Multi-threading required in ASP.NET websites?

I am building a simple ASP.NET website, which will receive SMSes via a 3rd party SMS API (through URL push), and then the website will acknowledge the senders by sending SMSes back to them. I am saving the incoming SMSes in a database. I have implemented the logic as below:
public partial class _Default : System.Web.UI.Page
{
protected void Page_Load(object sender, EventArgs e)
{
//declaring variables to be used elsewhere, but in this same class. I have just this one class.
//Grabbing the URL parameters here (contents of SMS).
//Saving SMSes in the database. Database interaction happening only inside Page_Load.
//Call these functions
SomeLogicFunction();
SendSMSFunction();
}
SomeLogicFunction()
{
}
SendSMSFunction()
{
}
}
Now, I have read somewhere that ASP.NET takes care of multi-threading and similar aspects. So, if I have a simple website like this, do I need to take care of multi-threading? Or Page_Load function/ASP.NET pretty much handles it automatically?
If the answer is I don't need to do anything, then its all awesome!
But if I have to take care of multi-threading, can you please help with some tips on how should I approach? I expect few thousand SMSes.
Thanks.
By default, for each request that comes in, ASP.NET grab a thread pool thread, make a new instance of your Page class, and call the appropriate Page_Load and event functions on that thread. It will use multiple threads for multiple requests simultaneously. Just be careful that any state you have in static members and fields is being properly shared and synchronized. Of course, avoid the shared state if possible.
ASP.NET and IIS will begin to reject requests if there are enough requests already being processed, so make sure your processing times are sufficiently fast. If you run into bottlenecks, you can increase the number of requests happening in parallel. If you have really high load (say hundreds of requests a second), there are asynchrony APIs you can use to increase the number of in-flight requests even further. But start out simple, of course, and you'll probably be fine.
In this particular case I think you would benefit from using Multi-Threading in your ASP.Net application.
Let me explain first how Multi-threading works in ASP.Net
ASP.Net provides a fixed number of threads to be used to handle requests, when the max number of threads have been used, incoming requests will be placed in a queue blocking potentially the web application. If incoming requests keep arriving to the server, you would end up with a Service Unavailable error.
Therefore one simple way to increase scalability in your application is releasing ASAP the threads used by ASP.Net to handle your request.
A way to do this, is to create an Async page (you could also create an Async HttpHandler), when you create an Async page, the long-time consuming process is placed in another thread, releasing almost immediately the ASP.Net thread. When your process has finished, a new thread will be used to instantiate a new instance of your page (the whole page life cycle won't be run though, meaning this process is cheaper) and finally the response will be sent to your client.
As you can see, since you release your ASP.Net threads almost immediately new incoming requests can be handled by ASP.Net
As an example, consider the following answer:
https://stackoverflow.com/a/11525692/1268570
The real question to ask in your case is that: Are the functions SomeLogicFunction() and SendSMSFunction() blocking or non-blocking? (i.e. do they block furthur code until the SMSes are sent or it resumes your Page_Load() processing while continuing to send messages asynchronously?
In case even one function blocks, then you will "have to" implement multithreading because you must create a separate thread for those functions to run parallelly while your Page_Load() processing is going on. OTOA, if they are non-blocking then ASP.NET will take care of them by running in separate threads if required or mandated by the framework.

Thread Aborted?

Hi,
I have a ASP.NET application where I have added a Webservice that contains a "fire and forget" method. When this method is executed it will start a loop (0-99999) and for every loop it will read a xml file and save it to the database.
The problem is that this action will take a couple of hours and it usually ends with a Thread Aborted exception?
I have seen that you can increase the executionTimeout and this is how :
<httpRuntime executionTimeout="604800"/>
<compilation debug="true">
But this does not help?
I have also tried to add a thread.sleep within the loop. If I set it to 500 it will go half way and if I set <100 it will just go a couple of 1000 loops before the thread aborted exception?
How can I solve this?
Don't run the loop inside the web service. Instead, have it in a console app, a winforms app, or possibly even a windows service. Use the web service to start up the other program.
A web service is basically a web page, and asp.net web pages are not meant to host long running processes.
This article does not directly answer your question, but contains a snippet of info relevant to my answer:
http://msdn.microsoft.com/en-us/magazine/dd296718.aspx
However, when the duration of the
operation grows longer than the
typical ASP.NET session duration (20
minutes) or requires multiple actors
(as in my hiring example), ASP.NET
does not offer sufficient support. You
may recall that the ASP.NET worker
processes automatically shut down on
idle and periodically recycle
themselves. This will cause big
problems for long-running operations,
as state held within those processes
will be lost.
and the article is a good read, at any rate. It may offer ideas for you.
Not sure if this is 'the answer', but when you receive the web service call you could consider dispatching the action onto another thread. That could then run until completion. You would want to consider how you ensure that someone doesn't kick off two of these processes simultaneously though.
I have a ASP.NET application where I
have added a Webservice that contains
a "fire and forget" method. When this
method is executed it will start a
loop (0-99999) and for every loop it
will read a xml file and save it to
the database.
Lets not go into that I fhind this approach quite... hm... bad for many reasons (like a mid of the thing reset). I would queue the request, then return, and have a queue listener do the processing with transactional integrity.
Anyhow, what you CAN do is:
Queue a WorkItem for a wpool thread to pick things up.
Return immediately.
Besides that, web services and stuff like this are not a good place for hourly long running processes. Tick off a workflow, handle it separately.

ASP.NET Threading: should I use the pool for DB and Emails actions?

I’m looking for the best way of using threads considering scalability and performance.
In my site I have two scenarios that need threading:
UI trigger: for example the user clicks a button, the server should read data from the DB and send some emails. Those actions take time and I don’t want the user request getting delayed. This scenario happens very frequently.
Background service: when the app starts it trigger a thread that run every 10 min, read from the DB and send emails.
The solutions I found:
A. Use thread pool - BeginInvoke:
This is what I use today for both scenarios.
It works fine, but it uses the same threads that serve the pages, so I think I may run into scalability issues, can this become a problem?
B. No use of the pool – ThreadStart:
I know starting a new thread takes more resources then using a thread pool.
Can this approach work better for my scenarios?
What is the best way to reuse the opened threads?
C. Custom thread pool:
Because my scenarios occurs frequently maybe the best way is to start a new thread pool?
Thanks.
I would personally put this into a different service. Make your UI action write to the database, and have a separate service which either polls the database or reacts to a trigger, and sends the emails at that point.
By separating it into a different service, you don't need to worry about AppDomain recycling etc - and you can put it on an entire different server if and when you want to. I think it'll give you a more flexible solution.
I do this kind of thing by calling a webservice, which then calls a method using a delegate asynchronously. The original webservice call returns a Guid to allow tracking of the processing.
For the first scenario use ASP.NET Asynchronous Pages. Async Pages are very good choice when it comes to scalability, because during async execution HTTP request thread is released and can be re-used.
I agree with Jon Skeet, that for second scenario you should use separate service - windows service is a good choice here.
Out of your three solutions, don't use BeginInvoke. As you said, it will have a negative impact on scalability.
Between the other two, if the tasks are truly background and the user isn't waiting for a response, then a single, permanent thread should do the job. A thread pool makes more sense when you have multiple tasks that should be executing in parallel.
However, keep in mind that web servers sometimes crash, AppPools recycle, etc. So if any of the queued work needs to be reliably executed, then moving it out of process is a probably a better idea (such as into a Windows Service). One way of doing that, which preserves the order of requests and maintains persistence, is to use Service Broker. You write the request to a Service Broker queue from your web tier (with an async request), and then read those messages from a service running on the same machine or a different one. You can also scale nicely that way by simply adding more instances of the service (or more threads in it).
In case it helps, I walk through using both a background thread and Service Broker in detail in my book, including code examples: Ultra-Fast ASP.NET.

multi threading a web application

I know there are many cases which are good cases to use multi-thread in an application, but when is it the best to multi-thread a .net web application?
A web application is almost certainly already multi threaded by the hosting environment (IIS etc). If your page is CPU-bound (and want to use multiple cores), then arguably multiple threads is a bad idea, as when your system is under load you are already using them.
The time it might help is when you are IO bound; for example, you have a web-page that needs to talk to 3 external web-services, talk to a database, and write a file (all unrelated). You can do those in parallel on different threads (ideally using the inbuilt async operations, to maximise completion-port usage) to reduce the overall processing time - all without impacting local CPU overly much (here the real delay is on the network).
Of course, in such cases you might also do better by simply queuing the work in the web application, and having a separate service dequeue and process them - but then you can't provide an immediate response to the caller (they'd need to check back later to verify completion etc).
IMHO you should avoid the use of multithread in a web based application.
maybe a multithreaded application could increase the performance in a standard app (with the right design), but in a web application you may want to keep a high throughput instead of speed.
but if you have a few concurrent connection maybe you can use parallel thread without a global performance degradation
Multithreading is a technique to provide a single process with more processing time to allow it to run faster. It has more threads thus it eats more CPU cycles. (From multiple CPU's, if you have any.) For a desktop application, this makes a lot of sense. But granting more CPU cycles to a web user would take away the same cycles from the 99 other users who are doing requests at the same time! So technically, it's a bad thing.
However, a web application might use other services and processes that are using multiple threads. Databases, for example, won't create a separate thread for every user that connects to them. They limit the number of threads to just a few, adding connections to a connection pool for faster usage. As long as there are connections available or pooled, the user will have database access. When the database runs out of connections, the user will have to wait.
So, basically, the use of multiple threads could be used for web applications to reduce the number of active users at a specific moment! It allows the system to share resources with multiple users without overloading the resource. Instead, users will just have to stand in line before it's their turn.
This would not be multi-threading in the web application itself, but multi-threading in a service that is consumed by the web application. In this case, it's used as a limitation by only allowing a small amount of threads to be active.
In order to benefit from multithreading your application has to do a significant amount of work that can be run in parallel. If this is not the case, the overhead of multithreading may very well top the benefits.
In my experience most web applications consist of a number of short running methods, so apart from the parallelism already offered by the hosting environment, I would say that it is rare to benefit from multithreading within the individual parts of a web application. There are probably examples where it will offer a benefit, but my guess is that it isn't very common.
ASP.NET is already capable of spawning several threads for processing several requests in parallel, so for simple request processing there is rarely a case when you would need to manually spawn another thread. However, there are a few uncommon scenarios that I have come across which warranted the creation of another thread:
If there is some operation that might take a while and can run in parallel with the rest of the page processing, you might spawn a secondary thread there. For example, if there was a webservice that you had to poll as a result of the request, you might spawn another thread in Page_Init, and check for results in Page_PreRender (waiting if necessary). Though it's still a question if this would be a performance benefit or not - spawning a thread isn't cheap and the time between a typical Page_Init and Page_Prerender is measured in milliseconds anyway. Keeping a thread pool for this might be a little bit more efficient, and ASP.NET also has something called "asynchronous pages" that might be even better suited for this need.
If there is a pool of resources that you wish to clean up periodically. For example, imagine that you are using some weird DBMS that comes with limited .NET bindings, but there is no pooling support (this was my case). In that case you might want to implement the DB connection pool yourself, and this would necessitate a "cleaner thread" which would wake up, say, once a minute and check if there are connections that have not been used for a long while (and thus can be closed off).
Another thing to keep in mind when implementing your own threads in ASP.NET - ASP.NET likes to kill off its processes if they have been inactive for a while. Thus you should not rely on your thread staying alive forever. It might get terminated at any moment and you better be ready for it.

Categories