I'm simulating the comet live feed protocol for my site, so in my controller I'm adding:
while(nothing_new && before_timeout){
Thread.Sleep(1000);
}
But I noticed the whole website got slow after I added this feature. After debugging I concluded that when I call Thread.Sleep all the threads, even in other requests, are being blocked.
Why does Thread.Sleep block all threads, not just the current thread, and how should I deal with an issue like this?
What #Servy said is correct. In addition to his answer I would like to throw my 2 cents. I bet you are using ASP.NET Sessions and you are sending parallel requests from the same session (for example you are sending multiple AJAX requests). Except that the ASP.NET Session is not thread safe and you cannot have parallel requests from the same session. ASP.NET will simply serialize the calls and execute them sequentially.
That's why you are observing this blocking. It will block only requests from the same ASP.NET Session. If you send an HTTP requests from a different session it won't block. This behavior is by design and you can read more about it here.
ASP.NET Sessions are like a cancer and I recommend you disabling them as soon as you find out that they are being used in a web application:
<sessionState mode="Off" />
No more queuing. Now you've got a scalable application.
I concluded that when I call thread.sleep all the threads even in other requests are being blocked
That conclusion is incorrect. Thread.Sleep does not block any other thread, it only blocks the current thread. If multiple threads are all being blocked by this line of code then it is because all of those threads are hitting this line of code.
Related
Is async/await useful in a backend / webservice scenario?
Given the case there is only one thread for all requests / work. If this thread awaits a task it is not blocked but it also has no other work to do so it just idles. (It can't accept another request because the current execution is waiting for the task to resolve).
Given the case there is one thread per request / "work item". The Thread still idles because the other request is handled by another thread.
The only case I can imagine is doing two async operations at a the same time is like reading a file and sending an http request. But this sounds like a rare case. Is should read the file first and then post the content and not post something I didn't even read.
Given the case there is one thread per request / "work item". The Thread still idles because the other request is handled by another thread.
That's closer to reality but the server doesn't just keep adding threads ad infinitum - at some point it'll let requests queue if there's not a thread free to handle the request. And that's where freeing up a thread that's got no other work to usefully do at the moment starts winning.
It's hard to read your question without feeling that you misunderstand how webservers work and how async/await & threads work. To make it simple, just think of it like this: async/await is almost always good to use when you query an external resource (e.g. database, web service/API, system file, etc). If you follow this simple rule, you don't need to think too deeply about each situation.
However, when you read & learn more on these subjects and gain good experience, deep thinking becomes essential in each case because there are always exceptions to any rule, so there are scenarios where the overhead of using async/await & threads may transcends their benefits. For example, Microsoft decided not to use it for the logger in ASP.Net Core and there is even a comment about it in the source code.
In your case, the webserver uses much more threads that you seem to think and for much more reasons than you seem to think. Also when a thread is idling waiting for something, it cannot do anything else. What async/await do is that they untie the thread from the current awaited task so the thread can go back to the pool and do something else. When the awaited task is finished, a thread (can be a different thread) is pulled out of the pool to continue the job. You seem to understand this to some degree, but perhaps you just don't know what other things a thread in a webserver can do. Believe me, there is a lot to do.
Finally, remember that threads are generic workers, they can do anything. Webservers may have specialized threads for different tasks, but they fall into two or three categories. Threads can still do anything within their category. Webservers can even move threads to different categories when required. All of that is done for you so you don't need to think about it in most cases and you can just focus on freeing the threads so the webserver can do its job.
Given the case there is only one thread for all requests / work.
I challenge you to say that this is a very abstruse case. Even before multi core servers because standard, asp.net used 50+ threads per core.
If this thread awaits a task it is not blocked but it also has no other work to do so it
just idles.
No, it goes back into the pool handling other requests. MOST web services will love handling as many requests as possible with as few resources as possible. Servers only handling one client are a rare edge case. Extremely rare. Most web services will handle as many requests as the plenthora of clients throw at them.
I have searched high and low for this solution. Any insights will be highly appreciated.
The Situation: When there multiple PageMethod calls in a single page, each of the method call holds a lock on the Session object thus blocking. The PageMethod calls can be made asynchronous only with #Page directive is turned to False|ReadOnly
Findings: When the Page directive is default (read/write) but the session is not used anywhere on the page, the calls are not blocked. Any read or write in to the session at the page level blocks the pagemethod calls.
The Problem: Making EnableSessionState=ReadOnly at the #Page directive is very restrictive and don't want to take that route.
Can the pagemethod calls not block? and still access the session? (may be not write but just read)
I don't believe you can do this without implementing your own session provider. There's some info on this MSDN page.
ASP.NET applications are inherently
multithreaded. Because requests that
arrive in parallel are processed on
concurrent threads drawn from a thread
pool, it's possible that two or more
requests targeting the same session
will execute at the same time. (The
classic example is when a page
contains two frames, each targeting a
different ASPX in the same
application, causing the browser to
submit overlapping requests for the
two pages.) To avoid data collisions
and erratic behavior, the provider
"locks" the session when it begins
processing the first request, causing
other requests targeting the same
session to wait for the lock to come
free.
Because there's no harm in allowing
concurrent requests to perform
overlapping reads, the lock is
typically implemented as a
reader/writer lock-that is, one that
allows any number of threads to read a
session but that prevents overlapping
reads and writes as well as
overlapping writes.
Answer detailed in Sessions in Asynchronous design
Hi,
I have a ASP.NET application where I have added a Webservice that contains a "fire and forget" method. When this method is executed it will start a loop (0-99999) and for every loop it will read a xml file and save it to the database.
The problem is that this action will take a couple of hours and it usually ends with a Thread Aborted exception?
I have seen that you can increase the executionTimeout and this is how :
<httpRuntime executionTimeout="604800"/>
<compilation debug="true">
But this does not help?
I have also tried to add a thread.sleep within the loop. If I set it to 500 it will go half way and if I set <100 it will just go a couple of 1000 loops before the thread aborted exception?
How can I solve this?
Don't run the loop inside the web service. Instead, have it in a console app, a winforms app, or possibly even a windows service. Use the web service to start up the other program.
A web service is basically a web page, and asp.net web pages are not meant to host long running processes.
This article does not directly answer your question, but contains a snippet of info relevant to my answer:
http://msdn.microsoft.com/en-us/magazine/dd296718.aspx
However, when the duration of the
operation grows longer than the
typical ASP.NET session duration (20
minutes) or requires multiple actors
(as in my hiring example), ASP.NET
does not offer sufficient support. You
may recall that the ASP.NET worker
processes automatically shut down on
idle and periodically recycle
themselves. This will cause big
problems for long-running operations,
as state held within those processes
will be lost.
and the article is a good read, at any rate. It may offer ideas for you.
Not sure if this is 'the answer', but when you receive the web service call you could consider dispatching the action onto another thread. That could then run until completion. You would want to consider how you ensure that someone doesn't kick off two of these processes simultaneously though.
I have a ASP.NET application where I
have added a Webservice that contains
a "fire and forget" method. When this
method is executed it will start a
loop (0-99999) and for every loop it
will read a xml file and save it to
the database.
Lets not go into that I fhind this approach quite... hm... bad for many reasons (like a mid of the thing reset). I would queue the request, then return, and have a queue listener do the processing with transactional integrity.
Anyhow, what you CAN do is:
Queue a WorkItem for a wpool thread to pick things up.
Return immediately.
Besides that, web services and stuff like this are not a good place for hourly long running processes. Tick off a workflow, handle it separately.
I’m looking for the best way of using threads considering scalability and performance.
In my site I have two scenarios that need threading:
UI trigger: for example the user clicks a button, the server should read data from the DB and send some emails. Those actions take time and I don’t want the user request getting delayed. This scenario happens very frequently.
Background service: when the app starts it trigger a thread that run every 10 min, read from the DB and send emails.
The solutions I found:
A. Use thread pool - BeginInvoke:
This is what I use today for both scenarios.
It works fine, but it uses the same threads that serve the pages, so I think I may run into scalability issues, can this become a problem?
B. No use of the pool – ThreadStart:
I know starting a new thread takes more resources then using a thread pool.
Can this approach work better for my scenarios?
What is the best way to reuse the opened threads?
C. Custom thread pool:
Because my scenarios occurs frequently maybe the best way is to start a new thread pool?
Thanks.
I would personally put this into a different service. Make your UI action write to the database, and have a separate service which either polls the database or reacts to a trigger, and sends the emails at that point.
By separating it into a different service, you don't need to worry about AppDomain recycling etc - and you can put it on an entire different server if and when you want to. I think it'll give you a more flexible solution.
I do this kind of thing by calling a webservice, which then calls a method using a delegate asynchronously. The original webservice call returns a Guid to allow tracking of the processing.
For the first scenario use ASP.NET Asynchronous Pages. Async Pages are very good choice when it comes to scalability, because during async execution HTTP request thread is released and can be re-used.
I agree with Jon Skeet, that for second scenario you should use separate service - windows service is a good choice here.
Out of your three solutions, don't use BeginInvoke. As you said, it will have a negative impact on scalability.
Between the other two, if the tasks are truly background and the user isn't waiting for a response, then a single, permanent thread should do the job. A thread pool makes more sense when you have multiple tasks that should be executing in parallel.
However, keep in mind that web servers sometimes crash, AppPools recycle, etc. So if any of the queued work needs to be reliably executed, then moving it out of process is a probably a better idea (such as into a Windows Service). One way of doing that, which preserves the order of requests and maintains persistence, is to use Service Broker. You write the request to a Service Broker queue from your web tier (with an async request), and then read those messages from a service running on the same machine or a different one. You can also scale nicely that way by simply adding more instances of the service (or more threads in it).
In case it helps, I walk through using both a background thread and Service Broker in detail in my book, including code examples: Ultra-Fast ASP.NET.
I have a given list of URL's, and i make an HTTP web request object, and try to connect with it, i have an 'array' of url's, and i try to connect with each one. the objective is seeing which ones are out.
It already works, but one request only starts as soon as the last one ends, so it's quite a slow working, about two requests per second.
I was wondering if i should make about 5 threads working alongside in the background, that would make it 5 times faster, which is the desired speed (whithout overloading the shared internet band). But i have two problems:
1 - i don't even know IF it is the best solution for my problem.
2 - i've tried some times, but i'm new to .NET framework, and have never used multi-thread. so i don't know how i would do it easily.
I have a function start(), and it has a For that goes through all the url's checking existence.
data: VS 08, .NET 3.5, C#.
--[edit]--
Can anyone tell me (with code sample if possible) how to use five (not as many as possible) threads in backgroundworker ? what about starting right after the last processing ends ?
Research the BackgroundWorker Object. This will allow you to spawn multiple thread workers to instantiate asynchronous web requests. Then just use the ReportProgress method to report back on the status of each request.
http://msdn.microsoft.com/en-us/library/system.componentmodel.backgroundworker.aspx
http://www.dotneat.net/2009/02/10/BackgroundworkerExample.aspx
http://dotnetperls.com/backgroundworker
This is an ideal case for the BackgroundWorker class in .NET. It makes use of the thread pool to execute potentially long-running operations in the background so the caller does not have to deal with individual thread creation code.
I prefer to create a worker class that does the HttpWebRequest and start it in its own thread for each connection. Have your worker class use a callback method to signal that its finished. I use a queue of pending threads and a dictionary of active threads. Threads that prematurely finish due to restartable things like connection failures and timeouts can be put back into the queue. The thread's ManagedThreadId is handy for keeping track of threads.
You probably also want to increase your app's maximum number of connections by adding this to your app.config:
<system.net>
<connectionManagement>
<remove address="*"/>
<add address="*" maxconnection="10" />
</connectionManagement>
</system.net>
I chose 10 as an example - you'll have to experiment to see the effect on throughput, CPU utilization, and memory usage.
Rather than explicitly starting new threads yourself, use the HttpWebRequest.BeginGetResponse() method to execute each request asynchronously, and specifying a callback method:
http://www.developerfusion.com/code/4654/asynchronous-httpwebrequest/
Remember to call response.Close() in the callback method so that the maximum number of concurrent connections is not exceeded. This is preferable to increasing the maxconnection value the app.config, which is against RFC guidelines (max connections = 2).
As a rough guide, I can execute around 8 requests per second over a 5 second period using this method.