I use quartz in my asp website, i initialize the scheduler in application_start method and shutdown in application_end method ,my trigger will fire everyday but I found that my scheduler will automatically shutdown if there are not request for a while ,so my background works will not triggered,are there any better way to keep the scheduler life long and only shutdown when the server stopped?
For better knowledge sharing:
There are two suggestions:
http://www.codeproject.com/Articles/12117/Simulate-a-Windows-Service-using-ASP-NET-to-run-sc
http://weblog.west-wind.com/posts/2007/May/10/Forcing-an-ASPNET-Application-to-stay-alive
In general, if you need reliable scheduling, you should not do it within a web site.
As you've found, the worker process will be shut down after a period of time. Even if you force the worker process to run all the time, there are conditions that may cause it to terminate as well. It's just not a good idea.
Instead, you should write a Windows Service and run quartz.net in that.
If you cannot install services (say you're in a shared hosting environment), then your options are more limited.
There is an IIS configuration that allows worker processes to stay on all the time. I found this setting through another SO answer link.
Edit C:\Windows\System32\inetsrv\config\applicationHost.config to include:
<applicationPools>
<add name="MyAppWorkerProcess" managedRuntimeVersion="v4.0" startMode="AlwaysRunning" />
</applicationPools>
Scott Guthrie (Microsoft Product Manager for .NET) has answerered a question directly related to the OP's question (link).
#Dominic Pettifer,
If I set startMode="AlwaysRunning" does this mean the web app will
'never' shut down and will always be kept running, even with no
traffic hitting the site for a long period (unless of course it's
manually shut down, or server is switched off/crashes etc.)? The
reason I ask is because I like to run background threads/services on
the IIS ASPNET worker process instead of using Windows Services (we
deal with clients with lots of security restrictions on their servers
which makes running a Windows Service difficalt or impossible).
Normally I have to devise something that hits the website periodically
to keep the ASPNET worker process alive and stop it from shutting
down.
This should mean that the application and worker process is always
running - so I think that does indeed handle your scenario well for
you.
Hope this helps,
Scott
I wondered the same thing. Ultimately, whilst I agree with the general consensus, I wanted to see how it could be done, because I've been in a similar situation myself, where Windows Services were not available to me.
All I did was create a new job which, when executed sends a HTTP request to the application itself. For me, I pointed it at a page which simply contained #Datetime.Now.ToString().
The action of sending a HTTP request to itself should be enough to keep the scheduler (and parent worker process) alive.
It does not however stop the application from being stopped/recycled without warning. If you wanted a way to handle that, then you'd likely need more than one site running which pings both itself and the other site. This way, if one site goes down, the other can hit it (assuming it's started) to bring it back.
A much simpler way to is use a quality assurance checker. Using the tool Zapix I was able to schedule my website to be quality checked every 20 minutes. Zapix simply visited the site and received and http response. By using Zapix, it mimicked the functionality of manually visiting the website to trigger the emails. That way, the Application Pool threads are constantly woke.
Related
I want to design an application that serves a REST API and also has a continuous process running that connects to websockets and processes the incoming data.
I have two approaches in mind:
Create a Windows Service with Kestrel running on one thread and the websocket listener on another. The API would be made accessible via a IIS reverse proxy.
Create the REST API with ASP.NET directly hosted in IIS and utilize the BackgroundService Class for the websocket listener as described here.
As I am new to the Windows Ecosystem I'd like to know if one of the approaches is more suitable or if I'm going about it the wrong way.
My understanding is that the Windows service approach should just work, but it seems more elaborate.
I'm unsure about the BackgroundService approach. The background process should really run 24/7. Are BackgroundServices designed for this? The docs always talk about long running tasks, but does it also work for infinite running ones with restart on failure etc.?
I'd recommend to host the continuous process in a Windows service as you have much more control over the lifecycle.
With a BackgroundService hosted on IIS, the process is controlled by IIS. In this case, it might be recycler from time to time or terminated of idle for some time. You can control this behavior with some configuration settings, but especially in combination with ASP.NET Core, the IIS process might be running, but the underlying Kestrel service is only started when a request hits the website.
If the two components do not rely on each other, you could also split them and have the best of both worlds, the web application hosted in IIS and the websocket listener running in a Windows service
To be precise: I have a .NET web forms system. I need a way to check some values and perform tasks, depending on these values in periodic manner. Let's say: Every month I have to check if my customers credit cards are still valid. There some other tasks/checking in short periods.
What is the best approach to the subject. I thought about Windows Service but I read about WCF. Please advise what is the modern and good way to solve this task. I'm thinking about .NET 4.0.
WCF is just an interface that can run in either Windows Service or IIS. You use this WCF interface to trigger some synchronous or asynchronous actions.
Your case sounds like you want a Windows Service on timer to perform validation on data stored in a data base or file.
If you want to start a process on demand then adding a WCF endpoint might be useful, if the timer approach is good enough, then you need not bother with WCF.
References for hosting WCF in Windows Process
microsoft.com
codeproject.com
As you've surmised, a Windows Service is a good approach to this problem.
Similarly, you could write a Console application and have it run via a scheduled task in Windows.
It depends on how your backend works and what you're most familiar with really.
Writing a console application is very simple to do, but it's not perhaps the best approach as you need to ensure that a user is logged on so that the scheduled task can run.
A service is slightly more complicated to implement, but it has the benefits of being integrated into the OS properly.
MSDN has a good guide to writing a service in C#, and you don't necessarily need WCF:
http://msdn.microsoft.com/en-us/library/aa984464(v=vs.71).aspx
You could use something like quartz.net. See link - http://quartznet.sourceforge.net/
If you have limited control over server (i.e. only regular HTTP pages allowed):
You can also use a web page to trigger the task - this way you don't need any additional components installed on server. Than have some other machine configure periodic requests to the page(s) that trigger tasks. Make sure that tasks are restartable and short enough - so you can finish each on regular page request. Page can respond with "next task to run" data so your client page can continue pinging server till whole operation is finished.
Note: Trying to run long running tasks inside web service process is unreliable due to app pool/app domain recycles.
I'm programming a monitoring application that needs to display the state of several windows services. In the current version, I can know whether a service is Running, Stopped, Suspended or in one of the pending states. That's good, but I'm wondering if there is a way to test if a service is actually responding? I guess it can be in a running state but not responding at all!
I am using the ServiceController class from System.ServiceProcess. Do you think that if a service is not responding, the ServiceController.Status would return an exception?
How would you approach the problem?
Thanks
EDIT
Seems that: ServiceController.Status can return 2 types of exceptions:
System.ComponentModel.Win32Exception: An error occurred when accessing a system API.
System.InvalidOperationException: The service does not exist as an installed service.
Nothing about reactivity.
This might be obvious, but have you tried talking to the service?
There's no common way to talk to a service, so there is no way Windows can interrogate whether the service is still responding as normal. It is perfectly normal for a service to go into a complete sleep waiting for external I/O to happen, and thus Windows would not get a response while the service is actually alive and functioning exactly as designed.
The only way is to actually send a request to it, and wait for the response, and for that you need some inter-process communication channel, like:
Network
Named pipes
Messages
Basically, if you need to determine if a service is able to respond, you need to check if it is responding.
The service controller types and APIs can only provide information on the basis of the service's response to those APIs.
E.g. you can create a service which responds to those APIs correctly, but provides no functionality on even numbered hours.
In the end you need to define "responsive" in terms of the services functionality (e.g. a batch processor is processing batches) and provide a mechanism (A2A API, WMI, Performance Counters) to surface this.
I have an application built that hits a third party company's web service in order to create an email account after a customer clicks a button. However, sometimes the web service takes longer than 1 minute to respond, which is way to long for my customers to be sitting there waiting for a response.
I need to devise a way to set up some sort of queuing service external from the web site. This way I can add the web service action to the queue and advise the customer it may take up to 2 minutes to create the account.
I'm curious of the best way to achieve this. My initial thought is to request the actions via a database table which will be checked on a regular basis by a Console app which is run via Windows Scheduled tasks.
Any issues with that method?
Is there a better method you can think of?
I would use MSMQ, it may be an older technology but it is perfect for the scenario you describe.
Create a WCF service to manage the queue and it's actions. On the service expose a method to add an action to the queue.
This way the queue is completely independent of your website.
What if you use a combination of AJAX and a Windows Service?
On the website side: When the person chooses to create an e-mail account, you add the request to a database table. If they want to wait, provide a web page that uses AJAX to check every so often (10 seconds?) whether their account has been created or not. If it's an application-style website, you could let them continue working and pop up a message once the account is created. If they don't want to wait, they close the page or browse to another and maybe get an e-mail once it's done.
On the processing side: Create a Windows service that checks the table for new requests. Once it's done with a request it has to somehow communicate back to the user, maybe by setting a status flag on the request. This is what the AJAX call would look for. You could send an e-mail at this point too.
If you use a scheduled task with a console app instead of a Windows service, you risk having multiple instances running at the same time. You would have to implement some sort of locking mechanism (at the app or request level) to prevent processing the same thing twice.
What about the Queue Class or Generic Queue Class?
Unfortunetally, your question is too vague to answer with any real detail. If this is something you want managed outside the primary application then a Windows Service would be a little more appropriate then creating a Console... From an integration and lifecycle management perspective this provides a nice foudation for adding other features (e.g. Performance Counters, Hosted Management Services in WCF, Remoting, etc...). MSMQ is great although there is a bit more involved in deployment. If you are willing to invest the time, there are a lot of advantanges to using MSMQ. If you really want to create your own point to point queue, then there are a ton of examples online that can serve as an example. Here is one, http://www.smelser.net/blog/page/SmellyQueue-(Durable-Queue).aspx.
So it's easy to load balance an ASP.NET web application. You set up a load balancer between two servers, and if the web server isn't responding on Port 80, it won't receive requests.
Are there any proven techniques for doing this for a C# console application or Windows service that takes actions of its own volition? Are there any frameworks for knowing if peer processes are alive or dead, doing heartbeats, etc?
I've been experimenting a bit with NServiceBus and it seems like, for certain kinds of applications, it would help to have most of the work done as a response to an event, which makes it more like a web application, actually, and therefore easier to scale and load balance with multiple processes, but I feel like that's a half-baked solution since in most cases there usually needs to be some concept of a "master" process that's responsible for getting work started.
NServiceBus does indeed handle this for you with its Distributor process (described here: http://docs.particular.net/nservicebus/scalability-and-ha/distributor/). The generic host that comes with NServiceBus allows you to have the exact same code and configuration run both as a console app and as a windows service (described here: http://docs.particular.net/nservicebus/hosting/nservicebus-host/).
You can have this for events as well as for regular command messages.
If you want a "master" process to decide what to do when all the load-balanced work completes, that is provided to you in the form of the saga infrastructure (described here: http://docs.particular.net/nservicebus/sagas/ and demonstrated in the Manufacturing sample that comes with NServiceBus).
In short, you should pretty much be covered.