To be precise: I have a .NET web forms system. I need a way to check some values and perform tasks, depending on these values in periodic manner. Let's say: Every month I have to check if my customers credit cards are still valid. There some other tasks/checking in short periods.
What is the best approach to the subject. I thought about Windows Service but I read about WCF. Please advise what is the modern and good way to solve this task. I'm thinking about .NET 4.0.
WCF is just an interface that can run in either Windows Service or IIS. You use this WCF interface to trigger some synchronous or asynchronous actions.
Your case sounds like you want a Windows Service on timer to perform validation on data stored in a data base or file.
If you want to start a process on demand then adding a WCF endpoint might be useful, if the timer approach is good enough, then you need not bother with WCF.
References for hosting WCF in Windows Process
microsoft.com
codeproject.com
As you've surmised, a Windows Service is a good approach to this problem.
Similarly, you could write a Console application and have it run via a scheduled task in Windows.
It depends on how your backend works and what you're most familiar with really.
Writing a console application is very simple to do, but it's not perhaps the best approach as you need to ensure that a user is logged on so that the scheduled task can run.
A service is slightly more complicated to implement, but it has the benefits of being integrated into the OS properly.
MSDN has a good guide to writing a service in C#, and you don't necessarily need WCF:
http://msdn.microsoft.com/en-us/library/aa984464(v=vs.71).aspx
You could use something like quartz.net. See link - http://quartznet.sourceforge.net/
If you have limited control over server (i.e. only regular HTTP pages allowed):
You can also use a web page to trigger the task - this way you don't need any additional components installed on server. Than have some other machine configure periodic requests to the page(s) that trigger tasks. Make sure that tasks are restartable and short enough - so you can finish each on regular page request. Page can respond with "next task to run" data so your client page can continue pinging server till whole operation is finished.
Note: Trying to run long running tasks inside web service process is unreliable due to app pool/app domain recycles.
Related
I'm developing a Client/Server applications (C#, Winforms for GUI).
We have a module to perform tasks to import / export data from the database to other external sources. Activities are managed by users using any client station. The next step will be to allow the schedule to automatically execute tasks (eg, X start time and repetition every hour, daily or weekly or monthly time, and so on).
Each tasks allows to import or export a large amount of data with any datasources (excel. access or dbms), therefore they are long-running activities.
Now the DLL that implements this logic is distributed to each client station. This is not a good solution because we have to install all the potential requirements in each client (for example driver ado / oledb / odbc for all managed dbms).
I have to move this logic to the server station. In each client I want to see the tasks progress, stop or start any tasks, or change the schedule table and restart the process.
I'm considering what is the best solution. Realize a Web API or WCF. Probably WCF because service-oriented, but I've seen projects or articles with Web APIs combined with libraries like Quartz or Hangfire.
I'm also considering whether it is better to use a Windows service and to host WCF inside it.
What is the best solution? or are there any other solutions I'm not considering?
Thank you
EDIT:
From any client workstation the user can schedule all tasks to be executed depending by the applied settings (frequence time, repeat each day/week/month). Probably I should use a windows service because when the server machine is automatically switch on, this service must be automatically started and check if there are tasks to run. At the same time the user can decide to run manually any task without schedule it and, in this case, it will be queued and processed when it is his turn.
Now I'm thinking to host a WCF service into a Windows service in the server machine. Automatically I will start a background worker to check the scheduled tasks to run. In addition all clients can invoke a method to start one or more tasks. To notify the progress to all clients I'll use Contract Duplex.
You will need to compare between WCF and Web API and Choosing which technology to use according to your requirements.
If you just need HTTP only as transport protocols and Lightweight web-hosted services go with Web API.
And I will recommend Hangfire as it has many features than Windows service like Distributed, Persistent and Also, it's out of the box Dashboard that shows you all your scheduled, processing, succeeded and failed jobs.
Check also this article about
Runing Background Tasks in ASP.NET
if this is an internal application and clients are using winforms, behind the scenes you can make gets/posts to web api endpoints -- this allows users to retrieve/export data without having to install database drivers
web api driven imo, not very familiar with windows services, but one of the benefits i'm seeing is that the service can still be running on reboot
feel free to reach out to me directly
I use quartz in my asp website, i initialize the scheduler in application_start method and shutdown in application_end method ,my trigger will fire everyday but I found that my scheduler will automatically shutdown if there are not request for a while ,so my background works will not triggered,are there any better way to keep the scheduler life long and only shutdown when the server stopped?
For better knowledge sharing:
There are two suggestions:
http://www.codeproject.com/Articles/12117/Simulate-a-Windows-Service-using-ASP-NET-to-run-sc
http://weblog.west-wind.com/posts/2007/May/10/Forcing-an-ASPNET-Application-to-stay-alive
In general, if you need reliable scheduling, you should not do it within a web site.
As you've found, the worker process will be shut down after a period of time. Even if you force the worker process to run all the time, there are conditions that may cause it to terminate as well. It's just not a good idea.
Instead, you should write a Windows Service and run quartz.net in that.
If you cannot install services (say you're in a shared hosting environment), then your options are more limited.
There is an IIS configuration that allows worker processes to stay on all the time. I found this setting through another SO answer link.
Edit C:\Windows\System32\inetsrv\config\applicationHost.config to include:
<applicationPools>
<add name="MyAppWorkerProcess" managedRuntimeVersion="v4.0" startMode="AlwaysRunning" />
</applicationPools>
Scott Guthrie (Microsoft Product Manager for .NET) has answerered a question directly related to the OP's question (link).
#Dominic Pettifer,
If I set startMode="AlwaysRunning" does this mean the web app will
'never' shut down and will always be kept running, even with no
traffic hitting the site for a long period (unless of course it's
manually shut down, or server is switched off/crashes etc.)? The
reason I ask is because I like to run background threads/services on
the IIS ASPNET worker process instead of using Windows Services (we
deal with clients with lots of security restrictions on their servers
which makes running a Windows Service difficalt or impossible).
Normally I have to devise something that hits the website periodically
to keep the ASPNET worker process alive and stop it from shutting
down.
This should mean that the application and worker process is always
running - so I think that does indeed handle your scenario well for
you.
Hope this helps,
Scott
I wondered the same thing. Ultimately, whilst I agree with the general consensus, I wanted to see how it could be done, because I've been in a similar situation myself, where Windows Services were not available to me.
All I did was create a new job which, when executed sends a HTTP request to the application itself. For me, I pointed it at a page which simply contained #Datetime.Now.ToString().
The action of sending a HTTP request to itself should be enough to keep the scheduler (and parent worker process) alive.
It does not however stop the application from being stopped/recycled without warning. If you wanted a way to handle that, then you'd likely need more than one site running which pings both itself and the other site. This way, if one site goes down, the other can hit it (assuming it's started) to bring it back.
A much simpler way to is use a quality assurance checker. Using the tool Zapix I was able to schedule my website to be quality checked every 20 minutes. Zapix simply visited the site and received and http response. By using Zapix, it mimicked the functionality of manually visiting the website to trigger the emails. That way, the Application Pool threads are constantly woke.
I have an application built that hits a third party company's web service in order to create an email account after a customer clicks a button. However, sometimes the web service takes longer than 1 minute to respond, which is way to long for my customers to be sitting there waiting for a response.
I need to devise a way to set up some sort of queuing service external from the web site. This way I can add the web service action to the queue and advise the customer it may take up to 2 minutes to create the account.
I'm curious of the best way to achieve this. My initial thought is to request the actions via a database table which will be checked on a regular basis by a Console app which is run via Windows Scheduled tasks.
Any issues with that method?
Is there a better method you can think of?
I would use MSMQ, it may be an older technology but it is perfect for the scenario you describe.
Create a WCF service to manage the queue and it's actions. On the service expose a method to add an action to the queue.
This way the queue is completely independent of your website.
What if you use a combination of AJAX and a Windows Service?
On the website side: When the person chooses to create an e-mail account, you add the request to a database table. If they want to wait, provide a web page that uses AJAX to check every so often (10 seconds?) whether their account has been created or not. If it's an application-style website, you could let them continue working and pop up a message once the account is created. If they don't want to wait, they close the page or browse to another and maybe get an e-mail once it's done.
On the processing side: Create a Windows service that checks the table for new requests. Once it's done with a request it has to somehow communicate back to the user, maybe by setting a status flag on the request. This is what the AJAX call would look for. You could send an e-mail at this point too.
If you use a scheduled task with a console app instead of a Windows service, you risk having multiple instances running at the same time. You would have to implement some sort of locking mechanism (at the app or request level) to prevent processing the same thing twice.
What about the Queue Class or Generic Queue Class?
Unfortunetally, your question is too vague to answer with any real detail. If this is something you want managed outside the primary application then a Windows Service would be a little more appropriate then creating a Console... From an integration and lifecycle management perspective this provides a nice foudation for adding other features (e.g. Performance Counters, Hosted Management Services in WCF, Remoting, etc...). MSMQ is great although there is a bit more involved in deployment. If you are willing to invest the time, there are a lot of advantanges to using MSMQ. If you really want to create your own point to point queue, then there are a ton of examples online that can serve as an example. Here is one, http://www.smelser.net/blog/page/SmellyQueue-(Durable-Queue).aspx.
So it's easy to load balance an ASP.NET web application. You set up a load balancer between two servers, and if the web server isn't responding on Port 80, it won't receive requests.
Are there any proven techniques for doing this for a C# console application or Windows service that takes actions of its own volition? Are there any frameworks for knowing if peer processes are alive or dead, doing heartbeats, etc?
I've been experimenting a bit with NServiceBus and it seems like, for certain kinds of applications, it would help to have most of the work done as a response to an event, which makes it more like a web application, actually, and therefore easier to scale and load balance with multiple processes, but I feel like that's a half-baked solution since in most cases there usually needs to be some concept of a "master" process that's responsible for getting work started.
NServiceBus does indeed handle this for you with its Distributor process (described here: http://docs.particular.net/nservicebus/scalability-and-ha/distributor/). The generic host that comes with NServiceBus allows you to have the exact same code and configuration run both as a console app and as a windows service (described here: http://docs.particular.net/nservicebus/hosting/nservicebus-host/).
You can have this for events as well as for regular command messages.
If you want a "master" process to decide what to do when all the load-balanced work completes, that is provided to you in the form of the saga infrastructure (described here: http://docs.particular.net/nservicebus/sagas/ and demonstrated in the Manufacturing sample that comes with NServiceBus).
In short, you should pretty much be covered.
Edit (again): Let me simplify my problem. I have a Windows Service that exposes some WCF endpoints with methods like:
int ExecuteQuery(string query) {
// asynchronously execute query that may take 1 second to 20 minutes
return queryId;
}
string GetStatus(int queryId) {
// return the status of the query (# of results so far, etc)
}
What is the best way to implement the ExecuteQuery method? Should I just call ThreadPool.QueueUserWorkItem to get my query going?
Note that the actual work behind executing a query is done by load-balanced black box. I want to be able to have several queries going at the same time.
The analogy is a web browser that is downloading multiple files simultaneously and you have a download manager that can track the status of each file.
Take a look at Microsoft Message Queuing (MSMQ):
Microsoft Message Queuing (MSMQ) technology enables applications running at different times to communicate across heterogeneous networks and systems that may be temporarily offline. MSMQ provides guaranteed message delivery, efficient routing, security, and priority-based messaging. It can be used to implement solutions for both asynchronous and synchronous messaging scenarios.
It's good to know that Windows Communication Foundation (WCF) can leverage queuing services offered by MSMQ.
Either this is a trick question or a no-brainer... ThreadPool.QueueUserWorkItem is about the easiest way to go when you want to execute a piece of code concurrently. I'm sure you already knew that, so technically you have already answered your own question.
So if this is not a trick question, then are you asking exactly how to pass the query in the ThreadPool.QueueUserWorkItem?
I use a Windows service for a very similar task and it works very well. I use database tables to queue requests and responses, as it gives me a persistent queue that can be accessed over the network from remote ASP.Net applications, and concurrency control through transactions.
A supervisor thread on a timer spawns workers whenever incoming requests need servicing. I use a separate database tables for configuration and control so that I can administer the service and pause the supervisor from an application without while leaving the service core running. Logging to a separate table is a convenient way to see what's happening from web apps and a local admin app.
I wouldn't use the ThreadPool for long-running threads, but instead create a worker class that runs in its own thread and uses callback methods to update the supervisor with progress and completion status.
Adding to the MSMQ answer, you could think about looking at using an Enterprise Service Bus (ESB) to handle these sorts of things, if future scalability is a concern. Check out NServiceBus for one .NET example.
I would use WWF (4.0):
You can start long running transactions that can be handle in a few machines, execute task in parallel, failure support, friendly coding, you can manage it with appfabric, it is free...