Best practice with a schedule/queue service? - c#

I am currently working on a project, this project is basically a web site which as its main function, runs a long calculation task (this task is made up of between 1-10 sub tasks) - (takes about 30-40 seconds to complete on average) and returns a result to the user, as it stands the code uses multiple threading in the site itself (which i don't really like the idea of), so the site continues to run after a button click, whilst this thread in the background continues the calculation.
I'd ideally like to move this calculation into a separate service, to stop the issues related to running threading in a web app. My idea is that it should be a separate service that runs every X seconds to see if any jobs are present in the database, if there are, run them, but i have little experience in writing a reoccurring windows service, and was wondering if anyone has any ideas on the matter, is this the right way to go about such a task, does anyone have any experience of doing something similar, and can anybody recommend any particular ways I should proceed?

I am working on a similar project. I was thinking about create a service, like you, but I didn't see the benefits in my case. I have to do some long calculations, and put it in the cache. I don't know if this is the best practice but what I did was to create a timer on global.asax, perform my calculations and put it into cache. When the request comes I give the last cached value.
I hope this help you.

In a situation like this I always prefer to create a tcp listener that accepts the requests from the clients. If I am using .net, my first choice would be wcf service with tcp or named pipe binding. Then when I receive a request, I start a thread to proceed the request. When the long running process finally ends, I return a callback to the client, which was a url that specified in the first request to the scheduler. IMHO this is the best way to do it.
However you may want to build your application on a proven framework like NServiceBus, RhinoBus, etc. You may implement the same business over them too.

Related

Request timeout error while processing long tasks

I have an c# asp.net management system with a button that calls a SQL Server Query to get 90,000 strings of text in multiple languages and categorized into sections. This in turn is sorted and 150 Binary files made before saving as a .ZIP and emailing the user with the results. The total time to process this and email the results is about 6 minutes. In this time the Web Page is sat waiting for the whole process to complete. I would like to be able to press the start process button and then allow this to work away in the background while I continue using the web management system, but I am unsure what is the most efficient method for doing this. I initially created an asmx file thinking this would work but the result is the same and so I am now looking at async and await. Can anyone give me any pointers on this and let me know if I am on the right track. I am currently not getting anything back to let me know the process has completed successfully as I can handle this by emailing the user to say something went wrong. The reason for this is the user could be on any number of pages.
There are probably a few ways to go about tackling this problem. Your options will vary based on what version of .NET you are using, so I'll not post code directly; however, the you can implement the concept I describe using ASMX web services, WCF, MVC, and so on.
Start-and-poll Approach
The classic response for this kind of problem is to implement a StartSomething() method and a GetProgress() method. A very-simple example of this approach using ASMX-based web services is presented here.
In the example, one service method is used to start a process on a background thread. Myself, I would change the example by having the start method return a value to the client to identify which background process was started, as you could feasibly have several going on at a time.
The client then can call a separate method to get progress updates, and continue polling until the process is complete.
There are a number of reasons why you should prefer to do lengthy background processing in a non-IIS service. I recommend using a Windows service to protect yourself from IIS somewhat-randomly restarting your application pool in the middle of a big job.
WebSockets
Another option worth some exploration on your part is to use WebSockets, which allow the server to contact a modern browser when the process is complete. The main advantage of this approach is that the client does not need to busily poll the service for updates. Its primary disadvantage is that WebSockets are new enough that there are still plenty of browsers that could not be clients for such a service.
Good luck!

running timer from global.asax vs quartz.net

I am developing a asp.net site that needs hit a few social media sites daily for blanket friend/follower data. I have chosen arvixe business class as my hosting. In the future if we grow, I'd love to get onto a dedicated server and run a windows service, however since that is not in the cards at this point I need another reliable way of running scheduled tasks. I am familiar with running a thread timer from the app_code(global.aspx). However the app pool recycling will cause some problems with the timer. I have never used task scheduling like quartz but have read a lot about it on stackoverflow. I was looking for some advise as to how to approach my goal. One big problem I have using either method is that I will need the crawler threads to sleep for up to an hour regularly due to api call limits. My first thoughts were to use the db to save the starting and ending of a job. When the app pool recycles I would clear out any parts not completed and only start parts that do not have a record of running on that day. What do the experts here think? any good links to sample architecture of this type of scheduling?
It doesn't really matter what method you use, whether you roll your own or use Quartz. You are at the mercy of ASP.NET/IIS because that's where you want to host it.
Do you have a spare computer laying around that can just run a scheduled task and upload data to a hosted database? To be honest, it's possibly safer (depending on your use case) to just do it that way then try to run a scheduler in ASP.NET.
Somewhat along the lines of Bryan's post;
Find a spare computer.
Instead of allowing DB access have it call up a web service on your site. This service call should be the initiator of the process you are trying to do. Don't try to put params into it, just something like "StartProcess()" should work fine.
As far as going to sleep and resuming later take a look at Workflow Foundation. There are some nice built in features to persist state.
Don't expose your DB to the outside world, instead expose that page or web service and wraps some security around that. WCF has some nice built in security features for that.
The best part is when you decide to move off, you can keep your web service and have it called from a Windows Service in the same manner.
As long as you use a persistent job store (like a database) and you write and schedule your jobs so that they can handle things like being killed half way through, having IIS recycle your process is not that big a deal.
The bigger issue is that IIS shuts your site down if it doesn't have traffic. If you can keep your site up, then just make sure you set the misfire policy appropriately and that your jobs store any state data needed to pick up where they left off, you should be able to pull it off.
If you are language-agnostic and don't mind writing your "job-activation-script" in your favourite, Linux-supported language...
One solution that has worked very well for me is:
Getting relatively cheap, stable Linux hosting(from reputable
companies),
Creating a WCF service on your .Net hosted platform that will contain the logic you want to run regularly (RESTfully or SOAP or XMLRPC... whichever suits you),
Handling the calls through your Linux hosted cron jobs, written in your language of choice(I use PHP).
Working very well, like I said. No VPS expense,configurable and externally activated. I have one central place where my jobs are activated, with 99 to 100% uptime(never had any failures).

Is it wrong to run my mail queue in the app pool?

We have to send automated emails. They need to be reliably dispatched, so we write them into the database. Simultaneously, a System.Threading.Timer that was started at Application_Start invokes a method every 30s to read out of the database and send then delete entries that have been sent. None of this occurs as a long-running task. Care has been taken to ensure that the process of clearing the db-queue uses async methods, so no phase of the sending/queuing ever blocks, with the whole process being performed by short-lived methods in the ThreadPool. The cost of an app recycle is also minimal (possibly resulting in the resending of a single email... not a problem).
Conventional wisdom says that running this in the web app is a not so good and I should spin this out to a service instead.
Writing services is a PITA. I'd rather avoid it if possible. So why shouldn't I run an efficient async mail queue in my app pool? Can anyone enlighten me?
If your site is not used your app pool will not be started - no mail is sent.
Writing services is a PITA
I guess that is subjective. However, don't you think it would be beneficial to put it in a service? In case you want to change your implementation, it's a lot easier to maintain smaller, individual components in my experience. It usually becomes more of a PITA when you have everything in one place.
You are already writing the emails to a database. It is very simple to write a simple Windows service that simply scans the database and sends emails. I know this might not be ideal, but there are lots of examples floating around on SO and elsewhere. You don't have to get all fancy and use an ESB (unless you want to).
So in the end, just because you can doesn't mean you should. You have to weigh the costs and benefits.

How to do something periodically in ASP.NET/WCF?

First of all sorry of asking such a dumb question, I am quite a newbie in asp.net.
So, I am supposed to do something periodically, say I am owner of site heartpatients.com (hypothetically) and I want that for each of my site user who visits the site, a message to be shown after 2hrs "Take your pills". so, basically this is all my question, how am I supposed to show this message after every 2hr (or 4, 6 whatever time)after, also how can I customize time.
One more thing, say if I have this method in a WCF service, that shows this message, how can I call that service at a particular time, and that even configured by user (say someone is taking pills after 10hrs?) So how to call that service (that particular method in service) after the time specified by user passes periodically?
I hope I made my question quite clear.
Any help is appreciated.
ASP.NET generally isn't suited as a task scheduler. The nature of the web is as a request/response system. So a web application should just sit and wait for a request, generate a response, and be done.
For any kind of back-end scheduled task, I'd recommend either:
A Windows Service
A console application scheduled to run (I think Windows comes with a task scheduler)
There are pros and cons either way. For example, a Windows Service will run from boot time and has no console UI, and is generally very manageable from a server perspective. While a console application is traditionally simpler to write and debug.
These can still share code from your web application. If your business logic and data access and all that good stuff are in their own projects/assemblies then these other applications can use those assemblies just as well. (Of course, if everything in your web application is UI-bound, that's another question entirely.)
What concerns me the most is... How do you plan to show this message to a user? Is the user just sitting on your website for hours at a time and you need to remind them to take their meds? Or do you plan to send an email or something? Maybe the example you gave doesn't really explain what you're trying to do? I'm not sure.
Running tasks in the background is one thing, but it seems to me that an entire half of your overall equation (displaying a message to the user) is sort of glazed over and not really thought through.
Check out Quartz.Net: http://quartznet.sourceforge.net/
It enables you to schedule tasks to run using cron expressions:
http://en.wikipedia.org/wiki/Cron

Proper way to handle thousands of calls to external service from asp.net (mvc)

I'm tasked to create a web application. I'm currently using c# & asp.net (mvc - but i doubt its relevant to the question) - am a rookie developer and somewhat new to .net.
Part of the logic in the application im building is to make requests to an external smsgateway by means of hitting a particular url with a request - either as part of a user-initiated action in the webapp (could be a couple of messages send) or as part of a scheduledtask run daily (could and will be several thousand message send).
In relation to a daily task, i am afraid that looping - say - 10.000 times in one thread (especially if im also to take action depending on the response of the request - like write to a db) is not the best strategy and that i could gain some performance/timesavings from some parallelization.
Ultimately i'm more afraid that thousands of users at the same time (very likely) will perform the action that triggers a request. With a naive implementation that spawns some kind of background thread (whatever its called) for each request i fear a scenario with hundreds/thousands of requests at once.
So if my assumptions are correct - how do i deal with this? do i have to manually spawn some appropriate number of new Thread()s and coordinate their work from a producer/consumer-like queue or is there some easy way?
Cheers
If you have to make 10,000 requests to a service then it means that the service's API is anemic - probably CRUD-based, designed as a thin wrapper over a database instead of an actual service.
A single "request" to a well-designed service should convey all of the information required to perform a single "unit of work" - in other words, those 10,000 requests could very likely be consolidated into one request, or at least a small handful of requests. This is especially important if requests are going to a remote server or may take a long time to complete (and 2-3 seconds is an extremely long time in computing).
If you do not have control over the service, if you do not have the ability to change the specification or the API - then I think you're going to find this very difficult. A single machine simply can't handle 10,000 outgoing connections at once; it will struggle with even a few hundred. You can try to parallelize this, but even if you achieve a tenfold increase in throughput, it's still going to take half an hour to complete, which is the kind of task you probably don't want running on a public-facing web site (but then, maybe you do, I don't know the specifics).
Perhaps you could be more specific about the environment, the architecture, and what it is you're trying to do?
In response to your update (possibly having thousands of users all performing an action at the same time that requires you to send one or two SMS messages for each):
This sounds like exactly the kind of scenario where you should be using Message Queuing. It's actually not too difficult to set up a solution using WCF. Some of the main reasons why one uses a message queue are:
There are a large number of messages to send;
The sending application cannot afford to send them synchronously or wait for any kind of response;
The messages must eventually be delivered.
And your requirements fit this like a glove. Since you're already on the Microsoft stack, I'd definitely recommend an asynchronous WCF service backed by MSMQ.
If you are working with SOAP, or some other type XML request, you may not have an issue dealing with the level of requests in a loop.
I set up something similar using a SOAP server with 4-5K requests with no problem...
A SOAP request to a web service (assuming .NET 2.0 and superior) looks something like this:
WebServiceProxyClient myclient = new WebServiceProxyClient();
myclient.SomeOperation(parameter1, parameter2);
myclient.Close();
I'm assuming that this code will will be embedded into your business logic that you will be trigger as part of the user initiated action, or as part of the scheduled task.
You don't need to do anything especial in your code to cope with a high volume of users. This will actually be a matter of scalling on your platform.
When you say 10.000 request, what do you mean? 10.000 request per second/minute/hour, this is your page hit per day, etc?
I'd also look into using an AsyncController, so that your site doesn't quickly become completely unusable.

Categories