how to Create "Keep Alive" for a chat app? - c#

At the moment im doing a chat web app where multiple users can chat and it can control multiple rooms. Its working and getting the job done.
Right now its using ajax(jquery is used) and just use GET to the server.aspx with different query parameters and then return some content.(It is meant to be build into a larger project later on)
But I have one thing that I cannot figure out how to build for it and hopin' someone had a splendid idea :)
A "Keep Alive" (or TimeToLive) service on the users. The service should ensure when a user disconnects(machine crash - Browser/window close) the user times out from the chat room.
My idea was that on every request from the user TO the server it should update a TTL list(a list with a userid and a "timestamp") and this part is easy.
Now comes my challenge
Then there should be some service running on the server that continuesly checks this TTL list to see if any stamps has run out and if it has remove the user from the room
But how and where can I do this server service in .net ? Or do you have another approch ? :)

I would just have a table called something like "LastPing" with user id and a date.
Put a piece of javascript which calls a page on your site at regular intervals (window.setInterval(...)) - that page just updates the table with the current datetime or does an insert if no rows are updated.
Finally, create a sql server job/task that selects user id from Lastping where date is older than currentdate - 30 mins (or whatever). Those user ids get deleted from any chat rooms etc. and finally removed from the LastPing table.
I think that's about it :)

You could run a Console Application (or run it as a Windows Service) that could scan your TTL list using a Timer that ticks on a set interval to process them as you wish. Could all be done in .net, preventing you from having to store your business logic in an SSIS package within SQL server.
If you're going down this path I would recommend writing a windows service that can also be run as a console app. Query the Environment.UserInteractive property to work out which version is being run - this will help with your development because a console application can be a little more verbose than a windows service.
Here is a code sample:
public partial class Service1 : ServiceBase
{
//Need to keep a reference to this object, else the Garbage Collector will clean it up and prevent further events from firing.
private System.Threading.Timer _timer;
static void Main(string[] args)
{
if (Environment.UserInteractive)
{
var service = new Service1();
Log.Debug("Starting Console Application");
service.OnStart(args);
// The service can now be accessed.
Console.WriteLine("Service ready.");
Console.WriteLine("Press <ENTER> to terminate the application.");
Console.ReadLine();
service.OnStop();
return;
}
var servicesToRun = new ServiceBase[]
{
new Service1()
};
Run(servicesToRun);
}
public Service1()
{
InitializeComponent();
}
protected override void OnStart(string[] args)
{
// For a single instance, this is a bit heavy handed, but if you're creating of a number of them
// the NT service will need to return in a short period of time and thus I use QueueUserWorkItem
ThreadPool.QueueUserWorkItem(SetupTimer, args);
}
protected override void OnStop()
{
}
private void SetupTimer(object obj)
{
//Set the emailInterval to be 1 minute by default
const int interval = 1;
//Initialize the timer, wait 5 seconds before firing, and then fire every 15 minutes
_timer = new Timer(TimerDelegate, 5000, 1, 1000 * 60 * interval);
}
private static void TimerDelegate(object stateInfo)
{
//Perform your DB TTL Check here
}
}

For that type of solution, you would need to setup a separate Thread that is periodically checking for users to expire, or utilize a library for scheduled tasks and similarly setup a scheduled task.

Related

How to create a scheduled long running process using windows service in c#

I want to create a windows service that performs some really long and heavy work. The code is inside OnStart method like this:
protected override void OnStart(string[] args)
{
System.IO.File.WriteAllText(
#"C:\MMS\Logs\WinServiceLogs.txt",
DateTime.Now + "\t MMS Service started."
);
this.RequestAdditionalTime(5*60*1000);
this.RunService();
}
this.RunService() sends a request to WCF service library hosted on IIS. It does some really long processes, ranging from 1-20 min, depending on the data it has to process. This service that I'm writing is supposed to be scheduled to run every day in the morning. So far, it runs and works fine, but when the time goes over a few seconds or min, it generates timeout exception. This causes the windows service to be in unstable state, and I can't stop or uninstall it without restarting the computer. Since, I'm trying to create an automated system, this is an issue.
I did do this.RequestAdditionalTime(), but I'm not sure whether it's doing what it's supposed to or not. I don't get the timeout error message, but now I don't know how to schedule it so it runs every day. If the exception occurs, then it won't run the next time. There were several articles and SO's I found, but there's something I'm missing and I can't understand it.
Should I create a thread? Some articles say I shouldn't put heavy programs in OnStart, where should I put the heavy codes then? Right now, when the service starts, it does this huge data processing which makes the Windows Service status to "Starting", and it stays there for long time until either the program crashes due to timeout, or completes successfully. How can I start the service, then set the status to Running while the code is running to do some data processing?
Your situation might be better suited for a scheduled task as Lloyd said in the comments above. But if you really want to use a Windows service, this is what you would need to add/update in your service code. This will allow your service to list as started and not timeout on you. You can adjust the timer length to suit your needs.
private Timer processingTimer;
public YourService()
{
InitializeComponent();
//Initialize timer
processingTimer = new Timer(60000); //Set to run every 60 seconds
processingTimer.Elapsed += processingTimer_Elapsed;
processingTimer.AutoReset = true;
processingTimer.Enabled = true;
}
private void processingTimer_Elapsed(object sender, ElapsedEventArgs e)
{
//Check the time
if (timeCheck && haventRunToday)
//Run your code
//You should probably still run this as a separate thread
this.RunService();
}
protected override void OnStart(string[] args)
{
//Start the timer
processingTimer.Start();
}
protected override void OnStop()
{
//Check to make sure that your code isn't still running... (if separate thread)
//Stop the timer
processingTimer.Stop();
}
protected override void OnPause()
{
//Stop the timer
processingTimer.Stop();
}
protected override void OnContinue()
{
//Start the timer
processingTimer.Start();
}

how to make scheduled process in asp.net/C# with mvc5

I have a MVC5 application that has a function to send e-mail to user after subscribing the newsletter. Now i have a requirement to send e-mail to all those users whose are running 1 month ahead of expiry date of their subscription. In this case i need to implement a background process that will run every day at a specific time on the web server. How can i do that?
Thanks
You can create a Windows Service to make that work for you.
You should follow this tutorial.
You can store the date/time on the project web.config/app.config which when you want your service to be executed. When the service executes, you validate the time and call a generic function that will do what you want. Follow this example:
public partial class Service1 : ServiceBase
{
YourServiceClass service;
private Timer serviceTimer;
public Service1()
{
InitializeComponent();
service = new YourServiceClass();
}
protected override void OnStart(string[] args)
{
TimerCallback timerDelegate = new TimerCallback(service.GetData); // You should add this function to your class. You have an example below
string time = System.Configuration.ConfigurationSettings.AppSettings["SceduleTime"]; // Gets time from app.config like 12:50
string[] timeS = time.Split(':');
DateTime DateIni = new DateTime(DateTime.Now.Year, DateTime.Now.Month, DateTime.Now.Day, Convert.ToInt32(timeS[0]), Convert.ToInt32(timeS[1]), 0);
TimeSpan diff = DateTime.Now - DateIni;
if (diff.TotalSeconds < 0)
diff = DateIni - DateTime.Now;
else
diff = DateIni.AddDays(1) - DateTime.Now;
// create timer and attach our method delegate to it
serviceTimer = new Timer(timerDelegate, service, diff, new TimeSpan(1, 0, 0, 0));
}
protected override void OnStop()
{
serviceTimer.Dispose();
}
}
And on your YourServiceClass class, you add this function:
public void GetData(object state){
// Do something...
}
Hope it helps!
Personally, I'd create a separate dedicated Windows Service to periodically check for expiry, and then call the code to send the emails.
An alternative is to create a simple console application to run the task, and call it using the Windows Task Scheduler.
A less secure method is to use a ping service, to periodically hit a page with an obfuscated URL which processes the emails.
A slightly old, but relevant blog post, detailing the issues with recurrent background tasks in .net sites can be read here.
I recommend you to use Quartz library. Your process should be independent from your web application since the web app will recycle from time to time based on the user traffic which will trigger Application_End event and will start once a new request is sent which will trigger Application_Start so you wont be able to manage correctly your schedule start and end.
It really depends on your requirements, and whether you want this update to be truly independent on the server, in a web application or bundled with your app.
I recently found a great third party library for ASP.NET called Hang Fire.
It looks like it can be as simple as:
RecurringJob.AddOrUpdate(
() => Console.WriteLine("Transparent!"),
Cron.Daily);
It supports:
Delayed Jobs
Recurring Tasks
Automatic Retries
And it has a nicely themed Management/Reporting interface
I'm planning to use it for some scheduled batch tasks I need to run on a CRM system. No experience with it yet, but it does seem to have some great features and I think I've seen it recommended a few times around Stack Overflow!
There are a ton of different ways to approach what you want (Scheduled Task on the server is another off the top of my head). This is a nice little package that lets me bundle it with my web applications.

C# console app to send email at scheduled times

I've got a C# console app running on Windows Server 2003 whose purpose is to read a table called Notifications and a field called "NotifyDateTime" and send an email when that time is reached. I have it scheduled via Task Scheduler to run hourly, check to see if the NotifyDateTime falls within that hour, and then send the notifications.
It seems like because I have the notification date/times in the database that there should be a better way than re-running this thing every hour.
Is there a lightweight process/console app I could leave running on the server that reads in the day's notifications from the table and issues them exactly when they're due?
I thought service, but that seems overkill.
My suggestion is to write simple application, which uses Quartz.NET.
Create 2 jobs:
First, fires once a day, reads all awaiting notification times from database planned for that day, creates some triggers based on them.
Second, registered for such triggers (prepared by the first job), sends your notifications.
What's more,
I strongly advice you to create windows service for such purpose, just not to have lonely console application constantly running. It can be accidentally terminated by someone who have access to the server under the same account. What's more, if the server will be restarted, you have to remember to turn such application on again, manually, while the service can be configured to start automatically.
If you're using web application you can always have this logic hosted e.g. within IIS Application Pool process, although it is bad idea whatsoever. It's because such process is by default periodically restarted, so you should change its default configuration to be sure it is still working in the middle of the night, when application is not used. Unless your scheduled tasks will be terminated.
UPDATE (code samples):
Manager class, internal logic for scheduling and unscheduling jobs. For safety reasons implemented as a singleton:
internal class ScheduleManager
{
private static readonly ScheduleManager _instance = new ScheduleManager();
private readonly IScheduler _scheduler;
private ScheduleManager()
{
var properties = new NameValueCollection();
properties["quartz.scheduler.instanceName"] = "notifier";
properties["quartz.threadPool.type"] = "Quartz.Simpl.SimpleThreadPool, Quartz";
properties["quartz.threadPool.threadCount"] = "5";
properties["quartz.threadPool.threadPriority"] = "Normal";
var sf = new StdSchedulerFactory(properties);
_scheduler = sf.GetScheduler();
_scheduler.Start();
}
public static ScheduleManager Instance
{
get { return _instance; }
}
public void Schedule(IJobDetail job, ITrigger trigger)
{
_scheduler.ScheduleJob(job, trigger);
}
public void Unschedule(TriggerKey key)
{
_scheduler.UnscheduleJob(key);
}
}
First job, for gathering required information from the database and scheduling notifications (second job):
internal class Setup : IJob
{
public void Execute(IJobExecutionContext context)
{
try
{
foreach (var kvp in DbMock.ScheduleMap)
{
var email = kvp.Value;
var notify = new JobDetailImpl(email, "emailgroup", typeof(Notify))
{
JobDataMap = new JobDataMap {{"email", email}}
};
var time = new DateTimeOffset(DateTime.Parse(kvp.Key).ToUniversalTime());
var trigger = new SimpleTriggerImpl(email, "emailtriggergroup", time);
ScheduleManager.Instance.Schedule(notify, trigger);
}
Console.WriteLine("{0}: all jobs scheduled for today", DateTime.Now);
}
catch (Exception e) { /* log error */ }
}
}
Second job, for sending emails:
internal class Notify: IJob
{
public void Execute(IJobExecutionContext context)
{
try
{
var email = context.MergedJobDataMap.GetString("email");
SendEmail(email);
ScheduleManager.Instance.Unschedule(new TriggerKey(email));
}
catch (Exception e) { /* log error */ }
}
private void SendEmail(string email)
{
Console.WriteLine("{0}: sending email to {1}...", DateTime.Now, email);
}
}
Database mock, just for purposes of this particular example:
internal class DbMock
{
public static IDictionary<string, string> ScheduleMap =
new Dictionary<string, string>
{
{"00:01", "foo#gmail.com"},
{"00:02", "bar#yahoo.com"}
};
}
Main entry of the application:
public class Program
{
public static void Main()
{
FireStarter.Execute();
}
}
public class FireStarter
{
public static void Execute()
{
var setup = new JobDetailImpl("setup", "setupgroup", typeof(Setup));
var midnight = new CronTriggerImpl("setuptrigger", "setuptriggergroup",
"setup", "setupgroup",
DateTime.UtcNow, null, "0 0 0 * * ?");
ScheduleManager.Instance.Schedule(setup, midnight);
}
}
Output:
If you're going to use service, just put this main logic to the OnStart method (I advice to start the actual logic in a separate thread not to wait for the service to start, and the same avoid possible timeouts - not in this particular example obviously, but in general):
protected override void OnStart(string[] args)
{
try
{
var thread = new Thread(x => WatchThread(new ThreadStart(FireStarter.Execute)));
thread.Start();
}
catch (Exception e) { /* log error */ }
}
If so, encapsulate the logic in some wrapper e.g. WatchThread which will catch any errors from the thread:
private void WatchThread(object pointer)
{
try
{
((Delegate) pointer).DynamicInvoke();
}
catch (Exception e) { /* log error and stop service */ }
}
You trying to implement polling approach, where a job is monitoring a record in DB for any changes.
In this case we are trying to hit DB for periodic time, so if the one hour delay reduced to 1 min later stage, then this solution turns to performance bottle neck.
Approach 1
For this scenario please use Queue based approach to avoid any issues, you can also scale up number of instances if you are sending so many emails.
I understand there is a program updates NotifyDateTime in a table, the same program can push a message to Queue informing that there is a notification to handle.
There is a windows service looking after this queue for any incoming messages, when there is a message it performs the required operation (ie sending email).
Approach 2
http://msdn.microsoft.com/en-us/library/vstudio/zxsa8hkf(v=vs.100).aspx
you can also invoke C# code from SQL Server Stored procedure if you are using MS SQL Server. but in this case you are making use of your SQL server process to send mail, which is not a good practice.
However you can invoke a web service, OR WCF service which can send emails.
But Approach 1 is error free, Scalable , Track-able, Asynchronous , and doesn't trouble your data base OR APP, you have different process to send email.
Queues
Use MSMQ which is part of windows server
You can also try https://www.rabbitmq.com/dotnet.html
Pre-scheduled tasks (at undefined times) are generally a pain to handle, as opposed to scheduled tasks where Quartz.NET seems well suited.
Furthermore, another distinction is to be made between fire-and-forget for tasks that shouldn't be interrupted/change (ex. retries, notifications) and tasks that need to be actively managed (ex. campaign or communications).
For the fire-and-forget type tasks a message queue is well suited. If the destination is unreliable, you will have to opt for retry levels (ex. try send (max twice), retry after 5 minutes, try send (max twice), retry after 15 minutes) that at least require specifying message specific TTL's with a send and retry queue. Here's an explanation with a link to code to setup a retry level queue
The managed pre-scheduled tasks will require that you use a database queue approach (Click here for a CodeProject article on designing a database queue for scheduled tasks)
. This will allow you to update, remove or reschedule notifications given you keep track of ownership identifiers (ex. specifiy a user id and you can delete all pending notifications when the user should no longer receive notifications such as being deceased/unsubscribed)
Scheduled e-mail tasks (including any communication tasks) require finer grained control (expiration, retry and time-out mechanisms). The best approach to take here is to build a state machine that is able to process the e-mail task through its steps (expiration, pre-validation, pre-mailing steps such as templating, inlining css, making links absolute, adding tracking objects for open tracking, shortening links for click tracking, post-validation and sending and retrying).
Hopefully you are aware that the .NET SmtpClient isn't fully compliant with the MIME specifications and that you should be using a SAAS e-mail provider such as Amazon SES, Mandrill, Mailgun, Customer.io or Sendgrid. I'd suggest you look at Mandrill or Mailgun. Also if you have some time, take a look at MimeKit which you can use to construct MIME messages for the providers allow sending raw e-mail and doesn't necessarily support things like attachments/custom headers/DKIM signing.
I hope this sets you on the right path.
Edit
You will have to use a service to poll at specific intervals (ex. 15 seconds or 1 minute). The database load can be somewhat negated by checkout out a certain amount of due tasks at a time and keeping an internal pool of messages due for sending (with a time-out mechanism in place). When there's no messages returned, just 'sleep' the polling for a while. I'd would advise against building such a system out against a single table in a database - instead design an independant e-mail scheduling system that you can integrate with.
I would turn it into a service instead.
You can use System.Threading.Timer event handler for each of the scheduled times.
Scheduled tasks can be scheduled to run just once at a specific time (as opposed to hourly, daily, etc.), so one option would be to create the scheduled task when the specific field in your database changes.
You don't mention which database you use, but some databases support the notion of a trigger, e.g. in SQL: http://technet.microsoft.com/en-us/library/ms189799.aspx
If you know when the emails need to be sent ahead of time then I suggest that you use a wait on an event handle with the appropriate timeout. At midnight look at the table then wait on an event handle with the timeout set to expire when the next email needs to be sent. After sending the email wait again with the timeout set based on the next mail that should be sent.
Also, based on your description, this should probably be implemented as a service but it is not required.
I have been dealing with the same problem about three years ago. I have changed the process several times before it was good enough, I tell you why:
First implementation was using special deamon from webhosting which called the IIS website. The website checked the caller IP and then check the database and send emails. This was working until one day, when I got a lot of very dirty emails from the users that I have totally spammed their mailboxes. The drawback of keeping email in database and sending from SMTP email is that there is NOTHING which ensure DB to SMTP transaction. You are never sure if the email has been successfully sent or not. Sending email can be successfull, can failed or it can be false positive or it can be false negative (SMTP client tells you, that the email was not sent, but it was). There was some problem with SMTP server and the server returned false(email not send), but the email was sent. The daemon was resending the email every hour the whole day before the dirty emails appears.
Second implementation: To prevent spamming, I have changed the algorithm, that the email is considered to be sent even if it failed (my email notification was not too important). My first advice is: "Don't launch the deamon too often, because this false negative smtp error makes users upset."
After several month there were some changes on the server and the daemon was not working well. I got the idea from the stackoverflow: bind the .NET timer to the web application domain. It wasn't good idea, because it seems, that IIS can restart application from time to time because of memory leaks and the timer never fires if the restarts are more often then timer ticks.
The last implementation. Windows scheduler every hour fires python batch which read local website. This fire ASP.NET code. The advantage is that time windows scheduler call the the local batch and website reliably. IIS doesn't hang, it has restart ability. The timer site is part of my website, it is still one projects. (you can use console app instead). Simple is better. It just works!
Your first choice is the correct option in my opinion. Task Scheduler is the MS recommended way to perform periodic jobs. Moreover it's flexible, can reports failures to ops, is optimized and amortized amongst all tasks in the system, ...
Creating any console-kind app that runs all the time is fragile. It can be shutdown by anyone, needs an open seesion, doesn't restart automatically, ...
The other option is creating some kind of service. It's guaranteed to be running all the time, so that would at least work. But what was your motivation?
"It seems like because I have the notification date/times in the database that there should be a better way than re-running this thing every hour."
Oh yeah optimization... So you want to add a new permanently running service to your computer so that you avoid one potentially unrequired SQL query every hour? The cure looks worse than the disease to me.
And I didn't mention all the drawbacks of the service. On one hand, your task uses no resource when it doesn't run. It's very simple, lightweight and the query efficient (provided you have the right index).
On the other hand, if your service crashes it's probably gone for good. It needs a way to be notified of new e-mails that may need to be sent earlier than what's currently scheduled. It permanently uses computer resources, such as memory. Worse, it may contain memory leaks.
I think that the cost/benefit ratio is very low for any solution other than the trivial periodic task.

Background task in a ASP webapp

I'm fairly new to C#, and recently built a small webapp using .NET 4.0. This app has 2 parts: one is designed to run permanently and will continuously fetch data from given resources on the web. The other one accesses that data upon request to analyze it. I'm struggling with the first part.
My initial approach was to set up a Timer object that would execute a fetch operation (whatever that operation is doesn't really matter here) every, say, 5 minutes. I would define that timer on Application_Start and let it live after that.
However, I recently realized that applications are created / destroyed based on user requests (from my observation they seem to be destroyed after some time of inactivity). As a consequence, my background activity will stop / resume out of my control where I would like it to run continuously, with absolutely no interruption.
So here comes my question: is that achievable in a webapp? Or do I absolutely need a separate Windows service for that kind of things?
Thanks in advance for your precious help!
Guillaume
While doing this on a web app is not ideal..it is achievable, given that the site is always up.
Here's a sample: I'm creating a Cache item in the global.asax with an expiration. When it expires, an event is fired. You can fetch your data or whatever in the OnRemove() event.
Then you can set a call to a page(preferably a very small one) that will trigger code in the Application_BeginRequest that will add back the Cache item with an expiration.
global.asax:
private const string VendorNotificationCacheKey = "VendorNotification";
private const int IntervalInMinutes = 60; //Expires after X minutes & runs tasks
protected void Application_Start(object sender, EventArgs e)
{
//Set value in cache with expiration time
CacheItemRemovedCallback callback = OnRemove;
Context.Cache.Add(VendorNotificationCacheKey, DateTime.Now, null, DateTime.Now.AddMinutes(IntervalInMinutes), TimeSpan.Zero,
CacheItemPriority.Normal, callback);
}
private void OnRemove(string key, object value, CacheItemRemovedReason reason)
{
SendVendorNotification();
//Need Access to HTTPContext so cache can be re-added, so let's call a page. Application_BeginRequest will re-add the cache.
var siteUrl = ConfigurationManager.AppSettings.Get("SiteUrl");
var client = new WebClient();
client.DownloadData(siteUrl + "default.aspx");
client.Dispose();
}
private void SendVendorNotification()
{
//Do Tasks here
}
protected void Application_BeginRequest(object sender, EventArgs e)
{
//Re-add if it doesn't exist
if (HttpContext.Current.Request.Url.ToString().ToLower().Contains("default.aspx") &&
HttpContext.Current.Cache[VendorNotificationCacheKey] == null)
{
//ReAdd
CacheItemRemovedCallback callback = OnRemove;
Context.Cache.Add(VendorNotificationCacheKey, DateTime.Now, null, DateTime.Now.AddMinutes(IntervalInMinutes), TimeSpan.Zero,
CacheItemPriority.Normal, callback);
}
}
This works well, if your scheduled task is quick.
If it's a long running process..you definitely need to keep it out of your web app.
As long as the 1st request has started the application...this will keep firing every 60 minutes even if it has no visitors on the site.
I suggest putting it in a windows service. You avoid all the hoops mentioned above, the big one being IIS restarts. A windows service also has the following benefits:
Can automatically start when the server starts. If you are running in IIS and your server reboots, you have to wait until a request is made to start your process.
Can place this data fetching process on another machine if needed
If you end up load-balancing your website on multiple servers, you could accidentally have multiple data fetching processes causing you problems
Easier to main the code separately (single responsibility principle). Easier to maintain the code if it's just doing what it needs to do and not also trying to fool IIS.
Create a static class with a constructor, creating a timer event.
However like Steve Sloka mentioned, IIS has a timeout that you will have to manipulate to keep the site going.
using System.Runtime.Remoting.Messaging;
public static class Variables
{
static Variables()
{
m_wClass = new WorkerClass();
// creates and registers an event timer
m_flushTimer = new System.Timers.Timer(1000);
m_flushTimer.Elapsed += new System.Timers.ElapsedEventHandler(OnFlushTimer);
m_flushTimer.Start();
}
private static void OnFlushTimer(object o, System.Timers.ElapsedEventArgs args)
{
// determine the frequency of your update
if (System.DateTime.Now - m_timer1LastUpdateTime > new System.TimeSpan(0,1,0))
{
// call your class to do the update
m_wClass.DoMyThing();
m_timer1LastUpdateTime = System.DateTime.Now;
}
}
private static readonly System.Timers.Timer m_flushTimer;
private static System.DateTime m_timer1LastUpdateTime = System.DateTime.MinValue;
private static readonly WorkerClass m_wClass;
}
public class WorkerClass
{
public delegate WorkerClass MyDelegate();
public void DoMyThing()
{
m_test = "Hi";
m_test2 = "Bye";
//create async call to do the work
MyDelegate myDel = new MyDelegate(Execute);
AsyncCallback cb = new AsyncCallback(CommandCallBack);
IAsyncResult ar = myDel.BeginInvoke(cb, null);
}
private WorkerClass Execute()
{
//do my stuff in an async call
m_test2 = "Later";
return this;
}
public void CommandCallBack(IAsyncResult ar)
{
// this is called when your task is complete
AsyncResult asyncResult = (AsyncResult)ar;
MyDelegate myDel = (MyDelegate)asyncResult.AsyncDelegate;
WorkerClass command = myDel.EndInvoke(ar);
// command is a reference to the original class that envoked the async call
// m_test will equal "Hi"
// m_test2 will equal "Later";
}
private string m_test;
private string m_test2;
}
I think you can can achieve it by using a BackgroundWorker, but i would rather suggest you to go for a service.
Your application context lives as long as your Worker Process in IIS is functioning. In IIS there's some default timeouts for when the worker process will recycle (e.g. Number of Idle mins (20), or regular intervals (1740).
That said, if you adjust those settings in IIS, you should be able to have the requests live, however, the other answers of using a Service would work as well, just a matter of how you want to implement.
I recently made a file upload functionality for uploading Access files to the database (not the best way but just a temporary fix to a longterm issue).
I solved it by creating a background thread that ran through the ProcessAccess function, and was deleted when completed.
Unless IIS has a setting in which it kills a thread after a set amount of time regardless of inactivity, you should be able to create a thread that calls a function that never ends. Don't use recursion because the amount of open functions will eventually blow up in you face, but just have a for(;;) loop 5,000,000 times so it'll keep busy :)
Application Initialization Module for IIS 7.5 does precisely this type of init work. More details on the module are available here Application Initialization Module

Needed: A Windows Service That Executes Jobs from a Job Queue in a DB; Wanted: Example Code

Needed:
A Windows Service That Executes Jobs from a Job Queue in a DB
Wanted:
Example Code, Guidance, or Best Practices for this type of Application
Background:
A user will click on an ashx link that will insert a row into the DB.
I need my windows service to periodically poll for rows in this table, and it should execute a unit of work for each row.
Emphasis:
This isn't completely new terrain for me.
EDIT: You can assume that I know how to create a Windows Service and basic data access.
But I need to write this service from scratch.
And I'd just like to know upfront what I need to consider.
EDIT: I'm most worried about jobs that fail, contention for jobs, and keeping the service running.
Given that you are dealing with a database queue, you have a fair cut of the job already done for you due to the transactional nature of databases. Typical queue driven application has a loop that does:
while(1) {
Start transction;
Dequeue item from queue;
process item;
save new state of item;
commit;
}
If processing crashes midway, the transaction rolls back and the item is processed on the next service start up.
But writing queues in a database is actually a lot trickier than you believe. If you deploy a naive approach, you'll find out that your enqueue and dequeue are blocking each other and the ashx page becomes unresponsive. Next you'll discover the dequeue vs. dequeue are deadlocking and your loop is constantly hitting error 1205. I strongly urge you to read this article Using Tables as Queues.
Your next challenge is going to be getting the pooling rate 'just right'. Too aggressive and your database will be burning hot from the pooling requests. Too lax and your queue will grow at rush hours and will drain too slowly. You should consider using an entirely different approach: use a SQL Server built-in QUEUE object and rely on the magic of the WAITFOR(RECEIVE) semantics. This allows for completely poll free self load tuning service behavior. Actually, there is more: you don't need a service to start with. See Asynchronous Procedures Execution for an explanation on what I'm talking about: launching processing asynchronously in SQL Server from a web service call, in a completely reliable manner. And finally, if the logic must be in C# process then you can leverage the External Activator, which allows the processing to be hosted in standalone processes as opposed to T-SQL procedures.
First you'll need to consider
How often to poll for
Does your service just stop and start or does it support pause and continue.
Concurrency. Services can increase the likelihood of a encountering a problem
Implementation
Use a System.Timers.Timer not a Threading.Timer
Maker sure you set the Timer.AutoReset to false. This will stop the reentrant problem.
Make sure to include execution time
Here's the basic framework of all those ideas. It includes a way to debug this which is a pain
public partial class Service : ServiceBase{
System.Timers.Timer timer;
public Service()
{
timer = new System.Timers.Timer();
//When autoreset is True there are reentrancy problme
timer.AutoReset = false;
timer.Elapsed += new System.Timers.ElapsedEventHandler(DoStuff);
}
private void DoStuff(object sender, System.Timers.ElapsedEventArgs e)
{
Collection stuff = GetData();
LastChecked = DateTime.Now;
foreach (Object item in stuff)
{
try
{
item.Dosomthing()
}
catch (System.Exception ex)
{
this.EventLog.Source = "SomeService";
this.EventLog.WriteEntry(ex.ToString());
this.Stop();
}
TimeSpan ts = DateTime.Now.Subtract(LastChecked);
TimeSpan MaxWaitTime = TimeSpan.FromMinutes(5);
if (MaxWaitTime.Subtract(ts).CompareTo(TimeSpan.Zero) > -1)
timer.Interval = MaxWaitTime.Subtract(ts).TotalMilliseconds;
else
timer.Interval = 1;
timer.Start();
}
protected override void OnPause()
{
base.OnPause();
this.timer.Stop();
}
protected override void OnContinue()
{
base.OnContinue();
this.timer.Interval = 1;
this.timer.Start();
}
protected override void OnStop()
{
base.OnStop();
this.timer.Stop();
}
protected override void OnStart(string[] args)
{
foreach (string arg in args)
{
if (arg == "DEBUG_SERVICE")
DebugMode();
}
#if DEBUG
DebugMode();
#endif
timer.Interval = 1;
timer.Start();
}
private static void DebugMode()
{
Debugger.Break();
}
}
EDIT Fixed loop in Start()
EDIT Turns out Milliseconds is not the same as TotalMilliseconds
You may want to have a look at Quartz.Net to manage scheduling the jobs. Not sure if it will fit your particular situation, but it's worth a look.
Some things I can think of, based on your edit:
Re: job failure:
Determine whether a job can be retried and do one of the following:
Move the row to an "error" table for logging / reporting later OR
Leave the row in the queue so that it will be reprocessed by the job service
You could add a column like WaitUntil or something similar to delay retrying the job after a failure
Re: contention:
Add a timestamp column such as "JobStarted" or "Locked" to track when the job was started. This will prevent other threads (assuming your service is multithreaded) from trying to execute the job simultaneously.
You'll need to have some cleanup process that goes through and clears stale jobs for re-processing (in the event the job service fails and your lock is never released).
Re: keeping the service running
You can tell windows to restart a service if it fails.
You can detect previous failure upon startup by keeping some kind of file open while the service is running and deleting it upon successful shutdown. If your service starts up and that file already exists, you know the service previously failed and can alert an operator or perform the necessary cleanup operations.
I'm really just poking around in the dark here. I'd strongly suggest prototyping the service and returning with any specific questions about the way it functions.

Categories