I have a quartz.net client/server setup to fire some text messages on a schedule using another Third Party library.
Quartz.Server.exe is running as a windows service on my staging and development environments, pointing to jobs on SQL and my website uses Quartz.Simpl.ZeroSizeThreadPool and just schedules the jobs. Everything was working fine until it wasn't.
Apparently something had caused the exe to stop running and even though it was running as a windows service with recovery options set to send me an email when it went down, I did not get the email.
So when I restarted the server 15 days worth of old text messages went out with misleading phrases like "Appointment tomorrow # 9:00 AM" for appointments for 5 days ago.
So now I have updated my IJob code so that the job is discarded if the fire-time is after the appointment time. Since I was using DateTime instead of plain strings I changed "quartz.jobStore.useProperties = false". When I was ready to deploy I realized that this change along with some other changes could break the scheduler so that it wouldn't fire historical triggers from before the changes.
I spiraled down a 12+ hour black hole trying to wrestle with the settings to get my new jobs to fire alongside my old jobs locally and on my staging environment. Tried a million things, including every combination of Quartz running as a service or console pointing to SQL local or on staging. Said potty-words. Then said some prayers.
Started over. Tested to see if the working code on the live server would fire both new and old jobs without testing. It wouldn't. I Changed "quartz.jobStore.useProperties = false" in quartz.config. It worked! So I did a new deployment Monday evening with the changes and everything seemed to be working fine. I did a "Keeping my job dance", and revisited my error logging/recovery setup and created a nightly job in SQL to count number of triggers yesterday, today, and tomorrow each morning.
Here it is Wednesday morning and I check the SQL job I set up to find 150+ triggers from Yesterday have not been handled (everything past 9:00 AM EST). So I go to my live server and the Quartz service is still running. I stop the process and go to my folder where it lives and Right_Click>Run as Admin. Here is the error I get in my log.txt file. And of course now the Windows service fails to start.
I need major help fast! I need a montage, and at the end of the Montage I need to be a quartz.net ninja. Marko Lahma (or anyone) will you "hold my hand" and tell me I am going to be able to keep my job? I need my setup to be bulletproof (be more robust/fail more noticeably).
EDIT - 201404241938Z
Here is my code to check for the new values that broke the old jobs
// I think this next line is causing an error
DateTimeOffset oDateReminder = data.GetDateTimeOffset("reminderTime");
if (oDateReminder.DateTime > DateTime.MinValue)
{
// Do stuff with other "new job" datamap keys
…
// Changing it to…
if (data["reminderTime"] != null)
{
// Do stuff with other "new job" datamap keys
…
EDIT - 201404242205Z
This seems to have worked. I'll wait and award some bounty
As you probably know, the data is now corrupted in the DB containing both JobDataMaps (binary) and NameValueCollections (binary).
Would you like to give a whirl to the latest head of repository master? I've ninjacommitted support for recovering from this mixed data situation. Just take the latest and build with custom snk file and drop the DLL for run.
This is version 2.2.x so take note if you are using something like 2.1 or 2.0.
The issue is tracked here https://github.com/quartznet/quartznet/issues/172
What comes to your original issue one easy fix would to ignore misfires, if SMS notification doesn't fire within expected time then it probably should be ignored and not fired immediately when possible. Logging information might help you get info about what caused the original downtime (windows updates, SQL server not started yet etc).
Related
I’m in the process of deploying an application that uses Hangfire’s solutions in 1.7.24 version with the MySQL database.
I am struggling with one problem. The application automatically schedules actions and adds these calls as BackgroundJob in hangfire. The problem is that these jobs cannot run on user-defined holidays or weekends. This means that if a job that is to be processed on a weekend or holiday should be moved to the next possible date.
Is it possible to move an already scheduled job, or is the only solution (which I don’t quite like and want to do) to delete the current scheduled job and add a new one with the updated date?
I tried to deserialize the data available in the table “hangfire.hangfirestate” for the corresponding scheduled job, I changed the value of the “Data” column for “EnqueueAt”.
The Hangfire dashboard correctly shows in this case that the job is to be performed on the date and time I changed manually, but unfortunately for some reason the Hangfire server does not take these changes and procced the job execution on this initial date.
Can you help with this topic? Thank you in advance.
I've created a new Windows Service in Visual Studio using C#. The service needs to run once per day, and I have a timer to check every hour to see if the time is correct for it to run the service code.
However, sometimes the server I am running can restart and it could miss this time. So I want to be able to store the date and time of when the code was last run, so that every hour when I check the time I can also check to see if the code has been run today or if it still needs to be run.
So is there a way that I can store a date and time within a windows service without creating files or storing it within a database?
The service needs to run once per day, and I have a timer to check every hour to see if the
time is correct for it to run the service code.
And why the heck is this a windows service? COmmand line program, task scheduler, instead of wasting memory during the rest of the day.
So is there a way that I can store a date and time within a windows service without creating
files or storing it within a database?
In a variable. If the service stops it is gone as it is only in memory. But ruling out files (and database) that is all that is left.
That said, the whle approach is bad. Forget the service, make it a task and use the windows task scheduler.
I am basically creating a site for recruiters. One of the functionality in my application requires posting to Facebook periodically. The posting frequency can be from 0(Never) to 4(High)
For Eg. If a recruiter has 4 open jobs and he has posting frequency set to 4, each job should be posted as per it's turn: 1st job on 1st day, 2nd job on 2nd, 3rd job on 3rd etc, on 5th day again 1st job (round robin fashion).
Had he set the posting frequency to 2, two jobs would be posted daily (thus each job would be posted every 2 days)
My only question is what type of threading should I create for this since this is all dynamic!! Also, any guidelines on what type of information should I store in database?
I need just a general strategy to solve this problem. No code..
I think you need to seperate it from your website, I mean its better to run the logic for posting jobs in a service hosted on IIS ( I am not sure such a thing exists or not, but I guess there is).
Also you need to have table for job queue to remember which jobs need to be posted, then your service would pick them up and post them one by one.
To decide if this is the time for posting a job you can define a timer with a configurable interval to check if there is any job to post or not.
Make sure that you keep the verbose log details if posting fails. It is important because it is possible that Facebook changes its API or your API key becomes invalid or anything else then you need to know what happened.
Also I strongly suggest to have a webpage for reporting the status of jobs-to-post queue, if they failed what was the causes of problem.
If you program runs non-stop, you can just use one of the Timer classes available in .NET framework, without the need to go for full-blown concurrency (e.g. via Task Parallel Library).
I suspect, though, that you'll need more than that - some kind of mechanism to detect which jobs were successfully posted and which were "missed" due program not running (or network problems etc.), so they can be posted the next time the program is started (or network becomes available). A small local database (such as SQLite or MS SQL Server Compact) should serve this purpose nicely.
If the requirements are as simple as you described, then I wouldn't use threading at all. It wouldn't even need to be a long-running app. I'd create a simple app that would just try to post a job and then exit immediately. However, I would scheduled it to run once every given period (via Windows Task Scheduler).
This app would check first if it hasn't posted any job yet for the given posting frequency. Maybe put a "Last-Successful-Post-Time" setting in your datastore. If it's allowed to post, the app would just query the highest priority job and then post it to Facebook. Once it successfully posts to Facebook, that job would then be downgraded to the lowest priority.
The job priority could just be a simple integer column in your data store. Lower values mean higher priorities.
Edit:
I guess what I'm suggesting is if you have clear boundaries in your requirements, I would suggest breaking your project into multiple applications. This way there is a separation of concerns. You wouldn't then need to worry how to spawn your Facebook notification process inside your web site code.
Let me give a back ground for everybody before I go to my problem. My company hosts website for many clients, my company also contracts some of the work to another company.
So when we first set up a website with all the informations to our clients, we pass that information to the other company we contracted and three of us have the same data. Problem is once the site is up and running, our clients will change some data and when ever they do that we should be able to update our contracted company.
The way we transfer data to the contracted company is by using a web service (httppost, xml data). Now my question is what it the best way to write a program which sends updated data to the contracted company everytime our clients change some data.
1) Write a windows service having a timer inside my code where every 30min or so connects to the database and find all changes and send it to the contracted company
2) Write the same code as #1 (with out the timer in it) but this time make it a simple program and let windows scheduler wake it every 30min
3) Any other suggestion you may have
Techenologies available for me are VS 2008, SQLServer 2005
Scheduled task is the way to go. Jon wrote up a good summary of why services are not well suited for this sort of thing: http://weblogs.asp.net/jgalloway/archive/2005/10/24/428303.aspx
A service is easy to create and install and is more "professional" feeling so why not go that way? Using a non-service EXE would also work of course and would be slightly easier to get running (permissions, etc.) but I think the difference in setup between the two is nearly negligible.
One possible solution would be to add a timestamp column to your data tables.
Once this is done, you can have one entry in each table that has the last collected time by your contracted company. They can pull all records since that last time and update their records accordingly.
A Windows Service is more self contained, and you can easily configure it to start up automatically when the OS is starting up. You might also need to create additional configuration options, as well as some way to trigger the synchronization immediately.
It will also give you more room to grow your functionality for the service in the future.
A standalone app should be easier to develop though, however you are reliant on the windows scheduler to execute the task always. My experience has been that it is easier to mess up things with the windows scheduler and have it not run, for example in cases where you reboot the OS but no user has logged in.
If you want a more professional approach go with the service, even though it might mean a little bit more work.
A windows service makes more sense in this case. Think about what happens after your server is restarted:
With a Windows Application you need to have someone restart the application, or manually copy a shortcut to the startup folder to make sure the application gets launched
OR,
With a Windows Service you set it to start automatically and forget about it. When the machine reboots your service starts up and continues processing.
One more consideration, what happens when there is an error? A Windows application would likely show an error dialog and wait for input before continuing; whereas a service would log the error in the event log and carry on.
I have to create an app that will read in some info from a db, process the data, write changes back to the db, and then send an email with these changes to some users or groups. I will be writing this in c#, and this process must be run once a week at a particular time. This will be running on a Windows 2008 Server.
In the past, I would always go the route of creating a windows service with a timer and setting the time/day for it to be run in the app.config file so that it can be changed and only have to be restarted to catch the update.
Recently, though, I have seen blog posts and such that recommend writing a console application and then using a scheduled task to execute it.
I have read many posts talking to this very issue, but have not seen a definitive answer about which process is better.
What do any of you think?
Thanks for any thoughts.
If it is a one per week application, why waste the resources for it to be running in the background for the rest of the week.
A console application seems much more appropriate.
The typical rule of thumb that I use is something along these lines. First I ask a few questions.
Frequency of Execution
Frequency of changes to #1
Triggering Mechanism
Basically from here if the frequency of execution is daily or less frequent I'll almost always lean towards a scheduled task. Then looking at the frequency for changes, if there is a high demand for schedule changes, I'll also try to lean towards scheduled tasks, to allow no-coding changes for schedule changes. lastly if there is ever a thought of a trigger other than time, then I'll lean towards windows services to help "future proof" an application. Say for example the requirement changes to be run every time a user drops a file in X folder.
The basic rule I follow is: if you need to be running continuously because events of interest can happen at any time, use a service (or daemon in UNIX).
If you just want to periodically do something, use a scheduled task (or cron).
The clincher here is your phrase "must be run once a week at a particular time" - go for a scheduled task.
If you have only one application and you need it to run once a week may be scheduler will be good as there is no need to have separate service and process running on the system which will be idle most of the time.