Is it right to set the Interval property to ensure the timer is started at an interval of 24 hours everyday
this.NotificationTimer = new System.Timers.Timer();
this.NotificationTimer.Interval = 86400000D;
I am converting 24 hrs to 86400000 ms.
Please confirm if this is the right approach/value. Else, explain the reason.
The Interval property gets or sets the interval, expressed in milliseconds, at which to raise the Elapsed event. So what you do is correct. See MSDN.
A bit more optimized and probably a touch easier to read is using the Timer constructor:
this.NotificationTimer = new System.Timers.Timer(86400000D);
In terms of calculating the correct time in milliseconds you can call a TimeSpan method. This improves readability; disadvantage is a slightly slower execution:
this.NotificationTimer = new System.Timers.Timer(TimeSpan.FromHours(24).TotalMilliseconds);
Related
This question already has answers here:
Raise event in high resolution interval/timer
(3 answers)
Closed 9 years ago.
I have tried Dispatcher timer but it doesn't seem to be working correctly.
I have the tick event set up that adds to a tick counter every tick and it's just not doing the job right. I also have a Stopwatch to count how long it's been, and the numbers aren't matching up. Please let me know what kind of solution would work to give me 192 ticks each second.
Stopwatch sw = new Stopwatch();
public DispatcherTimer dt = new DispatcherTimer();
dt.Tick += dt_Tick;
dt.Interval = TimeSpan.FromMilliseconds(1000/192);
dt.Start();
sw.Start();
void dt_Tick(object sender, EventArgs e)
{
tick_textbox.Text = tick_counter.ToString();
seconds_textbox.Text = sw.Elapsed.ToString();
tick_counter++;
}
Now, I've lowered it to 8 per second, which should solve the resolution problem, but I'm getting wildly different outcomes from using an interval of TimeSpan.FromSeconds and TimeSpan.FromMilliseconds:
dt.Tick += dt_Tick;
dt.Interval = TimeSpan.FromSeconds(2 / 16);
dt.Start();
vs.
dt.Tick += dt_Tick;
dt.Interval = TimeSpan.FromMilliseconds(2000 / 16);
dt.Start();
What is the reason for that?
You're asking for an event every 5ms and the .NET timers are simply not reliable at this resolution
http://www.informit.com/guides/content.aspx?g=dotnet&seqNum=815
From the article:
The conclusion I drew from all my testing is that, at best, you can count on a timer to tick within 15 milliseconds of its target time. It's rare for the timer to tick before its target time, and I never saw it be early by more than one millisecond. The worst case appears to be that the timer will tick within 30 milliseconds of the target time. Figure the worst case by taking your desired target frequency (i.e. once every 100 milliseconds), rounding up to the next multiple of 15, and then adding 15. So, absent very heavy CPU load that prevents normal processing, a 100 ms timer will tick once every 99 to 120 ms.
You definitely can't get better resolution than 15 milliseconds using these timers. If you want something to happen more frequently than that, you have to find a different notification mechanism. No .NET timer object will do it.
There are ways to get this resolution but they typically involve specific hardware designed for high-frequency timing applications and driver interop for their events. I've done this before using an acousto-optic modulator, laser source and CCD.
Scenario:
In a Winform application(C#), I have a Datagridview in which I have to display 4 countdown timers in the format "mm:ss".
Time Interval must be 1000ms.
I was counting the timer down in the elapsed event of system.timers.timer.
On all 4 timers I'm starting to countdown from 2 mins (02:00).
Issue:
It takes more time(125 seconds) than 2 mins, to reach 00:00.
Similarly for 4 mins it takes 7-10 more(247 -250) seconds to reach 00:00
Timers on systems are somewhat of an inaccurate beast. Better systems generally provide better timers but even the best system has slippage.
You also need to remember that your process isn't going to be in "full control" 100% of the time, it will at times be pushed into the background and have to share the processor with other applications so whilst it's does its best to keep track of the time.
What you probably want is a High Precision Timer (aka stopwatch) in C#. Have a look at This thread and This article on selecting timer mechanisms for some more information.
If you need time resolution of that type (i.e. actual clock or countdown clock), you should use the real-time clock.
You still use a timer with sub-second resolution to fire frequently enough for display purpose, but you don't add up those times, you use the real-time clock to get the real elapsed time (DateTime.Now - startTime).
First off you should run this little test and see the drift in near real-time. I lose a 1 second after 72 timer elapsed events (this is with very little actual work).
using (var timer = new Timer(1000))
{
var start = DateTime.Now;
var i = 0;
timer.Elapsed += (sender, args) => Console.WriteLine("{0} -> {1}", ++i, (DateTime.Now - start).TotalMilliseconds);
timer.Start();
Thread.Sleep(130*1000);
}
I'm not sure how precise your app needs to be but you can get "good enough" by using the delta between the start time & now and subtracting that from your initial value and killing the timer at zero. You will lose seconds with this approach and there's a reasonable chance that lost second could happen # 0:00 causing a -0:01 tick, which you will need to handle.
var countdownSeconds = 120;
var startedAt = DateTime.Now;
var timer = new Timer(1000);
timer.Elapsed += (sender, args) => Display(countdownSeconds - (int)((DateTime.Now - startedAt).TotalSeconds));
timer.Start();
//- be sure to dispose the timer
i am trying to set the tick alert for the timer based on the time set by the datetime picker. however i dont get the tick alert at all.
if (dateTimePicker1.Value >= DateTime.Now)
{
sendOrder.Interval =(int) (dateTimePicker1.Value.Ticks-DateTime.Now.Ticks);
sendOrder.Enabled = true;
}
in the above code, i set the tick time based on difference between time from datetimepicker - current time. what am i doing wrong on this one?
Assuming you are using System.Timers.Timer, Interval is in milliseconds, whereas you are specifying ticks. There are 10,000 ticks in 1 millisecond.
Try this instead:
sendOrder.Interval = (int)(dateTimePicker1.Value.Ticks - DateTime.Now.Ticks) / 10000;
DateTime.Ticks Property:
A single tick represents one hundred nanoseconds or one ten-millionth of a second. There are 10,000 ticks in a millisecond.
Timer.Interval Property:
The time, in milliseconds
So your timer is firing; the interval is just off by a factor of 10,000.
Instead of twiddling with ticks, you can use the TimeSpan.TotalMilliseconds Property to get the difference between the two DateTime values in milliseconds:
sendOrder.Interval = (dateTimePicker1.Value - DateTime.Now).TotalMilliseconds;
Ticks is in intervals of 100 nanoseconds. There are 10,000 ticks in a millisecond. Interval is in milliseconds. So that is obviously an error.
I want to have a running timer in my app which displays the seconds elapsed upto 2 decimal places e.g. 2.31 seconds. I'm using System.Threading.Timer for this and I'm setting the Timer object to call refreshTimeBox() function every 10 milliseconds (because I want the seconds upto 2 decimal places). But, the timer lags behind. And as the time passed increases, it lags more and more behind. I'm guessing its because refreshTimeBox() is taking too long to complete. But, I also tried calling refreshTimeBox() every 100 milliseconds instead. The timer still lags. The only difference is that the lag becomes noticeable after a longer time in this case than in the case where I use 10 milliseconds as interval. I have the following code to initialize the timer:
timer = new Timer(refreshTimeBox, null, 0, 10);
The following is the code for refreshTimeBox:
public void refreshTimeBox(object param)
{
time += 0.01f;
Dispatcher.BeginInvoke(WriteTimeBox);
}
The following is the code for WriteTimeBox:
public void WriteTimeBox()
{
TimeBar.Text = time.ToString("0.00");
}
time is the variable which stores the time elapsed and TimeBar is the text box which is being updated.
I want as accurate a timer as possible. Please help me with this. Thanks.
If you want to display the amount of time that has elapsed since a particular event in the past, then the best way to do that is to store the time at "time zero", e.g. startTime, and then, in your timer event, computer the time that has elapsed by subtracting startTime from the current time, and displaying the difference.
You can't rely on timer events being delivered at exact intervals. This isn't a realtime system. Added to which, you should be aware that BeginInvoke is likely requesting that the method call occur on a different thread to the current one, and that different thread may not be able to dispatch that method call at the current time.
You could also use a stopwatch
System.Diagnostics.Stopwatch sw=new System.Diagnostics.Stopwatch();
sw.Start();
And in the update function
TimeBar.Text = sw.ElapsedMilliseconds;
Then you don't have to worry about UTC issues
Timers always have the possibility of lagging. What you need to do is store when you started, then compare it to the current time.
private DateTime startTime;
public void startTimer() {
startTime = DateTime.Now;
}
public void refreshTimeBox(object param)
{
Dispatcher.BeginInvoke(WriteTimeBox);
}
public void WriteTimeBox()
{
TimeSpan ts = DateTime.Now - startTime;
// See http://msdn.microsoft.com/en-us/library/1ecy8h51.aspx for TimeSpan formatting.
TimeBar.Text = ts.time.ToString("s.ff");
}
I have a dll consumed by a service. Its basic job is to run every X minutes and perform some system checks.
In my dll I have a top level class that declares a System.threading.timer and a Timercallback.
The constructor for the class initialises the timerCallback with my thread function.
In my "Onstart" handler I initialise the timer with the timercallback and set the next time to fire and interval time. In my case its every 10 minutes.
Usually in these 10 minute checks there is nothing to do but the service is forced to do something at least once every day at a set time.
My problem: I am finding that during testing, the time the daily check is carried out every day is slowly drifitng away from the desired start time of 8.30. e.g. over about 20 odd days my time has drifted from 08.30 to 08.31.35. It drifts about 4 - 6 seconds every day.
My question: does anyone know why the time is drifting like this and how can I make it stick to its allotted time?
thanks
The time "drifts" because the timer is simply not that precise. If you need to run your code as closely as possible to a certain interval, you can do something like this:
public void MyTimerCallback(object something) {
var now = DateTime.UtcNow;
var shouldProbablyHaveRun = new DateTime(
now.Year, now.Month, now.Day,
now.Hour, now.Minute - (now.Minute % 10), 0);
var nextRun = shouldProbablyHaveRun.AddMinutes(10.0);
// Do stuff here!
var diff = nextRun - DateTime.UtcNow;
timer.Change(diff, new TimeSpan(-1));
}
...assuming you are using a System.Threading.Timer instance. Modify the example if you are using any other type of timer (there are several!).
Why not check every minute if the action needs to be performed?
ie:
if (DateTime.Now.Minute % 10) == 0
it takes a finite amount of time to do the operations you are doing in your timer method handler, so it makes sense that it's not going to happen every 10 minutes to the second, especially if you are scheduling the next wakeup after doing your checks and such. if you are already checking anyway for answering is it time to do this, you should make your timer fire more frequently to satisfy the resolution you need and trust your check of when it should execute something to make sure that it does. you probably need some sort of persistence to make sure it doesn't execute twice (if that is important) in case there is a shut down/restart and the state of knowing whether it has already run is not still in memory.
Here is my take:
while ((DateTime.Now - lastRunTime).TotalSeconds < 600)
CurrentThread.Sleep(1000);
or just register a windows timer and execute in response to the event/callback
public static void Main()
{
System.Timers.Timer aTimer = new System.Timers.Timer();
aTimer.Elapsed+=new ElapsedEventHandler(OnTimedEvent);
// Set the Interval to 600 seconds.
aTimer.Interval=600000;
aTimer.Enabled=true;
Console.WriteLine("Press \'q\' to quit the sample.");
while(Console.Read()!='q');
}
// Specify what you want to happen when the Elapsed event is raised.
private static void OnTimedEvent(object source, ElapsedEventArgs e)
{
Console.WriteLine("10 minutes passed!");
}
Timers aren't exact, just approximate. Don't use the logic of "just add 10 minutes". Each time your timer fires, you need to check for time skew and adjust.
eg. If you say "wake me up in 10min", and it wakes you up in 10min 1sec, then the next timer needs to be 9min 59sec, not 10min.
Also, you want to assign your next timer at the end of your logic.
eg. say you want to start taskA every 10min and it takes 2 seconds to run. Your timer starts and 10 minutes later it wakes up to run taskA. It kicks off, finishes, now you add 10 minutes. But it took 2 seconds to run your task. So 10 minutes from the time your code ran will be skewed by 2 seconds.
What you need to do is predict the next time you need to run and find the difference between now and then and set the timer to that difference.