Please be kind, I'm just learning C# and inheriting this application from a former-employee is my first C# project.
I am observing inconsistent and slow periods with System.Windows.Forms.Timer. The application is written in C# with MS Visual Studio.
The timer is set for an interval of 100 msec yet I am observing periods ranging from 110 msec to 180 msec.
I am using several tools to observe this including:
- a SW oscilloscope (the Iocomp.Instrumentation.Plotting.Plot package),
- a real oscilloscope,
- letting the timer run for some time and comparing the number of ticks * 100 msec to both the system time and to a stopwatch.
In all cases I am observing a 10% lag that becomes evident within the first few seconds.
The methods that are executed with each tick take fewer than 4 msec to run. There is no time-consuming asynchronous processing happening, either. This shouldn't matter, though, as the timer tick is an interrupt, not an event added to an event handler queue (as far as I know).
Has anyone experienced a problem like this before? What were the root causes?
Thanks.
Timers are only as accurate as the operating system clock interrupt. Which ticks 64 times per second by default, 15.625 msec. You cannot get a clean 100 msec interval from that, it isn't divisible by 15.625. You get the next integer multiple, 7 x 15.625 = 109.375 msec. Very close to the 110 msec you observed.
You need to add the latency in the handling of the timer notification to this theoretical minimum. Timers have to compete with everything else that's going on in your UI thread. They are treated as the least important notification to be delivered. Sent messages go first, user input goes next, painting is next, timer messages are last. Either way, if you have an elaborate user interface that takes a while to repaint then the Tick event is going to be delayed until that's done. Same for any event handler you write that does something non-trivial like reading a file or querying a dbase.
To get a more responsive timer that doesn't suffer from this kind of latency, you need to use an asynchronous timer. System.Threading.Timer or System.Timers.Timer. Avoid the latter. Their callback runs on a threadpool thread so can get running pretty quickly. Be very careful what you do in this callback, lots of things you cannot do because they are not thread-safe.
You can these timers more accurate by changing the clock interrupt rate. That requires pinvoke, call timeBeginPeriod(). timeEndPeriod() when you're done.
Yes,I always faced this issue with System.Windows.Forms.Timer as it doesnt ticks accurately(most of the time).
You can try System.Timers.Timer instead and it raises interrupt precisely(atleast for 100ms precision)
System.Windows.Forms.Timer is really just a wrapper for the native WM_TIMER message. this means that the timer message is placed in the message queue at time roughly close to the interval you requested (plus or minus... there's no guarantee here). when that message is processed is entirely dependant on other messages in the queue and how long each takes to process. For example, if you block the UI thread (and thus block the queue from processing new messages) you won't get the timer event until after you unblock.
Windows is not a real-time operating system, you can't expect fine-grained accuracy in timers. If you want something more fine-grained, a multimedia timer is the best choice.
This is old, but in case anyone comes here looking for an actually correct answer:
From https://msdn.microsoft.com/en-us/library/system.windows.forms.timer(v=vs.110).aspx (emphasis mine):
The Windows Forms Timer component is single-threaded, and is limited to an accuracy of 55 milliseconds. If you require a multithreaded timer with greater accuracy, use the Timer class in the System.Timers namespace.
So with Windows.Forms.Timer you can get 55ms, 110ms, 165ms, etc.; which is consistent with what you were seeing. If you need higher precision, try System.Timers.Timer or System.Threading.Timer
Related
I am using Multimedia timers in my application (C# .NET) to increase accuracy of my timer and to achieve 1 ms timer frequency. My application had been working great so far until recently it started behaving strangely. I am trying to understand what is wrong with my application. Below are the steps taken
timer frequency is set to 1 ms, callback is called on every 1ms
there are 4 threads, each creating its own timer object. They all are set to call the callback after 1ms. These are individual instances and not shared.
old piece of code execution time was about 0.3 ms. This was working fine until next step.
application code is changed. Timer callback function now takes about 1.2 ms for execution. This is clearly a problem. (I am going to work on optimizing the code later. But now I just want to understand the multimedia timer behavior)
only the 1st thread is calling the timer callback where as for other threads the call back is called only twice or thrice and after that the callback is never called.
Looks like for other threads, the timer even is missed (?) and it cannot catch up. (Its missed for every interrupt).
Could you please explain me the behavior of the timer objects. Are all the threads actually pointing to same timer object since its a single process?
Why are other threads not calling the timer callback?
The maximum resolution for the Multimedia timer is 1ms. This causes the programmable interrupt controller (on the hardware) to fire every 1ms. If you fire up 4 threads that all create timers which have 1ms timings that does not mean you will get events more than once per millisecond.
I encourage you to read the Why are the Multimedia Timer APIs (timeSetEvent) not as accurate as I would expect? blog post on MSDN.
Some quotes that are applicable here (emphasis mine):
The MM Timer APIs allow the developer to reprogram the Programmable
Interrupt Controller (PIC) on the machine. You can specify the new
timer resolution. Typically, we will set this to 1 millisecond. This
is the maximum resolution of the timer. We can’t get sub-millisecond
accuracy. The effect of this reprogramming of the PIC is to cause the
OS to wake up more often. This increases the chances that our
application will be notified by the operating system at the time we
specified. I say, “Increases the chances” because we still can’t
guarantee that we will actually receive the notification even though
the OS work up when we told it.
And:
Remember that the PIC is used to wake up the OS so that it can decide
what thread should be run next. The OS uses some very complex rules to
determine what thread gets to occupy the processor next. Two of the
things that the OS looks at to determine if it should run a thread or
not are thread priority and thread quantum.
So, even if you put the resolution down to the maximum of 1ms, you are not guaranteed that your thread will be the one chosen to do its work.
I suppose that you use a system timer that runs callbacks on a single dedicated thread.
Then you set the system interval to 1 ms. And before your change the callback takes 0.3 ms to complete, so the callbacks of the 4 threads take 4 * 0.3 = 1.2 ms to complete. So they manage to complete on 1-2 time intervals, and can all start again after that.
But after your change each callback takes 1.2 ms itself. So we have requests to run callbacks from the threads 2-4 and another request from thread 1 (because the time interval ran out). After that it depends on the timer used, which request it will serve. It turns out, that the one from the first thread.
in my application I have an "heartbeat" functionality that is currently implemented in a long running thread in the following way (pseudocode):
while (shouldBeRunning)
{
Thread.Sleep(smallInterval);
if (DateTime.UtcNow - lastHeartbeat > heartbeatInterval)
{
sendHeartbeat();
lastHeartbeat = DateTime.UtcNow;
}
}
Now, it happens that when my application is going through some intensive CPU time (several minutes of heavy calculations in which the CPU is > 90% occupied), the heartbeats get delayed, even if smallInterval << heartbeatInterval.
To crunch some numbers: heartbeatInterval is 60 seconds, lastHeartbeat is 0.1 seconds and the reported delay can be up to 15s. So, in my understanding, that means that a Sleep(10) can last like a Sleep(15000) when the CPU is very busy.
I have already tried setting the thread priority as AboveNormal - how can I improve my design to avoid such problems?
Is there any reason you can't use a Timer for this? There are three sorts you can use and I usually go for System.Timers.Timer. The following article discusses the differences though:
http://msdn.microsoft.com/en-us/magazine/cc164015.aspx
Essentially timers will allow you to set up a timer with a periodic interval and fire an event whenever that period ticks past. You can then subscribe to the event with a delegate that calls sendHeartbeat().
Timers should serve you better since they won't be affected by the CPU load in the same way as your sleeping thread. It has the advantage of being a bit neater in terms of code (the timer set up is very simple and readable) and you won't have a spare thread lying around.
You seem to be trying to reinvent one of the timer classes.
How about using System.Timers.Timer for example?
var timer = new System.Timers.Timer(smallInterval);
timer.Elapsed += (s, a) => sendHeartbeat;
timer.Enabled = true;
One of the issues here may be, at a guess, how often your thread gets scheduled when the CPU is under load. Your timer implementation is inherently single threaded and blocks. A move to one of the framework timers should alleviate this as (taking the above timer as an example) the elapsed event is raised on a thread pool thread, of which there are many.
Unfortunately, Windows is not a Real Time OS and so there are few guarantees about when threads are executed. The Thread.Sleep () only schedules the earliest time when the thread should be woken up next, it is up to the OS to wake up the thread when there's a free time slice. The exact criteria for waking up a sleeping thread is probably not documented so that the Window's kernel team can change the implementation as they see fit.
I'm not sure that Timer objects will solve this as the heartbeat thread still needs to be activated after the timer has expired.
One solution is to elevate the priority of the heartbeat thread so that it gets a chance of executing more often.
However, heartbeats are usually used to determine if a sub-system has got stuck in an infinite loop for example, so they are generally low priority. When you have a CPU intensive section, do a Thread.Sleep (0) at key points to allow lower priority threads a chance to execute.
Is there relialbe alternative to Timer class in .Net?
We are having issues with System.Timers.Timer and System.Threading.Timer, e.g., they start immidietly, or sometimes fire out after long period of inactivity (after 49 days).
I've seen that there seems to be a lot of issues with them like here: http://social.msdn.microsoft.com/Forums/en/netfxbcl/thread/dbc398a9-74c0-422d-89ba-4e9f2499a6a3
We can not use Forms timer.
We are thinking to pause the thread for certain period of time instead of the timer...
This article describes the difference between the timer classes in the .Net framework.
But maybe another approach can help:
If have have to wait for such a long time it would be better to calculate the DateTime when you like something to start. Afterwards your task wakes up every second (or whatever accuracy is needed) and compares the current time with the desired time. If the current time is equal or greater just start your job. Otherwise go to sleep (till the next second, hour, millisecond, whatever). By the way, that's the way how the Microsoft Task Scheduler works.
I think you need System.Diagnostics.Stopwatch.
The Stopwatch measures elapsed time by counting timer ticks in the underlying timer mechanism. If the installed hardware and operating system support a high-resolution performance counter, then the Stopwatch class uses that counter to measure elapsed time. Otherwise, the Stopwatch class uses the system timer to measure elapsed time. Use the Frequency and IsHighResolution fields to determine the precision and resolution of the Stopwatch timing implementation.
You could replace the timer with a new instance every 48 days in the Tick/Elapsed event handler.
The threading will most likely work, but is it going to be worth the extra overhead?
I'm also aware of those limitations and have implemented an own timer. Sucked to take that decision, but I have not regretted it so far. As a bonus my own timer implements an interface so that classes using the timer can be unit-tested easily.
Some .NET timers also has a problem where they may invoke the callback before the last callback has finished.
I would recommend putting the timer logic in it's own class rather than pausing a thread.
It's actually a noticeable difference that I've seen but cannot explain. These timers have intervals set to 1ms (the lowest available), but while it's minimized, it seems to tick faster? Could anyone explain this phenomenon to me? And if possible, explain how to reproduce the effect while the window is maximized?
Is this a Forms.Timer?
I doubt it is running faster, more likely that the Timer firing event is being handled in a more timely manner. Whilst minimised there will presumably be less messages handled by the Form windows's message pump, which could account for a larger time slice to handle the Timer messages. There is also the isue of the minimum Timer resolution.
If applicable, try using one of the other Timer types, such as System.Timers
The Windows Forms Timer component is
single-threaded, and is limited to an
accuracy of 55 milliseconds. If you
require a multithreaded timer with
greater accuracy, use the Timer class
in the System.Timers namespace.
Ref.
If I remember correctly, the minimum resolution you can get out of a System.Windows.Forms.Timer (which I assume is what you're using here) is 55 ms. Setting it to 1 ms essentially means that it ticks continuously.
Of course, a timer doesn't guarantee that ticks will arrive at exactly the interval specified. If your app is busy doing other things (like redrawing the screen) then it may take a few more ms, or significantly more under heavy load. If the timer is set to an interval of 1 second, you won't really notice this, but at the minimum window (55 ms), you might.
When the application is minimized, there are fewer other events that can interrupt the timer events before they fire.
Does anyone know how a System.Windows.Forms.Timer affects the host application and the system in general?
A threaded background loop on one hand has a very high CPU usage %, whilst a Timer with a very high tick rate shows no effect in Windows Task Manager.
Does a high tick-rate timer clutter up the Windows Message loop, or?
Define "high tick rate timer" :).
The problem with timer components relying on WM_TIMER (such as the Windows.Forms one) is manifold:
You will not be able to get a resolution better than 50 msec out of it, ever.
If your system is under load (e.g. heavy redrawing, running over RDP links, etc.) you might get WM_TIMER messages once every 500 msec or more, no matter how low you've set the interval.
WM_TIMER messages are synthetic messages and might not get delivered to your application at all for prolonged periods of time if your message queue gets flooded with other messages.
If your timer method takes longer than one timer interval to execute, the timer will "skip" the message, i.e. you will not get another WM_TIMER message until you've returned from the first one. In other words, you will never get two WM_TIMER messages one after the other in sequence.
Overall I have not noticed many negative downfalls to using a timer component inside my application, they are much more effective, and better on resources than some other methods out there.
I find that this Timer Comparison article from microsoft is also helpful when comparing these types of things.
But the long and short of it is that they don't appear to clutter up much.