What should be the best way to calculate the time diff which is accurate upto the level of Microseconds. currently I am doing as follows:
((TimeSpan)(DateTime.Now - _perfClock)).TotalMilliseconds
Note: perfClock is DateTime (set prior to task)
Which is suppose to give accuracy upto Milliseconds, but in my case it is showing values ends with "000". like 8000,9000 etc...
This is forcing me to think that is just converting seconds to Milliseconds somewhere, instead of calculating diff in Milliseconds. (Possibly I am wrong somewhere in code above).
But what should be the recommended mechanism for accurate Time Diff calculation?
-Sumeet
The issue is not with TimeSpan, that is accurate down to ticks, which is 100 nanoseconds.
The issue is you are using DateTime.Now for your timer.
DateTime.Now is accurate to about 16ms i believe. as mentioned by V4Vendetta, you want to use Stopwatch if you need "high resolution" results. Stopwatch can provide you with ticks (long) or TimeSpan. use Timespan for easy manipulation (in your case, add/subtract).
Note also that Stopwatch provides a .IsHighResolution property, to see if you have a better accuracy than Datetime.Now (it's always true on PC iirc)
I don't know the context in which you are measuring time but it would be good to start of with Stopwatch and check your results.
Also worth a read Precise Measurement
Have you tried:
TimeSpan diff = DateTime.Now.Subtract(_perfClock);
Related
I currently use a solution for getting a higher resolution timestamp in C# by taking a start time using DateTime.UtcNow and then using a Stopwatch to add ticks to it as time goes by. I came across Stopwatch.GetTimestamp() as a potential alternative or even better solution, but I cannot find reliable information on exactly what this function returns.
Best source of info seems to be this.
GetTimestamp() returns machine-dependent ticks which can be converted into seconds by dividing by the stopwatch frequency. If I do this, I get a value that appears to be a UTC UNIX timestamp which is exactly what I'm after - but I haven't seen anything that states that this is what I should expect from it.
One clue from MSDN states that:
If the Stopwatch class uses a high-resolution performance counter,
GetTimestamp returns the current value of that counter. If the
Stopwatch class uses the system timer, GetTimestamp returns the
current DateTime.Ticks property of the DateTime.Now instance.
Looking then at DateTime.Ticks, we then see:
The value of this property represents the number of 100-nanosecond
intervals that have elapsed since 12:00:00 midnight, January 1, 0001
(0:00:00 UTC on January 1, 0001, in the Gregorian calendar), which
represents DateTime.MinValue.
I'm therefore not clear how simply dividing some machine-dependent tick-count by the frequency can get me a UNIX 1970+ timestamp? Is it possible that if a high performance timer is not available on the target platform that I might get year 0001-based timestamp instead? Or maybe something else entirely, again depending on the available hi-res timer?
Can you describe your use case? If you're interested in extra precision, I don't see how you could possibly get it by starting out with DateTime.UtcNow, and then, separately, calling Stopwatch.Start() -- if you add Stopwatch.Elapsed to DateTime.UtcNow, the value is going to be inaccurate, because you have no way of knowing how long after the DateTime.UtcNow call that the stopwatch actually started. If you start the stopwatch first, you have the same problem in reverse.
Generally speaking, in .NET 4.6, there is a ToUnixTimeMilliseconds call on DateTimeOffset that may be helpful (e.g. DateTimeOffset.UtcNow.ToUnixTimeMilliseconds())
I am trying to calculate a video framerate in a program. To do this I take
DateTime.Now
at the beginning of a stream, and then again after every frame whilst also incrementing a framecounter.
Then I calculate the FPS like so:
int fps = (int)(frames / (TimeSpan.FromTicks(CurrentTime.Ticks).Seconds - TimeSpan.FromTicks(StartTime.Ticks).Seconds));
The problem is that I occassionally get a negative number out, meaning the start time must be later than the current time. How can this be the case? Does anyone know anough about these functions to explain?
Seconds gives you the seconds part of the TimeSpan, not the total duration of the TimeSpan converted in seconds. This means that Seconds will never be greater than 60.
Use TotalSeconds instead
You should consider using StopWatch for such needs, It has much better precision
The datetime functions are probably not precise enough for your needs, you may want to look into performance counters instead. I think the StopWatch class is what your looking for. System.Diagnostics.StopWatch. that is using the QueryPerformanceFrequency and QueryPerformanceCounter functions for the timing.
In C++ I am able to get the current time when my application starts I can use
time_t appStartTime = time(null);
then to find the difference in seconds from when it started I can just do the same thing, then find the difference. It looks like I should be using "System.DateTime" in C# net, but the MSDN is confusing in its explanation.
How can I use System.DateTime to find the difference in time (in seconds) between when my application starts, and the current time?
Use Now property
DateTime startTime = DateTime.Now;
//work
DateTime currentTime = DateTime.Now;
and then just simply calculate the difference.
currentTime - startTime;
If you would like to measure the performance consider using Stopwatch.
Stopwatch stopWatch = new Stopwatch();
stopWatch.Start();
//work
stopWatch.Stop();
As everyone suggested... But they were a little wrong :-) Use DateTime.UtcNow, because
It's faster (DateTime.Now calls DateTime.UtcNow)
It works around change of DST on/off.
OR
As #Shekhar_Pro suggested (yes, he was right!), use the Stopwatch
var sw = Stopwatch.StartNew()
.... your code
sw.Stop();
var ms = sw.ElapsedMilliseconds;
or
var ticks = sw.ElapsedTicks;
Oh... and I was forgetting... What you are doing is probably worthless in certain situation... You know, 2011 processors are multicore (and even 2010 :-) )... If you app is vaguely multithread you are probably better measuring:
Process.GetCurrentProcess().TotalProcessorTime
This include the use of all the cores used by your app... So on a dual core, using both cores, it will "gain" 2 seconds for every "real time" second.
If you are using this for checking performance and time taken to Execute code then you Best bet is to use StopWatch.
otherwise System.DateTime has a Subtract function which can be used to get a TimeSpan object or even a simple - (subtract) operator will do it.
Then that TimeSpan object has a property of TotalSeconds which you can use.
Several ways to do this:
Use DateTime.Now. Subtracting produces a TimeSpan. Takes 8 bytes of storage, times up to 8000 years, resolution of 1 millisecond but accurate to 1/64 second on most machines.
Use Environment.TickCount. Similar to time_t but relative from machine boot time. Takes 4 bytes of storage, times up to 24 days (49 with a cast), resolution and accuracy same as DateTime.
Use Stopwatch. Stored on the heap, resolution is machine dependent but almost always well below a microsecond. Accuracy isn't usually good but repeats decently, assume +/- 5%. Best used to measure small intervals for comparison.
Use timeGetTime. This requires pinvoke to use this multimedia timer. Similar to Environment.TickCount, you can get 1 msec accuracy by using timeBeginPeriod. This is not cheap since it has system-wide effects. Best avoided.
Keep in mind that process execution is subject to the vagaries of overall operating system load, your program is sharing resources with the other 70-odd processes that are running. Either DateTime or TickCount has plenty of accuracy for that.
DateTime startTime = DateTime.Now;
//some code
TimeSpan difference = DateTime.Now - startTime;
int seconds = difference.TotalSeconds.Truncate();
I want to measure the performance of my code.. if I consider the time as a criterion
I have this code
DateTime oldDate = new DateTime(2002,7,15);
DateTime newDate = DateTime.Now;
// Difference in days, hours, and minutes.
TimeSpan ts = newDate - oldDate;
// Difference in days.
int differenceInDays = ts.Milliseconds ;
Question1: is this the only way that I can test the performance of my algorithm ?
Question2: what are other criterion that C# provide to test the performance?
Regards
Its always better to use System.Diagnostics.Stopwatch
Check this link for more details. Performance Tests: Precise Run Time Measurements with System.Diagnostics.Stopwatch
use Stopwatch class
//Start a stopwatch:
var watch = Stopwatch.StartNew();
//Execute the code
watch.Stop(); //This stops the watch
The elapsed time can be measured by using Elapsed, ElapsedMilliSeconds and ElapsedTicks properties.
Try using the StopWatch class. It has significantly higher resolution than the DateTime and TimeSpan classes.
Additionally, you can look at the Windows Performance Counters as a way of measuring performance while your application is running so that you can monitor the health of your application.
You can use a profiler (tool based, for example with SlimTune) or measure the time with System.Diagnostics.Stopwatch. It has better precision than the DateTime hack.
If you truly want to use DateTime (because it's easier to use), use UtcNow instead of Now. It's a little faster (because current date and time are stored in UTC format in Windows) and as an added bonus, you can test your program around the DST change time :-).
But yeah, use Stopwatch.
Stopwatch watch = Stopwatch.StartNew();
watch.Stop()
Ah... very important... your code is wrong
ts.TotalMilliseconds
I did the same error yesterday, but I was measuring times around the second, so it was more difficult to notice :-)
Up until now I used DateTime.Now for getting timestamps, but I noticed that if you print DateTime.Now in a loop you will see that it increments in descrete jumps of approx. 15 ms. But for certain scenarios in my application I need to get the most accurate timestamp possible, preferably with tick (=100 ns) precision. Any ideas?
Update:
Apparently, StopWatch / QueryPerformanceCounter is the way to go, but it can only be used to measure time, so I was thinking about calling DateTime.Now when the application starts up and then just have StopWatch run and then just add the elapsed time from StopWatch to the initial value returned from DateTime.Now. At least that should give me accurate relative timestamps, right? What do you think about that (hack)?
NOTE:
StopWatch.ElapsedTicks is different from StopWatch.Elapsed.Ticks! I used the former assuming 1 tick = 100 ns, but in this case 1 tick = 1 / StopWatch.Frequency. So to get ticks equivalent to DateTime use StopWatch.Elapsed.Ticks. I just learned this the hard way.
NOTE 2:
Using the StopWatch approach, I noticed it gets out of sync with the real time. After about 10 hours, it was ahead by 5 seconds. So I guess one would have to resync it every X or so where X could be 1 hour, 30 min, 15 min, etc. I am not sure what the optimal timespan for resyncing would be since every resync will change the offset which can be up to 20 ms.
The value of the system clock that DateTime.Now reads is only updated every 15 ms or so (or 10 ms on some systems), which is why the times are quantized around those intervals. There is an additional quantization effect that results from the fact that your code is running in a multithreaded OS, and thus there are stretches where your application is not "alive" and is thus not measuring the real current time.
Since you're looking for an ultra-accurate time stamp value (as opposed to just timing an arbitrary duration), the Stopwatch class by itself will not do what you need. I think you would have to do this yourself with a sort of DateTime/Stopwatch hybrid. When your application starts, you would store the current DateTime.UtcNow value (i.e. the crude-resolution time when your application starts) and then also start a Stopwatch object, like this:
DateTime _starttime = DateTime.UtcNow;
Stopwatch _stopwatch = Stopwatch.StartNew();
Then, whenever you need a high-resolution DateTime value, you would get it like this:
DateTime highresDT = _starttime.AddTicks(_stopwatch.Elapsed.Ticks);
You also may want to periodically reset _starttime and _stopwatch, to keep the resulting time from getting too far out of sync with the system time (although I'm not sure this would actually happen, and it would take a long time to happen anyway).
Update: since it appears that Stopwatch does get out of sync with the system time (by as much as half a second per hour), I think it makes sense to reset the hybrid DateTime class based on the amount of time that passes between calls to check the time:
public class HiResDateTime
{
private static DateTime _startTime;
private static Stopwatch _stopWatch = null;
private static TimeSpan _maxIdle =
TimeSpan.FromSeconds(10);
public static DateTime UtcNow
{
get
{
if ((_stopWatch == null) ||
(_startTime.Add(_maxIdle) < DateTime.UtcNow))
{
Reset();
}
return _startTime.AddTicks(_stopWatch.Elapsed.Ticks);
}
}
private static void Reset()
{
_startTime = DateTime.UtcNow;
_stopWatch = Stopwatch.StartNew();
}
}
If you reset the hybrid timer at some regular interval (say every hour or something), you run the risk of setting the time back before the last read time, kind of like a miniature Daylight Savings Time problem.
To get a high-resolution tick-count, please, use the static Stopwatch.GetTimestamp()-method:
long tickCount = System.Diagnostics.Stopwatch.GetTimestamp();
DateTime highResDateTime = new DateTime(tickCount);
just take a look at the .NET Source Code:
public static long GetTimestamp() {
if(IsHighResolution) {
long timestamp = 0;
SafeNativeMethods.QueryPerformanceCounter(out timestamp);
return timestamp;
}
else {
return DateTime.UtcNow.Ticks;
}
}
Source Code here: http://referencesource.microsoft.com/#System/services/monitoring/system/diagnosticts/Stopwatch.cs,69c6c3137e12dab4
[The accepted answer does not appear to be thread safe, and by its own admission can go backwards in time causing duplicate timestamps, hence this alternative answer]
If what you really care about (per your comment) is in fact, a unique timestamp that is allocated in strict ascending order and which corresponds as closely as possible to the system time, you could try this alternative approach:
public class HiResDateTime
{
private static long lastTimeStamp = DateTime.UtcNow.Ticks;
public static long UtcNowTicks
{
get
{
long orig, newval;
do
{
orig = lastTimeStamp;
long now = DateTime.UtcNow.Ticks;
newval = Math.Max(now, orig + 1);
} while (Interlocked.CompareExchange
(ref lastTimeStamp, newval, orig) != orig);
return newval;
}
}
}
These suggestions all look too hard! If you're on Windows 8 or Server 2012 or higher, use GetSystemTimePreciseAsFileTime as follows:
[DllImport("Kernel32.dll", CallingConvention = CallingConvention.Winapi)]
static extern void GetSystemTimePreciseAsFileTime(out long filetime);
public DateTimeOffset GetNow()
{
long fileTime;
GetSystemTimePreciseAsFileTime(out fileTime);
return DateTimeOffset.FromFileTime(fileTime);
}
This has much, much better accuracy than DateTime.Now without any effort.
See MSDN for more info: http://msdn.microsoft.com/en-us/library/windows/desktop/hh706895(v=vs.85).aspx
It does return the most accurate date and time known to the operating system.
The operating system also provides higher resolution timing through QueryPerformanceCounter and QueryPerformanceFrequency (.NET Stopwatch class). These let you time an interval but do not give you date and time of day. You might argue that these would be able to give you a very accurate time and day, but I am not sure how badly they skew over a long interval.
1). If you need high resolution absolute accuracy: you can't use DateTime.Now
when it is based on a clock with a 15 ms interval (unless it
is possible "slide" the phase).
Instead, an external source of better resolution absolute
accuracy time (e.g. ntp), t1 below, could be combined with the high
resolution timer (StopWatch / QueryPerformanceCounter).
2). If you just need high resolution:
Sample DateTime.Now (t1) once together with a value from the
high resolution timer (StopWatch / QueryPerformanceCounter)
(tt0).
If the current value of the high resolution timer is tt then the
current time, t, is:
t = t1 + (tt - tt0)
3). An alternative could be to disentangle absolute time and
order of the financial events: one value for absolute time
(15 ms resolution, possibly off by several minutes) and one
value for the order (for example, incrementing a value by one each
time and store that). The start value for the order can be
based on some system global value or be derived from the
absolute time when the application starts.
This solution would be more robust as it is not dependent on
the underlying hardware implementation of the clocks/timers
(that may vary betweens systems).
This is too much work for something so simple. Just insert a DateTime in your database with the trade. Then to obtain trade order use your identity column which should be an incrementing value.
If you are inserting into multiple databases and trying to reconcile after the fact then you will be mis-calculating trade order due to any slight variance in your database times (even ns increments as you said)
To solve the multiple database issue you could expose a single service that each trade hits to get an order. The service could return a DateTime.UtcNow.Ticks and an incrementing trade number.
Even using one of the solutions above anyone conducting trades from a location on the network with more latency could possibly place trades first (real world), but they get entered in the wrong order due to latency. Because of this the trade must be considered placed at the database, not at users' consoles.
The 15 ms (actually it can be 15-25 ms) accuracy is based on the Windows 55 Hz/65 Hz timer resolution. This is also the basic TimeSlice period. Affected are Windows.Forms.Timer, Threading.Thread.Sleep, Threading.Timer, etc.
To get accurate intervals you should use the Stopwatch class. It will use high-resolution if available. Try the following statements to find out:
Console.WriteLine("H = {0}", System.Diagnostics.Stopwatch.IsHighResolution);
Console.WriteLine("F = {0}", System.Diagnostics.Stopwatch.Frequency);
Console.WriteLine("R = {0}", 1.0 /System.Diagnostics.Stopwatch.Frequency);
I get R=6E-08 sec, or 60 ns. It should suffice for your purpose.
I'd add the following regarding MusiGenesis Answer for the re-sync timing.
Meaning: What time should I use to re-sync ( the _maxIdle in MusiGenesis answer's)
You know that with this solution you are not perfectly accurate, thats why you re-sync.
But also what you implicitly want is the same thing as Ian Mercer solution's:
a unique timestamp that is allocated in strict ascending order
Therefore the amount of time between two re-sync ( _maxIdle Lets call it SyncTime) should be function of 4 things:
the DateTime.UtcNow resolution
the ratio of accuracy you want
the precision level you want
An estimation of the out-of-sync ratio of your machine
Obviously the first constraint on this variable would be :
out-of-sync ratio <= accuracy ratio
For example : I dont want my accuracy to be worst than 0.5s/hrs or 1ms/day etc... (in bad English: I dont want to be more wrong than 0.5s/hrs=12s/day).
So you cannot achieve a better accuracy than what the Stopwatch offer you on your PC. It depends on your out-of-sync ratio, which might not be constant.
Another constraint is the minimum time between two resync:
Synctime >= DateTime.UtcNow resolution
Here accuracy and precision are linked because if you using a high precision (for example to store in a DB) but a lower accuracy, You might break Ian Mercer statement that is the strict ascending order.
Note: It seems DateTime.UtcNow may have a bigger default Res than 15ms (1ms on my machine) Follow the link:
High accuracy DateTime.UtcNow
Let's take an example:
Imagine the out-of-sync ratio commented above.
After about 10 hours, it was ahead by 5 seconds.
Say I want a microsec precision. My timer res is 1ms (see above Note)
So point by point:
the DateTime.UtcNow resolution : 1ms
accuracy ratio >= out-of-sync ratio,
lets take the most accurate possible so : accuracy ratio = out-of-sync ratio
the precision level you want : 1 microsec
An estimation of the out-of-sync ratio of your machine : 0.5s/hour (this is also my accuracy)
If you reset every 10s, imagine your at 9.999s, 1ms before reset.
Here you make a call during this interval. The time your function will plot is ahead by : 0.5/3600*9.999s eq 1.39ms.
You would display a time of 10.000390sec. After UtcNow tick, if you make a call within the 390micro sec, your will have a number inferior to the previous one. Its worse if this out-of-sync ratio is random depending on CPU Load or other things.
Now let's say I put SyncTime at its minimum value > I resync every 1ms
Doing the same thinking would put me Ahead of time by 0.139 microsec < inferior to the precision I want. Therefore if I call the function at 9.999 ms, so 1microsec before reset I will plot 9.999. And just after I will plot 10.000. I will have a good order.
So here the other constraint is : accuracy-ratio x SyncTime < precision level , lets say to be sure because number can be rounded up that accuracy-ratio x SyncTime < precision level/2 is good.
The issue is resolved.
So a Quick recap would be :
Retrieve your timer resolution.
Compute an estimate of the out-of-sync ratio.
accuracy ratio >= out-of-sync ratio estimate , Best accuracy = out-of-sync ratio
Choose your Precision Level considering the following:
timer-resolution <= SyncTime <= PrecisionLevel/(2*accuracy-ratio)
The best Precision you can achieve is timer-resolution*2*out-of-sync ratio
For the above ratio (0.5/hr) the correct SyncTime would be 3.6ms, so rounded down to 3ms.
With the above ratio and the timer resolution of 1ms. If you want a one-tick Precision level (0.1microsec), you need an out-of-sync ratio of no more than : 180ms/hour.
In its last answer to its own answer MusiGenesis state:
#Hermann: I've been running a similar test for the last two hours (without the reset correction), and the Stopwatch-based timer is only running about 400 ms ahead after 2 hours, so the skew itself appears to be variable (but still pretty severe). I'm pretty surprised that the skew is this bad; I guess this is why Stopwatch is in System.Diagnostics. – MusiGenesis
So the Stopwatch accuracy is close to 200ms/hour, almost our 180ms/hour. Is there any link to why our number and this number are so close ? Dont know. But this accuracy is enough for us to achieve Tick-Precision
The Best PrecisionLevel: For the example above it is 0.27 microsec.
However what happen if I call it multiple times between 9.999ms and the re-sync.
2 calls to the function could end-up with the same TimeStamp being returned the time would be 9.999 for both (as I dont see more precision). To circumvent this, you cannot touch the precision level because it is Linked to SyncTime by the above relation. So you should implement Ian Mercer solution's for those case.
Please don't hesitate to comment my answer.
If need the timestamp to perform benchmarks use StopWatch which has much better precision than DateTime.Now.
I think this is the best way to solve this issue:
long timestamp = DateTimeOffset.Now.ToUnixTimeMilliseconds();