Why is milliseconds in timespan not increasing between two DateTime.Now - c#

I have this simple bit of code:
//LastUpdate = DateTime.Npw
//These two lines occur every frame
TimeSpan timeSpan = DateTime.Now - data.LastUpdate;
Debug.Log(timeSpan.Milliseconds);
But the result of this is showing milliseconds not really increasing, it fluctuates between 100 to 900 ms. It should be ever increasing in size since, the time past is increasing.
I have checked that LastUpdate does not change, so that isn't the cause.
I guess i am misunderstanding how timespan works. I am trying to get milliseconds that has passed between LastUpdate and now of the current frame.
Am i using it wrong ? I don't understand the issue.

From TimeSpan docs:
.Milliseconds (emphasis mine):
Gets the milliseconds component of the time interval represented by the current TimeSpan structure.
You want to use .TotalMilliseconds:
Gets the value of the current TimeSpan structure expressed in whole and fractional milliseconds.

Related

Why is the conversion from ulong to DateTime returning 0?

I'm trying to convert ULONG to DateTime and as DateTime accepts Ticks as param which are LONG, here's how I do it.
ulong time = 12354;
new DateTime((long)time).ToString("HH:mm:ss");
The result of this is 00:00:00.
I don't understand the result, am I doing something wrong?
P.S. i.Time is not 0, I checked multiple times.
Citing the documentation:
Initializes a new instance of the DateTime structure to a specified number of ticks.
ticks
Type: System.Int64
A date and time expressed in the number of 100-nanosecond intervals that have elapsed since January 1, 0001 at 00:00:00.000 in the Gregorian calendar.
This is 100 nanoseconds which is a super small time unit. So unless your number is larger than 10000000, you don’t even get a single second:
Console.WriteLine(new DateTime((long)10000000).ToString());
// 01.01.0001 00:00:01
So you should really think about what your “time left” (i.Time) value is supposed to mean? Is this really time in the unit of 100 nanoseconds? Very likely not. It’s probably more about seconds or something completely different.
Btw. if the number you have does not actually represent a moment in time, you should not use DateTime. You should use TimeSpan instead. Its long constructor has the same behavior though, but you can use one of the handy static functions to create a time span with the correct unit:
var ts = TimeSpan.FromSeconds(1000);
Console.WriteLine(ts.ToString());
// 00:16:40
Because a tick is 100 nanoseconds, and so 12354 ticks is only 1235400 nanoseconds which is only .0012354 seconds. So your datetime is .0012354 seconds after midnight on 1 Jan in the year one.

Why is DateTime based on Ticks rather than Milliseconds?

Why is the minimum resolution of a DateTime based on Ticks (100-nanosecond units) rather than on Milliseconds?
TimeSpan and DateTime use the same Ticks making operations like adding a TimeSpan to a DateTime trivial.
More precision is good. Mainly useful for TimeSpan, but above reason transfers that to DateTime.
For example StopWatch measures short time intervals often shorter than a millisecond. It can return a TimeSpan.
In one of my projects I used TimeSpan to address audio samples. 100ns is short enough for that, milliseconds wouldn't be.
Even using milliseconds ticks you need an Int64 to represent DateTime. But then you're wasting most of the range, since years outside 0 to 9999 aren't really useful. So they chose ticks as small as possible while allowing DateTime to represent the year 9999.
There are about 261.5 ticks with 100ns. Since DateTime needs two bits for timezone related tagging, 100ns ticks are the smallest power-of-ten interval that fits an Int64.
So using longer ticks would decrease precision, without gaining anything. Using shorter ticks wouldn't fit 64 bits. => 100ns is the optimal value given the constraints.
From MSDN;
A single tick represents one hundred nanoseconds or one ten-millionth
of a second. There are 10,000 ticks in a millisecond.
A tick represents the total number of ticks in local time, which is midnight on January 1st in the year 0001. But a tick is also smallest unit for TimeSpan also. Since ticks are Int64, so if miliseconds used instead of ticks, there can be a information losing.
Also could be a default CLS implementation.
Just for the information:
1 millisecond = 10 000 ticks
1 second = 10 000 000 ticks
Using difference (delta) of two ticks you can get more granular precision (later converting them to millisecond or seconds)
In a C# DateTime context, ticks starts from 0 (DateTime.MinValue.Ticks) up until DateTime.MaxValue.Ticks
new DateTime(0) //numbers between 0 and (864*10^9-1) produces same date 01/01/0001
new DateTime(DateTime.MaxValue.Ticks) //MaxValue tick generates 12/31/9999
System time ticks are incremented by 864 billion ticks per day.
for higher time resolution, even though you don't need it most of the time.
The tick is what the system clock works with.

Recommended method to calculate the difference between two timespans

What should be the best way to calculate the time diff which is accurate upto the level of Microseconds. currently I am doing as follows:
((TimeSpan)(DateTime.Now - _perfClock)).TotalMilliseconds
Note: perfClock is DateTime (set prior to task)
Which is suppose to give accuracy upto Milliseconds, but in my case it is showing values ends with "000". like 8000,9000 etc...
This is forcing me to think that is just converting seconds to Milliseconds somewhere, instead of calculating diff in Milliseconds. (Possibly I am wrong somewhere in code above).
But what should be the recommended mechanism for accurate Time Diff calculation?
-Sumeet
The issue is not with TimeSpan, that is accurate down to ticks, which is 100 nanoseconds.
The issue is you are using DateTime.Now for your timer.
DateTime.Now is accurate to about 16ms i believe. as mentioned by V4Vendetta, you want to use Stopwatch if you need "high resolution" results. Stopwatch can provide you with ticks (long) or TimeSpan. use Timespan for easy manipulation (in your case, add/subtract).
Note also that Stopwatch provides a .IsHighResolution property, to see if you have a better accuracy than Datetime.Now (it's always true on PC iirc)
I don't know the context in which you are measuring time but it would be good to start of with Stopwatch and check your results.
Also worth a read Precise Measurement
Have you tried:
TimeSpan diff = DateTime.Now.Subtract(_perfClock);

C# - Time calculation not working with TimeSpan.FromTicks DateTime

I am trying to calculate a video framerate in a program. To do this I take
DateTime.Now
at the beginning of a stream, and then again after every frame whilst also incrementing a framecounter.
Then I calculate the FPS like so:
int fps = (int)(frames / (TimeSpan.FromTicks(CurrentTime.Ticks).Seconds - TimeSpan.FromTicks(StartTime.Ticks).Seconds));
The problem is that I occassionally get a negative number out, meaning the start time must be later than the current time. How can this be the case? Does anyone know anough about these functions to explain?
Seconds gives you the seconds part of the TimeSpan, not the total duration of the TimeSpan converted in seconds. This means that Seconds will never be greater than 60.
Use TotalSeconds instead
You should consider using StopWatch for such needs, It has much better precision
The datetime functions are probably not precise enough for your needs, you may want to look into performance counters instead. I think the StopWatch class is what your looking for. System.Diagnostics.StopWatch. that is using the QueryPerformanceFrequency and QueryPerformanceCounter functions for the timing.

DateTime, minutes since update?

I have this DateTime object (update) which is set to DateTime.now, when i update my application.
I also have this timerTick event, called on every 5 seconds which should check how many minutes ago, update was.
I've tried with:
if ((DateTime.Now - Updated).Minutes > 0)
{
updateTextBlock.Text = "updated " + ((DateTime.Now - Updated).Minutes).ToString() + " minutes ago";
}
But it does not seem to work correctly. Isn't there a better way to do this?
/R
I suspect you want TotalMinutes instead of Minutes. Otherwise you'll only ever get a value in the range -59 to 59.
You may also want to consider using UtcNow instead of Now - otherwise you could get odd effects due to time zone changes (either the user changing time zone, or the time zone changing its UTC offset, usually for daylight saving time.)
You may find it easier to use an instance of System.Diagnostics.StopWatch to keep track of how much time has elapsed since any particular starting point.
It can be more reliable and accurate than doing math on DateTime objects because it'll use the hardware's High Resolution Timer if one is available.
You're probably looking for TotalMinutes, not just Minutes. TotalMinutes will give you the total number of minutes in the interval, whereas Minutes only gives 0-59 (since you also have Hours, etc.)
One way I did something similar was using TimeSpan
I had two variables timePassed and tickTime
where I would set tickTime to be a 5 second TimeSpan and timePassed to be 0
TimeSpan tickTime = new TimeSpan(0,0,0,5); // 5 seconds
TimeSpan timePassed = new TimeSpan(0,0,0,0); // 0 seconds
then in the handler for the tick event I would add tickTime to timePassed
timePassed = timePassed.Add(tickTime); \\ adds 5 seconds to the timePassed TimeSpan
Then you can use timePassed to get the time since update.
Hope this helps

Categories