Is .NET DateTime truncating my seconds? - c#

I am trying to format some precise dates, converting them from a Unix timestamp to a DateTime object. I noticed that the AddSeconds method has an overload that accepts a floating point number.
My expectation is that I may pass in a number such as 1413459415.93417 and it will give me a DateTime object with tick-level precision. Is this a decent assumption, or does the AddSeconds method still provide no better than millisecond precision? In the conversion, do I have to add the ticks myself?
My conversion code is below:
public static DateTime CalendarDateFromUnix(double unixTime)
{
DateTime calendarTime = UnixEpoch.AddSeconds(unixTime);
return calendarTime;
}
I expect to format the ToString of this date like 16 Oct 2014 11:36:55.93417 using the format string below:
dd MMM yyyy HH:mm:ss.fffff
Instead of giving me 16 Oct 2014 11:36:55.93417, it is giving me 16 Oct 2014 11:36:55.93400
Am I doing something wrong or is .NET truncating my floating-point seconds representation? I am new to .NET, so the former is quite possible.
Thanks

From the documentation of DateTime.AddSeconds:
The value parameter is rounded to the nearest millisecond.
An alternative would be to multiply by TimeSpan.TicksPerSecond and then add that to the ticks of UnixEpoch:
return new DateTime(
UnixEpoch.Ticks + (long) (unixTime * Timespan.TicksPerSecond),
DateTimeKind.Utc);

Related

Get Milliseconds from Ticks in DateTime c# [duplicate]

This question already has answers here:
How do you convert epoch time in C#?
(14 answers)
Closed 2 years ago.
I have a simple DateTime object, equal to the date: 11/1/2020 8:11:14 AM.
I want to convert it to milliseconds so I do:
myTimestamp?.Ticks / TimeSpan.TicksPerMillisecond.
I get 63739786274788, which seems correct from the pure calculation perspective.
However, when I input it into one of the online converters to validate, I get the date Wed Nov 01 3989 01:11:14, which is of course way off.
Questions:
What is this number 63739786274788 if not time in ms?
How do I get "normal" timestamp in ms?
In .NET, DateTime ticks are based on an epoch of 0001-01-01T00:00:00.0000000. The .Kind property is used to decide whether that is UTC, local time, or "unspecified".
Most online converters, such as the one you linked to, are expecting a Unix Timestamp, where the value is based on an epoch of 1970-01-01T00:00:00.000Z. It is always UTC based. (The precision varies, both seconds and milliseconds are commonly used.)
If you want to get a milliseconds-based Unix Timestamp From .NET, instead of dividing you should use the built-in functions DateTimeOffset.FromUnixTimeMilliseconds and DateTimeOffset.ToUnixTimeMilliseconds. (There are also seconds-based versions of these functions.)
Assuming your input values are UTC-based:
DateTime dt = new DateTime(2020, 11, 1, 8, 11, 14, DateTimeKind.Utc);
DateTimeOffset dto = new DateTimeOffset(dt);
long timestamp = dto.ToUnixTimeMilliseconds();
// output: 1604218274000
DateTimeKind.Local will also work with this, assuming your values are indeed based on the computer's local time zone. DateTimeKind.Unspecified is a bit trickier, as you'll need to convert to a DateTimeOffset with a specific time zone using TimeZoneInfo first.
You could also construct the DateTimeOffset value directly, rather than go through DateTime at all.
Okay, so you start off dividing Ticks by TicksPerMillisecond (10,000)
As you can see, the number you generated is much larger than the current milliseconds:
63739786274788
1607363529803
The short answer is that Ticks are based off of 12:00:00 midnight January 1, 0001 and a your online calculator is based off of unix time, January 1, 1970. So that would explain why you're about 2,000 years off. If you subtracted the Ticks from a new DateTime(1970,1,1), then that would give you about the right number to satisfy the online calculator.
For more info, I would suggest reading through MS's docs on DateTime

Does anyone know of a library to convert my (ISO8601???) timestamp string to UNIX timestamp format?

I have a timestamp being extracted from an XML document in the format
2019-02-13T09:01:53.557+00:00
Does anyone know of a library to convert this to unix time or will I have to fiddle with the string?
Since I'm not generating the timestamp myself the answers I have been finding aren't helping me much.
I think the format is iso8601 but I'm not certain as it seems to have a few extra digits which I'm guessing is ms resolution (since it relates to a machine tool), is this a problem?
Using DateTimeOffset.ToUnixTimeSeconds from .NET 4.6 and above, this should be fairly simple:
if (DateTime.TryParse("2019-02-13T09:01:53.557+00:00", out var dateTime))
{
var unixTimestamp = ((DateTimeOffset)dateTime).ToUnixTimeSeconds();
}
If for whatever reason your string format changes, you can also do a exact parse using DateTime.TryParseExact:
if (DateTime.TryParseExact("2019-02-13T09:01:53.557+00:00", "yyyy-MM-dd'T'HH:mm:ss.fffzzz",
CultureInfo.InvariantCulture, DateTimeStyles.None, out var dateTime))
{
var unixTimestamp = ((DateTimeOffset)dateTime).ToUnixTimeSeconds();
}
Basically you should consider what a Unix timestamp is: (from Wikipedia)
It is the number of seconds that have elapsed since 00:00:00 Thursday, 1 January 1970, Coordinated Universal Time (UTC), minus leap seconds
So probably a solution would be parsing the string to a datetime, converting it to UTC and then calculate the seconds.
var date = DateTime.Parse("2019-02-13T09:01:53.557+00:00", null, System.Globalization.DateTimeStyles.RoundtripKind);
var unixTimestamp = (int)(date.ToUniversalTime().Subtract(new DateTime(1970, 1, 1))).TotalSeconds;

Getting Milliseconds elapsed between arbitrary date and Epoch time

If I write a simple method to return the milliseconds between epoch time and DateTime.UtcNow, I get a proper answer. However, if I write a method to return the milliseconds between some arbitrary date and epoch time, the last three digits are always zero. 'Some arbitrary date' means that I pass in to the method the output of DateTime.Parse("arbitrary date string"). As near as I can make out, the DateTime object returned by .Parse is not returning all the significant digits.
Test method:
static void GetMillis()
{
DateTime dUtc = DateTime.UtcNow;
DateTime epoch = new DateTime(1970,1,1,0,0,0,DateTimeKind.Utc);
double utcmillis = (dUtc - epoch).TotalMilliseconds;
String timestamp = dUtc.ToString();
DateTime arbitrary = (DateTime.Parse(timestamp));
Console.WriteLine("Milliseconds between DateTime.UtcNow {0} \nand epoch time {1} are {2}", dUtc, epoch, utcmillis);
Console.WriteLine("Milliseconds between arbitrary date {0} \nand epoch time {1} are {2}", arbitrary, epoch, (arbitrary - epoch).TotalMilliseconds);
}
Output:
C:\src\vs\epochConverter\epochConverter\bin\Debug
{powem} [54] --> .\epochConverter.exe -debug
Milliseconds between DateTime.UtcNow 8/26/2012 11:12:31 PM
and epoch time 1/1/1970 12:00:00 AM are 1346022751385.8
Milliseconds between arbitrary date 8/26/2012 11:12:31 PM
and epoch time 1/1/1970 12:00:00 AM are 1346022751000
I don't know if I'm doing something grotesquely wrong or not understanding the math here. I've researched in MSDN and can't find anything relating to this difference. I really would like to be able to compute the millis as described -- is it possible?
Thanks.
mp
You want to examine the intermediate values of:
String timestamp = dUtc.ToString();
Just what it returns will depend on your local settings, but it'll be something like 8/26/2012 11:12:31, which is only accurate to the nearest second.
Parsing that of course gives a date-time with 0 milliseconds.
It is therefore correct that your milliseconds-since-epoch method has zeros at that point.
If however you did something like:
arbitrary = new DateTime(2012, 8, 26, 11, 12, 31, 123);
You'd get those 123 milliseconds influencing the result. You can also use a ToString and a ParseExact that includes fractions of a second, or a whole slew of other ways of obtaining a DateTime.
In all, your milliseconds-since-epoch worked perfectly, but your way of getting a date to test it was flawed.
The default DateTime.ToString() format does not include the milliseconds and this is where the data is being lost; it happens before the Parse. To obtain the milliseconds in the string representation, use a custom format:
DateTime.UtcNow.ToString()
// -> 8/26/2012 11:37:24 PM
DateTime.Parse("8/26/2012 11:37:24 PM").Millisecond
// -> 0
DateTime.UtcNow.ToString("yyyy-MM-ddTHH:mm:ss.fffffffK")
// -> 2012-08-26T23:41:17.3085938Z
DateTime.Parse("2012-08-26T23:41:17.3085938Z").Millisecond
// -> 308
See The Round-trip ("O", "o") Format Specifier
to type less. Or, in this case, consider avoiding the conversion entirely :-)
The math is sound.
The math looks reasonable here. Don't forget there are 1000 milliseconds in a 1 second, so any date computation from an arbitrary time that does not include milliseconds vs an almost identical time that includes milliseconds will have an error of +/- 1000 milliseconds.

C# strange with DateTime

I got some strange result for:
Console.WriteLine(new DateTime(1296346155).ToString());
Result is:
01.01.0001 0:02:09
But it is not right!
I parsed value 1296346155 from some file. It said that it is in UTC;
Please explain;)
Thank you for help!)))
DateTime expects "A date and time expressed in the number of 100-nanosecond intervals that have elapsed since January 1, 0001 at 00:00:00.000 in the Gregorian calendar." (from msdn)
This question shows how you can convert a unix timestamp to a DateTime.
The constructor for DateTime that accept long type is expecting ticks value, not seconds or even milliseconds, and not from 1/1/1970 like in other languages.
So 1296346155 ticks is 129 seconds.
If it's Unix time, then the following should yield the expected result;
DateTime baseTime = new DateTime(1970, 1, 1, 0, 0, 0);
Console.WriteLine(baseTime.AddSeconds(1296346155));
See Unix Time for more information.
That constructor is not what you want as the time is not measured in ticks.
DateTime start = new DateTime(1970,1,1,0,0,0,0);
start = start.AddSeconds(1296346155).ToLocalTime();
Console.WriteLine(start);
// You don't need to use ToString() in a Console.WriteLine call
Ive found the following subject where there is a conversion between unix timestamp (the one you have) and .Net DateTime
How to convert a Unix timestamp to DateTime and vice versa?
That is correct - what were you expecting it to be and why?
The constructor System.DateTime(Int64) takes the number of 100-nanosecond intervals (known as Ticks) since January 1st 0001 (in the Gregorian calendar).
Therefore, 1296346155 / 10000000 gives you the number of seconds, which is 129.6.
Therefore, this should display 2 minutes and 9 seconds since midnight on 1st January 0001.

How can I convert a DateTime value to a double?

How do I convert a DateTime value to a double?
If, by double you mean an OLE Automation date, then you can use DateTime.ToOADate(). From the linked MSDN topic:
An OLE Automation date is implemented as a floating-point number whose value is the number of days from midnight, 30 December 1899. For example, midnight, 31 December 1899 is represented by 1.0; 6 A.M., 1 January 1900 is represented by 2.25; midnight, 29 December 1899 is represented by -1.0; and 6 A.M., 29 December 1899 is represented by -1.25.
The base OLE Automation Date is midnight, 30 December 1899. The maximum OLE Automation Date is the same as MaxValue, the last moment of 31 December 9999.
If you're talking about some other date representation that can also be stored in a double, please specify...
DateTime.ToOADate() converts to a double OLE Automation Date in C#
Yes, OLE Automation date enable Datetime to convert to Decimal/double type. However, the outcome/value to decimal/double is not exact system Datetime.
For example,
Decimal dateIndDec = Convert.Decimal (Datetim.Today.ToOADate());
is not equal to MS SQL
Select Convert (Decimal (10, 9), GetDate())
Conclusion: OLE Automation date is not a true system datetime info... cannot use it.
I've been searched everywhere about convert Datetime value to Decimal value in C# without any luck.
You can calculate difference between your DateTime and DateTime.MinValue, and then get any Total* value.
var difference = DateTime.Now - DateTime.MinValue;
Console.WriteLine(difference.TotalMinutes);
Console.WriteLine(difference.TotalMilliseconds);

Categories