timestamp like 6.3527482515083E+17 - c#

I'm getting via WSDL from C# application timestamp number like
6.3527482515083E+17
6.3527482515047E+17
6.352748251638E+17
6.3527482514463E+17
All are the times in the past (this year, probably)
I think that's is the datetime counted from YEAR ZERO. I try to count up seconds from ZERO and get someting about 63537810544. But this is not exact, because missing leap years.
exists in PHP any function how to get UNIX timestamp ??? or convert it to STRING datetime ???
I get values via WSDL so I can't reformat it on source...

They are in 100 nanosecond ticks (1/10,000,000 of a second) from 12:00:00 midnight, January 1, 0001. This information can be found in this MSDN article.
A single tick represents one hundred nanoseconds or one ten-millionth of a second. There are 10,000 ticks in a millisecond, or 10 million ticks in a second.
The value of this property represents the number of 100-nanosecond intervals that have elapsed since 12:00:00 midnight, January 1, 0001, which represents DateTime.MinValue. It does not include the number of ticks that are attributable to leap seconds.
The magic constant that represents the number of 100ns ticks between 12:00:00 midnight January 1 0001 and 12:00:00 midnight January 1 1970 (Unix epoch time) is 62135596800000000. So if you subtract that from your numbers you get 100ns ticks since beginning of Unix Epoch time. Divide that by 10,000,000 and you get seconds. And that is usable in PHP. Sample code below to demonstrate (unixepoch is in seconds):
<?php
$msdatetime = 6.3527482515083E+17;
$unixepoch = ($msdatetime - 621355968000000000)/10000000 ;
echo date("Y-m-d H:i:s", $unixepoch);
?>
Output:
2014-02-08 11:55:15
I found this constant listed on this helpful site dealing with the problem of getting unix epoch time from other formats.

Related

Why is the conversion from ulong to DateTime returning 0?

I'm trying to convert ULONG to DateTime and as DateTime accepts Ticks as param which are LONG, here's how I do it.
ulong time = 12354;
new DateTime((long)time).ToString("HH:mm:ss");
The result of this is 00:00:00.
I don't understand the result, am I doing something wrong?
P.S. i.Time is not 0, I checked multiple times.
Citing the documentation:
Initializes a new instance of the DateTime structure to a specified number of ticks.
ticks
Type: System.Int64
A date and time expressed in the number of 100-nanosecond intervals that have elapsed since January 1, 0001 at 00:00:00.000 in the Gregorian calendar.
This is 100 nanoseconds which is a super small time unit. So unless your number is larger than 10000000, you don’t even get a single second:
Console.WriteLine(new DateTime((long)10000000).ToString());
// 01.01.0001 00:00:01
So you should really think about what your “time left” (i.Time) value is supposed to mean? Is this really time in the unit of 100 nanoseconds? Very likely not. It’s probably more about seconds or something completely different.
Btw. if the number you have does not actually represent a moment in time, you should not use DateTime. You should use TimeSpan instead. Its long constructor has the same behavior though, but you can use one of the handy static functions to create a time span with the correct unit:
var ts = TimeSpan.FromSeconds(1000);
Console.WriteLine(ts.ToString());
// 00:16:40
Because a tick is 100 nanoseconds, and so 12354 ticks is only 1235400 nanoseconds which is only .0012354 seconds. So your datetime is .0012354 seconds after midnight on 1 Jan in the year one.

Why is DateTime based on Ticks rather than Milliseconds?

Why is the minimum resolution of a DateTime based on Ticks (100-nanosecond units) rather than on Milliseconds?
TimeSpan and DateTime use the same Ticks making operations like adding a TimeSpan to a DateTime trivial.
More precision is good. Mainly useful for TimeSpan, but above reason transfers that to DateTime.
For example StopWatch measures short time intervals often shorter than a millisecond. It can return a TimeSpan.
In one of my projects I used TimeSpan to address audio samples. 100ns is short enough for that, milliseconds wouldn't be.
Even using milliseconds ticks you need an Int64 to represent DateTime. But then you're wasting most of the range, since years outside 0 to 9999 aren't really useful. So they chose ticks as small as possible while allowing DateTime to represent the year 9999.
There are about 261.5 ticks with 100ns. Since DateTime needs two bits for timezone related tagging, 100ns ticks are the smallest power-of-ten interval that fits an Int64.
So using longer ticks would decrease precision, without gaining anything. Using shorter ticks wouldn't fit 64 bits. => 100ns is the optimal value given the constraints.
From MSDN;
A single tick represents one hundred nanoseconds or one ten-millionth
of a second. There are 10,000 ticks in a millisecond.
A tick represents the total number of ticks in local time, which is midnight on January 1st in the year 0001. But a tick is also smallest unit for TimeSpan also. Since ticks are Int64, so if miliseconds used instead of ticks, there can be a information losing.
Also could be a default CLS implementation.
Just for the information:
1 millisecond = 10 000 ticks
1 second = 10 000 000 ticks
Using difference (delta) of two ticks you can get more granular precision (later converting them to millisecond or seconds)
In a C# DateTime context, ticks starts from 0 (DateTime.MinValue.Ticks) up until DateTime.MaxValue.Ticks
new DateTime(0) //numbers between 0 and (864*10^9-1) produces same date 01/01/0001
new DateTime(DateTime.MaxValue.Ticks) //MaxValue tick generates 12/31/9999
System time ticks are incremented by 864 billion ticks per day.
for higher time resolution, even though you don't need it most of the time.
The tick is what the system clock works with.

DateTime.Ticks, DateTime.Equals and timezones

How come the following code (in C#) returns false :
DateTime d = DateTime.Now;
d.Ticks == d.ToUniversalTime().Ticks; // false
I'd expect the ticks of a DateTime to be based on UTC time.
The MSDN page on DateTime.Ticks mentions says
The value of this property represents the number of 100-nanosecond intervals that have elapsed since 12:00:00 midnight, January 1, 0001, which represents DateTime.MinValue. It does not include the number of ticks that are attributable to leap seconds.
Midnight on January first, 0001 .. in which timezone ?
Why would DateTime.Ticks be timezone dependant ?
I guess that the fact that the Ticks are different is why the following code also returns false
DateTime d = DateTime.Now;
d == d.ToUniversalTime(); // false
The MSDN doc on DateTime.Equals mentions
t1 and t2 are equal if their Ticks property values are equal. Their Kind property values are not considered in the test for equality.
My expectation was that DateTime.Ticks would be equal, no matter the timezone.
I'd expect two moments in time to be equal no matter on what timezone they happened. Are my expectations wrong ?
source: http://social.msdn.microsoft.com/Forums/en/netfxbcl/thread/fde7e5b0-e2b9-4d3b-8a63-c2ae75e316d8
DateTime.Ticks is documented as "number of 100-nanosecond intervals
that have elapsed since 12:00:00 midnight, January 1, 0001". That is
1-Jan-0001 local time. If you convert your DateTime to UTC, Ticks
will then be number of 100-nanosecond intervals that have elapsed
since 12:00:00 midnight, January 1, 0001 UTC. Potentially different
that 1-Jan-0001 local time, ergo the two Ticks values will be
different.
Your current Datetime (unless you live in the one specific time zone - GMT) is offset from the UTC time by x hours, so DateTime.Now might put you at 4 AM while Datetime.Now.ToUniversalTime() could be at 11 PM depending on your current time zone.
The Ticks are calculated after the conversion from your time zone to universal time, so the only time they should be equal is if you live in the GMT time zone.
Put more simply, the number of ticks between 1/1/2011 8:00 AM is not the same as the number of ticks since 1/1/2011 11:00 PM. In your code, the date is being converted to the universal date, and then ticks being calulated on the right side of the equation, but it's just using your local date to get the difference on the left, hence, they're != each other.
DateTime.Now is determined based on your time zone offset which means it won't be the same as universal time unless your offset is zero. It wouldn't make sense to convert DateTime.Now to ticks in two different time zones and get the same result - they are the same absolute time (UTC), but not the same relative time (using the time zone offset).

Convert Date to Milliseconds

I am working with Visual Studio 2010, MVC 3 and C#. I am creating some highcharts and need to have the x-axis be a date. I am pulling the dates from a database and adding them to and array that will then be passed to highcharts. I think highcharts requires the dates to be in millisecond format. Ho do I go about converting a DateTime of '12/20/2011 5:10:13 PM" for example to milliseconds?
Once you figure out what you want to calculate milliseconds from, you can just take one DateTime object from another to get a TimeSpan object. From TimeSpan you can get TotalMilliseconds.
In other words, if start and end are DateTime objects, you can do this:
double milliseconds = (end - start).TotalMilliseconds;
You can use the DateTime.Ticks property and convert the value to milliseconds.
The value of this property represents the number of 100-nanosecond intervals that have elapsed since 12:00:00 midnight, January 1, 0001, which represents DateTime.MinValue. It does not include the number of ticks that are attributable to leap seconds.
A single tick represents one hundred nanoseconds or one ten-millionth of a second. There are 10,000 ticks in a millisecond.
The .Ticks in C# DateTime gives you the value of any time in ticks. You can thereafter convert to milliseconds as shown below:
long dateticks = DateTime.Now.Ticks;
long datemilliseconds = dateticks / TimeSpan.TicksPerMillisecond;
DateTime[] dates = ;
var minDate = dates.Min();
var msDates = dates.Select(date => (date - minDate).TotalMilliseconds).ToArray();

C# strange with DateTime

I got some strange result for:
Console.WriteLine(new DateTime(1296346155).ToString());
Result is:
01.01.0001 0:02:09
But it is not right!
I parsed value 1296346155 from some file. It said that it is in UTC;
Please explain;)
Thank you for help!)))
DateTime expects "A date and time expressed in the number of 100-nanosecond intervals that have elapsed since January 1, 0001 at 00:00:00.000 in the Gregorian calendar." (from msdn)
This question shows how you can convert a unix timestamp to a DateTime.
The constructor for DateTime that accept long type is expecting ticks value, not seconds or even milliseconds, and not from 1/1/1970 like in other languages.
So 1296346155 ticks is 129 seconds.
If it's Unix time, then the following should yield the expected result;
DateTime baseTime = new DateTime(1970, 1, 1, 0, 0, 0);
Console.WriteLine(baseTime.AddSeconds(1296346155));
See Unix Time for more information.
That constructor is not what you want as the time is not measured in ticks.
DateTime start = new DateTime(1970,1,1,0,0,0,0);
start = start.AddSeconds(1296346155).ToLocalTime();
Console.WriteLine(start);
// You don't need to use ToString() in a Console.WriteLine call
Ive found the following subject where there is a conversion between unix timestamp (the one you have) and .Net DateTime
How to convert a Unix timestamp to DateTime and vice versa?
That is correct - what were you expecting it to be and why?
The constructor System.DateTime(Int64) takes the number of 100-nanosecond intervals (known as Ticks) since January 1st 0001 (in the Gregorian calendar).
Therefore, 1296346155 / 10000000 gives you the number of seconds, which is 129.6.
Therefore, this should display 2 minutes and 9 seconds since midnight on 1st January 0001.

Categories