System.DateTime Kind Bits - c#

Due to difficulties I experienced trying to call the DotNetOAuth CryptoKey constructor I started to investigate the .Net System.DateTime structure. According to what I've read, this object is actually represented by a 64 bit signed integer, with the "Ticks" encoded in the lower 62 bits and the Kind encoded in the upper 2 bits (IOW, it's a concatenation of the 2 bit Kind and 62 bit ticks).
Now I wanted to actually "see" this so I constructed a small C# program that created three System.DateTime objects as so:
DateTime dtUtc = new System.DateTime(2014, 4, 29, 9, 10, 30, System.DateTimeKind.Utc);
DateTime dtLocal = new System.DateTime(2014, 4, 29, 9, 10, 30, System.DateTimeKind.Local);
DateTime dtU = new System.DateTime(2014, 4, 29, 9, 10, 30, System.DateTimeKind.Unspecified);
I then dumped the ticks property for each and, as expected, they were all equal. Finally, I applied .ToBinary()
long bitUtc = dtUtc.ToBinary();
long bitLocal = dtLocal.ToBinary();
long bitU = dtU.ToBinary();
These longs were all different, again as expected. HOWEVER, I then tried to "inspect" the upper two bits to see which state corresponded to what settings, and found that the upper two bits were set the same in all three. I used the following routine to return the bit status:
public static bool IsBitSet<T>(this T t, int pos) where T : struct, IConvertible
{
var value = t.ToInt64(CultureInfo.CurrentCulture);
return (value & (1 << pos)) != 0;
}
(I got this from another post on SO), and called it like this:
Boolean firstUtc = Class1.IsBitSet<long>(bitUtc, 63);
Boolean secondUtc = Class1.IsBitSet<long>(bitUtc, 62);
Boolean firstLocal = Class1.IsBitSet<long>(bitLocal, 63);
Boolean secondLocal = Class1.IsBitSet<long>(bitLocal, 62);
Boolean firstU = Class1.IsBitSet<long>(bitU, 63);
Boolean secondU = Class1.IsBitSet<long>(bitU, 62);
Again, the first and second bits were set the same in all three (first was true, second false). I don't understand this, as I THOUGHT these would all be different, corresponding to the different SystemKind values.
Finally, I did some more reading and found (or at least it was said in one source) that MS doesn't serialize the Kind information in .ToBinary(). OK, but then why are the outputs of the .ToBinary() method all different?
I would appreciate info from anyone who could point me in the direction of a resource that would help me understand where I've gone wrong.

These longs were all different, again as expected. HOWEVER, I then tried to "inspect" the upper two bits to see which state corresponded to what settings, and found that the upper two bits were set the same in all three.
I really don't think that's the case - not with the results of ToBinary. Here's a short but complete program demonstrating the difference, using your source data, showing the results as hex (as if unsigned):
using System;
class Test
{
static void Main()
{
DateTime dtUtc = new System.DateTime(2014, 4, 29, 9, 10, 30, System.DateTimeKind.Utc);
DateTime dtLocal = new System.DateTime(2014, 4, 29, 9, 10, 30, System.DateTimeKind.Local);
DateTime dtU = new System.DateTime(2014, 4, 29, 9, 10, 30, System.DateTimeKind.Unspecified);
Console.WriteLine(dtUtc.ToBinary().ToString("X16"));
Console.WriteLine(dtLocal.ToBinary().ToString("X16"));
Console.WriteLine(dtU.ToBinary().ToString("X16"));
}
}
Output:
48D131A200924700
88D131999ECDDF00
08D131A200924700
The top two bits are retrospectively 01, 10 and 00. The other bits change for the local case too, as per Marcin's post - but the top two bits really do indicate the kind.
The IsBitSet method is broken because it's left-shifting an int literal rather than a long literal. That means the shift will be mod 32, rather than mod 64 as intended. Try this instead:
public static bool IsBitSet<T>(this T t, int pos) where T : struct, IConvertible
{
var value = t.ToInt64(CultureInfo.CurrentCulture);
return (value & (1L << pos)) != 0;
}
Finally, I did some more reading and found (or at least it was said in one source) that MS doesn't serialize the Kind information in .ToBinary().
It's easy to demonstrate that's not true:
using System;
class Test
{
static void Main()
{
DateTime start = DateTime.UtcNow;
Show(DateTime.SpecifyKind(start, DateTimeKind.Utc));
Show(DateTime.SpecifyKind(start, DateTimeKind.Local));
Show(DateTime.SpecifyKind(start, DateTimeKind.Unspecified));
}
static void Show(DateTime dt)
{
Console.WriteLine(dt.Kind);
DateTime dt2 = DateTime.FromBinary(dt.ToBinary());
Console.WriteLine(dt2.Kind);
Console.WriteLine("===");
}
}

ToBinary() works differently for different DateTimeKind. You can see it on .NET source code:
public Int64 ToBinary() {
if (Kind == DateTimeKind.Local) {
// Local times need to be adjusted as you move from one time zone to another,
// just as they are when serializing in text. As such the format for local times
// changes to store the ticks of the UTC time, but with flags that look like a
// local date.
// To match serialization in text we need to be able to handle cases where
// the UTC value would be out of range. Unused parts of the ticks range are
// used for this, so that values just past max value are stored just past the
// end of the maximum range, and values just below minimum value are stored
// at the end of the ticks area, just below 2^62.
TimeSpan offset = TimeZoneInfo.GetLocalUtcOffset(this, TimeZoneInfoOptions.NoThrowOnInvalidTime);
Int64 ticks = Ticks;
Int64 storedTicks = ticks - offset.Ticks;
if (storedTicks < 0) {
storedTicks = TicksCeiling + storedTicks;
}
return storedTicks | (unchecked((Int64) LocalMask));
}
else {
return (Int64)dateData;
}
}
That's why you get different bits - local time is adjusted before transformed into bits, and so it does no longer match utc time.

Related

How do I get the current point in time in a recurring time interval?

I have a specific problem I've been trying to solve, and I think I have the right pieces - I'm just putting them together incorrectly. It might be more of a math question than a coding one.
So basically what I want to be able to do is find where "now" is within an arbitrary recurring time period (say, 43 minutes), given a known DateTime that this period recurred. So you have an anchor point, and from that anchor point, you know that every 43 minutes this period starts over, where is "now" in the current period?
I'm sure it involves division and/or modulo, and likely a subtraction using the anchor...so I've been toying with this code, but it isn't giving me the results I'm looking for:
using System;
public class Program
{
public static void Main()
{
TimeSpan interval = new TimeSpan(0, 43, 0);
DateTime anchor = new DateTime(2018, 1, 5, 7, 0, 49);
DateTime now = DateTime.Now;
TimeSpan left = new TimeSpan((now - anchor).Ticks % interval.Ticks);
Console.WriteLine(left);
}
}
Can someone tell me the piece I'm missing here? I'm not entirely sure what mathematical operations DateTime supports, or which ones I should be using in this instance.
Thanks.
Assuming your anchor always in past, you can use this code. It will print amount of completed periods and how much time passed from begin of last period
var anchor = new DateTime(2018, 1, 5, 13, 20, 17);
var interval = new TimeSpan(0, 43, 0);
var now = DateTime.Now;
var seconds = (ulong) (now - anchor).TotalSeconds;
var intervalSeconds = (ulong) interval.TotalSeconds;
var cycles = seconds / intervalSeconds;
var momentInInterval = TimeSpan.FromSeconds(seconds % intervalSeconds);
Console.WriteLine($"{cycles} cycles passed. {momentInInterval} passed from last period");
I'm using TotalSeconds, you can change it to TotalMilliseconds to be more accurate
If you want to know how much time until the next iteration you could do something like this
TimeSpan interval = new TimeSpan(0, 43, 0);
DateTime anchor = new DateTime(2018, 1, 5, 7, 0, 49);
var iterations = DateTime.Now.Subtract(anchor).Ticks / interval.Ticks;
DateTime current = anchor;
for(var i = 0; i < iterations; i++)
current = current.AddTicks(interval.Ticks);
var untilNext = DateTime.Now.Subtract(current);
Console.Write(untilNext);
You can output the amount of past iterations, the current amount of time from the last and how much until next.

DateTime precision difference .NET vs Java

Im porting some calculation routines from .Net to Java but there seem to be some precision problems in the Date classes. Maybe I have stared myself blind on this but I cant figure out why the results differ.
How should I handle the dates to get the same numbers (milliseconds) across the platforms?
.Net
[Test] public void foo() {
DateTime dt1 = new DateTime(2011, 2, 26, 19, 25, 24);
DateTime dt2 = new DateTime(2011, 2, 28, 18, 40, 25);
double millis = (dt2 - dt1).TotalMilliseconds;
Assert.AreEqual(170101000, millis);
}
Java
#Test public void foo() throws Exception {
Date d1 = createDate(2011, 2, 26, 19, 25, 24);
Date d2 = createDate(2011, 2, 28, 18, 40, 25);
long millis = d2.getTime() - d1.getTime();
Assert.assertEquals(166501000, millis, 0.01);
}
private static Date createDate(int year, int month, int day,
int hour, int minute, int second) {
Calendar instance = Calendar.getInstance();
instance.clear();
instance.set(year, month, day, hour, minute, second);
return instance.getTime();
}
Wonderful!
Problem:
Java Calendar months start at 0 and .Net Datetime months start at 1.
So you are comparing .Net february to Java march. And, as Arjan suggested, on march 27th is the change to daylight savings time in many timezones.
This is a very common problem when using Java Calendar. To avoid it, you should use the named constants for the months like this:
Date d1 = createDate(2011, Calendar.MARCH, 26, 19, 25, 24); // Calendar.MARCH == 2
The difference in milliseconds is 3,600,000, so exactly 1 hour. Could it be that one language is using a different timezone setting than the other, where there is a change from or to DST?
I have done some more testing. The PHP result matches the C# result (only I get seconds instead of milliseconds) and my Java result matches your Java result. The problem seems to be a bug in the timezone handling in Java. According to Java the second time is in DST, which is not correct in my timezone (Europe/Amsterdam).
Are you sure the problem is in the DateTime structure and not in the Double structure? From the documentation on the .NET Double data type:
Floating-Point Values and Loss of
Precision
Remember that a floating-point number can only approximate a decimal
number, and that the precision of a
floating-point number determines how
accurately that number approximates a
decimal number. By default, a Double
value contains 15 decimal digits of
precision, although a maximum of 17
digits is maintained internally. The
precision of a floating-point number
has several consequences:
Since you're not returning fractional miliseconds, you can use the System.Int64 (long in c#) data type. Your value is within the allowed range for that data type. The max value for an Int64 (long) is 9,223,372,036,854,775,807
I'm confused. Are you saying that the second (Java) test passes? Because I actually get the same number as the first (C#) test: 170101000.
Here is my test (which passes). I threw in JodaTime objects as an alternative to Date and Calendar:
#Test public void foo() throws Exception {
DateTime dt1 = new DateTime(2011, 2, 26, 19, 25, 24, 0);
DateTime dt2 = new DateTime(2011, 2, 28, 18, 40, 25, 0);
Duration d = new Duration(dt1, dt2);
Assert.assertEquals(170101000, d.getMillis(), 0.01);
Date d1 = createDate(2011, 2, 26, 19, 25, 24);
Date d2 = createDate(2011, 2, 28, 18, 40, 25);
long millis = d2.getTime() - d1.getTime();
Assert.assertEquals(170101000, millis, 0.01);
}
private static Date createDate(int year, int month, int day,
int hour, int minute, int second) {
Calendar instance = Calendar.getInstance();
instance.clear();
instance.set(year, month, day, hour, minute, second);
return instance.getTime();
}
It comes from the way you are subtracting. In java you subtract 2 longs that represent times. If you do a convert from the long back to the date it may not be what you set it to originally as there is some loss to the decimal place. In .net you are subtracting actual dates and then converting to miliseconds. So this result will actually be more precise. If you convert the times in .net to total miliseconds then subtract i would bet you would find much closer results.

When are two Datetime variables equal?

I try to compare Datetime.Now with a Datetime variable I set, using the Datetime.CompareTo() method. I use a timer to compare these every second and display the result, but as the current time approaches the time I set, the result changes from 1 to -1, but never 0, which means these two are never equal. I'm suspecting the Datetime structure contains milliseconds?
You're suspecting correctly. It goes further than milliseconds though. The maximum resolution is the "tick", which is equal to 100 nanoseconds.
As other have mentioned here, the resolution is 100ns.
The easiest approach would be to take your DateTime and subtract DateTime.Now. You then end up with a TimeSpan. If the TimeSpan's TotalSeconds property is 0, the difference between them is less than a second.
You are correct in your suspicion. The DateTime struct smallest unit is the "Tick" which is measured in units of 100ns. (One tick is 100ns)
What you more likely want to do is check if everything down to the seconds is equal and you can do that like this by first comparing the Date property and then compare the hour, minute and second properties individually
DateTime comparison is more exact than comparing with seconds. In your scenario, you can define an "error range", e.g. if the gap between two DateTime is less than 1 second, they are considered to be the same(in your program).
Try this... (but change the test date, of course)
DateTime d1 = new DateTime(2011, 12, 27, 4, 37, 17);
DateTime d2 = DateTime.Now;
if (d1.Subtract(d2).Seconds <= 1)
{
//consider these DateTimes equal... continue
}
I prefer to compare Datetime (as well as double) not with exact values but with value ranges, because it is quite unlikely that you have the exact value.
DateTime d1 = new DateTime(2011, 12, 27, 4, 37, 17);
DateTime d2 = DateTime.Now;
if ((d2 >= d1) && (d2 <= d1.AddMinutes(1)))
....
'simulate comparison of two datetimes
d1 = DateTime.Now
Threading.Thread.Sleep(250)
d2 = DateTime.Now
'see if two dates are within a second of each other
Dim ts As Double = ((d2 - d1).TotalSeconds)
If ts < 1 Then
'equal
Debug.WriteLine("EQ " & ts.ToString("n4"))
Else
Debug.WriteLine("neq " & ts.ToString("n4"))
End If

C# DateTime time zone subtract issues

I have this line of code:
double seconds = new DateTime(2006,7,6,12,1,0,DateTimeKind.Local).Subtract(new DateTime(1970,1,1,0,0,0,DateTimeKind.Local).TotalSeconds;
This was not the right number I wanted, so I tried the following:
double seconds = new DateTime(2006,7,6,12,1,0,DateTimeKind.Local).Subtract(new DateTime(1970,1,1,0,0,0,DateTimeKind.Utc).TotalSeconds;
(The difference is that in one case, I use local time for the epoch, and in the other, I use UTC). Interestingly though, they're both giving me the same value, and I don't know why this is. I live at -600 GMT, so DateTimeKind.Local should actually affect things.
Thanks in advance!
In the DateTimeKind page on MSDN (http://msdn.microsoft.com/en-us/library/shx7s921.aspx), it states:
The members of the DateTimeKind enumeration are used in conversion operations between local time and Coordinated Universal Time (UTC), but not in comparison or arithmetic operations. For more information about time conversions, see Converting Times Between Time Zones.
The advice there says to use TimeZoneInfo.ConvertTimeToUtc
So, based on that, the code should probably be modified to:
double seconds = new DateTime(2006,7,6,12,1,0,DateTimeKind.Local).Subtract(TimeZoneInfo.ConvertTimeToUtc(new DateTime(1970,1,1,0,0,0,DateTimeKind.Local)).TotalSeconds
Try this:
namespace ConsoleApplication1
{
using System;
class Program
{
static void Main( string[] args )
{
var laterDate = new DateTime( 2006, 7, 6, 12, 1, 0 );
var earlyDate = new DateTime( 1970, 1, 1, 0, 0, 0 );
var diff = laterDate.ToUniversalTime().Subtract( earlyDate.ToUniversalTime() );
var seconds = diff.TotalSeconds;
}
}
}

Convert DateTime to String value?

How do I convert DateTime to an Integer value?
Edit: How do I convert DateTime to a String value?
Example
String return value of 20100626144707 (for 26th of June 2010, at 14:47:07)
That could not be represented as an integer, it would overflow. It can be a long, however.
DateTime dateTime = new DateTime(2010, 6, 26, 14, 44, 07);
long time = long.Parse(dateTime.ToString("yyyyMMddHHmmss"));
However, it would be more intuitive to simply express it as a string, but I don't know what you intend to do with the information.
Edit:
Since you've updated the question, the answer is simpler.
string time = dateTime.ToString("yyyyMMddHHmmss");
A 32-bit integer is not large enough to hold a DateTime value to a very precise resolution. The Ticks property is a long (Int64). If you don't need precision down to the tick level, you could get something like seconds since the epoch:
TimeSpan t = (DateTime.UtcNow - new DateTime(1970, 1, 1));
int dateAsInteger = (int)t.TotalSeconds;
It's generally a bad idea to use a numeric datatype to store a number you can't do arithmetic on. E.g. the number in your example has no numeric meaning or value, adding it or subtracting it is pointless. However, the number of seconds since a certain date is useful as a numeric data type.
Note: this was posted before the question was changed to "string value" rather than "integer value".
Essentially, you want to shift the number along by the number of places needed for each part and then add it:
var x = new DateTime(2010,06,26,14,47,07);
long i = x.Year;
i = i * 100 + x.Month;
i = i * 100 + x.Day;
i = i * 100 + x.Hour;
i = i * 100 + x.Minute;
i = i * 100 + x.Second;
Console.WriteLine(i); // 20100626144707

Categories