DateTime precision difference .NET vs Java - c#

Im porting some calculation routines from .Net to Java but there seem to be some precision problems in the Date classes. Maybe I have stared myself blind on this but I cant figure out why the results differ.
How should I handle the dates to get the same numbers (milliseconds) across the platforms?
.Net
[Test] public void foo() {
DateTime dt1 = new DateTime(2011, 2, 26, 19, 25, 24);
DateTime dt2 = new DateTime(2011, 2, 28, 18, 40, 25);
double millis = (dt2 - dt1).TotalMilliseconds;
Assert.AreEqual(170101000, millis);
}
Java
#Test public void foo() throws Exception {
Date d1 = createDate(2011, 2, 26, 19, 25, 24);
Date d2 = createDate(2011, 2, 28, 18, 40, 25);
long millis = d2.getTime() - d1.getTime();
Assert.assertEquals(166501000, millis, 0.01);
}
private static Date createDate(int year, int month, int day,
int hour, int minute, int second) {
Calendar instance = Calendar.getInstance();
instance.clear();
instance.set(year, month, day, hour, minute, second);
return instance.getTime();
}

Wonderful!
Problem:
Java Calendar months start at 0 and .Net Datetime months start at 1.
So you are comparing .Net february to Java march. And, as Arjan suggested, on march 27th is the change to daylight savings time in many timezones.
This is a very common problem when using Java Calendar. To avoid it, you should use the named constants for the months like this:
Date d1 = createDate(2011, Calendar.MARCH, 26, 19, 25, 24); // Calendar.MARCH == 2

The difference in milliseconds is 3,600,000, so exactly 1 hour. Could it be that one language is using a different timezone setting than the other, where there is a change from or to DST?
I have done some more testing. The PHP result matches the C# result (only I get seconds instead of milliseconds) and my Java result matches your Java result. The problem seems to be a bug in the timezone handling in Java. According to Java the second time is in DST, which is not correct in my timezone (Europe/Amsterdam).

Are you sure the problem is in the DateTime structure and not in the Double structure? From the documentation on the .NET Double data type:
Floating-Point Values and Loss of
Precision
Remember that a floating-point number can only approximate a decimal
number, and that the precision of a
floating-point number determines how
accurately that number approximates a
decimal number. By default, a Double
value contains 15 decimal digits of
precision, although a maximum of 17
digits is maintained internally. The
precision of a floating-point number
has several consequences:
Since you're not returning fractional miliseconds, you can use the System.Int64 (long in c#) data type. Your value is within the allowed range for that data type. The max value for an Int64 (long) is 9,223,372,036,854,775,807

I'm confused. Are you saying that the second (Java) test passes? Because I actually get the same number as the first (C#) test: 170101000.
Here is my test (which passes). I threw in JodaTime objects as an alternative to Date and Calendar:
#Test public void foo() throws Exception {
DateTime dt1 = new DateTime(2011, 2, 26, 19, 25, 24, 0);
DateTime dt2 = new DateTime(2011, 2, 28, 18, 40, 25, 0);
Duration d = new Duration(dt1, dt2);
Assert.assertEquals(170101000, d.getMillis(), 0.01);
Date d1 = createDate(2011, 2, 26, 19, 25, 24);
Date d2 = createDate(2011, 2, 28, 18, 40, 25);
long millis = d2.getTime() - d1.getTime();
Assert.assertEquals(170101000, millis, 0.01);
}
private static Date createDate(int year, int month, int day,
int hour, int minute, int second) {
Calendar instance = Calendar.getInstance();
instance.clear();
instance.set(year, month, day, hour, minute, second);
return instance.getTime();
}

It comes from the way you are subtracting. In java you subtract 2 longs that represent times. If you do a convert from the long back to the date it may not be what you set it to originally as there is some loss to the decimal place. In .net you are subtracting actual dates and then converting to miliseconds. So this result will actually be more precise. If you convert the times in .net to total miliseconds then subtract i would bet you would find much closer results.

Related

Create 75-Mins timeblock for multiple DateTimes of a day

I am working on a stock market software. Where I am having a candle every 5 minutes. So whenever a time-frame of say 30 minutes is selected, what we do is -
long val = (long)(D * 24 * 60) / 30; //D is datetime of candle converted in OA date in double.
//The above code never create problem as (24*60)%30 == 0.
The above line returns same value for every half an hour chunk i. e. candle of 10:35, 10:40.....11:00. With that we can easily find out chunks of half an hour, whenever val is changed.
Now, We have a challange to implement the chunk of 75-Mins in the same way. Our market start from 9:15 and ends at 3:30. Suppose date for which 75-Mins needs to be calculated is 22-9-2018. For that I will need to have exactly 5 candle of below time -
22-9-2018 10:30 (9:15 to 10:30 = 75 mins)
22-9-2018 11:45
22-9-2018 1:00
22-9-2018 2:15
22-9-2018 3:30
I need to have same kind of code as metioned above which will calculate same value for these five chunks.
Problem I found is, If we start 75 from 12:00, then the chunk in market time will be at 8:45 to 10:00 while we require from 9:15 to 10:30 first chunk.
Also, (24*60)%75 = 15, So 15 Mins difference everyday disturbs the next day calculation too.
UPDATE -
To clear the question, For a chunk from 10:35 to 11:45, I will have candles like 10:35, 10:40, 10:45..... 11:45. For all these datetimes, I need a same numeric return value. As soon as the candle of 11:50 comes, the returned numeric value will get changed and my new 75 Min chunk will start. It will give same value till 1:00.
You can use a loop or a linq query like this:
var startTime = new DateTime(2018, 09, 22, 9, 15, 0);
var times = Enumerable.Range(1, 5).Select(x => startTime.AddMinutes(x * 75)).ToList();
Example
Here is also another example about how to split a date range. In the following example, i included the start time also as part of the result:
IEnumerable<DateTime> Split(DateTime start, DateTime end, int minutes)
{
if (minutes <= 0)
throw new ArgumentException(
$"'{nameof(minutes)}' should be greater than 0.",
nameof(minutes));
var result = start;
while (result <= end)
{
yield return result;
result = result.AddMinutes(minutes);
}
}
And here is the usage:
var startTime = new DateTime(2018, 09, 22, 9, 15, 0);
var endTime = startTime.AddHours(7);
var times = Split(startTime, endTime, 75).ToList();

System.DateTime Kind Bits

Due to difficulties I experienced trying to call the DotNetOAuth CryptoKey constructor I started to investigate the .Net System.DateTime structure. According to what I've read, this object is actually represented by a 64 bit signed integer, with the "Ticks" encoded in the lower 62 bits and the Kind encoded in the upper 2 bits (IOW, it's a concatenation of the 2 bit Kind and 62 bit ticks).
Now I wanted to actually "see" this so I constructed a small C# program that created three System.DateTime objects as so:
DateTime dtUtc = new System.DateTime(2014, 4, 29, 9, 10, 30, System.DateTimeKind.Utc);
DateTime dtLocal = new System.DateTime(2014, 4, 29, 9, 10, 30, System.DateTimeKind.Local);
DateTime dtU = new System.DateTime(2014, 4, 29, 9, 10, 30, System.DateTimeKind.Unspecified);
I then dumped the ticks property for each and, as expected, they were all equal. Finally, I applied .ToBinary()
long bitUtc = dtUtc.ToBinary();
long bitLocal = dtLocal.ToBinary();
long bitU = dtU.ToBinary();
These longs were all different, again as expected. HOWEVER, I then tried to "inspect" the upper two bits to see which state corresponded to what settings, and found that the upper two bits were set the same in all three. I used the following routine to return the bit status:
public static bool IsBitSet<T>(this T t, int pos) where T : struct, IConvertible
{
var value = t.ToInt64(CultureInfo.CurrentCulture);
return (value & (1 << pos)) != 0;
}
(I got this from another post on SO), and called it like this:
Boolean firstUtc = Class1.IsBitSet<long>(bitUtc, 63);
Boolean secondUtc = Class1.IsBitSet<long>(bitUtc, 62);
Boolean firstLocal = Class1.IsBitSet<long>(bitLocal, 63);
Boolean secondLocal = Class1.IsBitSet<long>(bitLocal, 62);
Boolean firstU = Class1.IsBitSet<long>(bitU, 63);
Boolean secondU = Class1.IsBitSet<long>(bitU, 62);
Again, the first and second bits were set the same in all three (first was true, second false). I don't understand this, as I THOUGHT these would all be different, corresponding to the different SystemKind values.
Finally, I did some more reading and found (or at least it was said in one source) that MS doesn't serialize the Kind information in .ToBinary(). OK, but then why are the outputs of the .ToBinary() method all different?
I would appreciate info from anyone who could point me in the direction of a resource that would help me understand where I've gone wrong.
These longs were all different, again as expected. HOWEVER, I then tried to "inspect" the upper two bits to see which state corresponded to what settings, and found that the upper two bits were set the same in all three.
I really don't think that's the case - not with the results of ToBinary. Here's a short but complete program demonstrating the difference, using your source data, showing the results as hex (as if unsigned):
using System;
class Test
{
static void Main()
{
DateTime dtUtc = new System.DateTime(2014, 4, 29, 9, 10, 30, System.DateTimeKind.Utc);
DateTime dtLocal = new System.DateTime(2014, 4, 29, 9, 10, 30, System.DateTimeKind.Local);
DateTime dtU = new System.DateTime(2014, 4, 29, 9, 10, 30, System.DateTimeKind.Unspecified);
Console.WriteLine(dtUtc.ToBinary().ToString("X16"));
Console.WriteLine(dtLocal.ToBinary().ToString("X16"));
Console.WriteLine(dtU.ToBinary().ToString("X16"));
}
}
Output:
48D131A200924700
88D131999ECDDF00
08D131A200924700
The top two bits are retrospectively 01, 10 and 00. The other bits change for the local case too, as per Marcin's post - but the top two bits really do indicate the kind.
The IsBitSet method is broken because it's left-shifting an int literal rather than a long literal. That means the shift will be mod 32, rather than mod 64 as intended. Try this instead:
public static bool IsBitSet<T>(this T t, int pos) where T : struct, IConvertible
{
var value = t.ToInt64(CultureInfo.CurrentCulture);
return (value & (1L << pos)) != 0;
}
Finally, I did some more reading and found (or at least it was said in one source) that MS doesn't serialize the Kind information in .ToBinary().
It's easy to demonstrate that's not true:
using System;
class Test
{
static void Main()
{
DateTime start = DateTime.UtcNow;
Show(DateTime.SpecifyKind(start, DateTimeKind.Utc));
Show(DateTime.SpecifyKind(start, DateTimeKind.Local));
Show(DateTime.SpecifyKind(start, DateTimeKind.Unspecified));
}
static void Show(DateTime dt)
{
Console.WriteLine(dt.Kind);
DateTime dt2 = DateTime.FromBinary(dt.ToBinary());
Console.WriteLine(dt2.Kind);
Console.WriteLine("===");
}
}
ToBinary() works differently for different DateTimeKind. You can see it on .NET source code:
public Int64 ToBinary() {
if (Kind == DateTimeKind.Local) {
// Local times need to be adjusted as you move from one time zone to another,
// just as they are when serializing in text. As such the format for local times
// changes to store the ticks of the UTC time, but with flags that look like a
// local date.
// To match serialization in text we need to be able to handle cases where
// the UTC value would be out of range. Unused parts of the ticks range are
// used for this, so that values just past max value are stored just past the
// end of the maximum range, and values just below minimum value are stored
// at the end of the ticks area, just below 2^62.
TimeSpan offset = TimeZoneInfo.GetLocalUtcOffset(this, TimeZoneInfoOptions.NoThrowOnInvalidTime);
Int64 ticks = Ticks;
Int64 storedTicks = ticks - offset.Ticks;
if (storedTicks < 0) {
storedTicks = TicksCeiling + storedTicks;
}
return storedTicks | (unchecked((Int64) LocalMask));
}
else {
return (Int64)dateData;
}
}
That's why you get different bits - local time is adjusted before transformed into bits, and so it does no longer match utc time.

Subtracting TimeSpan from date

I want to subtract a time-span from a date-time object using a 30-day month and ignoring leap years etc.
Date is 1983/5/1 13:0:0 (y/m/d-h:m:s)
Time span is 2/4/28-2:51:0 (y/m/d-h:m:s)
I can use DateTime and TimeSpan objects to do this, after converting years and months of the time-span to days (assuming a 30 day month and a ~364 day year).
new DateTime(1981,5,1,13,0,0).Subtract(new TimeSpan(878,13,51,0));
With this i get the result:
{12/4/1978 11:09:00 PM}
Above answer obviously doesn't ignore the factors i want ignored and gives me an accurate answer. But in this case that's not what i want so i wrote the below code.
public static CustomDateTime operator -(CustomDateTime DT1,CustomDateTime DT2)
{
CustomDateTime retVal = new CustomDateTime();
try
{
const int daysPerYear = 364.25;
const int monthsPerYear = 12;
const int daysPerMonth = 30;
const int hoursPerDay = 24;
const int minutesPerHour = 60;
retVal.Minute = DT1.Minute - DT2.Minute;
if (retVal.Minute < 0)
{
retVal.Minute += minutesPerHour;
DT1.Hour -= 1;
}
retVal.Hour = DT1.Hour - DT2.Hour;
if (retVal.Hour < 0)
{
retVal.Hour += hoursPerDay;
DT1.Day -= 1;
}
retVal.Day = DT1.Day - DT2.Day;
if (retVal.Day < 0)
{
retVal.Day += daysPerMonth;
DT1.Month -= 1;
}
retVal.Month = DT1.Month - DT2.Month;
if (retVal.Month < 0)
{
retVal.Month += monthsPerYear;
DT1.Year -= 1;
}
retVal.Year = DT1.Year - DT2.Year;
}
catch (Exception ex) { }
return retVal;
}
Then i get:
1981/0/3-10:9:0
This is pretty close to what i'm after except i shouldn't get 0 for month and year should be 1980. Any kind of help is appreciated.
Just to make things clear again; in this context I have to use a 30-day month and ignore leap-years, different numbers of months, etc. Its a weird thing to do, i know. So I'm pretty much after a 'wrong answer' as opposed to the exact answer given by the managed classes.
If you're estimating a month at 30 days, of course your math will be off. When you subtract 878 days from 5/1/1981, .Net is giving you the exact difference, not an estimate, and this difference accounts for leap years, if there are any. The error is not in the Subtract(...) method - it is in your own "manual" calculation.
DateTime dt = new DateTime(1981, 5, 1, 13, 0, 0);
TimeSpan t = new TimeSpan(878, 13, 51, 0);
dt.Ticks
624931668000000000
t.Ticks
759090600000000
dt.Ticks - t.Ticks
624172577400000000
new DateTime(dt2)
{12/4/1978 11:09:00 PM}
Date: {12/4/1978 12:00:00 AM}
Day: 4
DayOfWeek: Monday
DayOfYear: 338
Hour: 23
Kind: Unspecified
Millisecond: 0
Minute: 9
Month: 12
Second: 0
Ticks: 624172577400000000
TimeOfDay: {23:09:00}
Year: 1978
These are the total ticks since the epoch. Do this math, then convert back into a datetime.
Also: correct your math. 878 days is 2 years and 148 days. 5/1/1981 is the 121st day of the year, so subtract 120 to get Jan 1, 1979. This leaves 28 days. Start counting backwards from the end of 1978, and you get very close to the .Net answer. Your own answer isn't anywhere close.
EDIT based on feedback
// zh-Hans is a chinese culture
CultureInfo ci = CultureInfo.GetCultureInfo("zh-Hans");
DateTime dt = new DateTime(1981, 5, 1, 13, 0, 0, ci.Calendar);
TimeSpan t = new TimeSpan(878, 13, 51, 0);
Please note that you are still subtracting 878 days. The length of a month would be irrelevant in that case based on the Julian calendar. You will probably need to find the correct culture code for your particular calendar, then try this. However, with this calendar, I still arrive at the same answer above.
Beyond doing this, I am unsure how else to do the math. If you can provide a link to how you are doing it by hand, I can help code it for you.
EDIT 2
I understand now. Try this:
DateTime dt = new DateTime(1981, 5, 1, 13, 0, 0, ci.Calendar);
int years = 878 / 365;
int remainingDays = 878 % 365;
int months = remainingDays / 30;
remainingDays = remainingDays % 30;
TimeSpan t = new TimeSpan(years * 365 + months * 30 + remainingDays);
DateTime newdate = dt.Subtract(t);
You cannot assume a 30-day month. You are specifying that you want to subtract 878 days. The managed classes (I'm assuming you mean managed when you say native) are designed to factor in leap-years, different numbers of months, etc.
Using the managed classes will not give you a 0 for a month.

When are two Datetime variables equal?

I try to compare Datetime.Now with a Datetime variable I set, using the Datetime.CompareTo() method. I use a timer to compare these every second and display the result, but as the current time approaches the time I set, the result changes from 1 to -1, but never 0, which means these two are never equal. I'm suspecting the Datetime structure contains milliseconds?
You're suspecting correctly. It goes further than milliseconds though. The maximum resolution is the "tick", which is equal to 100 nanoseconds.
As other have mentioned here, the resolution is 100ns.
The easiest approach would be to take your DateTime and subtract DateTime.Now. You then end up with a TimeSpan. If the TimeSpan's TotalSeconds property is 0, the difference between them is less than a second.
You are correct in your suspicion. The DateTime struct smallest unit is the "Tick" which is measured in units of 100ns. (One tick is 100ns)
What you more likely want to do is check if everything down to the seconds is equal and you can do that like this by first comparing the Date property and then compare the hour, minute and second properties individually
DateTime comparison is more exact than comparing with seconds. In your scenario, you can define an "error range", e.g. if the gap between two DateTime is less than 1 second, they are considered to be the same(in your program).
Try this... (but change the test date, of course)
DateTime d1 = new DateTime(2011, 12, 27, 4, 37, 17);
DateTime d2 = DateTime.Now;
if (d1.Subtract(d2).Seconds <= 1)
{
//consider these DateTimes equal... continue
}
I prefer to compare Datetime (as well as double) not with exact values but with value ranges, because it is quite unlikely that you have the exact value.
DateTime d1 = new DateTime(2011, 12, 27, 4, 37, 17);
DateTime d2 = DateTime.Now;
if ((d2 >= d1) && (d2 <= d1.AddMinutes(1)))
....
'simulate comparison of two datetimes
d1 = DateTime.Now
Threading.Thread.Sleep(250)
d2 = DateTime.Now
'see if two dates are within a second of each other
Dim ts As Double = ((d2 - d1).TotalSeconds)
If ts < 1 Then
'equal
Debug.WriteLine("EQ " & ts.ToString("n4"))
Else
Debug.WriteLine("neq " & ts.ToString("n4"))
End If

C# DateTime.Ticks equivalent in Java

What is the Java equivalent of DateTime.Ticks in C#?
DateTime dt = new DateTime(2010, 9, 14, 0, 0, 0);
Console.WriteLine("Ticks: {0}", dt.Ticks);
What will be the equivalent of above mentioned code in Java?
Well, java.util.Date/Calendar only have precision down to the millisecond:
Calendar calendar = Calendar.getInstance();
calendar.set(Calendar.MILLISECOND, 0); // Clear the millis part. Silly API.
calendar.set(2010, 8, 14, 0, 0, 0); // Note that months are 0-based
Date date = calendar.getTime();
long millis = date.getTime(); // Millis since Unix epoch
That's the nearest effective equivalent. If you need to convert between a .NET ticks value and a Date/Calendar you basically need to perform scaling (ticks to millis) and offsetting (1st Jan 1AD to 1st Jan 1970).
Java's built-in date and time APIs are fairly unpleasant. I'd personally recommend that you use Joda Time instead. If you could say what you're really trying to do, we can help more.
EDIT: Okay, here's some sample code:
import java.util.*;
public class Test {
private static final long TICKS_AT_EPOCH = 621355968000000000L;
private static final long TICKS_PER_MILLISECOND = 10000;
public static void main(String[] args) {
long ticks = 634200192000000000L;
Date date = new Date((ticks - TICKS_AT_EPOCH) / TICKS_PER_MILLISECOND);
System.out.println(date);
TimeZone utc = TimeZone.getTimeZone("UTC");
Calendar calendar = Calendar.getInstance(utc);
calendar.setTime(date);
System.out.println(calendar);
}
}
Note that this constructs a Date/Calendar representing the UTC instant of 2019/9/14. The .NET representation is somewhat fuzzy - you can create two DateTime values which are the same except for their "kind" (but therefore represent different instants) and they'll claim to be equal. It's a bit of a mess :(
In Java is:
long TICKS_AT_EPOCH = 621355968000000000L;
long tick = System.currentTimeMillis()*10000 + TICKS_AT_EPOCH;
System.nanoTime() gives you nanoseconds in Java (since 1.6). You'll still need to shift/rescale, but no precision will be lost.
Base on Jon Skeet I developed this class
import java.util.Calendar;
import java.util.Date;
public class DateHelper {
private static final long TICKS_AT_EPOCH = 621355968000000000L;
private static final long TICKS_PER_MILLISECOND = 10000;
public static long getUTCTicks(Date date){
Calendar calendar = Calendar.getInstance();
calendar.setTime(date);
return (calendar.getTimeInMillis() * TICKS_PER_MILLISECOND) + TICKS_AT_EPOCH;
}
public static Date getDate(long UTCTicks){
return new Date((UTCTicks - TICKS_AT_EPOCH) / TICKS_PER_MILLISECOND);
}
}
It works for me
And for those of us showing up trying to get the current number of ticks as defined by the UUID specification:
/**
Returns the current tick count.
Ticks are the number of 100 ns intervals since October 15, 1582
#return
*/
private static long getUtcNowTicks() {
final long UNIX_EPOCH_TICKS = 122192928000000000L; //Number of ticks from 10/16/1582 to 1/1/1970
Instant i = Clock.systemUTC().instant(); //get the current time
long ticks = UNIX_EPOCH_TICKS; // number of ticks as of 1/1/1970
ticks += i.getEpochSecond()*10000000; //number of whole seconds (converted to ticks) since 1/1/1970
ticks += i.getNano() / 100; //number of ticks since the start of the second
return ticks;
/*
Some interesting tick values
Date Ticks
========== ==================
10/15/1582 0 Start of UUID epoch; the date we switched to the Gregorian calendar)
1/01/1601 5748192000000000 Start of Windows epoch (start of 1st Gregorian 400-year cycle)
12/30/1899 100101312000000000 Start of Lotus 123, Excel, VB, COM, Delphi epoch
1/01/1900 100103040000000000 Start of SQL Server epoch
1/01/1970 122192928000000000 Start of UNIX epoch
1/01/2000 131659776000000000
1/01/2010 134815968000000000
1/01/2020 137971296000000000
1/19/2038 143714420469999999 UNIX Y2k38 problem (January 19, 2038 3:14:07 am)
*/
}
To convert .Net Ticks to millis in java use this :
static final long TICKS_PER_MILLISECOND = 10000;
long ticks = 450000000000L; // sample tick value
long millis = (ticks / TICKS_PER_MILLISECOND);
There are 10,000 ticks in a millisecond, and C# considers the beginning of time January 1, 0001 at midnight. Here's a one-liner which converts an Instant to ticks.
public static long toTicks(Instant i)
{
return Duration.between(Instant.parse("0001-01-01T00:00:00.00Z"), i).toMillis() * 10000;
}

Categories