I need to store dates in a form of number of seconds since 1970.
With this I am getting number of seconds since 1970 with Swift by using Foundation's NSDate:
NSDate().timeIntervalSince1970
And maybe a dumb question but why this is double shouldn't it be int?
What is equivalent of this method in C#?
I am not sure what to use to get the same value.
TimeSpan t = (DateTime.UtcNow – new DateTime(1970, 1, 1));
long timestamp = (long) t.TotalSeconds;
I used the UtcNow property to ensure that the timestamp is the same regardless of what timezone this code is being run in.
Also, use the largest integer type you can find since the current epoch time is slightly less than 32 bit signed integer and you want code to be future proof.
If you do have .NET 4.6 or above, try this:
DateTimeOffset.UtcNow.ToUnixTimeSeconds()
This might work.
(DateTime.Now - new DateTime(1970,1,1)).TotalSeconds
This gets the date and subtracts the default epoch time of C# (01-01-01:00:00:00) making it start from 01-01-1970.
This is most probably the easiest way to get the same value.
Related
This question already has answers here:
How do you convert epoch time in C#?
(14 answers)
Closed 2 years ago.
I have a simple DateTime object, equal to the date: 11/1/2020 8:11:14 AM.
I want to convert it to milliseconds so I do:
myTimestamp?.Ticks / TimeSpan.TicksPerMillisecond.
I get 63739786274788, which seems correct from the pure calculation perspective.
However, when I input it into one of the online converters to validate, I get the date Wed Nov 01 3989 01:11:14, which is of course way off.
Questions:
What is this number 63739786274788 if not time in ms?
How do I get "normal" timestamp in ms?
In .NET, DateTime ticks are based on an epoch of 0001-01-01T00:00:00.0000000. The .Kind property is used to decide whether that is UTC, local time, or "unspecified".
Most online converters, such as the one you linked to, are expecting a Unix Timestamp, where the value is based on an epoch of 1970-01-01T00:00:00.000Z. It is always UTC based. (The precision varies, both seconds and milliseconds are commonly used.)
If you want to get a milliseconds-based Unix Timestamp From .NET, instead of dividing you should use the built-in functions DateTimeOffset.FromUnixTimeMilliseconds and DateTimeOffset.ToUnixTimeMilliseconds. (There are also seconds-based versions of these functions.)
Assuming your input values are UTC-based:
DateTime dt = new DateTime(2020, 11, 1, 8, 11, 14, DateTimeKind.Utc);
DateTimeOffset dto = new DateTimeOffset(dt);
long timestamp = dto.ToUnixTimeMilliseconds();
// output: 1604218274000
DateTimeKind.Local will also work with this, assuming your values are indeed based on the computer's local time zone. DateTimeKind.Unspecified is a bit trickier, as you'll need to convert to a DateTimeOffset with a specific time zone using TimeZoneInfo first.
You could also construct the DateTimeOffset value directly, rather than go through DateTime at all.
Okay, so you start off dividing Ticks by TicksPerMillisecond (10,000)
As you can see, the number you generated is much larger than the current milliseconds:
63739786274788
1607363529803
The short answer is that Ticks are based off of 12:00:00 midnight January 1, 0001 and a your online calculator is based off of unix time, January 1, 1970. So that would explain why you're about 2,000 years off. If you subtracted the Ticks from a new DateTime(1970,1,1), then that would give you about the right number to satisfy the online calculator.
For more info, I would suggest reading through MS's docs on DateTime
I have a DateTime represented as long (8 bytes), that came from DateTime.ToBinary(), let's call it dateTimeBin. Is there an optimal way of dropping the Time information (I only care for the date) so I can compare it to a start of day? Lets say we have this sample value as a start of day.
DateTime startOfDay = new DateTime(2020,3,4,0,0,0);
long startOfDayBin = startOfDay.ToBinary();
I obviously know I can always convert to a DateTime object then get the date component. However, this operation is going to happen billions of times and every little performance tweak helps.
Is there an efficient way of extracting the Date info of dateTimeBin without converting it to DateTime? Or any arithmetic operation on the long that will return the date only?
Is there a way to match startOfDay (or startOfDayBin) and dateTimeBin if they have the same date components?
Is there a way to see if (dateTimeBin >= startOfDayBin), I don't think the long comparison is valid.
N.B. all the dates are UTC
Since you are working only with UTC dates - makes sense to use DateTime.Ticks instead of DateTime.ToBinary, because former has relatively clear meaning - number of ticks since epoch, just like the unix time, the only difference is unix time interval is second and not tick (where tick is 1/10.000.000 of a second), and epoch is midnight January 1st of 0001 year and not year 1970. While ToBinary only promises that you can restore original DateTime value back and that's it.
With ticks it's easy to extract time and date. To extract time, you need to remainder of division of ticks by number of ticks in a full day, so
long binTicks = myDateTime.Ticks;
long ticksInDay = 24L * 60 * 60 * 10_000_000;
long time = binTicks % ticksInDay;
You can then use convert that to TimeSpan:
var ts = TimeSpan.FromTicks(time);
for convenience, or use as is. The same with extracting only date: just substract time
long date = binTicks - (binTicks % ticksInDay);
Regular comparision (dateTimeBin >= startOfDayBin) in also valid for tick values.
I currently use a solution for getting a higher resolution timestamp in C# by taking a start time using DateTime.UtcNow and then using a Stopwatch to add ticks to it as time goes by. I came across Stopwatch.GetTimestamp() as a potential alternative or even better solution, but I cannot find reliable information on exactly what this function returns.
Best source of info seems to be this.
GetTimestamp() returns machine-dependent ticks which can be converted into seconds by dividing by the stopwatch frequency. If I do this, I get a value that appears to be a UTC UNIX timestamp which is exactly what I'm after - but I haven't seen anything that states that this is what I should expect from it.
One clue from MSDN states that:
If the Stopwatch class uses a high-resolution performance counter,
GetTimestamp returns the current value of that counter. If the
Stopwatch class uses the system timer, GetTimestamp returns the
current DateTime.Ticks property of the DateTime.Now instance.
Looking then at DateTime.Ticks, we then see:
The value of this property represents the number of 100-nanosecond
intervals that have elapsed since 12:00:00 midnight, January 1, 0001
(0:00:00 UTC on January 1, 0001, in the Gregorian calendar), which
represents DateTime.MinValue.
I'm therefore not clear how simply dividing some machine-dependent tick-count by the frequency can get me a UNIX 1970+ timestamp? Is it possible that if a high performance timer is not available on the target platform that I might get year 0001-based timestamp instead? Or maybe something else entirely, again depending on the available hi-res timer?
Can you describe your use case? If you're interested in extra precision, I don't see how you could possibly get it by starting out with DateTime.UtcNow, and then, separately, calling Stopwatch.Start() -- if you add Stopwatch.Elapsed to DateTime.UtcNow, the value is going to be inaccurate, because you have no way of knowing how long after the DateTime.UtcNow call that the stopwatch actually started. If you start the stopwatch first, you have the same problem in reverse.
Generally speaking, in .NET 4.6, there is a ToUnixTimeMilliseconds call on DateTimeOffset that may be helpful (e.g. DateTimeOffset.UtcNow.ToUnixTimeMilliseconds())
I am currently using the following date filter in my WebAPI application:
json.SerializerSettings.Converters.Add(
new IsoDateTimeConverter { DateTimeFormat = "dd-MM-yyyy hh:mm" });
I started to use this as my front end could not understand the dates. If I remember correctly it was due to the way milliseconds were formatted with too many digits.
What I need is to get the date into a format like this:
1288323623006
Can someone suggest how I can do this using the serializer. Is this different from the default?
You don't want to use IsoDateTimeConverter at all - you possibly want to use JavaScriptDateTimeConverter. That will convert it into new Date(...) with the right value - but I believe it really will include the new Date(...) part. If you don't want that, you'll probably need to write your own converter.
It shouldn't be too hard to write a converter - although you need to decide how to handle the different kinds of DateTime. For example, if you're asked to convert a DateTime with a Kind of Unspecified, do you want to assume it's actually already in UTC, or already in the system local time zone, or something else?
Once you've got an appropriate "instant" in time, you just need to find the number of milliseconds between that and the Unix epoch (1st January 1970 00:00:00, UTC) and convert that number of milliseconds into a string.
I think that's what you want
private static readonly long DatetimeMinTimeTicks = (new DateTime(1970, 1, 1, 0, 0, 0, DateTimeKind.Utc)).Ticks;
long b = (long)((Calendar1.SelectedDate).ToUniversalTime().Ticks - DatetimeMinTimeTicks) / 10000;
Delphi:
SecondsBetween(StrToDateTime('16/02/2009 11:25:34 p.m.'), StrToDateTime('1/01/2005 12:00:00 a.m.'));
130289133
C#:
TimeSpan span = DateTime.Parse("16/02/2009 11:25:34 p.m.").Subtract(DateTime.Parse("1/01/2005 12:00:00 a.m."));
130289134
It's not consistent either. Some dates will add up the same, ie..
TimeSpan span = DateTime.Parse("16/11/2011 11:25:43 p.m.").Subtract(DateTime.Parse("1/01/2005 12:00:00 a.m."));
SecondsBetween(StrToDateTime('16/11/2011 11:25:43 p.m.'), StrToDateTime('1/01/2005 12:00:00 a.m.'));
both give
216905143
The total amount of seconds is actually being used to encode data, and I'm trying to port the application to C#, so even one second completely throws everything off.
Can anybody explain the disparity? And is there a way to get c# to match delphi?
Edit: In response to suggestions that it might be leap second related: Both date ranges contain the same amount of leap seconds (2), so you would expect a mismatch for both. But instead we're seeing inconsistency
16/02/2009 - 1/01/2005 = Delphi and C# calculate a different total seconds
16/11/2011 - 1/01/2005 = They calculate the same total seconds
The issue it seems related to this QC 59310, the bug was fixed in Delphi XE.
One will likely deal with Leap Seconds. However, .NET does not as far as I'm aware.
You don't mention how you convert the c# TimeSpan into a number. The TotalSeconds property is a floating point value - perhaps it's a rounding problem in the double to int conversion?