I have a very simple DateTime object that is set to the date 01-01-0001. I am supplied a value to add to this DateTime, in days. I am seeing an unexpected offset in my results though, of two days. Let's pretend I print out the results of the AddDays() call, like so:
DateTime myDateTime = DateTime.Parse("01-01-0001 00:00:00");
Console.WriteLine(myDateTime.AddDays(735768.0));
With the value seen above (735768.0) I expect an output of "6/18/2015 12:00:00 AM". However, instead I get "6/20/2015 12:00:00 AM". When I go to the following website and calculate the duration in days between 01-01-0001-->06/18/2015 I get a value of 735,768 days, as expected:
http://www.timeanddate.com/date/durationresult.html?m1=01&d1=01&y1=0001&m2=06&d2=18&y2=2015
Am I doing something wrong, or is there something going on under the hood that I am not aware of?
In case you are wondering, the 735,768 represents the first time value of the data that I am working with. The data is expected to start at 06/18/2015 00:00:00.
Edit: I should note I merely provided that particular website as an example of a conflicting source. Other websites, including the government weather agency I get the data from all give me 06-18-2015. This doesn't mean C# is wrong. I am more so curious as to where this offset came from, and why.
Timeanddate.com is taking into account the calendar change from Julian to Gregorian, which explains the discrepancy.
You can actually see this change occur if you look at the difference between 01/01/1752 and 01/01/1753 on timeanddate.com:
The formula for leap years on the Julian Calendar is "every year divisible by 4". This means that there are way more leap years in timeanddate.com's calculation between year 1 and year 1752, when the Calendar changes. This puts .NET's calculations until 1753 behind by 11 days.
Until 1752, England and the east coast of the United States used the Julian Calendar. A consequence of this change is that 1752 was only 355 days long. This calculation is not taken into account by .NET's calculation, and so at this point, .NET's calculation is two days ahead.
According to this answer by Jon Skeet, DateTime essentially uses the Gregorian calendar exclusively. This is why the above subtleties aren't reflected in .NET's calculations.
.NET is giving you the "correct" answer - noting that "correct" is assuming a purely gregorian calendar as Andrew Whitaker points out in the comments below and answer above. Andrew's answer is more correct.
A leap year can be defined as divisible by 4, but not divisible by 100 unless it is divisible by 400. Therefore, since 1/1/0001 following those rules, there have been 488 leap days.
Accounting for these leap days, there have been 735,598 days from 1/1/0001 through end of 2014. This leaves us to find day #170 of 2015, which is 6/20/2015 (31 + 28 + 31 + 30 + 31 + 20).
Also, this is not a rounding issue in .NET as some have suggested. Since DateTime.AddDays uses ticks, which are long data types as 64-bit signed ints, no overflows or rounding is occurring.
Ticks/day = 864BB (or 8.64 x 10^11)
Tick / (2,015 years) ~ 1.75 x 10^15
Max long = 9,223,372,036,854,775,807 (or 9.22 x 10^18)
From the website "It is 735,768 days from the start date to the end date, but not including the end date". This leads me to think, that you actually need to add 735,767 days since the website counts the start date. But this will only explain the one extra day. Perhaps the website is wrong by one day ? They do have a warning.
If you run below code you can find there are only 28 days in Feb for years
100,200 ...... 1900
IF you go through link you will find 29days in Feb for years 100,200....
http://www.timeanddate.com/calendar/?year=100&country=1
Its better not to compare .net date day to Timeanddate.com.
Code:
static void Main(string[] args)
{
for (int i = 001; i <= 2015; i++)
{
if (i % 4 == 0)
{
if (DateTime.DaysInMonth(i, 2) != 29)
{
Console.WriteLine("Days {0} in a month feb Year {1}..", DateTime.DaysInMonth(i, 2), i);
}
}
}
Console.ReadLine();
}
OutPut:-
Days 28 in a month feb Year 100..
Days 28 in a month feb Year 200..
Days 28 in a month feb Year 300..
Days 28 in a month feb Year 500..
Days 28 in a month feb Year 600..
Days 28 in a month feb Year 700..
Days 28 in a month feb Year 900..
Days 28 in a month feb Year 1000..
Days 28 in a month feb Year 1100..
Days 28 in a month feb Year 1300..
Days 28 in a month feb Year 1400..
Days 28 in a month feb Year 1500..
Days 28 in a month feb Year 1700..
Days 28 in a month feb Year 1800..
Days 28 in a month feb Year 1900..
If it were leap years it would be a lot more than 2 days, it would be 503 days off if that were the case. There's one of two options in my mind, either A. that calculator you are using online is off a little bit off, or the math that C# is using is inaccurate at that scale. If you do the math yourself you'll see that 735,768 / 365 comes out to a totally irrational number. So my thinking is that the inaccuracies in the math that goes on under the hood can't stay accurate for that many days. This happens ALOT with decimal points, I assume C# is probably truncating the decimal points (rounding it down) and so you're two days off. My guess anyway.
Related
I have a unit test where the expected result is a DateTime and is set as follows:
var expectedResult = DateTime.Today.AddMonths(3).AddMonths(3);
After which I have a function that adds quarters to a date:
DateTime.AddMonths(3 * numberOfTimes);
numberOfTimes is in this case 2.
The results differ in date. Today is 31/01/2023, expected result is 30/07/2023 and the function result is 31/07/2023.
I expected the results to be the same because 6 months should have been equal number of days from the starting date. I am curious why this happens. For now I fixed the problem by doing 3 * numberOfTimes in the expectedResult section.
Just out of curiosity why does this happen?
It is documented:
The AddMonths method calculates the resulting month and year, taking
into account leap years and the number of days in a month, then
adjusts the day part of the resulting DateTime object. If the
resulting day is not a valid day in the resulting month, the last
valid day of the resulting month is used. For example, March 31st + 1
month = April 30th, and March 31st - 1 month = February 28 for a
non-leap year and February 29 for a leap year.
So you get a different result because if you add 6 months it just needs to be checked if 31/07/2023 is a valid DateTime, which it is. For DateTime.Today.AddMonths(3).AddMonths(3) it will check first if 31/04/2023 is valid, which is not, so 30/04/2023 is returned, then 3 months are added.
The result can be different because of the way that the date and time is calculated for each method. The DateTime.AddMonths method takes into account the number of days in each month and can result in a different day of the month if the original date has a different number of days in the month. For example, adding 3 months to January 31st would result in a different day of the month than adding 6 months to January 31st.
I want to get yearWeek in python from datetime. I have used IsoCalender()[1] but it returns different week then I get in C# using DateTimeFormatInfo.InvariantInfo.Calendar.GetWeekOfYear(Date, CalendarWeekRule.FirstDay, DayOfWeek.Monday).
I have tried:
1) IsoCalender()[1].
2) d = datetime.datetime.strptime('2017-09-22 00:00:00.00','%Y-%m-%d %H:%M:%S.%f')
print(datetime.datetime.strftime(d,'%W'))
So For '2017-09-22 00:00:00.00' In C# I get 39 but I get 38 in python using above mentioned techniques.
Any help will be highly appretiated.
According to ISO 8601 (cf. this draft von 2016, section 5.7.7)
A week is defined as a seven-day time interval, starting with a Monday. Week number one of the calendar year is the first week that contains at least four (4) days in that calendar year.
The documentation of Python's date.isocalendar states likewise:
[…] week starts on a Monday and ends on a Sunday. The first week of an ISO year is the first (Gregorian) calendar week of a year containing a Thursday.
According to the documentation of CalendarWeekRule this equals CalendarWeekRule.FirstFourDayWeek:
Indicates that the first week of the year is the first week with four or more days before the designated first day of the week.
Whereas CalendarWeekRule.FirstDay means:
Indicates that the first week of the year starts on the first day of the year and ends before the following designated first day of the week.
In conclusion, the correct way in C# is:
DateTimeFormatInfo.InvariantInfo.Calendar.GetWeekOfYear(new DateTime(2017, 9, 22), CalendarWeekRule.FirstFourDayWeek, DayOfWeek.Monday)
which correctly yields 38.
I'm in the middle of calculating week numbers for dates, but the System.Globalization.Calendar is returning odd results for (amongst other years) December 31st of year 2007 and 2012.
Calendar calendar = CultureInfo.InvariantCulture.Calendar;
var date = new DateTime(2007, 12, 29);
for (int i = 0; i < 5; i++)
{
int w = calendar.GetWeekOfYear(date, CalendarWeekRule.FirstFourDayWeek, DayOfWeek.Monday);
Console.WriteLine("{0}\t{1}", date.ToString("dd.MM.yyyy"), w);
date = date.AddDays(1);
}
Results
29.12.2007 52
30.12.2007 52
31.12.2007 53 <--
01.01.2008 1
02.01.2008 1
29.12.2012 52
30.12.2012 52
31.12.2012 53 <--
01.01.2013 1
02.01.2013 1
As far as I understand, there shouldn't be a week 53 in year 2007 and 2012, but the days should be included in week 1. Is there a way to change this behaviour in the Calendar?
The documentation for the CalendarWeekRule enumeration specifically states that it "does not map directly to ISO 8601", and links to ISO 8601 Week of Year format in Microsoft .Net, a blog entry that describes the differences.
Have a look at the values of CalendarWeekRule. You are using FirstFourDayWeek, and so you are getting the values you describe. If you want every week to have exactly 7 days, you should use FirstFullWeek.
In your case, that would mean that 31. 12. 2007 will be week 53, but so will 2. 1. 2008.
Starting with .net core 3 there is a new ISOWeek that actually does calculate WeekOfYear correctly
https://learn.microsoft.com/en-us/dotnet/api/system.globalization.isoweek.getweekofyear
There don't have to be 52 weeks for the week identifiers to be unique, you just don't necessarily have 7 days in a particular week.
If this is a problem for you then add code to handle the edge case.
I need to calculate the date of the first day in a calendar week given a year/week, e.g.
Week 53 in 2009 -> Mon, 28.12.2009
Week 1 in 2010 -> Mon, 04.01.2010
How would you wirte that code?
PS: In the Gregorian calendar the first week of a year is the first week with 4 days.
If you don't want to do all the calculation yourself, use the System.Globalization.Calendar class. I think you can use the GetDayOfYear() method to get what you need.
Check this MSDN library for the example.
Requirements:
Calculate the number of months
between two dates: receiveDate and
dueDate.
Both optimistic and pessimistic
calculations are needed
Assumptions:
dueDate will always be the last day of the month.
I've already figured out the pessimistic calculation (meaning a single day overdue counts as a whole month:
if(receiveDate > dueDate)
receiveDate.Month - dueDate.Month + (receiveDate.Year - dueDate.Year) * 12;
Doing a search on the internet turned up several similar examples to confirm this.
Now my instincts tell me the optimistic calculation will just be the same minus one month but for some reason it just doesn't feel right. Am I on the right track or am I missing something?
You're right; if you're looking for the number of complete months between the two dates, subtracting 1 (assuming the receiveDate doesn't fall on the last day of the month, in which case you will have a remainder of 0 days either way) will get you your answer.
If you don't need to keep days of month in your calculus I think it's the way to go.
Your formula calculates the number of months between the first of the receivedDate's month to the first of the dueDate's month. As most time elements in TimeSpan are expressed as TotalXXX, it seems weird that they left out a TotalMonths and a TotalYears.
I think it's because there aren't a fixed number of days from month to month, so it's hard to know what makes most sense in terms of how to express the fractional remainder.
My formula is this...
int nMonthDiff_FirstToFirst = DateTime.Now.Month - testDate.Month + ((DateTime.Now.Year - testDate.Year)* 12);
double dMonthDiff = (double)nMonthDiff_FirstToFirst + (DateTime.Now - testDate.AddMonths(nMonthDiff_FirstToFirst)).TotalDays / (double)DateTime.DaysInMonth(DateTime.Now.Year, DateTime.Now.Month);
So what I'm doing is basically getting the month difference like you are (first of the month to first of the month) then I project my testDate into the future by the month difference. Then with that TimeSpan I get the TotalDays and divide that by the number of days in that month. Thus I'm representing the fractional portion in terms of remaining days of the month.
So if you were going May 5st 2012 -> June 3rd 2012, your formula would return 1 for month difference when a full month hasn't passed yet. In my formula the TotalDays of the projected date would yield a negative fractional number of days, and when added to the 'first to first' month difference, it would take it in as needed.
I am doing this extra check to get precise amount of months:
numberOfMonths = receiveDate.Month - dueDate.Month + (receiveDate.Year - dueDate.Year) * 12;
if (receiveDate.Day > dueDate.Day) numberOfMonths--;
int GetMonthsCount(DateTime dtStart, DateTime dtEnd)
{
return (int)Math.Round((dtEnd - dtStart).TotalMonths);
}