Assuming Jan 31, 2000, a leap year, the following two ways of adding a month give me different results. Am I doing something wrong or are these two different philosophies of handling leap-year months? And, if these two approaches are just a philosophy difference, how would you know which method to pick?
Method 1: using LocalDate plusMonths():
LocalDate atestDate = LocalDate.parse("2000-01-31");
System.out.println("One month in future using LocalDate.addMonths() " + atestDate.plusMonths(1));
Output:
One month in future using LocalDate.addMonths() 2000-02-29
Method 2: using Calendar:
Calendar zcal = Calendar.getInstance();
zcal.set(Calendar.DAY_OF_MONTH, 31);
zcal.set(Calendar.MONTH, 1);
zcal.set(Calendar.YEAR, 2000);
zcal.add(Calendar.MONTH, 0);
System.out.println("ONE MONTH IN FUTURE using Calendar: "
+ zcal.getTime());
Output:
ONE MONTH IN FUTURE using Calendar: Thu Mar 02 2000
Why are these two dates' output not the same?
Thanks.
zcal.set(Calendar.DAY_OF_MONTH, 31);
zcal.set(Calendar.MONTH, 1);
zcal.set(Calendar.YEAR, 2000);
This defines the date of February 31st, 2000, which, due to lenient measuring, is equated to March 2nd, 2000. Month values are zero-indexed.
Setting strict:
zcal.setLenient(false);
We get an exception:
Exception in thread "main" java.lang.IllegalArgumentException: MONTH: 1 -> 2
at java.util.GregorianCalendar.computeTime(GregorianCalendar.java:2829)
at java.util.Calendar.updateTime(Calendar.java:3393)
at java.util.Calendar.getTimeInMillis(Calendar.java:1782)
at java.util.Calendar.getTime(Calendar.java:1755)
at Employee.Tester.main(Tester.java:19)
So you can see that the interpolation was shifted over to March. If you wish to hard-code dates for testing purposes, I would recommend using strict validation to avoid these cases. There have been other documented issues related to setting values that make things messy.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With