Final answer:
A program to determine whether a year is a leap year in the Gregorian Calendar would check if the year is divisible by 4, and for century years, it would additionally check if they are divisible by 400. This ensures the calendar year maintains synchronicity with the Earth's orbit around the Sun over long periods.
Step-by-step explanation:
The Gregorian Calendar, introduced by Pope Gregory XIII in 1582, corrected inaccuracies in the Julian calendar by adjusting the rules for determining leap years. The key change was to exclude century years from being leap years unless they were divisible by 400, thereby creating a calendar year that averages 365.2425 days, which matches closely with the tropical year of approximately 365.2422 days. The need for a leap year arises because the Earth's orbit around the Sun takes approximately 365.2422 days, rather than a neat 365. Therefore, without correction, the calendar year would drift from the astronomical events it was intended to synchronize with, like the equinoxes and solstices.
To write a program to determine whether a year is a leap year, one would need to check the following: if the year is divisible by 4, it's a leap year unless it is a century year not divisible by 400. Here is a simple pseudo-code example:
if (year % 4 == 0) {
if (year % 100 == 0) {
if (year % 400 == 0) {
isLeapYear = true;
} else {
isLeapYear = false;
}
} else {
isLeapYear = true;
}
} else {
isLeapYear = false;
}
Using this logic, the program can accurately determine whether a given year is a leap year and print the appropriate output, as requested in the question.