The other evening we were discussing how likely it is that there will be major problems on February 29, 2000, when some leap-year calculations might fail. I pointed out that simple-minded programs that just assume that every fourth year is a leap year will muddle through; to fail requires a program clever enough to know about century-omitted leap years but not clever enough to know the divisible-by-400 rule.
That observation later reminded me of one of the (more obscure) design decisions that we made for the Multics Calendar Clock. We specified that clock to measure the number of microseconds since 0000 GMT, January 1, 1901. The reason for choosing that date, rather than January 1, 1900, was that 1900 was a century year, so it did not have an extra day in February.
If we had chosen 1900, then every program that converted calendar clock readings to display dates would have had to include provision for century years, and thus perhaps have been subject to the bug of not correctly handling years that are divisible by 400. By choosing 1901 as the starting point, a simple-minded program that just assumes that every fourth year is a leap year will work correctly until the year 2100. Since the (52-bit) clock itself would wrap around to zero in 2039, that seemed sufficient provision at the time (1967).
Art Evans, on hearing this choice, responded "Yes, but why not do it right?" and proceeded to lay out a specification for the conversion program that handled century years, years divisible by 400, years divisible by 4000, time zones from Alaska to Zanzibar--the whole nine yards. His specification was the subject of a bit of ridicule at the time, and what actually got implemented, I don't recall. But in any case the starting date of the Multics clock was intentionally chosen to allow for simple conversion and to minimise the risk of mistakes.
IBM, in contrast, two or three years later chose January 1, 1900, as the starting date for the Time-of-Century clock in the System/370. As a result, date conversion programs for the 370 and its follow-ons must include some century-year adjustment, and are therefore at risk of omitting or not correctly including the 400-year adjustment that should take place for the year 2000.
A fair number of people who encountered the IBM clock specification found another way to capture some conversion simplicity: They just add one day's worth of seconds to the clock reading and then do a conversion that ignores century years. This stratagem is sometimes described as "starting the clock on January 0, 1900".
01/05/00