Follow TV Tropes

Following

Discussion Main / MillenniumBug

Go To

You will be notified by PM when someone responds to your discussion
Type the word in the image. This goes away if you get known.
If you can't read this one, hit reload for the page.
The next one might be easier to see.
Think-Tank Since: Apr, 2022
Apr 22nd 2022 at 4:46:55 AM •••

Another nit-pick!

Despite being called the Millenium Bug (later called Y2K), the "problem" was expected to arise on 1st Jan 2000.

The Millenium, however, didn't start until Jan 1st 2001!

So it was actually the Millenium-1 Bug which happened in C20 not C21

I know - picky huh?

But accurate :)

Edited by Think-Tank
SeptimusHeap MOD (Edited uphill both ways)
Mar 20th 2021 at 11:45:08 AM •••

Previous Trope Repair Shop thread: Needs Help, started by abloke on Sep 9th 2012 at 5:12:01 PM

"For a successful technology, reality must take precedence over public relations, for Nature cannot be fooled." - Richard Feynman
johnnye Since: Jan, 2001
Jul 18th 2013 at 2:59:51 AM •••

"So, typically all dates were stored internally as 6 digits (and punctuation was added at display time), so November 27, 1960 was coded as 112760. Now, a month later you can get by adding 1 to the first two digits. The new date is later than the original one. Now, however, say you have a date of November 15, 1992 (111592) and you add eight years to it, you get 111500,"

I don't know how these things were programmed, but if 111592 was stored as a single integer and you added +8, wouldn't you get 111600?

Edited by 70.33.253.43 Hide / Show Replies
johnboy3 Since: Mar, 2011
Oct 17th 2013 at 2:48:14 PM •••

No, most databases stored numbers as text to prevent conversion problems (endian-ness), not too mention the differing interpretations of floats and their precision.

Mhoram Since: Nov, 2012
Apr 14th 2013 at 6:21:56 PM •••

This may be too much of a nit-pick to put on the page, but: the reason programmers used two-digit years wasn't really to save space. Yes, space was at a premium, but not that much of a premium. A programmer desperate for space and using a lot of dates could pack a date into fewer than 6 bytes by encoding it in various ways. Unix, for instance, uses 4 bytes (two less than DDMMYY) to represent the date and time as the number of seconds since Jan 1, 1970. Other methods were used as well that saved more than dropping two digits.

The real reason people programmed in two-digit years is that, until the early 1990s or so, they thought in two-digit years. Checkbooks came with the '19' already printed on the checks so all you had to write was the final two digits. People said they were from the class of '85, and didn't pronounce the single quote. Also, things moved so fast in the computer software world that you were lucky if anyone was still using your program after a year, let alone a couple decades later.

We didn't think about it and leave it out to save space; we just didn't think about it.

PrimeEvil Since: Nov, 2010
Apr 20th 2012 at 7:06:51 PM •••

It's funny...this trope is the main reason why I always say that I "missed" the Year 2000 celebrations, even though I was in seventh grade at the time. Truth be told, I can't remember very much of that New Year's Eve, except that it all came and went with something of a yawn. Too bad, really, 'cause that should have felt like a milestone.

Top