The Millennium Muddle

How to put the years where they belong. Early in the 1970s, mainframe computers at many banks, securities firms and insurance companies began to generate unexpected errors when they calculated information involving dates that spanned the period before and after December 31, 1999. Both the managers and the data-entry clerks were baffled by the errors; not so the computer programmers. They knew what caused the problem, and they also knew it would grow more serious as the year 2000 approached. Because the years had been recorded in the computers' memory as two digits instead of four--97, for example, instead of 1997--as we moved closer to the year 2000, the computers would have increasing difficulty determining in which century to place a two-digit year designation. But for the most part those who understood the problem kept silent. After all, they figured, why worry management now? They were sure that somehow, someone, somewhere would find a way to solve the problem before the clock tolled midnight on December 31, 1999. They were wrong. As it turned out, the odds of finding a quick, easy solution today are no better than the likelihood that the year 2000 will never come. In fact, many of those who now labor over the problem wish that the year 2000 would never come--or at least not quite so soon. As 1997 ends and with only two years before 2000, it's become clear that, while surely fixes are available, in many cases they are not going to be easy, fast or cheap--and, in some cases, they probably won't resolve the issue entirely. UNTANGLING THE PUZZLE The problem goes by the shorthand name Y2K (for "year two thousand"); however, an increasing number of those struggling to untangle the puzzle describe it in language that can't be repeated here. As it turns out, Y2K affects not only large computers and their software; it's poised to foul up personal computers (PCs) and any software application programmed with a two-digit year field rather than a four-digit field. And that includes accounting software, computer operating systems, programs that run VCRs, time-controlled vaults and hundreds, if not thousands, of other date-dependent electronic equipment. What's behind the problem? In the early days of computers, when hard-disk memory storage was expensive, programmers were cautioned to conserve memory space. So, instead of creating a four-space field in an application program where a year was to be inserted, they economized with just two--after all, they figured, 2000 was in the next century. Two digits may sound like an insignificant savings, but when you consider that two digits were being saved in hundreds of millions of data fields, the savings actually added up to a significant sum. Although, in retrospect, the savings may not be as significant as the expected cost now of inserting those two blank spaces in both application software and the billions of data fields. However, not everyone agrees that economizing then was wrong. Two professors writing in a recent issue of the Journal of Systems Management calculated that, over the 30-year period when two-digit economy was designated, a typical organization saved over $1 million per gigabyte of total data storage. And, they added, if that savings had been invested wisely during the period, it could have produced a fifteenfold return--more than enough, they speculate, to pay for the remedy today. That reasoning, however, doesn't satisfy many enterprises facing the daunting task of fixing their software. THE ULTIMATE PRICE In reality, no one is really sure how much the fix will cost. The Office of Management and Budget estimates the federal government itself will spend $3.8 billion. Commercial banks forecast their price tag at $9.3 billion. J. P. Morgan, the investment bank, came up with a worldwide estimate of $200 billion and the Gartner group, a think tank that does computer consulting, upped that estimate and reported a worse-case worldwide cost of $600 billion. …