Year 2000 problem

"Y2K" redirects here. For other uses, see Y2K (disambiguation).
An electronic sign displaying the year incorrectly as 1900 on 3 January 2000 in France

The Year 2000 problem is also known as the Y2K problem, the Millennium bug, the Y2K bug, or Y2K. Problems arose because programmers represented the four-digit year with only the final two digits. This made the year 2000 indistinguishable from 1900. The assumption that a twentieth-century date was always understood caused various errors, such as the incorrect display of dates, and the inaccurate ordering of automated dated records or real-time events.

In 1997, the British Standards Institute (BSI) developed a standard, DISC PD2000-1,[1] which defines "Year 2000 Conformity requirements" as four rules: No valid date will cause any interruption in operations. Calculation of durations between, or the sequence of, pairs of dates will be correct whether any dates are in different centuries. In all interfaces and in all storage, the century must be unambiguous, either specified, or calculable by algorithm. Year 2000 must be recognised as a leap year. It identifies two problems that may exist in many computer programs.

First, the practice of representing the year with two digits became problematic with logical error(s) arising upon "rollover" from x99 to x00. This had caused some date-related processing to operate incorrectly for dates and times on and after 1 January 2000, and on other critical dates which were billed "event horizons". Without corrective action, long-working systems would break down when the "... 97, 98, 99, 00 ..." ascending numbering assumption suddenly became invalid.

Secondly, some programmers had misunderstood the Gregorian calendar rule that determines whether years that are exactly divisible by 100 are not leap years, and assumed the year 2000 would not be a leap year. Years divisible by 100 are not leap years, except for years that are divisible by 400. Thus the year 2000 was a leap year.

Companies and organisations worldwide checked, fixed, and upgraded their computer systems.[2]

The number of computer failures that occurred when the clocks rolled over into 2000 in spite of remedial work is not known; among other reasons is the reluctance of organisations to report problems.[3]

Background

Y2K is a numeronym and was the common abbreviation for the year 2000 software problem. The abbreviation combines the letter Y for "year", and k for the SI unit prefix kilo meaning 1000; hence, 2K signifies 2000. It was also named the Millennium Bug because it was associated with the popular (rather than literal) roll-over of the millennium, even though the problem could have occurred at the end of any ordinary century.

The Year 2000 problem was the subject of the early book, Computers in Crisis by Jerome and Marilyn Murray (Petrocelli, 1984; reissued by McGraw-Hill under the title The Year 2000 Computing Crisis in 1996). The first recorded mention of the Year 2000 Problem on a Usenet newsgroup occurred on Friday, 18 January 1985, by Usenet poster Spencer Bolles.[4]

The acronym Y2K has been attributed to David Eddy, a Massachusetts programmer,[5] in an e-mail sent on 12 June 1995. He later said, "People were calling it CDC (Century Date Change), FADL (Faulty Date Logic) and other names."

The problem started because on both mainframe computers and later personal computers, storage was expensive, from as low as $10 per kilobyte, to in many cases as much as or even more than US$100 per kilobyte.[6] It was therefore very important for programmers to reduce usage. Since programs could simply prefix "19" to the year of a date, most programs internally used, or stored on disc or tape, data files where the date format was six digits, in the form MMDDYY, MM as two digits for the month, DD as two digits for the day, and YY as two digits for the year. As space on disc and tape was also expensive, this also saved money by reducing the size of stored data files and data bases.

Many computer programs stored years with only two decimal digits; for example, 1980 was stored as 80. Some such programs could not distinguish between the year 2000 and the year 1900. Other programs tried to represent the year 2000 as 19100. This could cause a complete failure and cause date comparisons to produce incorrect results. Some embedded systems, making use of similar date logic, were expected to fail and cause utilities and other crucial infrastructure to fail.

Some warnings of what would happen if nothing was done were particularly dire:

The Y2K problem is the electronic equivalent of the El Niño and there will be nasty surprises around the globe. John Hamre, United States Deputy Secretary of Defense[7]

Special committees were set up by governments to monitor remedial work and contingency planning, particularly by crucial infrastructures such as telecommunications, utilities and the like, to ensure that the most critical services had fixed their own problems and were prepared for problems with others. While some commentators and experts argued that the coverage of the problem largely amounted to scaremongering,[8] it was only the safe passing of the main "event horizon" itself, 1 January 2000, that fully quelled public fears. Some experts who argued that scaremongering was occurring, such as Ross Anderson, Professor of Security Engineering at the University of Cambridge Computer Laboratory, have since claimed that despite sending out hundreds of press releases about research results suggesting that the problem was not likely to be as big a problem as some had suggested, they were largely ignored by the media.[8]

Programming problem

The practice of using two-digit dates for convenience predates computers, but was never a problem until stored dates were used in calculations.

The need for bit conservation

"I'm one of the culprits who created this problem. I used to write those programs back in the 1960s and 1970s, and was proud of the fact that I was able to squeeze a few elements of space out of my program by not having to put a 19 before the year. Back then, it was very important. We used to spend a lot of time running through various mathematical exercises before we started to write our programs so that they could be very clearly delimited with respect to space and the use of capacity. It never entered our minds that those programs would have lasted for more than a few years. As a consequence, they are very poorly documented. If I were to go back and look at some of the programs I wrote 30 years ago, I would have one terribly difficult time working my way through step-by-step."

Alan Greenspan, 1998[9]

In the first half of the 20th century, well before the computer era, business data processing was done using unit record equipment and punched cards, most commonly the 80-column variety employed by IBM, which dominated the industry. Many tricks were used to squeeze needed data into fixed-field 80-character records. Saving two digits for every date field was significant in this effort.

In the 1960s, computer memory and mass storage were scarce and expensive. Early core memory cost one dollar per bit. Popular commercial computers, such as the IBM 1401, shipped with as little as 2 kilobytes of memory. Programs often mimicked card processing techniques. Commercial programming languages of the time, such as COBOL and RPG, processed numbers in their character representations. Over time the punched cards were converted to magnetic tape and then disc files, but the structure of the data usually changed very little. Data was still input using punched cards until the mid-1970s. Machine architectures, programming languages and application designs were evolving rapidly. Neither managers nor programmers of that time expected their programs to remain in use for many decades. The realisation that databases were a new type of program with different characteristics had not yet come.

There were exceptions, of course. The first person known to publicly address this issue was Bob Bemer, who had noticed it in 1958 as a result of work on genealogical software. He spent the next twenty years trying to make programmers, IBM, the government of the United States and the ISO aware of the problem, with little result. This included the recommendation that the COBOL PICTURE clause should be used to specify four digit years for dates.[10] Despite magazine articles on the subject from 1970 onward, the majority of programmers and managers only started recognising Y2K as a looming problem in the mid-1990s, but even then, inertia and complacency caused it to be mostly unresolved until the last few years of the decade. In 1989, Erik Naggum was instrumental in ensuring that internet mail used four digit representations of years by including a strong recommendation to this effect in the internet host requirements document RFC 1123.[11]

Saving space on stored dates persisted into the Unix era, with most systems representing dates to a single 32-bit word, typically representing dates as elapsed seconds from some fixed date.

Resulting bugs from date programming

Webpage screenshots showing the JavaScript .getYear() method problem, which depicts the Year 2000 problem
An Apple Lisa does not accept the date

Storage of a combined date and time within a fixed binary field is often considered a solution, but the possibility for software to misinterpret dates remains because such date and time representations must be relative to some known origin. Rollover of such systems is still a problem but can happen at varying dates and can fail in various ways. For example:

Date bugs similar to Y2K

4 January 1975

This date overflowed the 12-bit field that had been used in the Decsystem 10 operating systems. There were numerous problems and crashes related to this bug while an alternative format was developed.[16]

9 September 1999

Even before 1 January 2000 arrived, there were also some worries about 9 September 1999 (albeit less than those generated by Y2K). Because this date could also be written in the numeric format 9/9/99, it could have conflicted with the date value 9999, frequently used to specify an unknown date. It was thus possible that database programs might act on the records containing unknown dates on that day. Data entry operators commonly entered 9999 into required fields for an unknown future date, (e.g., a termination date for cable television or telephone service), in order to process computer forms using CICS software.[17] Somewhat similar to this is the end-of-file code 9999, used in older programming languages. While fears arose that some programs might unexpectedly terminate on that date, the bug was more likely to confuse computer operators than machines.

Leap years

Main article: Zeller's congruence

Mostly, a year is a leap year if it is evenly divisible by four. A year divisible by 100, however, is not a leap year in the Gregorian calendar unless it is also divisible by 400. For example, 1600 was a leap year, but 1700, 1800 and 1900 were not. Some programs may have relied on the oversimplified rule that a year divisible by four is a leap year. This method works fine for the year 2000 (because it is a leap year), and will not become a problem until 2100, when older legacy programs will likely have long since been replaced. Other programs contained incorrect leap year logic, assuming for instance that no year divisible by 100 could be a leap year. An assessment of this leap year problem including a number of real life code fragments appeared in 1998.[18] For information on why century years are treated differently, see Gregorian calendar.

Year 2010 problem

Some systems had problems once the year rolled over to 2010. This was dubbed by some in the media as the "Y2K+10" or "Y2.01K" problem.[19]

The main source of problems was confusion between hexadecimal number encoding and binary-coded decimal encodings of numbers. Both hexadecimal and BCD encode the numbers 0–9 as 0x0–0x9. But BCD encodes the number 10 as 0x10, whereas hexadecimal encodes the number 10 as 0x0A; 0x10 interpreted as a hexadecimal encoding represents the number 16.

For example, because the SMS protocol uses BCD for dates, some mobile phone software incorrectly reported dates of SMSes as 2016 instead of 2010. Windows Mobile is the first software reported to have been affected by this glitch; in some cases WM6 changes the date of any incoming SMS message sent after 1 January 2010 from the year "2010" to "2016".[20][21]

Other systems affected include EFTPOS terminals,[22] and the PlayStation 3 (except the Slim model).[23]

The most important occurrences of such a glitch were in Germany, where upwards of 20 million bank cards became unusable, and with Citibank Belgium, whose digipass customer identification chips failed.[24]

Year 2038 problem

Main article: Year 2038 problem

The original Unix time datatype (time_t) stores a date and time as a signed long integer (on 32 bit systems a 32-bit integer) representing the number of seconds since 1 January 1970. During and after 2038, this number will exceed 231  1, the largest number representable by a signed long integer on 32 bit systems, causing the Year 2038 problem (also known as the Unix Millennium bug or Y2K38). As a long integer in 64 bit systems uses 64 bits, the problem does not realistically exist on 64 bit systems that use the LP64 model.

Programming solutions

Several very different approaches were used to solve the Year 2000 problem in legacy systems. Three of them follow:

Date expansion
Two-digit years were expanded to include the century (becoming four-digit years) in programs, files, and databases. This was considered the "purest" solution, resulting in unambiguous dates that are permanent and easy to maintain. However, this method was costly, requiring massive testing and conversion efforts, and usually affecting entire systems.
Date re-partitioning
In legacy databases whose size could not be economically changed, six-digit year/month/day codes were converted to three-digit years (with 1999 represented as 099 and 2001 represented as 101, etc.) and three-digit days (ordinal date in year). Only input and output instructions for the date fields had to be modified, but most other date operations and whole record operations required no change. This delays the eventual roll-over problem to the end of the year 2899.
Windowing
Two-digit years were retained, and programs determined the century value only when needed for particular functions, such as date comparisons and calculations. (The century "window" refers to the 100-year period to which a date belongs). This technique, which required installing small patches of code into programs, was simpler to test and implement than date expansion, thus much less costly. While not a permanent solution, windowing fixes were usually designed to work for several decades. This was thought acceptable, as older legacy systems tend to eventually get replaced by newer technology.[25]

Documented errors

Before 2000

On 1 January 2000

When 1 January 2000 arrived, there were problems generally regarded as minor. Consequences did not always result precisely at midnight. Some programs were not active at that moment and would only show up when they were invoked. Not all problems recorded were directly linked to Y2K programming in a causality; minor technological glitches occur on a regular basis. Some caused erroneous results, some caused machines to stop working, some caused date errors, and two caused malfunctions.

Reported problems include:

On 1 March 2000

Problems were reported but these were mostly minor.[32]

On 31 December 2000 or 1 January 2001

Some software did not correctly recognise 2000 as a leap year, and so worked on the basis of the year having 365 days. On the last day of 2000 (day 366) these systems exhibited various errors. These were generally minor, apart from reports of some Norwegian trains that were delayed until their clocks were put back by a month.[34]

Government responses

Bulgaria

Although only two digits are allocated for the birth year in the Bulgarian national identification number, the year 1900 problem and subsequently the Y2K problem were addressed by the use of unused values above 12 in the month range. For all persons born before 1900, the month is stored as the calendar month plus 20, and for all persons born after 1999, the month is stored as the calendar month plus 40.[35]

Netherlands

The Dutch Government promoted Y2K Information Sharing and Analysis Centers (ISACs) to share readiness between industries, without threat of antitrust violations or liability based on information shared.

Norway and Finland

Norway and Finland changed their national identification number, to indicate the century in which a person was born. In both countries, the birth year was historically indicated by two digits only. This numbering system had already given rise to a similar problem, the "Year 1900 problem", which arose due to problems distinguishing between people born in the 20th and 19th centuries. Y2K fears drew attention to an older issue, while prompting a solution to a new problem. In Finland, the problem was solved by replacing the hyphen ("-") in the number with the letter "A" for people born in the 21st century. In Norway, the range of the individual numbers following the birth date was altered from 0–499 to 500–999.

Uganda

The Ugandan government responded to the Y2K threat by setting up a Y2K Task Force.[36] In August 1999 an independent international assessment by the World Bank International Y2k Cooperation Centre found that Uganda's website was in the top category as "highly informative". This put Uganda in the "top 20" out of 107 national governments, and on a par with the United States, United Kingdom, Canada, Australia and Japan, and ahead of Germany, Italy, Austria, Switzerland which were rated as only "somewhat informative". The report said that "Countries which disclose more Y2k information will be more likely to maintain public confidence in their own countries and in the international markets."[37]

United States

In 1998, the United States government responded to the Y2K threat by passing the Year 2000 Information and Readiness Disclosure Act, by working with private sector counterparts in order to ensure readiness, and by creating internal continuity of operations plans in the event of problems. The effort was coordinated out of the White House by the President's Council on Year 2000 Conversion, headed by John Koskinen.[38] The White House effort was conducted in co-ordination with the then-independent Federal Emergency Management Agency (FEMA), and an interim Critical Infrastructure Protection Group, then in the Department of Justice, now in Homeland Security.

The US Government followed a three-part approach to the problem: (1) Outreach and Advocacy (2) Monitoring and Assessment and (3) Contingency Planning and Regulation.[39]

The logo created by The President's Council on the Year 2000 Conversion, for use on Y2K.gov

A feature of US Government outreach was Y2K websites including Y2K.GOV. Presently, many US Government agencies have taken down their Y2K websites. Some of these documents may be available through National Archives and Records Administration[40] or the Wayback Machine.

Each federal agency had its own Y2K task force which worked with its private sector counterparts. The FCC had the FCC Year 2000 Task Force.[39][41]

Most industries had contingency plans that relied upon the internet for backup communications. However, as no federal agency had clear authority with regard to the internet at this time (it had passed from the US Department of Defense to the US National Science Foundation and then to the US Department of Commerce), no agency was assessing the readiness of the internet itself. Therefore, on 30 July 1999, the White House held the White House Internet Y2K Roundtable.[42]

United Kingdom

The British government made regular assessments of the progress made by different sectors of business towards becoming Y2K-compliant and there was wide reporting of sectors which were laggards. Companies and institutions were classified according to a traffic light scheme ranging from green "no problems" to red "grave doubts whether the work can be finished in time". Many organisations finished far ahead of the deadline.

International co-operation

The International Y2K Cooperation Center (IY2KCC) was established at the behest of national Y2K coordinators from over 120 countries when they met at the First Global Meeting of National Y2K Coordinators at the United Nations in December 1988. IY2KCC established an office in Washington, D.C. in March 1999. Funding was provided by the World Bank, and Bruce W. McConnell was appointed as director.

IY2KCC's mission was to "promote increased strategic cooperation and action among governments, peoples, and the private sector to minimize adverse Y2K effects on the global society and economy." Activities of IY2KCC were conducted in six areas:

IY2KCC closed down in March 2000.[43]

Private sector response

The Y2K issue was a major topic of discussion in the late 1990s and as such showed up in most popular media. A number of "Y2K disaster" books were published such as Deadline Y2K by Mark Joseph. Movies such as Y2K: Year to Kill capitalised on the currency of Y2K, as did numerous TV shows, comic strips, and computer games.

Fringe group responses

A variety of fringe groups and individuals such as those within some fundamentalist religious organizations, survivalists, cults, anti-social movements, self-sufficiency enthusiasts, communes and others attracted to conspiracy theories, embraced Y2K as a tool to engender fear and provide a form of evidence for their respective theories. End-of-the-world scenarios and apocalyptic themes were common in their communication.

Interest in the survivalist movement peaked in 1999 in its second wave for that decade, triggered by Y2K fears. In the time before extensive efforts were made to rewrite computer programming codes to mitigate the possible impacts, some writers such as Gary North, Ed Yourdon, James Howard Kunstler,[46] and Ed Yardeni anticipated widespread power outages, food and gasoline shortages, and other emergencies. North and others raised the alarm because they thought Y2K code fixes were not being made quickly enough. While a range of authors responded to this wave of concern, two of the most survival-focused texts to emerge were Boston on Y2K (1998) by Kenneth W. Royce, and Mike Oehler's The Hippy Survival Guide to Y2K.

Y2K was also exploited by some prominent and other lesser known fundamentalist and Pentecostal Christian leaders throughout the Western world, particularly in North America and Australia.[47] Their promotion of the perceived risks of Y2K was combined with end times thinking and apocalyptic prophecies in an attempt to influence followers.[47] The New York Times reported in late 1999, "The Rev. Jerry Falwell suggested that Y2K would be the confirmation of Christian prophecy - God's instrument to shake this nation, to humble this nation. The Y2K crisis might incite a worldwide revival that would lead to the rapture of the church. Along with many survivalists, Mr. Falwell advised stocking up on food and guns".[48] Adherents in these movements were encouraged to engage in food hoarding, take lessons in self-sufficiency, and the more extreme elements planned for a total collapse of modern society. The Chicago Tribune reported that some large fundamentalist churches, motivated by Y2K, were the sites for flea market-like sales of paraphernalia designed to help people survive a social order crisis ranging from gold coins to wood-burning stoves.[49] Betsy Hart, writing for the Deseret News reported that a lot of the more extreme evangelicals used Y2K to promote a political agenda in which downfall of the government was a desired outcome in order to usher in Christ's reign. She also noted that, "the cold truth is that preaching chaos is profitable and calm doesn't sell many tapes or books"[50] These types of fears and conspiracies were described dramatically by New Zealand-based Christian prophetic author and preacher Barry Smith in his publication, "I Spy with my Little Eye", where he dedicated a whole chapter to Y2K.[51] Some expected, at times through so-called prophecies, that Y2K would be the beginning of a worldwide Christian revival.[52] It became clear in the aftermath that leaders of these fringe groups had cleverly used fears of apocalyptic outcomes to manipulate followers into dramatic scenes of mass repentance or renewed commitment to their groups, additional giving of funds and more overt commitment to their respective organizations or churches. The Baltimore Sun noted this in their article, "Apocalypse Now - Y2K spurs fears", where they reported the increased call for repentance in the populace in order to avoid God's wrath.[53] Christian leader, Col Stringer, in his commentary has published, "Fear-creating writers sold over 45 million books citing every conceivable catastrophe from civil war, planes dropping from the sky to the end of the civilised world as we know it. Reputable preachers were advocating food storage and a "head for the caves" mentality. No banks failed, no planes crashed, no wars or civil war started. And yet not one of these prophets of doom has ever apologised for their scare-mongering tactics."[52] Some prominent North American Christian ministries and leaders generated huge personal and corporate profits through sales of Y2K preparation kits, generators, survival guides, published prophecies and a wide range of other associated merchandise. Christian journalist, Rob Boston, has documented this[47] in his article "False Prophets, Real Profits Religious Right Leaders' Wild Predictions of Y2K Disaster Didn't Come True, But They Made Money Anyway".

Cost

The total cost of the work done in preparation for Y2K is estimated at over US$300 billion ($413 billion today, once inflation is taken into account).[54][55] IDC calculated that the US spent an estimated $134 billion ($184 billion) preparing for Y2K, and another $13 billion ($18 billion) fixing problems in 2000 and 2001. Worldwide, $308 billion ($424 billion) was estimated to have been spent on Y2K remediation.[56] There are two ways to view the events of 2000 from the perspective of its aftermath:

Supporting view

This view holds that the vast majority of problems had been fixed correctly, and the money was well spent. The situation was essentially one of preemptive alarm. Those who hold this view claim that the lack of problems at the date change reflects the completeness of the project, and that many computer applications would not have continued to function into the 21st century without correction or remediation.

Opposing view

Others have asserted that there were no, or very few, critical problems to begin with. They also asserted that there would be only a few minor mistakes and that a "fix on failure" approach, would have been the most efficient and cost-effective way to solve these problems as they occurred.

See also

References

  1. BSI Standard, on year 2000.
  2. Wired (25 February 2000). "Leap Day Tuesday Last Y2K Worry". Retrieved 16 October 2016.
  3. Carrington, Damian (4 January 2000). "Was Y2K bug a boost?". BBC News. Archived from the original on 22 April 2004. Retrieved 19 September 2009.
  4. Spencer Bolles. "Computer bugs in the year 2000". Newsgroup: net.bugs. Usenet: 820@reed.UUCP.
  5. American RadioWorks Y2K Notebook ProblemsThe Surprising Legacy of Y2K. Retrieved on 22 April 2007.
  6. A web search on images for "computer memory ads 1975" returns advertisements showing pricing for 8K of memory at $990 and 64K of memory at $1495.
  7. Looking at the Y2K bug, portal on CNN.com Archived 7 February 2006 at the Wayback Machine.
  8. 1 2 3 Presenter: Stephen Fry (2009-10-03). "In the beginning was the nerd". Archive on 4. BBC Radio 4.
  9. Testimony by Alan Greenspan, ex-Chairman of the Federal Reserve before the Senate Banking Committee, 25 February 1998, ISBN 978-0-16-057997-4
  10. "Key computer coding creator dies". The Washington Post. 25 June 2004. Retrieved 25 September 2011.
  11. Braden, Robert (ed.) (October 1989). "Requirements for Internet Hosts -- Application and Support". Internet Engineering Task Force. Retrieved 16 October 2016.
  12. Microsoft Support (17 December 2015). "Microsoft Knowledge Base article 214326". Retrieved 16 October 2016.
  13. "JavaScript Reference Javascript 1.2". Sun Microsystems. Retrieved 7 June 2009.
  14. "JavaScript Reference Javascript 1.3". Sun. Retrieved 7 June 2009.
  15. TVTropes. "Millennium Bug - Television Tropes & Idioms". Retrieved 16 October 2016.
  16. "The Risks Digest Volume 4: Issue 45". The Risks Digest.
  17. Stockton, J.R., "Critical and Significant Dates" Merlyn.
  18. A. van Deursen, "The Leap Year Problem" The Year/2000 Journal 2(4):65–70, July/August 1998.
  19. CRN (4 January 2010). "Bank of Queensland hit by "Y2.01k" glitch". Retrieved 16 October 2016.
  20. "Windows Mobile glitch dates 2010 texts 2016". 5 January 2010.
  21. "Windows Mobile phones suffer Y2K+10 bug". 4 January 2010.
  22. "Bank of Queensland vs Y2K – an update". 4 January 2010.
  23. "Error: 8001050F Takes Down PlayStation Network".
  24. RTL (5 January 2010). "2010 Bug in Germany" (in French). Retrieved 16 October 2016.
  25. "The Case for Windowing: Techniques That Buy 60 Years", article by Raymond B. Howard, Year/2000 Journal, Mar/Apr 1998.
  26. Millennium bug hits retailers, from BBC News, 29 December 1999.
  27. Martin Wainwright (13 September 2001). "NHS faces huge damages bill after millennium bug error". The Guardian. UK. Retrieved 25 September 2011. The health service is facing big compensation claims after admitting yesterday that failure to spot a millennium bug computer error led to incorrect Down's syndrome test results being sent to 154 pregnant women. ...
  28. 1 2 Y2K bug fails to bite, from BBC News, 1 January 2000.
  29. 1 2 Computer problems hit three nuclear plants in Japan, report by Martyn Williams of CNN, 3 January 2000 Archived 7 December 2004 at the Wayback Machine.
  30. 1 2 3 "Minor bug problems arise". BBC News. British Broadcasting Corporation. Retrieved 4 December 2015.
  31. Preparation pays off; world reports only tiny Y2K glitches at the Wayback Machine (archive index), report by Marsha Walton and Miles O'Brien of CNN, 1 January 2000.
  32. Wired (29 February 2000). "HK Leap Year Free of Y2K Glitches". Retrieved 16 October 2016.
  33. Wired (1 March 2000). "Leap Day Had Its Glitches". Retrieved 16 October 2016.
  34. The last bite of the bug, report from BBC News, 5 January 2001.
  35. Iliana V. Kohler; Jordan Kaltchev; Mariana Dimova. "Integrated Information System for Demographic Statistics 'ESGRAON-TDS' in Bulgaria" (PDF). 6 Article 12. Demographic Research: 325–354.
  36. "Uganda National Y2k Task Force End-June 1999 Public Position Statement". 30 June 1999. Retrieved 11 January 2012.
  37. "Y2K Center urges more information on Y2K readiness". 3 August 1999. Retrieved 11 January 2012.
  38. DeBruce, Orlando; Jones, Jennifer (23 February 1999). "White House shifts Y2K focus to states". CNN. Retrieved 16 October 2016.
  39. 1 2 "FCC Y2K Communications Sector Report (March 1999) copy available at WUTC" (PDF). (1.66 MB)
  40. See President Clinton: Addressing the Y2K Problem, White House, 19 October 1998.
  41. "Federal Communications Commission Spearheads Oversight of the U.S. Communications Industries' Y2K Preparedness, Robert J Butler and Anne E Hoge, Wiley, Rein & Fielding September/October 1999". Opengroup. Archived from the original on 9 October 2008. Retrieved 16 October 2016.
  42. "Basic Internet Structures Expected to be Y2K Ready, Telecom News, NCS (1999 Issue 2)" (PDF). (799 KB)
  43. "Finding Aids at The University of Minnesota".
  44. "quetek.com". quetek.com. Retrieved 25 September 2011.
  45. Internet Year 2000 Campaign archived at Cybertelecom.
  46. Kunstler, Jim (1999). "My Y2K—A Personal Statement". Kunstler, Jim. Retrieved 12 December 2006.
  47. 1 2 3 "False Prophets, Real Profits - Americans United". Retrieved 9 November 2016.
  48. Dutton, D., 31 December 2009 New York Times, "Its Always the End of the World as we Know it"
  49. Coen, J., 1 March 1999, "Some Christians Fear End, It's just a day to others" Chicago Tribune
  50. Hart, B., 12 February 1999 Deseret News, "Christian Y2K Alarmists Irresponsible" Scripps Howard News Service
  51. Smith, B., 1999, I Spy with my Little Eye, MS Life Media, chapter 24 - Y2K Bug, http://www.barrysmith.org.nz/site/books/
  52. 1 2 "Col Stringer Ministries - Newsletter Vol.1 : No.4". Retrieved 9 November 2016.
  53. Rivera, J., 17 February 1999, "Apocalypse Now – Y2K spurs fears" , Baltimore sun
  54. Federal Reserve Bank of Minneapolis Community Development Project. "Consumer Price Index (estimate) 1800–". Federal Reserve Bank of Minneapolis. Retrieved October 21, 2016.
  55. Y2K: Overhyped and oversold?, report from BBC News, 6 January 2000.
  56. Robert L. Mitchell (28 December 2009). "Y2K: The good, the bad and the crazy". ComputerWorld.
  57. 1 2 James Christie, (12 January 2015), Y2K – why I know it was a real problem, 'Claro Testing Blog' (accessed 12 January 2015)
  58. Y2K readiness helped New York after 9/11, article by Lois Slavin of MIT News, 20 November 2002.
  59. "Finance & Development, March 2002 - September 11 and the U.S. Payment System". Finance and Development - F&D.
  60. Y2K readiness helped NYC on 9/11, article by Rae Zimmerman of MIT News, 19 November 2002.
  61. Dutton, Denis (31 December 2009), "It's Always the End of the World as We Know It", The New York Times.
  62. Smith, R. Jeffrey (4 January 2000), "Italy Swatted the Y2K Bug", The Washington Post.
  63. White House: Schools lag in Y2K readiness: President's Council sounds alarm over K-12 districts' preparations so far, article by Jonathan Levine of eSchool News, 1 September 1999.
  64. Hoover, Kent (9 January 2000), "Most small businesses win their Y2K gamble", Puget Sound Business Journal.
  65. Lights out? Y2K appears safe, article by Elizabeth Weise of USA Today, 14 February 1999.
  66. John Quiggin, (2 September 1999), Y2K bug may never bite, 'Australian Financial Review' (from The Internet Archive accessed 29 December 2009).

External links

This article is issued from Wikipedia - version of the 12/4/2016. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.