Learn something new every day.
And have fun doing it.
In computer lingo, a “bug” is a mistake in the code causing unexpected behavior. You know, like the famous Windows blue screen of death. Somewhere in the millions of lines of nitty-gritty computer instructions are some flaws, which, under a convoluted combination of circumstances, can cause a crash. Other computer bugs can just cause unexpected results, but even those can have major consequences.
For example, the “biggest” computer bug was Y2K. For those of you not yet born, during the run-up to January 1, 2000, people lost their minds over the potential for catastrophe. Many, or perhaps most, software coders used a two-digit field to store the year since they didn’t foresee their programs being used far into the future, and the practice saved memory, which was far less plentiful at the time. So, for example, “73” was used to indicate the year “1973.” Problems arise when the year becomes “2000” as many computers would store just “00,” calculating the date as “1900.” While the fears of banks crashing, planes falling out of the sky, and nuclear missiles launching themselves never materialized, the scare did cost billions of dollars of software rewrites worldwide.
Computers are technically stupid in that they do whatever you tell them, even if the inputs defy logic or represent an obvious (to us) error. While the Hubble Space Telescope debacle was the result of a manufacturing flaw, the Mars Orbiter’s cause of death was a computer bug. In 1988, the Mars Climate Orbiter incinerated itself in the thin Martian atmosphere because it calculated the wrong entry angle to achieve orbit. Actually, the onboard computer did precisely what it was told, but somewhere in there, programmers mixed up imperial and metric units of measurement, skewing the positioning calculations by 100km.
One famous bug created history’s wealthiest person—for at least a hot second. In 2007, a programming error in PayPal’s software delivered 92 quadrillion dollars to one user. That’s 1,000 times more than the GDP of planet Earth. Alas, Pennsylvania resident Chris Reynolds didn’t spend it quickly enough, and his account was reset to … zero. Easy come, easy go.
We think of bugs as idiosyncrasies of computers, but the term was used to represent flaws or glitches long before the first vacuum tube, or transistor added any digits. The term “bug,” meaning something is awry in the system, actually predates the electric light bulb.
Thomas Edison used the term frequently to describe mechanical and electrical problems while puttering around his Menlo Park lab. In an 1878 letter to Western Union President William Orton, Edison wrote, “I did find a ‘bug’ in my apparatus, but it was not in the telephone proper.” In this case, Edison quoted the word bug as he really did find bugs in the wiring of one of his new telephone designs.
Finding a computer bug!
Computer programming pioneer Rear Admiral Grace Hopper, USN, was working with an early computer at Harvard University all the way back in 1947. The Mark II machine was spitting out consistent errors, so the team investigated the hardware. Like Edison before, someone found an actual bug fouling up the electronic connections. A team member taped the dead moth into the log book with a note saying, “First actual case of a bug being found.”
Grace Hopper, her place in history cemented through groundbreaking work on the development of computer languages, wasn’t the person who found this bug, but her presence at the incident likely helped make the story of the first known computer “bug” famous.
A bug in the system, indeed.