This week’s blog is supposed to be about the command line and useful utilities. If you’ve been following along at all you’ll know by now that I am not a power user (although I’m getting there!). Anything I could tell you about the command line, or any review I might have of a utility that I just googled, would be essentially useless to everyone on the planet. I’m not shirking work: I’m diverging in a creative way that adds value to the world while being tangentially topical.

Blog Post Week Four: From Fork Bombs to Totally Forked: A Cautionary Tale of Horrible Things Accidentally Done Using Computers

While learning about the command line was honestly fascinating I found a new layer of engagement when I came upon this computer-crashing group of squiggles:

:(){ :|:& };:

Now this little snippet is probably common knowledge to every modern fifteen year-old but I’d never heard of a fork bomb. One thing led to another and I went down the rabbit hole and I think I can contribute something more useful and interesting by dragging you along with me.

There are many tales of overwriting good code with bad and new code with old, highlighting both the importance of thinking twice before saving over anything and maintaining good backups. Lots of tales of deleting entire hard drives with key slips while using “sudo” (ctrl-c might save you.. but probably not). “Phifty” on quora.com had a good anecdote about working on a live database (feeding a call centre) and accidentally changing over 1 million banking customer first names to “Javier”. That, and this, struck me as comical:

“To build a delete function is human, to run it on your own code, divine.”

Real life disasters also happen, which are sobering. I worked for Knight Securities, which (after I left) had one of the most spectacular programming fails. On August 1, 2012 at about 11am a software rollout repurposed a flag but the new software failed to be rolled out on one of the servers. The flag activated an obsolete command on this server. Four hundred and forty million dollars was lost through bad trades in forty minutes and the firm ceased to exist not long after. It was a billion dollar company when I was there. On a lesser scale, a tale from my days there (that to my knowledge has never been publicly told) happened one morning over an hour before the market opened. We had a firm-wide morning meeting and one of the junior traders had thrown his duffle bag on his desk before heading to the meeting. The meeting was interrupted after a short while due to frantic selling of our company stock in the thin premarket trading environment. News services were starting to report on the unusual activity and speculating about corporate insolvency, etc. Turns out, the bag on the desk had landed and depressed the “sell” key on the trading terminal causing a string of sell orders to batter our own stock down 90%. Luckily the trades were cancelled as “clearly erroneous”.

Deaths have been caused by programming errors. Between 1985 and 1987 at least 6 accidents occurred due to “concurrent programming errors” on the Therac-25 radiation therapy machine, causing 100x the radiation to be clinically administered. Three people died. In 2003, eight thousand five hundred people “fake” died in Grand Rapids, Michigan when a “mapping error” notified social security, insurance companies and the patients themselves that they had died.

In 2011, California State released 450 violent felons due to a programming error. Many are still unaccounted for. Michigan did something similar in 2005 but the 23 inmates they released early were non violent. The same error also had kept some prisoners past their release date.

Two more financial hits in the $300million range include an IRS malfunction in 2006 that saw its fraud detection software become unknowingly inoperative (this was not discovered until it was “too late”) and a $327million hit when the Mars Climate Orbiter went missing in 1998. An error in the ground-based software caused a miscalculation that caused the Orbiter to enter the atmosphere at the wrong entry point and disintegrate.

I’m taking away three lessons:

  1. Think. Take the time to imagine all the possible stresses that your critical systems could come under, including careless duffle bags.
  2. I have a lot to learn, whether it’s to not put passwords in source code, proper backup management, or not being glib with sudo.
  3. Test, have proper rollout systems in place for critical components, and hope that the software that holds my life in its hands multiple times a day is carefully and thoughtfully constructed.

Leave a Reply