June 2016

Volume 31 Number 6

[Editor's Note]

Cognitive Bias

By Michael Desmond | June 2016

Michael DesmondIn my last column (msdn.com/magazine/mt703429), I described how efforts to control the coolant loss event at the Three Mile Island (TMI) nuclear plant in 1979 were shaped by received wisdom from another domain—in this case, training that plant operators received in the U.S. Navy’s nuclear fleet. Faced with conflicting information from malfunctioning and dysfunctional systems and controls, operators chose to act first on water level readings in the cooling system’s pressurizer tank over those in the reactor core itself. The incident resulted in a partial fuel meltdown and the worst nuclear reactor accident in U.S. history.

The lessons of TMI extend beyond the biases professionals bring with them as they transition among organizations, projects and job roles. In fact, TMI and nuclear incidents such as the 2011 Fukushima Daichii disaster in Japan reveal an important aspect about human nature in the face of crisis, and present cautionary lessons for software developers who must be responsive to deadlines, budget constraints, code flaws, security threats and a host of other stresses.

Arnie Gundersen is a nuclear industry veteran and chief engineer at Fairewinds Energy Education. During an April 2016 presentation in Japan, he noted that plant operators at TMI and Fukushima each relied on instruments that falsely indicated “that there was a lot of water in the nuclear reactor, when in fact there was none.”

He went on to say: “Every reading that was true and really bad, they thought of as erroneous. Every reading that was erroneous but really good, they relied upon. That’s a trend that I always see in emergency response. Operators want to believe the instruments that lead them to the conclusion they want to get to.”

Normalcy bias explains some of this. Humans are hardwired to underestimate the likelihood and impacts of a disaster and tend to, as Wikipedia notes, “interpret warnings in the most optimistic way possible, seizing on any ambiguities to infer a less serious situation.” This cognitive quirk occurs all over the place—in aircraft cockpits, financial institutions, government bodies and, yes, software development shops.

Banks and financial firms, for instance, continued to engage in risky behavior ahead of the global financial collapse of 2008, despite clear indications of the impending downturn. In the minutes and hours before the Deepwater Horizon oil spill in 2010, operators failed to act on abnormal pressure and fluid readings in the well, which portended the calamitous blow out. After the explosion, British Petroleum downplayed the impact, estimating the flow rate of oil into the Gulf of Mexico at just 1,000 to 5,000 barrels per day, while the U.S. government’s Flow Rate Technical Group (FRTG) placed that figure at 62,000 barrels.

Ignoring troubling indicators, downplaying damage, and choosing to believe information that supports positive outcomes—these are flawed responses that can make bad situations terrible. But the motivation to engage in them is strong. When I interviewed Gundersen, he drove the point home by citing author Upton Sinclair, who famously wrote: “It is difficult to get a man to understand something, when his salary depends on his not understanding it.”

For developers pressed by unrealistic ship schedules, inadequate budgets and ambitious software requirements, the ability to make clear-eyed judgments spells the difference between making tough decisions today and facing much more difficult ones down the road.


Michael Desmond is the Editor-in-Chief of MSDN Magazine.


Discuss this article in the MSDN Magazine forum