Jack Covert Selects

Black Box Thinking: Why Most People Never Learn from Their Mistakes—But Some Do

November 19, 2015

Share

In his new book, Matthew Syed explains how mistakes and failures can be used to drive progress and innovation.

Black Box Thinking: Why Most People Never Learn from Their Mistakes—But Some Do by Matthew Syed, Portfolio, 320 pages, $27.95, Hardcover, November 2015, ISBN 9781591848226

The business world has turned the definitions of words on their head lately. Disruption is highly sought after. Failure has come to be celebrated. These proclamations can be a little tiresome in their insistence. What about the jobs being lost to disruption or real lives, careers, and compaines being affected by failure?

Black Box Thinking is different. It is not a part of that unquestioned celebration, or an exaltation of failure telling us to fail fast and fail often. It is, rather, an exhortation to get the right systems and culture in place to allow for the inevitability of failure and a methodical breakdown of how to learn from its occurrence. It is a hard book at times, especially at the start, because it begins with the story of death—particularly the death of Elaine Bromiley, an otherwise healthy, thirty-seven year old mother, during a routine medical procedure to alleviate her sinus problems. Chapter Two lightens things up a little with the story of a plane crash.

But it is these two events that Matthew Syed uses to brilliantly form the frame of a discussion on failure. Elaine’s husband, Martin Bromiley, was determined to have an investigation done, not out of malice towards the doctors or hospital, but in the hope that what happened to his could be prevented in the future. And his determination was largely the result of his own profession:

He is a pilot. He had flown for commercial airlines for more than twenty years. He had even lectured on system safety. He didn’t want the lessons from a botched operation to die along with his wife.

 

So he asked questions. He wrote letters. And as he discovered more about the circumstances surrounding his wife’s death, he began to suspect it wasn’t a one-off. He realized that the mistake may have had a “signature,” a subtle pattern that, if acted upon, could save future lives.

 

The doctors in charge couldn’t have known this for a simple but devastating reason: historically, health-care institutions have not routinely collected data on how accidents happen, and so cannot detect meaningful patterns, let alone learn from them.

 

In aviation, on the other hand, pilots are generally open and honest about their own mistake (crash landings, near misses). The industry has powerful independent bodies designed to investigate crashes. Failure is not regarded as an indictment of the specific pilot who messes up, but a precious learning opportunity for all pilots, all airlines, and all regulators.

 

One obvious difference between the two industries is the cost of malpractice lawsuits in healthcare, an issue we seem to hear a lot about. But studies show that when healthcare institutions are more transparent and honest about their mistakes, lawsuits actually go down. And most of the time, as Martin Bromiley was himself told, the only way an investigation is done is if a lawsuit is filed. But the biggest difference is within the culture of the two industries, and in their attitude toward failure. “In aviation” Syed asserts, “learning from failure is hardwired into the system.” The title of the book comes from the black boxes placed on airplanes (incidentally, now colored bright orange to make them easier to find should a plane crash).

All airplanes must carry two black boxes, one of which records instructions sent to all on-board electronic systems. The other is a cockpit voice recorder, enabling investigators to get into the minds of the pilots in the moments leading up to the accident. Instead of concealing failure, or skirting around it, aviation has a system where failure is data rich.

 

There are also systems in place for pilots to self-report errors with immunity and anonymity, and assertiveness training other officers in the cockpit to alert pilots to information they may be missing because of overfocusing on one problem at the expense of another. This all came about because of past failures. Syed highlights the case of Captain Chelsey Sullenberger, who famously landed a plane in the Hudson River, to show how systems built out of previous failures improve the likelihood of future success. In Sullenberger’s own words:

Everything we know in aviation, every rule in the book, every procedure we have, we know because someone died … We have purchased at a great cost, lessons literally bought with blood that we have to preserve as institutional knowledge and pass on to succeeding generations. We cannot have the moral failure of forgetting these lessons and have to relearn them.

 

Syed points to the example of science, in which progress is made on falsification, when old theories are tested and proved wrong and we must come up with new models and theories to explain why. In this way, the work is never done. The model is never perfect. It is also why a free market economy that allows business failures functions better than a centrally planned economy that does not.

As he says, “error in indispensable to the process of discovery.” So he advocates for “open loops,” where feedback is sought out and rationally acted upon, rather than “closed loops” where mistakes are denied, misinterpreted, or ignored, so real data or information on the failure is impossible to access and assess to make improvements.

He also notes a famous study by Philip Tetlock that shows so-called “experts” are often worse at analyzing information and predicting outcomes.

As Tetlock put it, “Ironically, the more famous the expert, the less accurate his or her predictions tended to be.”

 

Why is this? Cognitive dissonance gives us the answer. It is those who are the most publicly associated with their predictions, whose livelihoods and egos are bound up with their expertise, who are most likely to reframe their mistakes—and who are thus the least likely to learn from them.

Syed dives deep into the human psychology of this phenomenon, showing the extent of cognitive dissonance and intellectual distortions we’ll go through, the self-justification we’ll come up with to avoid admitting mistake. He shows the difference between internal and external justifications, taking us through examples of wrongful convictions in the criminal justice system, and the often insane reaction and contortions of prosecutors who insist that they have the right guy in jail even when new evidence emerges to the contrary. He examines how the justification for the Iraq war went from WMD to getting rid of Saddam Hussein, fighting terrorists, and promoting peace in the Middle East. And of course, he gives us many business examples. Referencing Sydney Finkelstein’s book Why Smart Executives Fail, he explains how error denial can happen in business management, why it "increases as you go up the pecking order," and why…

The reason should be obvious. It is those at the top of business who are responsible for strategy and therefore have the most to lose if things go wrong. They are far more likely to cling to the idea that the strategy is wise, even as it is falling apart, and to reframe any evidence that says otherwise.

 

It is for that reason it is so crucial to have systems in place that flatten hierarchy in ways similar to cockpits, where other officers "within earshot" of leaders have the agency and responsibility to bring additional information and perspective to bear.

Black Box Thinking is dense with research, reference, and resources to build a culture that uses mistakes made to build a better business. It will teach you why the all-too-common “We did the best we could” response to failure is not good enough, why “It is about creating systems and cultures that enable organizations to learn from errors, rather than being threatened by them.” Matthew Syed, with a wealth of examples, science, and stories, will show you how to harness failures while minimizing its costs and negative effects, and how to overcome your own mental gymnastics that compel you to want to twist and turn around a mistake. In the end, he will teach you how to use mistakes and errors to drive innovation and progress, as it has in aviation safety and science.

What if healthcare took that same approach? What if your company did?

We have updated our privacy policy. Click here to read our full policy.