Black Box Thinking
Matthew Syed
Nobody wants to fail. But in highly complex organizations, success can happen only when we confront our mistakes, learn from our own version of a black box, and create a climate where it’s safe to fail.
We all have to endure failure from time to time, whether it’s underperforming at a job interview, flunking an exam, or losing a pickup basketball game. But for people working in safety-critical industries, getting it wrong can have deadly consequences. Consider the shocking fact that preventable medical error is the third-biggest killer in the United States, causing more than 400,000 deaths every year. More people die from mistakes made by doctors and hospitals than from traffic accidents. And most of those mistakes are never made public, because of malpractice settlements with nondisclosure clauses.
For a dramatically different approach to failure, look at aviation. Every passenger aircraft in the world is equipped with an almost indestructible black box. Whenever there’s any sort of mishap, major or minor, the box is opened, the data is analyzed, and experts figure out exactly what went wrong. Then the facts are published and procedures are changed, so that the same mistakes won’t happen again. By applying this method in recent decades, the industry has created an astonishingly good safety record.