The Question That Should Guide All Government Leaders

Don’t ignore those voices in your head.

What keeps you up at night? If you are a public executive, there must be something—undoubtedly many things. Some of these sleep-depriving problems are minor annoyances, the daily irritations that can make life unpleasant.

Executive insomnia, however, is a more debilitating disease. Its cause can be traced to something that could go wrong. Something that could go terribly wrong. Some disaster that, if the executive’s leadership team had been prescient enough, it might have been able to prevent.

Of course, your leadership team is not omniscient. It can’t be. It doesn’t have the time to be aware of everything that is going on inside your organization—everything that has the potential to produce a big disaster. And, of course, there are lots of things going on outside your organization that could also produce a disaster for which you might be at least partially culpable.

No one can possibly predict everything that can go wrong.

Late at night, however, you do. The possibilities swirl through your head. You might try to get to sleep by counting them. Sheep No. 1 is a terrorist attack on your headquarters. Sheep No. 2 is a client assault on a public employee. Sheep No. 3 is a public employee assault on a client. Sheep No. 4 is . . .

Actually, you might never get to Sheep No. 2. Today, Sheep No. 1 is a big potential disaster. It keeps lots of public executives awake at night. It is one really big thing that could go really wrong. Of course, you know about Sheep No. 1 and have (I hope) taken multiple steps to prevent it.

Unfortunately, you don’t know where, when, or how other potential disasters might happen. You might not even be able to identify the specific circumstance that would cause this big thing to go wrong. You might not be able to specify the exact nature of this disaster. All you have consuming your late-night head is vague, tortured speculation about the many things that can go terribly wrong.

Moreover, you don’t know how to go about predicting where, when, or how this big disaster might happen. Late at night, you can easily invent a disparate miscellany of examples. Yet, you know that when the big disaster does happen, it won’t happen in any of the ways that your tormented, sleep-deprived brain invented.

For the U.S. Federal Aviation Administration, the big disaster is a passenger-airplane crash. Lots of people will die. And the FAA could easily (if incorrectly) be accused of incompetence for having failed to identify this specific type of disaster and be blamed for not preventing it.

The FAA doesn’t know when the next passenger-airplane crash might occur. It doesn’t know where it might occur. It doesn’t know how it might occur. It only knows that it will, eventually, occur.

The FAA also knows, when it does occur, it will be a really big disaster.

So, besides lying awake at night, what can FAA leaders do about this potential, but unknown disaster? How might the FAA predict the possible disasters that could happen? After all, if it could predict them, it could try to devise ways to prevent each of them from ever happening.

The FAA’s approach is to identify near misses. Every time it identifies a near miss, it identifies a potential disaster. Once the FAA learns of a near miss, it can dissect everything that led up to this specific near disaster, distinguish the multiple contributing factors, analyze each of them, and develop a remedy to prevent each one from ever happening again.

To employ its near-miss strategy for preventing airplane crashes, the FAA needs near-miss data. It collects such data by making it possible for anyone who witnesses or experiences a near miss to report it to the National Aeronautics and Space Administration. Then NASA gives the FAA these near-miss data (no names attached).

The FAA is not the only organization that—in an effort to prevent future mistakes—has created a process for reporting and analyzing near misses.

For example, all hospitals have the potential for a disaster—for a fatal mistake. One analysis estimated that more than 400,000 patients die in U.S. hospitals every year.

All hospitals have near misses. Thus, hospital executives need to worry about near misses. Concluded the Institute of Medicine: “Near-miss systems should be an integral part of clinical care and quality management information systems.”

Does your organization have near misses? Certainly it does. But what are they? How can you collect your own near-miss data? What can you learn from such data? What disaster can such data help you prevent?

So while you are laying awake at night, worrying about all of those potential disasters, maybe you ought to focus your troubled brain on creating ways to identify your organization’s near misses.