Toronto Pearson International Airport's tower in 2015.

Toronto Pearson International Airport's tower in 2015. Flickr user Robert Linsdell

The Ultimate Case Against Using Shame as a Management Tactic

To prevent catastrophes, air traffic controllers are trained in a culture of "psychological safety."

Earlier this month, what could have been the worst aviation accident in history was narrowly averted. An Air Canada pilot mistakenly lined up to land on a taxiway , instead of a parallel runway, at San Francisco International Airport. Four planes, fully fueled and loaded with passengers, were parked on the taxiway queued for take-off, facing the incoming aircraft.

In the heart-stopping audio recording of air traffic control conversations from that evening, an unidentified voice can be heard alerting the controllers to the plane’s location. “Where’s that guy going?” he asks. Then: “He’s on the taxiway.”

This visualization by the San Jose Mercury News shows just how close to disaster the incident came:

A controller who had previously reassured the Air Canada pilot that no planes were on its assigned runway calmly speaks again, this time telling the plane to “go around.” That’s aviation-speak for “abort the landing,” which the plane did, missing a collision, but only barely.

Exactly what went awry in the moments leading up to the incident is still under investigation by the Federal Aviation Administration (FAA). But once it’s sorted out, it will almost certainly become a lesson in air traffic control training programs around the world, says Neil May, head of Human Factors at NATS, the public-private partnership company that provides air traffic control services for most UK airports. His trainers are constantly pulling insights and lesson plans from real world crises, not just from within the company’s own staff, but globally.

Air traffic controllers are vetted before they even start training, and selected for an above-average capacity to handle high-stakes pressure. Then they go through an intensive education, involving simulations, in which they’re taught to stay calm and decisive, have “clarity of thought,” and keep assimilating information rapidly .

But this innate talent and intense training would be less effective were it not for what the European industry calls “just culture,” May says. That’s an ethos which dictates the way mistakes are reported in the profession, and what the consequences are for human error. The system, which is similarly practiced in US air traffic control but doesn’t go by the same name, involves responding to errors with training and support, not punishment or job loss. More than a buzzword, just culture involves a literal contract that codifies the ways that employees will be made to feel psychologically safe at work, and it’s signed by management and the controller’s trade unions.

Importantly, May explains, just culture doesn’t let willful, deliberate (sometimes criminal) actions off the hook. “However,” he says, “we are all human, and humans make mistakes… So if the controller makes made an honest mistake and owns up to it, then that’s absolutely fine. They will get training, they will get help to overcome the psychological aspects of having an incident. If somebody puts a hand up and says you have these two aircraft got too close, for example, that’s only positive, there’s no blame attached.”

Other professions call this “psychological safety”

In most offices, employees want to hide or minimize gaffes, lest they impact a performance review, get them fired, or—and this is a big one—tarnish their reputation among peers. As Amy Edmondson, a Harvard University professor of management who coined the term “psychological safety,” found in her research , “It turns out that no one wakes up in the morning and jumps out of bed because they can’t wait to get to work today to look ignorant, intrusive, incompetent or negative.” So we protect ourselves by not asking questions, and by not admitting to slip-ups. Employees spend a lot time and energy on “impression management” within the workplace.

Now companies like Google are studying ways to create psychological safety within teams, to free up resources wasted on self-protection and allow people to collaborate with ease and think creatively. In aviation, psychological safety became important for a less abstract reason: the literal safety of employees and customers.

“’Just culture’ as a term emerged from air traffic control in the late 1990s, as concern was mounting that air traffic controllers were unfairly cited or prosecuted for incidents that happened to them while they were on the job,” Sidney Dekker, a professor, writer, and director of the Safety Science Innovation Lab at Griffith University in Australia, explains to Quartz in an email. Eurocontrol, the intergovernmental organization that focuses on the safety of airspace across Europe, has “adopted a harmonized ‘just culture’ that it encourages all member countries and others to apply to their air traffic control organizations.”

The principle has also begun to take hold in the healthcare industry, says Dekker. And in the wake of the Grenfell Tower fire in London that claimed at least 80 lives last month, the editor of the Architect’s Journal, a trade publication, recently suggested that her profession could learn from just culture, too.

Reporting your misstep is the right thing to do

One tragic example of what can happen when companies don’t create a culture where employees feel empowered to raise questions or admit mistakes came to light in 2014, when an investigation into a faulty ignition switch that caused more than 100 deaths at GM Motors revealed a toxic culture of denying errors and deflecting blame within the firm . The problem was later attributed to one engineer who had not disclosed an obvious issue with the flawed switch, but many employees spoke of extreme pressure to put costs and delivery times before all other considerations, and to hide large and small concerns.

Under NATS protocol, employees who come forward about mistakes they’ve made aren’t only calling attention to human error, but also to the ways the system might be failing them. It might be that someone’s blunder is actually inevitable or highly likely to happen again because of the way the airspace is designed, says May, or how technology is being used . Under just culture, when even the smallest deviations from regulation are reported repeatedly, it allows an organization to see a faulty patterns .

But the ethos of just culture is probably most effective because it encourages individuals to proactively prevent mishaps and carelessness. For example, air traffic controllers are taught how to recognize both the common signs that a person is overwhelmed—speaking too fast, for instance, or leaning in on one’s radar screen—and the weird things only they do, as individuals, when the pressure is mounting.

The safety-first environment makes it totally normal and acceptable to call in help when it’s required, or to accept personal assistance when a colleague raises a flag about your demeanor or performance. What’s more, when a serious incident happens, employees know that their managers have a system in place that will allow them to not only stay employed, but also feel supported as they rebuild their confidence—through peer and professional interventions.

This enables employees to grow and improve. Mistakes become tools, not problems to downplay or ignore, and it’s not only the employee who was involved in the snafu who benefits from it instructional side effects. “We always ask controllers for permission to discuss incidents,” says May, “The controllers always say, ‘Yes, yes, yes. I want others to learn from what I went through.’”

( Image via Flickr user Robert Linsdell )