Douglas Litchfield/Shutterstock.com

How Much Should Air Traffic Controllers Trust New Flight Management Systems?

The FAA's NextGen system should bring safety and efficiency to American air travel, but its users need to understand it clearly.

With airfares at their lowest point in seven years and airlines adding capacity, this year’s Thanksgiving air travel is slated to be 2.5 percent busier than last year. Between Nov. 18 and 29, 27.3 million Americans are expected to take to the skies.

The system we use to coordinate all those flights carrying all those Thanksgiving travelers through the air is decades old, and mostly depends on highly trained air traffic controllers , who keep track of where all the planes are, where they’re heading, how fast they’re going and at what altitude.

As the national airspace gets more crowded, and as technology improves, the Federal Aviation Administration has begun upgrading the air traffic control systems . The new system is called NextGen , and some of its capabilities are already being rolled out across the country. It is intended to make air traffic faster, more efficient, more cost-effective and even, through fuel savings, less damaging to the environment. It will also help air traffic controllers and pilots alike handle potential hazards, whether they involve weather, other aircraft or equipment problems.

But we the traveling public will be able to realize all these benefits only if the air traffic controllers of the future make the most of the technology. As a human factors researcher, seeking to understand how people interact within complex systems, I have found that there are challenges for controllers learning to properly trust the computer systems keeping America in the air.

Use as directed

The NextGen system is designed for humans and computers to work in tandem. For example, one element involves air traffic controllers and pilots exchanging digital text messages between the tower and airplane computer systems , as opposed to talking over the radio. This arrangement has several benefits, including eliminating the possibility someone might mishear a garbled radio transmission.

Human controllers will still give routing instructions to human pilots, but computers monitoring the airspace can keep an eye on where planes are, and automatically compare that to where they are supposed to be, as well as how close they get to each other. The automated conflict detection tools can alert controllers to possible trouble and offer safer alternatives.

In addition, air crews will be able to follow routing instructions more quickly, accepting the digital command from the ground directly into the plane’s navigation system. This, too, requires human trust in automated systems. That is not as simple as it might sound.

Trust in automation

When the people who operate automated tools aren’t properly informed about their equipment – including what exactly it can and cannot do – problems arise. When humans expect computerized systems to be more reliable than they are, tragedy can result. For example, the owner killed in the fatal Tesla crash while in autopilot mode may have become overreliant on the technology or used it in a way beyond how it was intended. Making sure human expectations match technical abilities is called “ calibration .”

When the people and the machinery are properly calibrated to each other, trust can develop. That’s what happened over the course of a 16-week course training air traffic controller students on a desktop air traffic control simulator .

Researchers typically measure trust in automated systems by asking questions about the operator’s evaluations of the system’s integrity, the operator’s confidence in using the system and how dependable the operator thinks the system is. There are several types of questionnaires that ask these sorts of questions; one of them, a trust scale aimed at the air traffic management system as a whole , was particularly sensitive to discerning changing trust in the student group I studied.

I asked the air traffic controller students about their trust in the automated tools such as those provided by NextGen on the first day, at the midterm exam in week nine of their course, and at the final exam at the end of the training. Overall, the students’ trust in the system increased, though some trusted it more than others.

Too much trust, or too little?

There is such a thing as trusting technology too much. In this study, some students, who trusted the system more, were actually less aware than their less trusting classmates of what was going on in the airspace during simulated scenarios at the final exam with lots of air traffic. One possible explanation could be that those with more trust in the system became complacent and did not bother expending the effort to keep their own independent view (or “maintain the picture,” as air traffic controllers say).

These more trusting students might have been more vulnerable to errors if the automation required them to manually intervene. Correlation analyses suggested that students with more trust were less likely to engage in what might be called “nontrusting” behaviors, like overriding the automation. For example, they were less likely to step in and move aircraft that the automated conflict detection tools determined were far enough apart, even if they personally thought the planes were too close together. That showed they were relying on the automation appropriately.

These trust disparities and their effects became clear only at the final exam. This suggests that as they became familiar with the technology, students’ trust in the systems and their actions when using it changed.

Previous research has shown that providing specific training in trusting the automation may reduce students’ likelihood of engaging in nontrusting behaviors. Training should aim to make trainees more aware of their potential to overly trust the system, to ensure they remain aware of critical information. Only when the users properly trust the system – neither too much nor too little – will the public benefits of NextGen truly be available to us all.

The Conversation

This post originally appeared at The Conversation . Follow @ConversationUS on Twitter.