Philip Pilosian/Shutterstock.com

From Air Travel to Food Safety, More Data Can Be Misleading

The FAA unpacks the "black box" of incident reporting and performance measures.

Government agencies regularly report “incident” data, such as the number of burglaries, house fires, food poisoning cases, bankruptcies and workplace injuries. While these data can be used externally for accountability, they can also be used internally to predict and prevent these kinds of incidents.

These days, more detailed, near real-time data can be collected because of improvements in technology and new reporting systems. But these more detailed data -- if not well-explained and put in context -- can alarm the public and cause political problems, even while improving performance. Recent examples include reported increases in:

Incident reporting systems are an integral part of many agencies’ operations. But reporting the raw data for total number of incidents occurring does not necessarily help prevent future incidents.

Agency managers need to analyze operational data at a much finer level to understand why incidents occur and what can be done to pre­vent them. Understanding the precursors of an incident becomes an essential element in improving performance.

This is often called the “black box” of performance management -- understanding the relationships that connect danger signals to potential changes in operations to improve­ program outputs and outcomes.

Managing Air Traffic Incidents

In a new report  for the IBM Center for the Business of Government, Russell Mills offers a case study of the Federal Aviation Administration’s Air Traffic Organization incident reporting systems that have evolved since the late 1990s. He describes the introduction of voluntary self-reporting of errors by air traffic controllers and the use of increasingly sophisticated electronic tracking equipment. Both of these new measurement systems dramatically improved the timeliness and quality of data about “operational errors” -- when aircraft fly too close to each other. For the most part, air traffic controllers are required to keep aircraft separated by three miles (horizontally) and 1,000 feet (vertically). Deviations from these standards are one measure of the overall safety of the air traffic control system.

He writes that, ironically, this improved data collection initially alarmed external stakeholders—the traveling public and Congress. To them, it seemed that there was a dramatic increase in the number of operational errors. In fact, the increased reporting of incidents that had previously been undetected or unreported led to a greater understanding of trends and causal factors, thereby allowing the FAA to put in place corrective actions. While this led to a safer air traffic system, it created political concerns for the agency.

Mills reports that the FAA overcame these political concerns by creating a new risk-based reporting system for the traveling public and Congress that demonstrated the new elements of its incident reporting systems are contributing to greater safety. The FAA shifted from reporting raw numbers of operational errors to reporting on the significance of the numbers -- focusing on risk created by the lack of separation, rather than just compliance with the separation standards.

What the FAA Learned

Based on the experience of the FAA’s evolving incident report systems, Mills offers a set of strategic, management, and analytical lessons that could be applied by other agencies that may be increasing the sophistication of their own incident reporting systems:

Strategic Lessons: As agencies report more performance information -- including incident reporting -- there will be increased scrutiny of that performance by external stakeholders. As a result, agencies need to be prepared to proactively educate key stakeholders on the new measures and how to interpret them. Mills notes that “FAA leaders were often forced to act from a reactive rather than a proactive position in explaining the increased number of [operational errors] due to increased detection of incidents.”

Management Lessons: In order to be useful to agency leaders, analyses of the data from incidents have to be available in a timely manner. At the FAA, risk analyses are assessed by panels of experts in each service area at least three to four times a week. This frequent assessment created a continuous feedback loop to detect patterns that need to be addressed. In addition to the frequency of data reporting and analysis, the success of self-reported errors by front-line air traffic controllers depends on collaboration between managers and employees. In this case, the ability of the FAA and the union representing the air traffic controllers was critical to obtaining the buy-in of controllers to honestly report without being subject to some form of retaliation.

Analytical Lessons: Agencies have to balance the need for externally reported performance indicators with the need for assurance that the indicators are reporting actionable information. In some agencies, there is external pressure to develop and report indicators without the scientific rigor to determine whether the measures are meaningful. In the case of the FAA, the risk metrics were developed in response to political concerns that the raw numbers of operational errors was climbing sharply. The agency, however, is still developing baselines and targets for measures under its long term strategic plan. In addition, FAA found that having more data did not necessarily mean that it had more performance information at hand. Having analytical techniques to interpret the data was also an important element in its overall performance management strategy.

(Image via Philip Pilosian / Shutterstock.com)

NEXT STORY: What Is Your Agency Doing Right?