The Big Picture

National Journal.
When government managers attack problems head-on, they often create as many difficulties as they solve. Systems thinkers propose a different approach.

F

or most of the 20th century, when a wildfire broke out somewhere in the U.S. wilderness, government firefighters hiked into the woods and dug a trench around the blaze to keep it from spreading. The fire would burn to the trench, then peter out. Put simply, when firefighters found a fire, they put it out. The firefighters saved lives, structures and many acres of vegetation. But their effectiveness in reducing destruction allowed brush to accumulate throughout the nation's forests and grasslands, building up fuel that made subsequent fires burn faster and more intensely. What the firefighters didn't realize was that by putting out fires, they were making future fires worse.

These days, forest and other wildland firefighters actually start fires-prescribed burns, they're called-to clear out brush and reduce the likelihood of catastrophic firestorms. To fight fires, they let fires burn.

Like wildland firefighting, many of government's solutions to national problems have turned out to have perverse results and unintended consequences. Welfare, it could be argued, has kept people poor. Efforts to hold health care workers accountable for medical errors have hurt patient safety. Ongoing and expensive federal drug interdiction activities and drug-related arrests have had relatively minor effects on overall drug use.

Are our policy failures rooted in a common cause? A growing number of management analysts say the way policy-makers, and most of us for that matter, think about problem solving is all wrong. We tend to think about problems in a linear way: Take action A to address problem B to get to solution C. Pour water on the fire to put it out. But the management analysts, who call themselves systems thinkers, argue that most of the nation's and most organizations' problems cannot be solved in a linear way. Doing A may address B and get you to C, but it also causes D which makes B worse and causes E, which introduces new problem F, which also compounds problem B. The world is not full of straight lines, the systems thinkers say. It's full of loops. And loops require a fundamentally different way of thinking about how to solve problems.

A WORLD OF LOOPS

Systems thinking was born in the field of natural science, where biologists, earth scientists, physicists and others in the late 19th and early 20th centuries learned to view the natural world as complex systems of relationships, or loops. Rather than seeing simple patterns of cause and effect, scientists discovered that effects often become causes and causes usually are the effects of other causes. Scientists see loops everywhere, from the simple groundwater-evaporation-rain cycle to the complex interactions of plant life, animal life and climate to produce balance or change in the ecosystem.

From science, systems thinking spread to engineering, logistics and, eventually, to social and organizational dynamics. Massachusetts Institute of Technology management scholar Peter Senge popularized systems thinking in management consulting circles through his 1990 book The Fifth Discipline (Currency/Doubleday). In the book, Senge diagrammed several loops that he saw repeating themselves in many organizations. For example, in what Senge calls the "shifting the burden" loop, people attack a problem head-on with short-term fixes. But the short-term fixes create side effects that, over time, actually worsen the problem. The government's old way of fighting wildfires fits the "shifting the burden" model, with the immediate attack on fires actually worsening wildfires in the long run. By starting prescribed burns, firefighters are introducing a long-term fix that will help them escape the vicious cycle.

Firefighting isn't the only realm in which government managers have applied systems thinking. NASA's earth sciences group has applied systems thinking to its operations. The Army uses a systems-thinking-style approach to training. The Veterans Health Administration is applying systems thinking to patient safety. The Defense Department has used systems thinking to analyze the repercussions of weapons production and systems integration decisions. Managers at the Federal Aviation Administration, the National Cancer Institute and the Food and Drug Administration also are experimenting with systems thinking.

Systems thinkers in government say federal executives and managers tend to respond to events without looking at the larger patterns of which the events are a part. Because leaders don't deal with the patterns, the patterns persist. "Systems thinking is the bigger picture look at how interdependent parts function and how people analyze and prevent problems in situations that are beyond a simple one-aspect quality," says Roberta Sappington, a systems thinking coach at the Federal Aviation Administration's Center for Management Development in Palm Coast, Fla. "You see people tamper with one piece of a system, tugging on it and not realizing that it's attached to a lot of other parts."

VICIOUS CYCLES

Everyone in the earth sciences program at NASA knew they could do a better job producing science that contributes to understanding how the planet works. Out of frustration with the budget process, managers in the field were lobbying legislators to win special protection for specific projects-and thereby undermining the program's overall performance. Mike Luther, deputy associate administrator for earth science, called in systems thinkers to help solve the problem.

Christine Williams, an organizational development specialist at NASA headquarters, headed up an effort to diagram the system in which earth science managers operated. Williams and a contractor interviewed 50 earth science managers from across the country. Then the managers gathered for a three-day retreat. On the first day, they told stories about earth sciences operations, what worked and what frustrated them. Then, Williams and the contractor drew a systems diagram. What they drew, in part, was a map of a system in which individual managers acting in the best interest of their programs were unintentionally hurting the mission of the group.

Each year, the earth sciences group would receive a pot of money from Congress. To divvy up the money, headquarters officials would define requirements for programs and then use a selection process to decide who got how much money. But people in the field didn't understand how the process worked. Because they didn't understand the process, they thought it was unfair and became frustrated. Field managers would vent their frustrations to members of Congress-NASA's field sites are spread across the country and are important employers in their congressional districts. Legislators would earmark money for the field projects. The earmarks took away money that the earth sciences program could apportion each year, causing more frustration among field managers.

The group's problem is one that occurs frequently in organizations and is dubbed the "tragedy of the commons." Individuals looking out for their own interests wind up hurting the common interest, which eventually hurts the individuals' interests, too. Environmentalists often cite the tragedy of the commons model in describing how individual actions such as overfishing and overpopulation can harm the common good.

In the earth sciences group's case, Williams' diagram helped field managers see how their actions were harming the organization. As headquarters officials made the funding selection process clearer, field managers stopped running to their members of Congress. Earth science officials purposely chose to improve their existing system, rather than reorganize their structure. Williams explains, "They saw that barriers would exist in any structure they decided to adopt, so they chose to stay with the one they had been working in and best understood while they focused on removing the barriers."

Systems thinkers contend that understanding the system you're in is more important than changing it. Sometimes awareness itself can get people to act differently. At other times, systems must fail for people to change. The Army, for example, was in a tailspin in the 1970s, caught in a post-Vietnam loop of dysfunction. Officials wanted to combat the service's decline, but they didn't want to change the Army's hierarchical structure. Instead, they introduced the after-action review, explains retired Col. John O'Shea, director of defense education for the Reserve Officers Association in Washington.

After-action reviews immediately follow training exercises and battlefield operations. Units gather with either the unit commander or an outside facilitator to answer the questions, What did we set out to do? What did we do? Why are they different? The commander or facilitator creates a climate in which soldiers feel free to admit mistakes or criticize their commanding officers without fear of repercussions. The goal is to determine what the soldiers, and the Army as a whole, can learn from the outcome of the operation or exercise. The findings of after-action reviews are recorded and passed on to training and operational experts. The experts produce recommendations, which are entered into a database at the Center for Army Lessons Learned in Fort Leavenworth, Kan. The center then shares lessons learned with units engaged in similar training and operational exercises.

For example, O'Shea observed an after-action review involving a tank unit that did not get from point A to point B as it was supposed to. Members of the unit offered several reasons for missing the mark. Then, a young soldier asked the commander of the unit if he had an earache. The soldier had noticed that the commander jerked his head every time someone talked into his headset. The commander said yes, he was having an ear problem. As a result, he couldn't concentrate on the information he was being given, so the unit missed its target. The specific lesson was that earaches interfere with communications. The larger lesson was that commanders have to be aware of how their own health can affect unit effectiveness.

O'Shea says the system of after-action reviews, from the unit's discussion to the expert review to the database to the sharing of information with other units, allows the Army to constantly learn from its mistakes. In the old Army, O'Shea says, the commander with the earache simply might have been told that if he missed the target next time his career would suffer, a way of dealing with the problem that would not have dealt with the root cause or helped other commanders learn. Applied throughout the Army, after-action reviews helped the service get out of its dysfunctional loop and instead continuously improve, based on the lessons learned from the reviews, O'Shea says. Now, "the Army is at the forefront of organizational learning," he says.

ROOT CAUSES

Like the Army, the Veterans Health Administration is trying to identify root causes of problems. In both organizations, identifying those causes can save lives. The problem for VHA, as the agency's National Center for Patient Safety director, Dr. James Bagian, sees it, is persistence of a culture of blame, rather than a culture of safety. "Systems thinking is not historically rooted in medicine," Bagian explained to a House Veterans Affairs Committee panel in 2000. "On the contrary, the field of medicine has typically ascribed errors to individuals and embraced the name-blame-shame-and-train approach to error reduction."

In a 1998 survey, VA employees said the shame of making an error inhibited them from reporting errors. Because they failed to report the errors, others were likely to make the same mistakes. Since then, Bagian has tried to change the culture at VHA so that employees feel free to report errors and the organization can learn. Those efforts are paying off. For example, health care workers at VA hospitals sometimes misidentify patients-a problem that can prove fatal if patients are given treatments intended for someone else. Over a three-year period, from January 2000 to March 2003, VHA workers reported 100 cases of misidentification. In one case, a man received prostate surgery that he didn't need while the patient who did need the surgery did not get it until after the error was discovered. To combat such errors, the VHA developed redundant procedures for verifying patients' identities, rather than simply reprimanding employees each time they made a mistake or just telling them to be more careful.

By creating a systems thinking approach, Bagian is dealing with patterns of errors and their root causes, not just events. "Making people perfect is a losing proposition," Bagian says. "The goal is not to eliminate errors. The goal is to prevent harm to the patient." When errors occur, particularly close calls in which patients are not actually harmed, employees are encouraged to report the errors to Bagian's patient safety center, which helps identify root causes. "Near misses occur far more often than the events they're the harbinger of," Bagian says.

In some places, government managers are just beginning to experiment with systems thinking. Dr. Scott Leischow, head of the tobacco control research division at the National Cancer Institute in Bethesda, Md., is hoping to apply systems thinking to federal efforts to reduce tobacco use. Leischow is in charge only of research. He realizes that his unit is part of a larger system of tobacco control, which is itself part of a larger system of health care, but the various government agencies and outside groups with a role in reducing tobacco use tend to operate as if their activities are unconnected with one another. "We operate in a whole bunch of islands or silos of activity. There has not been a lot of work on hooking them together."

Leischow is starting to bring representatives of each "silo" together, with the aim of better connecting his researchers to people who work directly with Americans to get them to quit smoking or not to start in the first place. He also wants to get auditors who evaluate such efforts to connect more with researchers, so that real-world results have a bigger impact on research into smoking cessation methods. "That really isn't happening in an effective way," Leischow says. He's overseeing a two-year contract that is identifying the parts of the tobacco control system and the issues that inhibit collaboration. "The systems approach is the way of the future," Leischow says.

LIMITS OF LOOPS

Systems thinkers admit there are limits to the value of looking at systems. For one, managers and executives often have no choice but to react quickly to events using cause-and-effect analysis. If there's an oil spill, someone needs to clean it up, fast. If a fire threatens someone's home, it needs to be put out. Of course, systems thinkers would argue that after the spill is cleaned up or the fire extinguished, leaders should review the event to see if it fits in a pattern that needs to be dealt with more systemically.

People's assumptions and lack of information also limit systems thinking. At the USDA Graduate School in July, managers from across government gathered to learn about systems thinking. One group in the class developed a system diagram showing an agency in which employees complained about not getting enough feedback. The typical response of senior leaders in the agency was to gather everyone into an auditorium to encourage more feedback. But nothing changed. Employees' trust in their senior leaders' effectiveness eroded, and the lack of feedback continued. The group at the graduate school concluded that the solution to the failed system would be to provide more effective training to managers on how to give feedback.

Another student in the class, however, challenged the group's assumptions. What if the problem was not really feedback, but instead was frustration with a lack of promotional opportunities? Also, the student wondered, what were her classmates trying to achieve? If the goal was better performance, then perhaps feedback might not be the only-or the best-mechanism for getting it.

So many variables affect organizational and social dynamics that it's easy to miss the important ones. Still, systems thinkers say the exercise of figuring out the loops that operate in society, even if they don't exactly reflect reality, is worthwhile. Indeed, managers might find after mapping a system that it has far more loops and variables than they had considered. "People can gain insight and understanding into why things are the way they are," the FAA's Sappington says.

A group of 10 systems thinkers from across government is trying to apply systems thinking to the operations of government as a whole-a task the group has been working on for two years. The High Performing Federal Agencies Community of Practice, as the group calls itself, has put together a set of systems diagrams that attempt to describe the dilemmas facing government leaders.

One of the diagrams places federal operations in the context of the political environment, examining political candidates' need to differentiate themselves from incumbents. Once elected, the candidates must fulfill their campaign promises, which lead to new initiatives. The initiatives must show short-term, reportable results that elected officials can use to once again differentiate themselves from their opponents. The political loop focuses government employees on short-term results and can hurt federal agencies' ability to produce long-term results. Yet federal managers are expected to carry out ongoing agency operations directed at achieving long-term results. "Re-direction of resources to support new initiatives, and the lack of upper management focus on their efforts reduce management's morale and productivity," explains the group's first paper on the leadership dilemma.

While the High-Performing Agencies group is applying systems thinking to the whole government, federal managers can use it to examine their own operations. At the FAA's management development center, Sappington has worked with managers applying the concepts to aviation safety, air traffic control operations and organizational change. Sometimes, Sappington says, managers discover that the best thing they can do is avoid taking head-on action to solve a problem. "The natural state of a system is really, really powerful," Sappington says. "Sometimes a right solution is to respect the power of the system and look for what's working in [it]."


For More Information
  • The USDA Graduate School's organizational learning program includes a systems thinking component and is open to federal managers. The school offers a two-day course on systems thinking and half-day and two-day courses on the Army's after-action reviews. More information is at www.grad.usda.gov/olcc.
  • Richard Karash, one of the graduate school's systems thinking instructors, has a Web site at www.karash.com.
  • The Veterans Affairs National Center for Patient Safety Web site explains how Dr. James Bagian and his staff are applying systems thinking in the health care profession: www.patientsafety.gov.

Brian Friel is a staff correspondent for


NEXT STORY: Coast Guard’s Logical Home Port