ADVICE+DISSENT:Intelligence File Reading the Signs

Crises occur when we have the wrong kinds of information.

Crises occur when we have the wrong kinds of information.

One can usually expect surprises. That's a lesson to draw from the two great American crises of the first decade of the 21st century: the Sept. 11 terrorist attacks and the economic meltdown. Neither event came without warning, but few people saw the signals in advance.

And yet we see clearly, now, the discrete events that precipitated these disasters. The 9/11 commission attributed the government's blindness to a "failure of imagination." The idea here is the people who were responsible for thinking about terrorist attacks- mostly career counterterrorism staff on the National Security Council and analysts in the intelligence community- didn't think hard enough about how terrorists could attack the United States.

But that's not so. These people weren't able to name the date, or the method of the attack, but throughout the summer of 2001, their early warning systems were "blinking red," as George Tenet, then the director of Central Intelligence, told the 9/11 commission in 2004. So, why did the information available to them-which consisted mostly of intelligence reports about increased terrorist phone "chatter"-fail to alert anyone that a plot was afoot?

Well, perhaps it was the wrong kind of information. Consider the root causes of the economic crisis.

This disaster was touched off by a collapse in the market for mortgage-backed securities. (We know this now, but we failed to act at the time.) Financial firms made a killing buying up bundles of discounted mortgages and then selling shares of them to investors. When people paid off their mortgages, the securities paid returns. Many of those securities were deemed safe, quality investments by the companies that routinely grade securities and bonds.

Why? Not because the rating agencies were incapable of assessing risk. Like counterterrorism officials, they were plenty good at that. The rating agencies failed because they were using the wrong information.

The grade assigned to a mortgage-backed security depended on the likelihood that the thousands of individual borrowers who comprised those securities would default on their loans. Those people posed risk to the investors. So, the rating agencies examined historic rates of default-which they deemed tolerably low-and used them to calculate future risk.

Bad idea. Because it turns out (as we know now), those securities included a lot of loans to new borrowers, many of whom had neither the money for a down payment nor sufficient income to repay their loan. Historic data didn't account for these people, who had never taken out mortgages for a very good reason-they couldn't afford them. The rating agencies apparently never considered how these riskier borrowers would make the securities riskier, too.

In defense of the terrorist hunters, they sensed something was coming-unlike just about everyone implicated in the financial collapse. But both the Sept. 11 attacks and the economic crisis have this much in common: The people who should have predicted them failed, because they didn't have the right information.

The government responded to the attacks by gathering more information. It's hard to say whether this improves its ability stop the next terrorist strike. But this much we can say. You don't prevent disaster simply by collecting more data.

As we dig out from the ashes of the financial crisis, and we hear calls for more "transparency" and "reporting" from banks and other institutions, we might want to remember that we had a lot of information before the meltdown. But it was the wrong kind.

Shane Harris, a staff correspondent for National Journal, wrote about intelligence and technology at Government Executive for five years.