In his new book The Watchers, (Penguin Press, 2010), Shane Harris chronicles what he calls "the rise of America's surveillance state," a process he's been following since he was a reporter and technology editor at Government Executive from 2001 to 2005. It's a story with all the elements of a spy thriller: political intrigue, shadowy federal organizations and a compelling cast of characters desperately seeking to prevent the next Sept. 11. At the center is the enigmatic John Poindexter, former national security adviser and architect of the ill-fated Total Information Awareness data collection and analysis effort. Earlier this year, Harris, now a correspondent for National Journal, sat down with Government Executive Editor in Chief Tom Shoop to discuss the book and its driving theme: intelligence agencies are a lot better at collecting dots of information than connecting them to predict future attacks. An edited transcript of the interview follows. Government Executive: Let's start with the Christmas Day bombing attempt. The Obama administration's report on the incident said, "Information sharing does not appear to have contributed to this intelligence failure; relevant all-sources analysts as well as watch-listing personnel who needed this information were not prevented from accessing it. . . . Information technology within the [counterterrorism] community did not sufficiently enable the correlation of data that would have enabled analysts to highlight the relevant threat information." So it sounds like we still haven't found the holy grail on connecting the dots. Why is that? Shane Harris: There are a couple of overarching reasons. One is that the initial response to 9/11 was to do two things: First, to tear down legal and bureaucratic barriers between agencies, many of which didn't need to be there in the first place. And the second thing that happened was there was this huge emphasis on collecting information. So the National Security Agency is dispatched to go out and start sucking up phone calls and e-mails. The FBI is dispatched to follow leads and start getting as much information as they can. And the emphasis becomes initially on collection. Because despite the narrative that all the dots about the 9/11 attacks were within the hands of the FBI and the CIA--let's just say for the sake of argument that's true--it doesn't mean there was a huge amount of information within the intelligence community on al Qaeda and on terrorism, generally. So the initial instinct is to go out and collect as much as you can. The other factor has more to do with the fact that connecting the dots is just harder. And the investments that you have to make in technology and in policy are just greater than saying, "You, NSA, go to the phone companies and get hold of as much customer record data as you possibly can." In the White House's recommendations, there's this one line that caught my eye: The president ordered that NSA analysts who were doing counterterrorism should immediately undergo training in how the watch lists and the no-flight lists are constructed. I was actually kind of baffled that they didn't already know, because their information is precisely what is being used to construct those lists. GE: So is connecting the dots less a technological problem and more of an organizational one? Harris: Yeah, it's also more of a cultural problem, too. It seems so cliche to be saying this, but culture is harder to change than a system. The technology, the policy and the culture just aren't geared toward this kind of real, collaborative effort. The National Counterterrorism Center was supposed to be the place where that started changing. And, arguably, it just hasn't. And even if it started changing in pockets within that organization, that's a very small organization. It is not the intelligence community. And I don't think bureaucratically it has a lot of clout. GE: Let's go back to the beginning of the story. People tend to think of deep information gathering and analysis--mining data and trying to connect the dots--as being a post-9/11 thing. But it really stretches all the way back to the Reagan administration, at least. Harris: Right. This is something I discovered as I was reporting; 9/11 really is a midpoint in the story. It's sort of the moment when the public becomes immediately aware of what global terrorism is and very quickly starts to understand what failure to connect the dots means. So they get very quickly in a short span of time what an intelligence failure is around terrorism. But it does go back to the early '80s--and specifically 1983. I hit upon this in deciding to make John Poindexter this central narrative protagonist: I said, OK, this is obviously someone who has been around government for some time. Let's go back and research and report that aspect of his life when he was in the White House, which was an interesting period anyway. What I discovered was that it really was the Beirut attack that shocked the intelligence system in a very similar way to 9/11. You have the Marines in Beirut, ostensibly on this international peacekeeping mission. They're hunkered down at the airport. For various political reasons, they're not allowed to go out very much in public. They are sort of sitting ducks. What happens is in the aftermath of the bombing, the intelligence community finds out there were all these warnings that something bad was about to happen to the Marines at the airport. So you had, in the spring of 1983, more than 100 individual warnings about car bombings fielded by the intelligence community. The Marines were blind, deaf and dumb sitting at the base. And the golden nugget of it all is that NSA intercepted, in the days before the attack, this phone conversation going from a minister in Iran to presumably one of these organizing terrorist groups--directing this group to go and take this spectacular action against the Marines. You add all these up and it looks a lot like 9/11. There's all this information sitting there and it's like, how come nobody's putting it together? And Poindexter is the guy who looks at this and says, "This shouldn't happen and we can take steps to make sure it doesn't happen. There has to be a way to logically approach this problem, systematize the whole process and connect those dots." GE: What's your ultimate assessment of Poindexter? Harris: He's basically the godfather of modern counterterrorism. There are other people who have been more celebrated in that area, but there's nobody who's been thinking about this question of how do you deliberately engineer the government to address terrorism. There's no one who's been thinking about it as long as he has and as thoughtfully as he has. I think the reason he hasn't been more effective is because of the political controversy that he became embroiled in. After Iran-Contra, he was tainted and always will bear that scar. And that's always going to be a piece of him. And the controversy over Total Information Awareness is always going to be a piece of him too. But he really is just a human lightning rod. He courts it. GE: In the book, you describe Poindexter's focus on anonymizing data. If TIA hadn't blown up, would we actually have more protections than in the systems currently in place? Harris: I think so. One of the great ironies of this story is John Poindexter ends up being perhaps one of the great defenders of privacy and civil liberties. It was an essential component of this TIA system that you would have to have a way to technologically shield privacy--to make it so that people who were working on a computer were not allowed to see the identities of the people underneath it. And every single keystroke was monitored. It's not the case that this was ever just a matter of "let's get all the information we can and we'll just use our own internal guidelines about how we can share it and what we can minimize and what we can't." That's what NSA did. NSA took his idea and ran with it with no privacy control. And I think they would say, "Well, we have internal procedures for that." But what are they? Of course, we don't know. GE: Was the way events turned out ultimately a vindication of the views of another central character in the book, former NSA and CIA chief Michael Hayden? Harris: Exactly. He was at the forefront of pushing for a much more aggressive collection of information and a much more aggressive analysis of that information. And he absolutely took what some in government would call a risky interpretation of the law--and the White House did too. Ultimately, the law did vindicate most of what he was doing. GE: By now, have people voluntarily given up many of the privacy protections that were the subject of so much controversy? Harris: The easiest example--and it also goes to the generational aspect of this--is if you look at Facebook or MySpace, or any kind of social networking site. People are willingly dumping in information at a level of detail that 20 years ago would have shocked people--that you would voluntarily give over photos, where you live, where you work, who you know. With a site like Facebook, I think even the intelligence community would look at that and say, "I'm not sure we could even devise something with that rich of a source of information." GE: Something that you touch on almost tangentially in the book is the successive reorganizations of the intelligence bureaucracy. What's your net assessment of that? Harris: Some shake-up is necessary, but ultimately I'm not sure if it is very effective. I think the case study for that is the National Counterterrorism Center, where you have this organization that really was supposed to be the national fusion center. I can remember covering this at the time for Government Executive, and there was this sense that this is going to be the place where it all comes together. We're going to have this director of national intelligence and he's going to be in control. We're going to have this center, and everyone's going to come together. I won't say that NCTC is a failure, but if you look at the reasons why they failed to detect the Christmas attack, then you realize that it has very little to do with re-arranging the structure. It has everything to do with the culture and with the fact that there's nobody watching over what those analysts are doing. And nobody trying to keep them from being absolutely overtaken by the flood of information they're getting every day. I don't mean to be glib about it, but my ultimate assessment is we did add another layer of bureaucracy with no accountability. GE: Ultimately, what will it take for intelligence agencies to get better at piecing together the information? Harris: We've engineered the system and the law toward acquisition of information. That is not the hard part. There are no effective barriers to collecting anything anymore. The government can pretty much get what it wants. People give away information willingly. The question is, what do you do with it? I think you can internally change the community's mind-set toward being one of better analysis, and you also can change the legal mind-set toward creating a statute that talks about what you do with data once you've got it. If you can do those two things, then you start to get toward the kind of system that we need. It really is about getting these people, ultimately, some relief, and getting some of that burden lifted from them so they can make better decisions.
Want to contribute to this story? Share your addition in comments.