Defense official defends idea of data mining

Public misconceptions of privacy and civil liberties issues surrounding the Defense Department's Terrorism Information Awareness (TIA) program led to its demise, a Defense official said on Tuesday.

The end of TIA, which called for "mining" commercial databases for information on potential terrorists, was the result of "lots of distortions and misunderstandings," Robert Popp, a special assistant to the director for strategic matters at the Defense Advanced Research Projects Agency, said at an event sponsored by the Potomac Institute.

Popp said TIA researchers were pursuing the project under two agendas: operational, and research and development. The operational aspect called for DARPA to provide R&D groups with different technologies in order to "tie many different agencies together," Popp said. And on the research front, DARPA asked whether "there may be other data in the information space that may be useful for the government to exploit in its counter terrorism."

"Terrorist acts must involve people ... and plans and activities ... that will leave an information signature," he said. DARPA was "extremely public" in detailing its TIA work, Popp added, but that allowed the project to be "distorted in the public."

Asked how he might have handled the situation differently, he said, "When the first onslaught of distortions occurred, we would've been much more public ... to clear the record ... in respect to the public and to Congress."

In place of TIA, perhaps there is a "need for a specific intelligence agency to go after terrorists" with a limited charter, said Kim Taipale, executive director for the Center for Advanced Studies in Science and Technology Policy.

"We have a long way to go on this," said Dan Gallington, a senior research fellow at the Potomac Institute. He called for specific congressional oversight committees to handle the situation.

"The goal is security with privacy," Taipale added. "[That] does not mean balancing security and privacy but maximizing the set of results you want within those constraints."

"It's best solved by using guiding principles, not ... rigged structure or rules that pre-determine where you're trying to get to," he said. "Security and privacy are not dichotomous rivals to be traded one for another in a zero-sum game; they are dual objectives, each to be maximized within certain constraints."

Taipale said, "Technology is not the solution" but only a "tool to allocate resources."

"In a society that is increasingly digitized, technology creates privacy problems," Taipale said. The problem, therefore, he said, is not controversial programs like data mining, but how to respond to the digitized society.

"We really face two inevitable futures," Taipale said. "Develop technologies that are built to provide privacy-protecting mechanisms [or] rely solely on legal mechanisms ... to control the use of technologies."

Taipale said specific tech implementations should be subject to congressional oversight, administrative procedures and judicial review. "It's the classic needle-in-the-haystack problem, [but] even worse, the needles themselves appear innocuous in isolation," he said.