No patch can fix the most vulnerable spot in any network: the user.
Everyone knows not to click on the link in the e-mail claiming your bank account is overdrawn, and not to open attachments from strangers, either-it's a dangerous digital world.
What about an e-mail from your superior, instructing you follow a link to correct a mistake? The return address appears to come from the trusted .edu domain, the sender is a colonel and you're a lowly cadet at the U.S. Military Academy at West Point. In spring 2004, you'd probably have done as 80 percent of 512 randomly selected cadets did, and click on the link. Oops. By doing so, those cadets activated a beacon that downloaded their Internet Protocol address and specifics about their operating system and Web browser.
"A cadet, when they see an e-mail and the signature block has a rank of that stature, they click on it-and deal with the consequences later," says Aaron J. Ferguson, a National Security Agency systems engineer who just finished a stint as a visiting professor at West Point's computer science department.
Luckily, the bogus e-mail was part of what has since become a biannual cybersecurity exercise called the Carronade, named after an 18th century cannon limited in range but very destructive at close quarters. Developed by Ferguson, the virtual Carronade takes aim at the most vulnerable part of network security: users.
Inexperienced college-age military officers are hardly the only ones at risk for what the hacking community calls social engineering-manipulating the norms of social interaction to nefarious ends. For example, 35 out of 100 Internal Revenue Service employees and managers provided their logon name to auditors posing as information technology help desk personnel in late 2004. Those 35 civil servants also agreed to change their passwords to new ones suggested by the fake help desk caller. Some test subjects later said they were having network problems, so a call from the IT desk seemed legitimate. Others said they were hesitant to give out that information over the phone, but their managers said it was all right.
IRS auditors say they didn't take advantage of insider knowledge to get test subjects to cooperate. "We tried to make it fairly generic," says Kent Sagara, acting director of systems security at the Treasury Inspector General for Tax Administration. Any outsider with enough gumption could have repeated those results. The strong network perimeter defense at the IRS mitigates the danger of outsiders logging in with stolen identities, according to the IG. But purely technical solutions to cybersecurity likely will fail unless people using the networks are equally vigilant.
The problem is human beings always will fall short. Take the case of the West Point cadets. Since the first exercise in 2004, cadets have grown more sophisticated in their responses to Carronade e-mails-but so have the probes. Only 28 percent of the 1,000 or so cadets tested in fall 2005 clicked an e-mail link taking them to a Web site requesting they type in their personal information, says Army Lt. Col. Ronald C. Dodge, director of West Point's information technology and operations center.
But about half of another group of 1,000 downloaded attachments. One such e-mail, with a return address of "Web.de," claimed that the urgent downloading of a supposed zip file was necessary for changing passwords. That e-mail happened to coincide with a Defense Department push for everyone to alter their passwords, so the instructions seemed plausible.
"You can have the best firewall in the world, but if your users go out to a Web site and download something malicious, there's nothing you're going to be able to do about it," Dodge says. Security administrators tend to focus on technical defenses and they're getting better at it all the time. But there's no software yet that can prevent criminals from getting insiders to unknowingly do their dirty work for them. Users always will be the weakest link in the system, and security architecture must account for that. Training employees can help, Ferguson says. "Go into any factory today; they have safety signs all over the place-wear your safety goggles, wear your boots," he says. Where are the daily safety reminders for network users?
But even under ideal circumstances, complete assurance will remain elusive, Ferguson adds. Dodge estimates that between 10 percent and 15 percent of users will sometimes fall for some form of social engineering. Technology targeted to look for suspicious user activity can add another layer of assurance-but security administrators already tend to focus too heavily on technical solutions. Centering attention, as the government does, mostly on technical means increases the importance of social engineering as a network vulnerability. That means it's going to get larger as a problem in the coming years, predicts one agency chief information officer. He favors requiring biometric scans for logging on-it's hard to give away fingerprint or retina information. Well-thought-out controls, too, can limit the damage made by one careless employee.
Ultimately, there's no way to completely prevent the problem, says security expert Bruce Schneier. "Social engineering preys on human nature, which is not an easy thing to defend," he says. He agrees social engineering is becoming a bigger problem, mainly because criminals increasingly know that penetrating systems is lucrative work. Network users always will be a vulnerable target, and there's nothing that can completely thwart that exploit. "Welcome to our species," Schneier says.