The Unsung Hereos of Information Security

nspectors general must assess the effectiveness of security controls, programs and practices in each federal department and agency by Sept. 1. Agencies without IGs must contract with independent evaluators to perform the assessments, which are required by the Government Information Security Reform Act provisions of the 2001 Defense Authorization Act. The innovators who created the promising practices and procedures to prevent these mistakes are the unsung heroes of information security, helping agencies answer three key questions.
i

The IGs face a huge problem in deciding what to measure and determining whether what they find is good, average or unacceptable. Office of Management and Budget guidance on implementing the law offers no specifics or metrics for the assessments. But benchmarks for security assessments exist in the form of promising practices implemented successfully by agency leaders. IGs who want to know where their agencies stand can ask how far ahead or behind the benchmarks they are.

These benchmarks address the three primary goals of information security:

  • Integrity, meaning information will not be accidentally or maliciously altered or destroyed.
  • Availability, meaning information will be ready for use when needed.
  • Confidentiality, meaning information will be kept secret from all except those who have a right to see it. When auditors find breaches in any of these areas, they invariably find one or more of three mistakes:
  • Manufacturers delivered the systems with unsafe configurations or with unpatched vulnerabilities.
  • System administrators made an error in system settings or did not apply a critical security patch.
  • Users made an error such as giving out a password or downloading an infected picture or screen saver file from the Internet.

How can we be sure that our systems have all the necessary security patches installed correctly?

Every major type of computer system has security vulnerabilities; new ones are discovered nearly every week. Hackers exploit vulnerabilities to get inside federal computers, deface federal Web sites, store pornography and hacker files, steal government information and launch attacks on other government systems.

Most agencies have policies requiring system administrators to install all security patches, but many are not applied. Why not? Because there are too many patches, because they must be installed in a specific order and, most troubling, because some patches don't work and others have caused computers to stop operating correctly.

Our first unsung security hero, Marcey Kelley of the Energy Department's Lawrence Livermore National Laboratory, led a team that solved the patch problem. Their solution, called SafePatch, automates the process of finding and installing patches. Even more important, Kelley's group does the difficult manual task of testing patches thoroughly before allowing them to be installed. System administrators who maintain standard configurations trust the patches, and they don't even have to install them. SafePatch does it automatically. Any government agency that maintains standard configurations for the systems supported by SafePatch-Solaris and Linux-can contract with Lawrence Livermore to take advantage of the centralized patch validation and testing as well as automated installation. You can reach Marcey Kelley at kelley6@llnl.gov.

How can we be sure our systems are configured to withstand the most common attacks?

Even if patches are installed correctly, privacy and security can be compromised when system administrators are unaware of needed configuration settings. Trusting the configuration that is automatically provided by the system vendors is a common error. The vendor-provided configuration is the equivalent of a house with all its doors unlocked in a neighborhood full of burglars. Sure, the vendor tells the buyer to close the doors they don't need open, but most system administrators don't know how.

Most agencies have discovered that new computers never should be connected to the Internet with their vendor-supplied configurations. They use industry security guides or system-hardening programs to "lock down" new systems. But until last year, no one had developed a practical a method of correcting configuration errors on the millions of deployed computers, where a configuration change could cause system shutdown.

Dave Nelson, NASA's deputy chief information officer, developed such a method. He based his solution on these assumptions:

  • Hackers generally exploit a few common attacks for which programs are widely available.
  • Focusing system administrators on correcting a limited number of problems would lead them to share solutions and make more rapid progress.
  • Reporting to top management on progress in closing the commonly exploited holes would ensure management support. Targeting the 50 most commonly exploited vulnerabilities has resulted in a 96 percent reduction in those vulnerabilities across NASA sites, and a welcome decrease in the proportion of successful attacks. We can't publish Nelson's Top 50, lest we provide a road map for attackers. But his work is being carried on by the Center for Internet Security, which is releasing global consensus rulers that check the security configuration of systems and target high-priority vulnerabilities. Nelson can be contacted at dnelson@hq.nasa.gov. The Center for Internet Security is at www.cisecurity.org.
How can we ensure security and system administrators are sufficiently up to date on the latest threats, technology and techniques?

System administrators rarely are trained in security; it's just an option in the most common system administrator certification-the Microsoft Certified Systems Engineer (MCSE)-and most MCSEs never study the security material. UNIX systems managers have similar deficiencies.

But more than 2,000 federal system administrators and security professionals have enhanced their security skills through five common certifications:

  • Certified Computer Crime Investigator, for law enforcement officers and private investigators: www.htcn.org.
  • Certified Information Security Auditor, for security auditors: www.isaca.org.
  • Certified Information Systems Security Professional, for higher-level, nontechnical security managers: www.isc2.org.
  • Checkpoint Certified Security Engineer, for those who manage Checkpoint firewalls: www.checkpoint.com.
  • Global Information Assurance Certification Certified Security Administrator, for system and network administrators, security analysts and security officers: www.sans.org/ giactc. htm. (The author's employer sponsors this certification program.) Some system administrators and security staffers have used the certification process to help the community. They have proved their mastery of the material by creating reports, which have been graded, verified, improved and posted at the Information Security Reading Room at www.sans.org/infosecFAQ/index.htm. Five federal security people who have demonstrated this willingness to share their knowledge are:
  • Jeffrey Payne of the Naval Surface Warfare Center (step-by-step guidelines for securing Microsoft Exchange).
  • Jeff Campione of the Federal Reserve Board (step-by-step guidelines, with graphics, on how to secure Windows NT systems).
  • Michael Sneddon of the National Renewable Energy Laboratory (guidelines for securing Microsoft Exchange servers).
  • Lorraine Williams of the Naval Aviation Systems Command (importance of key length in cryptography).
  • Brian Kelly of the Marshall Space Flight Center (safe firewall practices). IGs can measure the skills of security staff against professional standards set by their peers. They can compare vulnerability reduction program against NASA's, patch updating programs against the one at Lawrence Livermore.

Alan Paller is director of research for the SANS Institute of Bethesda, Md.

NEXT STORY: Amazon.mil