Security officials urge more research into supercomputing

The nation's investment in supercomputing research and development has played a crucial role in national security, but more investment is needed to resolve numerous computational problems, a key National Security Agency (NSA) official said on Wednesday.

George Cotter, chief of NSA's Office of Corporate Assessments, said the conclusion of a congressionally mandated study on high-end computing R&D determined a need for faster computing to enable the military to create better weapons, aircraft and ships, as well as to improve the nation's ability to monitor its nuclear-weapons stockpile. Faster computers also are needed to analyze intelligence data and build better mapping capabilities for the military, he said.

"We have to continue to invest in R&D in these systems," Cotter told attendees of an Army High-Performance Computing Research Center luncheon. "The problem is, [right now] our capacity is limited."

The center has received $4 million in research funding annually over the past two years from the Army as the Pentagon decided to increase its focus on using supercomputing for military purposes. The program was initiated in 1990 and at that time received $2 million.

That amount dropped in the late 1990s and was recently increased, according to Paul Muzio, a spokesman for Network Computing Services, which provides the facilities management for the program. The center is among the programs Cotter examined as part of his study.

He cited the need for advances in computer simulations and modeling, adding that no current computer can satisfy the robust needs in several areas.

He said, for instance, that better simulation is needed to identify what changes occur within nuclear weapons as they sit in inventory, and better modeling is needed in aerospace and ship design, both for pilot training and for creating stealth ships.

Cotter also called for advanced atmospheric modeling in national missile defense to help missiles more precisely hit their targets, and he said more precise modeling of the impact of biological or radiological terrorist attacks is needed.

The reasons to hope the challenges will be met include the work occurring at the center, he said.

The emphasis on supercomputing machines declined during the 1990s, as the high-end computing industry shifted its research emphasis onto parallel and distributed computing.

Vincent Scarfino, manager of numerically intensive computing at Ford Motor, said at the lunch that the shift resulted in lower costs and higher productivity for high-end computing but that the ability of computers to solve increasingly complex problems came to a standstill. As a result, he said, "there has been no new breakthroughs in applications of science to solving problems."

Cray Computer has returned to making supercomputers over the past several years, and the government has been among its biggest customers. The center now uses some Cray computers.

Stay up-to-date with federal news alerts and analysis — Sign up for GovExec's email newsletters.
FROM OUR SPONSORS
JOIN THE DISCUSSION
Close [ x ] More from GovExec
 
 

Thank you for subscribing to newsletters from GovExec.com.
We think these reports might interest you:

  • Sponsored by G Suite

    Cross-Agency Teamwork, Anytime and Anywhere

    Dan McCrae, director of IT service delivery division, National Oceanic and Atmospheric Administration (NOAA)

    Download
  • Data-Centric Security vs. Database-Level Security

    Database-level encryption had its origins in the 1990s and early 2000s in response to very basic risks which largely revolved around the theft of servers, backup tapes and other physical-layer assets. As noted in Verizon’s 2014, Data Breach Investigations Report (DBIR)1, threats today are far more advanced and dangerous.

    Download
  • Federal IT Applications: Assessing Government's Core Drivers

    In order to better understand the current state of external and internal-facing agency workplace applications, Government Business Council (GBC) and Riverbed undertook an in-depth research study of federal employees. Overall, survey findings indicate that federal IT applications still face a gamut of challenges with regard to quality, reliability, and performance management.

    Download
  • PIV- I And Multifactor Authentication: The Best Defense for Federal Government Contractors

    This white paper explores NIST SP 800-171 and why compliance is critical to federal government contractors, especially those that work with the Department of Defense, as well as how leveraging PIV-I credentialing with multifactor authentication can be used as a defense against cyberattacks

    Download
  • Toward A More Innovative Government

    This research study aims to understand how state and local leaders regard their agency’s innovation efforts and what they are doing to overcome the challenges they face in successfully implementing these efforts.

    Download
  • From Volume to Value: UK’s NHS Digital Provides U.S. Healthcare Agencies A Roadmap For Value-Based Payment Models

    The U.S. healthcare industry is rapidly moving away from traditional fee-for-service models and towards value-based purchasing that reimburses physicians for quality of care in place of frequency of care.

    Download
  • GBC Flash Poll: Is Your Agency Safe?

    Federal leaders weigh in on the state of information security

    Download

When you download a report, your information may be shared with the underwriters of that document.