Officials press for long-term high-end computing plan

A long-term policy for the nation's high-end computing efforts should not be driven by a perception that the United States is losing its competitive edge in the field, officials said during a Wednesday forum on the issue.

While proponents for high-end computing repeatedly have stressed that the nation's leadership in the area could be slipping, that is not the "main issue" for advancing the industry, said Juan Rogers, an associate professor at the Georgia Institute of Technology.

Rogers outlined a recent report conducted by the institute that found U.S. leadership in the field is "understandably a driver of the policy discussion," but he said to advance, the country needs a "longer-term, pragmatic leadership vision" that is explicitly tied to coordinated national goals.

Government officials agreed with the academy's study.

"I don't believe the U.S. has lost its leadership" in high-end computing," said Dona Crawford, with the Lawrence Livermore National Laboratory, which uses supercomputing technology to test the country's nuclear-weapons stockpile. Crawford noted that the United States houses seven of the world's top 10 high-end computing machines.

An official with the National Energy Research Scientific Computation Center said 50 percent of the top 500 machines are U.S.-based, and more than 90 percent are U.S.-built. "Pragmatic U.S. leadership is well established and unchallenged," the center's Horst Simon said.

To advance high-end computing, the agenda's focus must switch from a "short-term crisis perception" to a long-term approach that aims to balance the various supercomputing efforts that contribute to all the relevant national goals-such as national security-in a coordinated and consistent fashion, Rogers said.

John Grosh, deputy undersecretary at the Defense Department, echoed Rogers' assessment, saying that the policy must move from "event-driven investments to one based upon a national planning process."

Rogers also said fragmented research and development across agencies has been a disadvantage for high-end computing because the fragmentation tends to force "stereotypical forms of division of labor" that may lead to unnecessary duplication. High-end computing needs "it's own high-level coordinating body, a sustained policy and coordination effort, and the continuing attention of the [White House] Office of Science and Technology Policy," he said.

Rogers recommended that the national initiative include incentives for researchers in government, academia and industry to explore alternatives in hardware and software. But he argued that advancing high-end computing cannot be based on "give us more money," saying instead that a policy must be able to succeed during tight budget years.

Rogers also suggested that the national program have built-in time frames to assess progress in achieving policy goals because policy for high-end computing is a moving target. "Today's high end is tomorrow's mainstream," he said.

Stay up-to-date with federal news alerts and analysis — Sign up for GovExec's email newsletters.
FROM OUR SPONSORS
JOIN THE DISCUSSION
Close [ x ] More from GovExec
 
 

Thank you for subscribing to newsletters from GovExec.com.
We think these reports might interest you:

  • Sponsored by G Suite

    Cross-Agency Teamwork, Anytime and Anywhere

    Dan McCrae, director of IT service delivery division, National Oceanic and Atmospheric Administration (NOAA)

    Download
  • Data-Centric Security vs. Database-Level Security

    Database-level encryption had its origins in the 1990s and early 2000s in response to very basic risks which largely revolved around the theft of servers, backup tapes and other physical-layer assets. As noted in Verizon’s 2014, Data Breach Investigations Report (DBIR)1, threats today are far more advanced and dangerous.

    Download
  • Federal IT Applications: Assessing Government's Core Drivers

    In order to better understand the current state of external and internal-facing agency workplace applications, Government Business Council (GBC) and Riverbed undertook an in-depth research study of federal employees. Overall, survey findings indicate that federal IT applications still face a gamut of challenges with regard to quality, reliability, and performance management.

    Download
  • PIV- I And Multifactor Authentication: The Best Defense for Federal Government Contractors

    This white paper explores NIST SP 800-171 and why compliance is critical to federal government contractors, especially those that work with the Department of Defense, as well as how leveraging PIV-I credentialing with multifactor authentication can be used as a defense against cyberattacks

    Download
  • Toward A More Innovative Government

    This research study aims to understand how state and local leaders regard their agency’s innovation efforts and what they are doing to overcome the challenges they face in successfully implementing these efforts.

    Download
  • From Volume to Value: UK’s NHS Digital Provides U.S. Healthcare Agencies A Roadmap For Value-Based Payment Models

    The U.S. healthcare industry is rapidly moving away from traditional fee-for-service models and towards value-based purchasing that reimburses physicians for quality of care in place of frequency of care.

    Download
  • GBC Flash Poll: Is Your Agency Safe?

    Federal leaders weigh in on the state of information security

    Download

When you download a report, your information may be shared with the underwriters of that document.