Tech TechTech
Updates on federal IT management, in partnership with Nextgov.com
ARCHIVES

Linking Supercomputers

Officials at the NASA Aerospace Simulation (NAS) Facility first began using the most powerful computers (called high-performance or supercomputers) a decade ago with the goal of reducing, if not eliminating, wind tunnel testing of new aircraft. Wind tunnels, while extremely effective at evaluating how an aircraft will perform in flight, were expensive. Because tests took years to complete, major U.S. aircraft companies were unable to get their products to market fast enough to compete effectively in the global economy. NAS officials, who routinely perform long-term research and development for industry, believed supercomputers and high-end applications could recreate an airplane on screen and simulate factors such as turbulence and air flow.

Unfortunately, because of the cost and complexity of advanced computing, NAS officials struggled to advance their goals. But then in 1991, the National High Performance Computing and Communications (HPCC) program was created. Several agencies joined up and began exchanging data and resources, enabling NAS officials to collaborate with federal scientists and engineers working on climate control simulation and other related models.

"Previously, we were trying to solve some specific system software problems but we could only do so much considering the cost of the machines that we needed and...

A Vision Too Grand

When the Defense Department first began publicizing its Corporate Information Management (CIM) plan in 1989, top military brass, congressional leaders and even the press were almost unanimous in their praise.

Certainly the goals of CIM were difficult to criticize, for the change management program promised to streamline computer operations and cut red tape out of one of the world's largest and most bureaucratic organizations. The word "corporate" in the project's name suggested a key goal: to bring tightly organized, business-style management to DoD's sprawling operations.

CIM promised savings of more than $70 billion over seven years, enough money to fund modernization of defense computer systems and help bolster military readiness. The savings would come out of administrative and logistical operations, through such initiatives as the consolidation of data centers, the merging of redundant payroll, procurement and other systems and the reduction of inventories through better management of supply chains.

Savings generated would pay for installation of an information technology infrastructure capable of fighting the modern war. So CIM would be helping not only the pencil-pushers in the bureaucracy but also the young soldiers in the field. It would, in the process, spare the military the embarrassment of...

Linking Supercomputers

Officials at the NASA Aerospace Simulation (NAS) Facility first began using the most powerful computers (called high-performance or supercomputers) a decade ago with the goal of reducing, if not eliminating, wind tunnel testing of new aircraft. Wind tunnels, while extremely effective at evaluating how an aircraft will perform in flight, were expensive. Because tests took years to complete, major U.S. aircraft companies were unable to get their products to market fast enough to compete effectively in the global economy. NAS officials, who routinely perform long-term research and development for industry, believed supercomputers and high-end applications could recreate an airplane on screen and simulate factors such as turbulence and air flow.

Unfortunately, because of the cost and complexity of advanced computing, NAS officials struggled to advance their goals. But then in 1991, the National High Performance Computing and Communications (HPCC) program was created. Several agencies joined up and began exchanging data and resources, enabling NAS officials to collaborate with federal scientists and engineers working on climate control simulation and other related models.

"Previously, we were trying to solve some specific system software problems but we could only do so much considering the cost of the machines that we needed and...

Thank you for subscribing to newsletters from GovExec.com.
We think these reports might interest you:

  • Sponsored by G Suite

    Cross-Agency Teamwork, Anytime and Anywhere

    Dan McCrae, director of IT service delivery division, National Oceanic and Atmospheric Administration (NOAA)

    Download
  • Data-Centric Security vs. Database-Level Security

    Database-level encryption had its origins in the 1990s and early 2000s in response to very basic risks which largely revolved around the theft of servers, backup tapes and other physical-layer assets. As noted in Verizon’s 2014, Data Breach Investigations Report (DBIR)1, threats today are far more advanced and dangerous.

    Download
  • Federal IT Applications: Assessing Government's Core Drivers

    In order to better understand the current state of external and internal-facing agency workplace applications, Government Business Council (GBC) and Riverbed undertook an in-depth research study of federal employees. Overall, survey findings indicate that federal IT applications still face a gamut of challenges with regard to quality, reliability, and performance management.

    Download
  • PIV- I And Multifactor Authentication: The Best Defense for Federal Government Contractors

    This white paper explores NIST SP 800-171 and why compliance is critical to federal government contractors, especially those that work with the Department of Defense, as well as how leveraging PIV-I credentialing with multifactor authentication can be used as a defense against cyberattacks

    Download
  • Toward A More Innovative Government

    This research study aims to understand how state and local leaders regard their agency’s innovation efforts and what they are doing to overcome the challenges they face in successfully implementing these efforts.

    Download
  • From Volume to Value: UK’s NHS Digital Provides U.S. Healthcare Agencies A Roadmap For Value-Based Payment Models

    The U.S. healthcare industry is rapidly moving away from traditional fee-for-service models and towards value-based purchasing that reimburses physicians for quality of care in place of frequency of care.

    Download
  • GBC Flash Poll: Is Your Agency Safe?

    Federal leaders weigh in on the state of information security

    Download

When you download a report, your information may be shared with the underwriters of that document.