The Superheroes of Computers

Supercomputers being tapped to simulate nuclear explosions and predict weather also hold potential for use in homeland security.

A

s Hurricane Isabel barreled toward the mid-Atlantic coast in September, satellites, hurricane chaser airplanes, and weather buoys collected thousands of measurements about the coming storm. The sensors relayed their eyewitness accounts to an enormous electronic brain, a supercomputer housed in Maryland that was crunching data around the clock, trying to predict where and when Isabel would land.

The supercomputer, a mass of more than 1,400 powerful computer processors, performs mathematical computations to predict weather patterns so quickly it would make the prescient science fiction writer H.G. Wells blush. A human being would have to work 24 hours a day for 15,000 years to perform the number of calculations this computer churns out in one second.

It's because of the supercomputer's awesome abilities that forecasters with the National Weather Service, which uses the system, are able to predict a hurricane's path three to five days in advance. Those predictions also allow disaster management officials to know which towns and cities will suffer under the most intense swaths of the storm.

The federal government has long been at the forefront of supercomputer research and development, and today its agencies are among the biggest users of the machines. Just as desktop computing power has increased exponentially, so too have supercomputers' ability to process data. The fast machines are being tapped for a number of research projects-including simulation of nuclear explosions, weather forecasting and astrophysics-and also hold potential for use in homeland security and counterterrorism measures.

PLAYING WITH MODELS

Predicting the weather may offer the best glimpse into how supercomputers work, and how they've improved over the years.

The Weather Service receives about 100 million weather "observations" every day, says Kevin Cooley, the chief information officer for the National Centers for Environmental Prediction, which, along with the Weather Service, is housed at the National Oceanic and Atmospheric Administration. The data, such as wind speed, air temperature and barometric pressure, is collected from many of the same sources used to track a hurricane's progress, and also from land-based observation points and ships at sea.

The information is fed into the supercomputer, which is housed in a facility owned by its creator, IBM, in the Washington suburb of Gaithersburg, Md. The supercomputer is actually two sets of 1,400 fast processors-what's known as a parallel system. The half known as Frost makes weather-related calculations used in forecasting, while the other half, called Snow, is constantly looking for ways to improve the supercomputer's software, thereby making weather predictions more accurate and more detailed.

Frost and Snow take up 7,000 square feet of floor space. They're fed 24 hours a day, digesting data at the rate of 450 billion calculations a second. Their electronic stomachs hold 42 terabytes of data, about 1,000 times more storage than a typical desktop computer.

Each model produces a picture of how the atmosphere is likely to behave at some point in the future, Cooley says. That point could be three hours away, or seven days out. Thousands, or perhaps millions, of weather observations need to be factored in and put in the context of current weather conditions to produce a forecast. But weather watchers want forecasts over an extended period. Whenever the forecast is elongated, Frost and Snow have to recompute their data.

Ultimately, the Weather Service's forecasts are made available to anyone who wants them. The agency's regular customers are commercial weather forecasters, including television news meteorologists, and also other federal agencies, particularly the Defense Department and the Federal Emergency Management Agency, which, among other things, is responsible for responding to weather disasters.

U.S. businesses also are key customers. "The U.S. economy is very weather-sensitive," Cooley says. About 10 percent of the gross domestic product depends on agriculture, construction, energy and leisure and entertainment-all weather-sensitive industries, he says. The economy loses as much as $12 billion each year from severe weather, Cooley notes.

Cooley says the Weather Service is unique among government supercomputer users in that it's serving a wide audience. The Weather Service produces more than 5 million weather products a day that forecasters use in their predictions. Forecasters want the data quickly. "If something is five or 10 minutes late, that's a big problem," Cooley says. The Weather Service measures itself in part on how fast it can provide information.

Frost and Snow are descended from a line of supercomputers that began at the Energy Department's national laboratories. And it's there that government technologists now are experimenting with new ways to craft supercomputers on the cheap.

SUPER CLUSTERS

Largely due to the enormous price tag for some supercomputers-which can run into the hundreds of millions of dollars-and because commercially available processors now are so fast, technologists are building supercomputers out of everyday machines.

Known as clusters, these supercomputers are a collection of off-the-shelf processors linked together with high-speed data cables.

Clusters like those Michael Warren builds are now in vogue. A staff member in the theoretical astrophysics group at the Los Alamos National Laboratory, Warren developed a clustered supercomputer in 1996, which has served as a model for subsequent machines.

Warren's first cluster, nicknamed Loki, after the Norse god, was built from 16 processors that, at the time, were decidedly high-end.

Today, those 200-megahertz processors look decrepitly slow, but Loki won a prestigious computing award for the best overall performance for the money at the time.

In the mid-1990s, traditional supercomputing giants were stuck in financial doldrums, as consumers and business were buying personal computers in abundance. Government researchers were in a bind. The computer codes they used for their models were so complex it could take a year to get one up and running properly on a traditional supercomputer, Warren says. If the company that made the machine went bankrupt the next year, the researchers were left high and dry.

But Warren, along with colleagues at NASA and the California Institute of Technology, realized that personal computers would continue developing. If the researchers could get their codes running on clustered machines, and be assured that every few months a faster processor would be on the market, they could continue upgrading their machines as long as the companies stayed in business.

"The cluster technology is amazing," because thousands of processors can be harnessed together, says Tim Keenan, president of High Performance Technologies Inc. of Reston, Va., which has built supercomputers for agencies such as NOAA. But they have their limitations. Beyond about 4,000 processors, physics starts to get in the way of business, Keenan says. The machines generate so much heat they'd need a specially cooled room to keep them from melting down. Since one point of clustering is to keep prices low, there are obvious physical limitations to their potential.

Nevertheless, experts agree that clustering is the wave of the future, mainly because commercial processors are so sophisticated and powerful. And that opens up a new realm of applications for supercomputers.

TECHNOLOGY AND TERROR

Supercomputing could greatly enhance intelligence analysis, technologists say. Keenan, whose company has developed technology for narcotics investigations, says analysts use techniques known as link analysis and pattern recognition to look for connections between data points. This is a job supercomputers were meant for, he thinks.

Link analysis requires someone to know a few thousand interactions or relationships, among people, places or events, for instance. An analyst needs to run algorithms against those data points to find potential connections, but a human brain can't process the data quickly.

Supercomputers can be tapped to perform the immense number of calculations it takes to come up with a probability that, for example, two criminal suspects are connected. Did a man arrested for cocaine possession have any connection to a suspected terrorist recently stopped at a border crossing? It's a wild question, perhaps too broad for one human investigator. But a supercomputer might be able to answer the puzzle instantly.

The applications for supercomputing are as seemingly limitless as their technological capacity. Researchers could use the machines to predict disease outbreak patterns or the movements of chemical plumes in a terrorist attack. Supercomputers are already mapping out more benign subjects, including the behavior of giant supernovas-collapsed stars-and the human genome, which has been understood for the first time in recent years largely because human beings have built a brain that works faster than our own.

And those brains continue to grow stronger. IBM, which built Frost and Snow, is now developing two systems for the Energy Department that will have one and a half times the computing power of the previous top 500 supercomputers combined, says Tom Burlin, a partner with IBM Business Consulting Services.

Where do the machines go from there? Burlin says such predictions move into the realm of "space odyssey stuff," alluding to Arthur C. Clarke's 2001: A Space Odyssey, in which astronauts battle with a computer, called HAL, that has learned to think on its own.

According to Burlin, such a vision is not that far-fetched. "These computers will be thinking more like the human mind" as they evolve, he says. Researchers hope supercomputers will eventually have the capacity to be atomic, or self-healing, so that they can "reason what's happening to them and correct it themselves," minimizing the need for human maintenance. The idea is that the computers would be able to process calculations so fast that they would figure out what was wrong with them before a human being could.

Science fiction author Clarke might have been telegraphing a message to his readers when he named his supercomputer HAL, an acronym whose letters are each one step removed from IBM.

The eerie vision of the future aside, however, Clarke was right in guessing that the power of these machines is restrained much more by physical demands than it is by the imagination. Human beings may not be able to think as fast as their creations, but they've proven plenty capable of conceiving of imaginative uses for them.