Gazing into the Internet's Future

he Keck telescopes atop the Mauna Kea volcano in Hawaii are the largest optical and infrared stargazers in the world. Yet across an ocean and 13,796 feet above sea level, the telescopes are too remote for many government and university researchers to use. Astronomers scattered across the country have experimented with using the Internet to control the telescopes and transport the images they produce. But they need reliable networks with greater capacity for transferring massive amounts of data.
T

Using the Internet to focus telescopes on distant nebulas and download high-quality images presents numerous problems-the greatest being that the process is unreliable. The Internet excels at transporting small bits of information but not uninterrupted streams of data. Therefore, it can only handle low-quality audio and video broadcasts.

But solutions are in sight. The federal government has funded efforts to develop the Internet's next generation of high-performance networks. These advanced networks dwarf the Internet's current capabilities. Imagine watching a flawless, high-definition television (HDTV) broadcast or listening to better than CD-quality music over the Internet, all at the click of a mouse. Imagine an Internet that can prioritize data or video traveling to your computer, or an Internet that delivers information 100 to 1,000 times faster than it does today. The work is still in progress, but the advances are staggering.

In 1996, the National Science Foundation, the federal agency responsible for promoting advanced scientific and technological research, began searching for ways to create faster networks to help scientists collaborate and use remote resources such as supercomputers. The Internet had admirably supported scientists and researchers for years, but it had become too clogged with commercial and consumer traffic for the researchers to do their jobs.

In 1997, work began on the Next-Generation Internet (NGI)-a program that could transcend the limitations of time and space. NSF was the initiative's visionary. However, the Clinton administration formalized it by funding and coordinating efforts to develop advanced research networks at six agencies-the Defense Advanced Research Projects Agency, Energy Department, NASA, the National Institutes of Health, the National Institutes of Standards and Technology and NSF.

History Repeating

The federal government is no stranger to Internet development. In 1965, DARPA developed the Internet's progenitor, Arpanet, to help researchers in academia and government transfer data to each other. By the early 1970s, Arpanet was a viable network used by hundreds of scientists and engineers and led to the first e-mail program.

In 1986, the National Science Foundation took over the Internet mantle when it launched the NSFNET, which provided Internet access to a wide array of universities. Then in 1991, a computer scientist at the European Organization for Nuclear Research (CERN), created the World Wide Web for scientists collaborating in the area of high-energy physics. The Web, which connects users to data all over the world through the Internet's networks, opened the Internet to millions of people in all sorts of fields-even private citizens.

When use of the Internet skyrocketed in 1995, NSF transferred the responsibility for the huge series of networks to the private sector. Yet this left universities without access to networks that could help them accomplish their research goals. To bridge the gap, the NSF contracted with MCI WorldCom to build a national network in 1997. The vBNS (very-high performance backbone network system) is part of the Internet but only available to approved users. It allowed the research community to create networks with greater speeds.

"Making the Internet 100 to 1,000 times faster was a novel challenge," says George Strawn executive officer of the NSF's Computer and Information Science and Engineering Directorate. The NSF initially intended vBNS to connect five university supercomputing centers. But more and more universities interested in high-speed networking began using vBNS.

Next-Generation Internet projects have yielded advanced networks for a variety of purposes. NASA built its NASA Educational and Research Network (NREN) to connect scientists all over the country. Energy operates the Energy Sciences Network (ESNET), which is the fastest backbone network in wide use for linking national laboratories studying high-energy and nuclear physics. The Navy Marine Corps Intranet will use vBNS to funnel massive amounts of data to installations and ships all over the world. DARPA has pieced together its SuperNet test bed to research high-speed networks for the Defense Department and the intelligence community.

"The Internet we know today came from the same collaborators: academia, government and commercial enterprises," says Greg Wood, director of communications for Internet2, a university-led consortium of companies, schools and government research laboratories working alongside the Next-Generation Internet program to develop high-performance Internet technologies.

Traffic Cops

The Internet, which is the ultimate network of networks, holds the key to collaboration for agencies. The Internet relies on networks built out of fiber-optic cables to channel information to its destinations. But much of that fiber is underutilized. Its capacity far exceeds the ability of the hardware used to broadcast data. One Next-Generation Internet goal is to maximize the potential of fiber-optic cables.

Yet the Next-Generation Internet program isn't just about widening the lanes on the information superhighway to accommodate more traffic. Researchers also are creating smarter vehicles that can take advantage of the capacity that is already there. Ken Freeman, NASA's NREN project manager, says developers create compact, fuel-efficient software applications when they have limited bandwidth. But with wider lanes, developers can create the tractor-trailers of applications. High-quality videoconferencing is just one example of an application that voraciously eats up network bandwidth.

The Internet is made up of 8,000 networks, which makes consistent operations difficult. Information travels across the Internet in a vast series of overlapping and concentric circles. As data navigate the circles, network performance is diminished. However, such applications as videoconferencing require data to travel without interruption. Next-Generation Internet developers have been testing quality-of-service software that would guarantee a slice of bandwidth for certain data to ensure consistent end-to-end performance. Quality-of-service software functions as a traffic cop at the Internet's intersections, ensuring the flow of priority traffic and meting out bandwidth to less-important data. HDTV broadcasts, high-quality videoconferencing and remote control applications would take precedence because they require networks capable of ensuring data can travel uninterrupted. Researchers also are perfecting the technique of broadcasting multiple, concurrent video streams-called multicasting. The technology can guarantee quality and performance, unlike today's Internet multimedia applications.

NASA's NREN network, built by Qwest Communications International Inc., will connect at least seven NASA installations from California to Virginia. When completed in 2002, NREN will provide multicasting and quality-of-service features that allow engineers to collaborate on far-flung projects. NREN will be an OC12 network, capable of moving data at a speed of 622 megabits per second-more than 11,000 times faster than standard modems that connect to the Internet at 56 kilobits per second. Eventually, the network could handle OC192 circuits, which run 16 times faster than the OC12 network.

Freeman says NASA needs all this firepower because it is experimenting with multicast video technology and digital imaging, which would allow multiple people to see same image at the same time. The agency tested the technology with astrobiologists and doctors at five locations, who were able to view a human heart and collaborate in real time. NASA Administrator Daniel Goldin recently called for technologists to create an environment where engineers based around the world could gather in a virtual 3-D space to view and share data at the same time.

NASA hopes to use NREN to remotely control electron microscopes, expensive machines based only in a few locations throughout the country. Scientists would benefit from a high-speed network that allows them to operate a lens in viewing cells or microscopic fossils. "There is always a desire to control an instrument that is someplace else," Freeman says. A microscope's value increases when scientists no longer are limited by geography. Remote control technology depends on a network that can transmit various commands in real time. In the case of the electron microscopes, the timing involved in transmitting commands is so critical that it could prevent a lens from smashing into a specimen.

DARPA's Next-Generation Internet program focuses on high-capacity networks to transfer high-resolution images fast. Its SuperNet test bed consists of multiple networks designed to guarantee end-to-end performance of a gigabit per second. The data transfer rate of a gigabit is 17,857 times faster than that of a standard modem. SuperNet connects 24 government and commercial research laboratories. One SuperNet backbone runs at 2.5 gigabits per second while another runs at 10 gigabits per second. Still another can scream up to 40 gigabits per second, or 64 times faster than NASA's NREN.

Such performance is vital to numerous intelligence applications. For instance, Mari Maeda, program manager of the Next-Generation Internet program at DARPA says it is possible to take multiple data streams from low-resolution radars and combine them on the Internet at a high resolution. Maeda says such a technique could be used to determine what foreign spy satellites are targeting. Next-Generation Internet has opened the door to high-speed networks, NSF's Strawn says. The program has allowed MCI WorldCom and Qwest to hone their skills in building, operating and maintaining high-performance networks. And now that such networks exist, agencies seeking high-speed Internet services will encounter less of a wait and lower costs.

NEXT STORY: Weathering the ERP Storm