Information Insurance

The federal data storage market was strong well before Sept. 11. Federal Sources Inc., a federal technology market analysis firm in McLean, Va., pegged the market's value at $544 million last fiscal year. FSI says agencies could spend as much as $584 million on data storage this fiscal year, thanks to the new focus on disaster recovery plans and, even more important, continuous data storage. In addition to being almost entirely disconnected, agencies are also badly organized in managing their data. "There's information all over the place, in all kinds of different repositories," says Dan Agan, senior vice president of marketing and corporate development for Vienna, Va.-based Convera. "There's a very large movement toward making sure we have access to everything." Agan's firm makes information retrieval software that sifts through mounds of data in numerous repositories to find certain words and relationships of ideas to help make sense of seemingly disconnected documents, messages and even video footage. The software is widely used by intelligence agencies and by the Defense and Justice departments.
Agencies make plans for retrieving data if their systems or offices come crashing down.

w

ith Sept. 11 still fresh in their minds, federal information managers are taking a close look at agencies that quickly got their people back to work and their systems up and running after losing their offices and critical electronic data. Employees of the Secret Service and the Securities and Exchange Commission, for instance, who worked in Building 7 of New York's World Trade Center, were back at work at other sites just days after their building collapsed.

With the need for recovery plans proven beyond any doubt, information managers across government are seeking the best ways to back up critical data in off-site locations.

Much as President Bush ordered a cadre of senior executives into secret, fortified bunkers to ensure the continuity of government in the event of a catastrophic attack on Washington, information managers are hoping to create shadow versions of their agencies' databases safely away from their headquarters.

Data storage vendors across the federal market-and there are dozens-report huge interest among their government customers for disaster recovery and back-up services. It seems everyone is looking for an information insurance policy, and it's easy to understand why. Any loss of data can be disastrous, says Nigel Turner, senior vice president of storage management at Computer Associates, an Islandia, N.Y., business software manufacturer that has long counted intelligence and Defense agencies among its clients. As an executive from a leading travel reservations firm told a team of analysts from Framingham, Mass.-based technology research group IDC, "We have no business without storage. Everything in the office can be replaced, but if I lose data, I am out of business."

Wall Street financial firms learned firsthand that even a temporary loss of access to data can be devastating. Corporate executives told federal emergency officials helping with disaster recovery in lower Manhattan that their firms could lose millions, and even billions of dollars, each hour their employees couldn't access customer information on servers in damaged offices. Federal agencies also felt that pain. The SEC lost two full weeks of data because it hadn't yet mailed its latest backup tapes to Washington, where they're archived. Only recently, plans aimed at dealing with the much-anticipated Y2K computer glitch gave rise to a wide range of data recovery plans at many agencies. But despite this progress, the question remains whether sending out backup tapes at two-week-or even two-day-intervals is good enough. As a result, agencies are hungry for better solutions.

BACK IT UP

An FSI report says the federal government is "uniquely positioned" to move the entire storage market forward by deploying the latest technologies. Two breeds of storage devices are on the leading edge today. The first are storage area networks (SANs), interconnected storage devices and backup systems that include servers for storing data on disks. If the human body were a SAN, the organs would be storage devices and the web of veins and capillaries connecting them would be the fiber optic network through which data move. Next are network-attached storage (NAS) devices. These apparatuses occupy a position on an organization's network, but don't compete with its main server for the resources needed to process files and programs. NAS devices store information on hard disks and are gen- erally considered stepping stones toward more complex SAN systems.

The National Security Agency, the Veterans Affairs Department and the Army are all investing in storage this fiscal year, FSI reports. NSA will deploy a storage area network to consolidate its infrastructure of servers, according to Chief Information Officer Richard Turner. And NASA is using SAN devices to manage its grid computing system, a collection of mainframes, servers and networks in different places that are harnessed together to provide more speed and act as one large supercomputer.

The military services have some of the greatest data storage demands because of the size and scope of their missions. The Air Force Research Laboratory, headquartered at Wright-Patterson Air Force Base, Ohio, uses a SAN to help power jet fighter simulators. F-16 pilots fly in simulators that might be hundreds of miles apart, yet are able to see one another's planes on their screens as if they were all flying in formation. These mock flying runs require-and generate-tremendous amounts of data. In one instance, four F-16 simulators shared one database that stored all the information they needed. The database was housed in a storage subsystem attached to each simulator by a SAN.

Government scientists similarly need massive data storage capabilities to run their experiments. At the Sandia National Laboratory in New Mexico, researchers run simulated tests of nuclear weapons to better understand how the weapons would behave under specific conditions. These simulations create huge amounts of data that are useless to scientists unless they can be fashioned into a visual model, says Milton Clauser, a principal member of Sandia's technical staff.

Despite the promise of sophisticated storage designs such as SAN and NAS, no single storage device or scheme, whether for disaster recovery or research, has emerged as the clear leader. Some agencies have opted for simpler models to suit their needs. The Vicksburg, Miss., district office of the Army Corps of Engineers is using individual servers for storage. The agency is evaluating whether new kinds of storage technology would be beneficial, but, says Mary Anne Woods, chief of the Vicksburg District Information Management Office, "The SAN [and] NAS technology is just now evolving into a viable alternative."

"Because of advances in disk technology and relatively inexpensive disk storage, we do not utilize other storage management techniques," Woods says. The Vicksburg District manages the storage needs of 16 Corps of Engineers sites in three states. "The ability to maintain highly available backups for disaster recovery is a priority," Woods adds.

But simply backing up data to an alternate source isn't good enough, as the SEC's experience in New York demonstrates. Organizations have normally created tape replicas for archival purposes, says Bob Guilbert, vice president of marketing and business development of NSI Software, a data storage software maker in Hoboken, N.J. But while keeping records is the key to storage, just having an archive doesn't always provide immediate access to data. And since homeland security demands that agencies be able to quickly utilize not only their internal information but also data from other agencies, information retrieval is quickly becoming linked to storage as never before.

GET TO IT

But in addition to creating access to data, federal information managers have to contend with how to manage the storage infrastructure itself. The biggest problem they face today, says Computer Associates' Turner, is that there's actually a storage glut.

"In the private sector, we see storage [space] growing . . . at a phenomenal rate," says Turner. He estimates that the amount of space could be ballooning as much as 70 percent a year. And given that more and more information can be squeezed into smaller spaces, it seems unlikely the government or the private sector will run out of storage space anytime soon.

Turner proposes that agencies manage their storage capacity the same way network administrators try to manage their systems, keeping track of every new device that's added to a storage scheme and distributing data evenly over an entire organization, rather than overburdening one set of devices.

Guilbert of software maker NSI says he sees agencies storing their data at multiple remote locations for no apparent reason. He believes that an overall architecture needs to be developed to help the government make the most of its storage infrastructure.

Turner says the government's needs and challenges aren't that different from those of the private sector. But the government's task is much bigger because agencies must hold onto information longer than most private companies, he says. And in the post-Sept. 11 world, guaranteeing access to that information is more important than ever. That makes storage, once the domain of basement-dwelling technologists and networkphiles, suddenly sexy.

NEXT STORY: The Supply Chain’s Demands