Client-Server Perks and Pitfalls

Client-Server Perks and Pitfalls

M

onitoring the accuracy of radar and other navigational aids used to guide aircraft through U.S. airspace is a data-intensive task for the Federal Aviation Administration's Aviation System Standards group. Computer labs on board a fleet of 30 FAA planes compare transmitter information, such as latitude and longitude readings, against databases to verify whether deployed aids are reading within tolerance levels.

Computer tapes of data obtained in flight are transported from the planes to FAA field offices and then back to Aviation System Standards headquarters in Oklahoma City. Database information is manually entered onto the office's mainframe via a batch-processing system-meaning accumulated data from several flights is held and processed together instead of being input individually. The workload is so heavy that most of it has to be handled by an outsourcer.

"Since processing is not done in real time, it's impossible to get precise updates on which transmitters are giving false readings," says Travis Ray, leader of the operational systems development team at the Aviation System Standards office. "Our people are always working from a historic view rather a current one."

That will soon change, however, when the office completes its migration from a mainframe system to a distributed computing environment in which powerful file-server computers will send data and applications to PCs and workstations known as clients. The shift to client-server computing-to be completed by October 1997-will enable the Aviation System Standards group's 122 field offices to download database information to mobile computer labs while planes are parked in hangars.

"It will no longer be necessary to download the databases to tape and transport them out to the field to fly missions," says Ray. "Further, the new system will provide real-time processing capabilities that will provide a clear picture of all data."

FAA's Aviation Standards Information System will be composed of 11 Sun Microsystems workstations running on a Solaris operating system and using an Oracle relational database management program. A wide-area network will make data sharing easy between the field offices while rapid application development (RAD) tools using graphical-user interfaces will speed the writing of customized applications.

"RAD software will enable us to quickly alter computer logic and respond to changes such as new regulations being issued," says Ray. "But by far the biggest advantage is that we will be able to maintain the file servers ourselves and will not have to rely on an outsourcer."

FAA is about to join a host of federal agencies who are singing the praises of client-server technology. Spurred by the National Performance Review's recommendations on empowerment, government organizations are migrating to distributed computing in order to save money, boost productivity and give lower-level workers access to a wide assortment of data and applications. The federal client-server market has grown 7 percent over the last year, to 650,000 computers worth $2.3 billion, according to researcher International Data Corp. in Framingham, Mass.

One of the main appeals of distributed computing is the ability to move decision-making down to lower levels within enterprises. Instead of being dependent on centralized mainframes, which are difficult to learn and use, many government workers now rely on networked PCs and workstations. These "clients" provide easy access to data, printers and modems without all the headaches associated with mainframe computing.

Clients depend on messages sent to them by servers, which are powerful machines that hold applications and manage all computing functions. Servers can be everything from souped-up PCs-usually Pentium or Pentium Pro machines-that store files and move e-mail to workstations and minicomputers that handle databases and dedicated applications such as records indexing, inventory management and Internet Web pages.

High-end servers such as symmetric multiprocessing systems are used to host groupware, software that enables many users to work on the same computer applications at the same time. Other types of high-end servers, called massively parallel processors, are used to host data warehouses or to do scientific visualization and engineering tasks.

Server prices range from about $2,500 to more than $1 million, depending on the level of sophistication and the configuration of the network. Cost is linked to the amount of hard-drive space and the number of peripherals added to the system. Leading providers include Compaq, Data General, Dell, Digital, Hewlett-Packard, IBM, Sun Microsystems and Zenith Data Systems.

One of the biggest trends in client-server computing is the move to machines incorporating powerful 64-bit chip architectures. Some agencies are using these workstations to consolidate smaller servers purchased several years ago.

Unix continues to battle Microsoft's Windows NT operating system in the server market. Unix is an open operating system that enables dissimilar computers to exchange information and run on each other's software. But it can be difficult to configure and use, leading some federal organizations to pick user-friendly Windows NT instead. NT supports machines running Intel or RISC (reduced-instruction set computing) chips-such as PowerPC from Apple/IBM/Motorola, Alpha from Digital and MIPS from Silicon Graphics-but is plagued by a dearth of software applications.

Despite all its advantages, client-server computing is not without its problems. The technology can be significantly more expensive than mainframes by the time agencies buy all the machines and pull them together into networks. Ownership costs for client-server systems run three to six times higher than for comparable mainframe systems, according to the Gartner Group, a market researcher in Stamford, Conn.

In addition, distributed architectures are much more difficult to manage than centralized mainframes. Some of the early client-server systems were built willy-nilly, without attention being paid to standards or adequate communications links between departments. This has resulted in problems with reliability and security.

"We want all the advantages of client-server without having to give up central manageability and the bulletproof dependability of our mainframe," says FAA's Ray. "Since our systems are mission critical, corruption of our applications or databases would be catastrophic."

FAA plans to solve the problem by installing BMC Software's application management program on all its Oracle databases. Other agencies are relying on middleware-customized software that patches together disparate systems.

Still other agencies are moving to what some believe is the next level of client-server computing: intranets. These internal networks use Internet technology to link workers and provide access via Web browsers to databases and various applications. This "network-centric computing" enables users to work on the same applications at the same time. It also makes it easy to integrate agency data with external information.

Intranets are much cheaper than conventional client-server networks because Internet technology does not depend on standardized computers or operating systems. In addition, many application-development tools can be obtained free over the Internet. But as the CIA recently discovered when its Web page was rearranged by Swedish hackers who changed the title to read "Central Stupidity Agency," Internet technology is still vulnerable to a variety of security threats.

"Intranets are susceptible to 'record locking' in which two people update the same file at the same time," says Ray. "Whoever puts the file back first gets overwritten by the second person-a chance we just can't take when flight safety is at stake. For that reason alone, we'll stick to traditional client-server technology."

NEXT STORY: DoD Panel: Contract Out More