Y2K Work Changed<br>Course of IT

jdean@govexec.com

A

t the moment the millennium turned, the USS Topeka, a Navy submarine, was floating underwater, straddling the equator. Its crew members trained their eyes on the ship's information systems, searching for trouble, any evidence of computers failing to understand the millennial rollover.

That night, the Navy's regional commander in Guam was also watching, searching for blips in service, as was another Navy submarine, the USS Bremerton, in port in Singapore.

But all three reported that all systems were fully operational. "They said everything was fine," says Dave Wennergren, the Navy's deputy chief information officer for e-business and security. During the buildup to Y2K he served as the Navy's deputy CIO for Y2K and information assurance. "It was a really boring New Year's Day. But that meant all our preparations had paid off."

The preparations for Y2K were fervent and grabbed many headlines, making the actual rollover anticlimactic when problems failed to arise. But government agencies learned a strong set of lessons in the effort to guarantee their information systems would recognize the new millennium.

The process of finding every computer, testing it for Y2K compatibility, fixing problem systems, creating contingency plans for unforeseen failures and discovering the complex interconnectivities and dependencies of information systems presented a unique challenge to every agency. Plus, the sheer management challenge this process presented IT staffs, project managers and senior leaders changed the way agencies view information technology's role in their organizations.

The Wake-Up Call

"Y2K was a wake-up call," says David McClure, an associate director for information technology management issues at the General Accounting Office. "Y2K showed us how integrated information technology is into everything the government does."

"Think of commonly delivered government services like weather forecasting or air traffic control," McClure says. "Technology is sitting right in the front of those operations. Or think about monitoring traffic or customs inspectors checking items coming into ports-information systems guide much of this activity."

In fact, scholars, government executives, Congress and even GAO have been quick to recognize the silver linings of Y2K. In GAO's final report on Y2K, "Comprehensive Strategy Can Draw on Year 2000 Experiences" (AIMD-00-1), the congressional investigators outlined a key set of lessons they hope government agencies will take from Y2K.

GAO underscores the importance of "high level congressional and executive branch leadership" and providing standard guidance. GAO also acknowledges the benefits of the government's increased ability to identify technical problems and coordinate the appropriate solutions. Agencies also gained experience in performance monitoring and facilitating the progress of important programs.

"IT is not an adjunct issue in an organization's operations," says John Koskinen, former chair of the President's Council on Y2K Conversion. "IT is a central part of how an organization operates. That was the basis for the passage of the Clinger-Cohen Act. That act set out the need to make sure that IT decisions are made by an organization's senior management, not just the systems administrators."

GAO agrees with Koskinen. The same report cites the importance of "implementing fundamental information technology management improvements" and "understanding risks to computer supported operations."

IT Management Basics

"Y2K reiterated the absolute necessity for good IT management, investment control, inventories and architecture," McClure says. "System remediation was hard for those with poor architectures, poor software management and control practices, as well as poor asset management."

Some agencies with regulatory responsibility were in the middle of this dichotomy. The Securities and Exchange Commission, the government body that regulates the securities industry, for instance, was responsible not only for ensuring its own operation was in order but also for guaranteeing that investment houses were ready for the year 2000.

"The Y2K problem got to chairman level very quickly," says Frank Dreano, director of application development in the SEC's Office of Information Technology. During the Y2K crisis, Dreano served as the SEC's Y2K program director. "Our chairman had monthly status briefings on the progress of Y2K-he was intensely concerned."

The reason for his attention was that "the administration and the public were concerned about readiness of the securities industry," Dreano says.

As such, the SEC coordinated a massive survey of securities companies. "Several thousand forms and surveys went out that asked brokered dealers their level of compliance and how much money they were spending to fix their Y2K problems," Dreano says.

But the SEC also had internal issues to combat. "When I arrived here the SEC was a typical government shop with mainframe and legacy applications that were written in COBOL. There were some applications written in new environ-ments. Still, we had to remediate all of it," Dreano says.

The SEC began by inventorying all its assets. "We had to stop and take a snapshot of our infrastructure-our local area network's plumbing, our applications-our total environment," Dreano says.

"We found we had a tremendous amount of work to do," Dreano says. "The key to our success was independent verification and validation of compliance and remediation. We retained an outside firm to do that work." This work included the SEC's Electronic Data Gathering, Analysis and Retrieval Database (EDGAR), its "No. 1 most mission critical system," according to Dreano.

Compliance testing of EDGAR was completed by April 1999. "The EDGAR system is a fairly modern system," Dreano says, "so it had no compliance problems, and that was fortunate."

The rest of the SEC was completely tested and Y2K compliant by August of 1999-the deadline set for the securities industry.

And like many agencies, the SEC took advantage of the focus on legacy systems to re-evaluate its IT needs. "We told the CIO that some things on our mainframes were not Y2K compliant and that there was enough remediation work to do that a move to a more modern environment would be beneficial to the SEC," Dreano says. "Our CIO agreed and we put a whole application architecture together. So instead of remediating code we migrated [our applications] to a state of the art environment."

Dreano says the SEC is now running Sybase relational databases on Unix-based servers. Its desktop workstations, which run on Microsoft Corp.'s Windows NT business operation system, now access modern, Web-based applications off the Unix servers. "We created the architecture for five major applications in the span of about 18 months. It was a minor miracle."

During this time, the SEC was also conducting its survey of the securities industry. If brokers were not compliant when the August deadline came, the SEC monitored the situation. "The brokers had an obligation to file their status with us," Dreano says. The SEC had to determine if further action needed to be taken for those brokers on the slow road to compliance.

"It's easy now to say Y2K wasn't a big problem, but there was a real concern because public perception of the securities industry was that they weren't compliant," Dreano says. "But the truth was that most people were taking appropriate actions."

Tying IT to the Mission

"Y2K clearly underscored the fact that IT is a critical part of the SEC and the regulation of the securities industry," Dreano says. It turns out that the SEC isn't so different from the Navy. Both entities came away from Y2K with a renewed sense of how each information system is tied to a mission.

The project managers at the Transportation Department also learned from Y2K. "I think Y2K raised the level of awareness within the ranks of our project managers," says Transportation CIO George Molaski. "It proved how integral IT is to the process and also how they need to get education on how to use IT more effectively."

"Y2K proved that IT issues can't be decided in a vacuum any longer," McClure says. "IT must be looked at in terms of the mission, what needs to be accomplished."

The Navy began applying this lesson before the rollover even occurred. That is not to say the Navy did not have its share of armadas to conquer. With 800,000 personnel inhabiting every time zone, it was a long deployment for those working on the Navy's Y2K problems before night fell and the USS Topeka reported all systems were normal.

The Navy came to a "clear turning point when we stopped viewing IT as a stand-alone entity," says the Navy's Wennergren. "IT permeates every aspect of our enterprise." The Navy identified 2,000 information systems, including mission critical support systems that had to be tested for compliance. As a result, the Navy landed on a strategy of centralized policy-making and decentralized execution.

"The plan allowed system owners and base commanders to go out and use the [omnibus] plan," says Wennergren. Under the Navy's test structure, entire carrier battle groups were sent out to sea, where the clocks were then rolled forward. Sometimes, Marines conducted operations ashore to test the ground mission.

The Navy fixed the Y2K issues it discovered from deployment to deployment. But remediation issues paled beside the data the Navy gathered about its systems' interconnectivities. "For every Y2K problem we found, there were four or five other interoperability problems we wanted to fix," Wennergren says. Now, the Navy knows where every computer is and where its interdependencies lie.

"To be successful with Y2K you had to rationalize the infrastructure," Wennergren says. "You have to take advantage when these core knowledge capacities are developed. We have a much more knowledgeable workforce about the current IT infrastructure, and we also understand what it lacks."

This notion of taking advantage of the Navy's inventories and core knowledge generated as a result of Y2K is at the core of the Navy's proposed Navy Marine Corps Intranet program. "With N/MCI we will be taking advantage of global interconnectivity," Wennergren says. "Parts of this include a corporate intranet enabling e-business and knowledge management." The Navy wants to take advantage of this window of opportunity created by the exhaustive inventories of systems and interconnectivities.

Out of the Back Room

Y2K's legacy is more than inventories and contingency plans. Some agencies are now benefiting from altered management structures as well.

Kathy Adams, former assistant deputy commissioner for systems at the Social Security Administration, is now an executive with SRA International Inc. as well as an acknowledged leader on the Y2K issue. She breaks the effects of Y2K on management into three parts. First, she says, those at the CEO level, departmental deputy secretaries and the like, now realize that IT is critical to the provision of government services.

"I would say Y2K definitely increased the visibility and importance of the CIO within their organization," Molaski says. "Y2K also demonstrated that the CIO can be an integral change agent within the government. A CIO can take on large management tasks and work in a collaborative manner to accomplish an end result effectively."

Commerce Department CIO Roger Baker is not sure whether Y2K increased his stature at his department-but, he says, it has happened anyway. "Now I certainly get around to each of the undersecretaries and flex my IT muscles," Baker says.

Second, now there is the realization that the CIO really needs to have a seat at the executive table, Adams says. "Prior to Y2K, the IT shop was really not reporting directly to the head of the organization. This is not really a choice anymore for executives," Adams says.

Congressman Stephen Horn, R-Calif., chairman of the House subcommittee that gave federal agencies report cards throughout the Y2K crisis, agrees with Adams. "It's a question of whether or not the CIO can get the ear of the secretary or the deputy secretary when things go awry. We need to get the CIO up onto the level of the key team that runs the agency or department."

And in some cases this is happening. "The deputy secretary and the chief financial officer and I interact all the time," Baker says. "Y2K did give CIOs the opportunity to show their stuff."

And finally, "the CIOs learned that there is a need to improve management practices within their own IT department," Adams says. "IT workers are sometimes loath to having discipline. We need to move systems development and maintenance into being much more of an engineering science rather than a creative art form."

But while interagency management was changed, so too were cross-agency communications. "The first committee of the CIO Council was the Y2K committee," Adams says. "We realized we needed each other in order to figure out something that had not been done before. It was a team. If somebody figured out how to do things, we shared it with everyone else-especially when somebody figured out a best practice."

But Y2K eventually required central leadership. That came in the form of John Koskinen. Adams, Wennergren and even Horn credit Koskinen with helping focus federal, state and local and even international governments on the Y2K problem.

"I'm not sure we would have made it without Koskinen," McClure says. "He has a unique set of skills. He worked enormous hours and was a tireless worker. He was a focal point. He set direction and laid out priorities. He had the ear of national leadership and kept focus and attention on the problem."

In fact, Koskinen was so successful at helping agencies cross cultural boundaries and focus on a common problem that the job description and powers of the Y2K czar are being considered for other issues. With support evident from Congress about the need for a federal CIO, or even an e-government czar, the effectiveness of Koskinen's position is without question.

"The Y2K czar position has turned into a real lesson learned on how to get a major program done governmentwide," Baker says.

Going Forward: Security

The idea that is getting the most support from Congress and even some CIOs is the creation of a security czar to help the federal government focus on its information assurance problems in the Internet Age.

"Y2K was an information security problem," Wennergren says. And many say that the lessons learned from Y2K directly apply to computer security.

"Y2K made us get a grip on how vulnerable government information systems are to disruptions," McClure says. "Y2K focused the debate over what would happen if systems were tampered with or brought down or even destroyed. Cybersecurity came to the forefront as a result of the dialogue about Y2K."

One thing cybersecurity doesn't have that Y2K benefited from was a deadline. Agencies knew that come Jan. 1, 2000, systems had to be remediated or replaced and contingency plans had to be ready.

"There is no countdown to a day when information assurance is no longer a problem," Wennergren says. This is why so many support the idea of a security czar. Many feel that cybersecurity poses a great threat to national security and that the focus brought from a Koskinen-type security czar would insert a greater sense of urgency in the move to close the federal government's numerous security holes.

And while Horn doesn't like the word czar, the focus of a Koskinen-type position seems necessary to him. "We aren't after a czar, but rather somebody to coordinate already existing operations."

Y2K also holds other similarities to security. With both, inventories and asset management are vital. It is impossible to secure what you don't know you have. "But it's not just cataloging," Wennergren says. "When agencies understand system interconnectivity and interdependency it pays great dividends in terms of critical infrastructure protection and information assurance efforts."

Wennergren also points out that agencies must look for system vulnerabilities like Y2K and bring certain types of software into compliance, including taking older versions of software and standardizing them across the enterprise-then come contingency planning and constant monitoring. This is all starting to sound familiar.