Information Is Power

nferris@govexec.com

J

oseph Thompson sits down at the paper-strewn desk in his blue-carpeted, standard-issue federal executive office and grabs the computer mouse. Clicking an icon on his computer screen, the top official at the Veterans Benefits Administration nods toward it and, clicking away, explains, "The balanced scorecard is here. And this is all of our work management information. There are thousands and thousands of pieces of data there."

Clicking again, he continues: "This is activity-based costing, and, here, our Monday Morning Workload is actually our weekly snapshot of business in all our regional offices."

Thompson, undersecretary of Veterans Affairs, is showing off his agency's brand-new Operations Center. It's a virtual center, a set of often-updated computer reports that keeps VBA managers abreast of the work being completed, or not being completed, in the agency's five lines of business, as well as updating VBA's financial situation.

VBA and many other agencies are beginning to enjoy the benefits of assembling disparate data, often from decades-old mainframe systems, in virtual warehouses and making it accessible with a familiar World Wide Web browser. Warehousing involves not only displaying the information in a connected fashion but also finding or creating common denominators so that apples are not compared directly with oranges.

In VBA's case, the PC screens give Thompson and his lieutenants fast and easy-to-digest status reports, mostly in the form of spreadsheets. "I can graph all this on the fly," he says. "I can say, I want New York, Chicago, L.A., I want all the big cities, and I want to look at their average days to complete original compensation claims. And I can graph all that out in minutes. That would have taken hours in the past."

Once the information is displayed on the computer screen, areas of deviation from the norm are easy to spot and investigate. An unexpected uptick in performance could signal a process improvement in one office that could be replicated elsewhere. A downturn may indicate a problem that needs to be resolved.

Thompson says VBA got the idea for the online data center after visiting the New York City Police Department's operations center. There, in what's been described as a model set-up, NYPD managers review operations and performance data with each squad or precinct.

The VBA staff decided that having an office or meeting space for this purpose wouldn't be necessary if the information could be made available on their internal computer network. So for about half of 1999 the agency's data management office pulled existing information into a data warehouse, where it could be integrated.

Like other large agencies, VBA has hundreds of information systems. Dissimilarities among the data sets and the structural gaps between systems make it hard to see the big picture or compare data in more than two dimensions. "Without designing new IT systems, we have enormous amounts of data in there already that we don't know about," Thompson says.

VBA's data warehouse arrays information locally and rolls it up to the regional and then the national level. Local offices can compare their performance with their peers. Information still is being refined and added to the system.

At the national level, Thompson says, the data warehouse is giving executives new insights into some of their thorniest performance issues. For example, it has shown them why their work backlogs have been intractable, even though the number of American military veterans is dwindling.

"The folks looking at our budget have always said, 'Geez, your workload is going down, down, down.' And we're saying, 'It doesn't seem like it's going down. Every time I turn around I've got a mountain of claims files sitting there. It doesn't look like it's getting smaller,' " Thompson says.

With the data warehouse, VBA officials discovered that although there are fewer veterans applying for disability benefits, they are claiming more disabilities, making their cases more complex. Disabled World War II veterans averaged 1.7 disabilities apiece. For Gulf War veterans, the average is 3.2 disabilities.

In addition to handling more complex cases, VBA claims examiners are under pressure to document each step of their work more completely and carefully than in the past, so it can withstand judicial review. The result: An initial claim for disability benefits takes 25 percent longer to process than the same claim did in 1997. "That's a big change," Thompson says, "and if you can't explain that well, it's hard to justify your budget" and persuade anyone you need more employees.

Self-Managing Employees

"The best part is that it's a very democratic system," Thompson says of the online data center. "It's there for everybody to see, and knowledge is power. And in our vision of the future, employees will be largely self-managing. The amount that supervisors need to intervene with you on operational issues should be minimal.

"Our expectation is you'll be able to manage your own operation, and in order to do that, you need to know what everybody else is doing too, and have some idea of whether you're in the ballpark on all of this, " he adds.

Thompson is not known as a cheerleader for technology. In fact, he's inclined to downplay its value. But as the top executive of an agency that will dispense more than $22 billion in benefits this year, he's beginning to see that IT can be a more useful management tool than it has been in the past. "Building the ability to explain yourself has been key to us," he says.

Once agencies like VBA put data in warehouses, users can analyze it in ways that weren't possible before, shedding new light on relationships among causes, effects, resources and results. The low cost and flexibility of these systems make them easy to develop and modify. Agencies can start small and gradually build breadth into their systems, or they can start with a large bundle of information and whittle it down to what's most useful.

As Thompson notes, most agencies are not short of data. When VBA's parent department, Veterans Affairs, began looking at ways to monitor performance in accordance with the 1993 Government Performance and Results Act, officials found "we had thousands and thousands of performance indicators throughout the department," says Thom Rockford of the department's Performance Analysis Service. The staff selected about 120 of the most useful measures and pulled them together into a single VA Performance Measurement System.

With continued whittling, the VA now has settled on about two dozen performance indicators that are updated quarterly for use in progress reviews. Rockford says development of the system is continuing. One current emphasis is on obtaining more outcome data.

Meanwhile, an executive information system tracks major indicators of progress toward VA's goals. Despite the system's name, all department employees nationwide have access to it and the other performance data on the VA intranet. This represents a remarkable culture shift. As Don Larsen, another Performance Analysis Service employee, says, "In the Department of Veterans Affairs, for many years it was frowned upon, to put it mildly, to share data" between offices and bureaus.

Larsen built the Web-based executive information system himself, without contractor help. It draws information from the existing mainframe systems, using ordinary software tools from Microsoft Corp. He estimates the cost to the department at less than $10,000, plus his time and, during the development phase, that of another VA employee who no longer works there.

Data Mining

The Coast Guard's executive information system, too, was designed to make sense out of a morass of data. Lt. Cmdr. Terance Keenan of the headquarters staff calls information "the fuel of performance management" and says data mining-the digging, slicing, dicing and analysis that turns up relationships and patterns in the data-is helping Coast Guard managers see what resources are available to them and better understand the organization's needs.

The executive information system was difficult to set up, but the hard work paid off when Coast Guard planners could easily obtain the data they needed in developing their fiscal 2000 performance plan, Keenan says. For example, they could look at relationships between Coast Guard initiatives and seizures of contraband cocaine. They can make tradeoffs between the costs of alternative strategies-stopping drug smugglers using more aircraft or more small boats, for example-and the likely effects.

The Coast Guard has numerous separate databases. Although their sources are similar and the contents overlap one another, Keenan says, "they end up being different" data sets, and often only one employee understands each database well enough to extract useful information from it.

To complicate matters further, the Coast Guard's field offices are organized geographically, while budgets are drawn up by programs that operate nationwide. This is common in other agencies too, including the Environmental Protection Agency. There, says Terry Ouverson, associate director of EPA's annual planning and budget division, "our organizational structure and our appropriations structure have nothing in common."

Ouverson says that in his agency, every office had its own information systems, and only an overlay system like the EPA's new Budget Automation System could pull together fundamental management information about the agency's budget and its spending patterns.

Those in agencies such as VA and the Coast Guard, where performance information is being widely distributed among employees, strongly defend the practice as a contemporary management technique. They contrast it with the earlier era, when only a few privileged managers and analysts had such wide access to agency data. With better information, lower-ranking employees too can make better decisions, they say. "That's the only way we're going to get better," Keenan says.

However, other agencies are choosing different approaches. At EPA, for example, the number of Budget Automation System users is limited, and a structured security scheme allows each user to view only the data he or she needs. Some authorized users can merely see the data; others have permission to change it.

Hot Products

The Coast Guard system relies on a commercial software package, PowerPlay from Cognos Inc. NASA also uses Cognos performance reporting software. Among the many other developers that provide this kind of software to federal agencies are Oracle Corp., SPSS Inc., Information Builders Inc., SAS Institute Inc., MicroStrategy Inc. and IBM Corp.

The huge systems integrator Computer Sciences Corp. is creating the Coast Guard's new system. EPA's system was built by a 200-employee Silver Spring, Md., company, ISSI Consulting Group Inc. Hundreds of other technical and professional services companies say they can do similar work.

In fact, this is a hot category of software, under the trendy label of "business intelligence." Some of the packages can generate "executive dashboards" that show the most important performance indicators, or other data of high interest, graphically on a computer screen. Gauges resembling the odometer on your car might show the current unit cost of your agency's core work or the backlog of work to be processed.

GPRA is the impetus for much of the interest in business intelligence within the executive branch, but these products and services are selling well in the private sector, too. In both the public and private sectors, the most difficult part of the implementation is finding reliable, accurate data that's a valid performance indicator, according to Jeff Babcock, who heads the public sector division of SAS Institute in Cary, N.C.

Babcock, who has worked with several agencies on performance reporting, says agencies commonly make the mistake of starting with their data and constructing measures based on the information available. Instead, he says, they should start with their goals and objectives, decide how to measure their performance and progress, and then find data that meets their measurement needs.

More of Babcock's advice: Start with a small, manageable project and build upon that foundation. Worry about more about meeting user needs and achieving a good fit with your agency's operational and management environment than about which software technology you'll use. "Based on our experience," Babcock says, "the technology is almost irrelevant."

Y2-OK

It came as a relief to everyone in the federal IT world, as well as their bosses, when they passed their biggest test on Jan. 1. Most of the government's computer and communications systems continued working correctly when 1999 rolled over to the year 2000.

The conventional wisdom had held that some Y2K failures were inevitable because of the industry's sorry track record in major federal projects. IT professionals as a group have a history of:

  • Failing to define user requirements accurately, which results in systems that don't do what they were supposed to do. In one typical example, the General Accounting Office last year issued a scathing assessment of a grantee performance monitoring system at the Housing and Urban Development Department. The system is supposed to provide HUD with real-time performance data, GAO said, but it doesn't, because of major design flaws.
  • Automating outdated and inefficient processes. Early procurement management systems are an example of this problem. They often were designed to automate the production, distribution, review and filing of documents, but their designers would have achieved more if they had looked for ways of getting the job done with fewer documents, rather than generating more paper.
  • Poor project management, leading to cost overruns and delays. One famous case is the Health Care Financing Administration's program to develop a new Medicare Transaction System. The program was canceled in 1997 after HCFA spent $80 million and more than three years on it. GAO officials called it a "huge learning experience" for the agency. At VA, the inspector general reported in October 1999 that the agency had paid a contractor more than $1 million for a network performance management system, although the contractor never delivered an acceptable system.
  • Insufficient attention to quality and reliability. Even expensive and mission-critical systems can't be counted on to perform properly when first installed. The industry norm is to deliver software that hasn't been thoroughly tested, then issue "bug fixes" or "patches" after problems materialize.

Given this history, the betting was that the IT profession wouldn't meet the immovable Y2K deadline. Even if the work got done on time, it was likely to be riddled with faults that would cripple system operations, many thought. When those expectations proved false, IT professionals collectively experienced an unaccustomed surge of pride.

Measurement Miasma

Although they now have shown that they can get the job done, IT professionals still are at a loss when it comes to demonstrating return on IT investments or establishing the business value of implementing new technology. Some of this inability stems from trying to predict the outcome of complex process changes and infrastructure upgrades. Some of it comes from the intangible and time-dependent nature of information.

Another set of unknowns arises from what Paul Strassmann, former chief information officer at the Defense Department, calls "stealth spending" on computers. Many outlays and indirect costs associated with computing are not captured by conventional accounting systems. Some of these costs include the price of software, printer paper and other items purchased by users rather than the IT office; training expenses and lost productivity associated with introduction of new and upgraded systems; and telephone costs from individuals' modem use.

In his 1997 book, The Squandered Computer (Information Economics Press), Strassmann recommends that organizations minimize unaccounted-for and unauthorized IT spending, saving the extra costs that can accompany this spending. For example, a person who uses a word processing software package that's different from the agency standard may independently demand specialized training and support.

To keep their internal customers happy, most IT managers give users some leeway in adherence to agency standard configurations, as long as the users shoulder most of the extra cost. But there are continuing and unrecorded expenses down the road, once the standards wall is breached.

That's one of the reasons for the general agreement in recent years on the need for an IT architecture for each agency. The architecture is more than a set of standard hardware and software configurations, although they are a piece of it. The architecture should show how the IT systems relate to important business processes and map the flow of important data.

Without an architecture, IT is nearly unmanageable. It's difficult enough to manage even with an architecture in place.

Security and Performance

One increasingly important issue that an architecture can help resolve is information security. Security and privacy worries are the downside of the current push for maximum connectivity and information sharing. Every new agency Web site or electronic mail server is an invitation to malicious-and perhaps even truly sinister-hackers, virus writers, information peddlers, spies and saboteurs.

An information systems architecture makes it easier to identify points of vulnerability and put systematic protections in place. What's more, a thorough IT training program and many of the other elements of a high-performance IT program will contribute to effective information security. But, as Strassmann says, "the problem with security funding is that its payback is not apparent until it is too late."

That's also true of many other facets of information systems. To get a better handle on IT and comply with laws such as the 1996 Clinger-Cohen Act, agencies are adopting a portfolio approach to major IT investments. They are establishing IT investment review boards composed of IT, financial and program managers to select the most promising projects from among those requested.

Once funded, the systems development projects are being tracked with automated systems that are much more sophisticated than earlier models. For example, the Coast Guard in the past monitored systems projects for compliance with rules and with management controls. Now a new IT investment process at the agency will compare actual and promised performance, among other things.

Eleven agencies are using a government-owned software system called the IT Investment Portfolio System (ITIPS), developed by an Energy Department contractor and now available from the Treasury Department. The newest version of ITIPS tracks the linkages between an agency's goals and objectives and its IT initiatives, Treasury's Darren Ash says. The system long has reported on risks, implementation progress and cost and schedule variances, besides generating required budget reports. One advantage of such a tool, Ash says, is that it establishes a baseline based on initial plans.

Over the next few years, use of ITIPS and similar systems is likely to give agencies better information about the performance of new IT initiatives. Meanwhile, Y2K helped agencies get a more complete picture of their installed IT assets, because each had to be checked for readiness.

Nonetheless, says GAO's David L. McClure, "metrics are being used largely for reporting" on IT, rather than managing its performance. Patrick Plunkett of the General Services Administration, whose specialty is IT metrics, agrees. He says "very little progress is being made on IT performance measures."

One often-mentioned reason for this lack of progress in measuring the value of IT is a lack of high-level management and oversight interest in the subject. But that could change this year if Congress, as is expected, holds hearings on agencies' implementation of the Clinger-Cohen Act and their efforts to measure their overall performance. "You're going to see a lot more congressional interest," Plunkett told a group of federal managers earlier this year.

For most agencies, IT is one of the biggest expenses after their payrolls. And the federal government spends more on IT than corporations spend, if a recent survey is correct. The Gartner Group, a respected consultancy in Stamford, Conn., reports that North American enterprises spend an average of less than $8,000 per employee per year on IT, while the federal government spends $15,233 per employee.

When agencies start looking carefully at costs and benefits, the results can be striking. For example, the VA's Insurance Service maintains 2.2 million folders of paper forms and other information on its customers. The agency, a unit of the Veterans Benefits Administration, spends $1.1 million a year to store the paper, retrieve files and return them to the proper shelves.

That money will be spent in the future on operation of a paperless system that will speed up the agency's response to callers on its toll-free telephone system. The agency hopes to increase employee productivity further by adding self-service options to its Web site, reducing the need for VA intervention in records updates, such as changes of address.

"Our insurance program is one of the best programs in government," with high customer satisfaction, speed and accuracy scores, says VBA's Thompson. It's clear he views technology as a means of maintaining or increasing that level of performance despite a declining workforce.

NEXT STORY: Government Performance Project