Performance Anxiety

While performance promises have accompanied program funding requests in the past, goals set under the 1993 Government Performance and Results Act regime pack a whole new punch. They must reflect results that legislators and, more importantly, citizens can see. In addition, Results Act goals must include measurements, and be tied to specific budget requests. The Results Act required agencies to submit five-year strategic plans Sept. 30, 1997, and annual performance plans beginning with the 1999 budget. In addition, the administration had to include the governmentwide performance plan.
alaurent@govexec.com

I

f federal agencies were actors, they'd be standing in the wings dry-mouthed and trembling right now, gazing fearfully out at a tough congressional audience. Agencies should be feeling a little stage fright. For the first time in history, they uniformly have committed to achieve specific levels of real-world results for the money Congress appropriates. Release of President Clinton's fiscal 1999 budget request in February put agencies' Government Performance and Results Act performance plans in the hands of congressional appropriators. The day of Results Act reckoning is closing in. Agencies must report on their performance for the first time March 31, 2000.

Agency performance goals included in the fiscal 1999 governmentwide performance plan included:

  • IRS district offices will cut the time to resolve taxpayers' problems from 36 days in 1997 to 35.
  • The Office of National Drug Control Policy will present societal performance measures for anti-drug programs.
  • The U.S. Marshals Service will apprehend 80 percent of violent criminals within a year of warrants for their arrest.
  • Veterans Affairs Department hospitals will increase the number of patients treated by 9 percent.
  • The Environmental Protection Agency will complete 136 Superfund cleanups.
  • The Head Start program will serve 30,000 more children than in 1998.
  • NASA will complete 10 international space station assembly flights.
  • The Transportation Department will reduce transportation fatalities below the 1995 level of 43,549.

Any doubts about the seriousness of the Results Act were erased last year. Agencies squirmed under the spotlight shone by legislators who twice graded agency strategic plans and found them failing. Cabinet secretaries and top agency officials erupted when bad grades on draft plans hit the newspapers in August. Transportation Secretary Rodney Slater reportedly became so concerned that his troops turned around Transportation's score from 28 out of 105 possible points on the draft strategic plan in August to 75 on the final plan in November, winning legislators' "most improved" title. Now, with performance plans in appropriators' hands, it's not just pride at stake, it's money.

Cross-Cutting Controversy

With cash on the table, Republicans in Congress and the Democratic administration have been wrestling over the meaning of performance and results and methods for achieving them. Republicans are anxious to use the Results Act to cut back the size of government by eliminating what House Majority Leader Dick Armey, R-Texas, refers to as "duplication and program overlap." In August 1997, congressional auditors at the General Accounting Office found mission fragmentation-multiple agencies being given responsibilities for the same national issues-and the resulting overlapping programs "a particular and pressing challenge to implementation of the Results Act" (AIMD-97-146).

In a Dec. 17 letter to Office of Management and Budget Director Franklin Raines, Armey and House and Senate Budget, Appropriations and government oversight committee leaders explicitly identified "coordination of cross-cutting program activities" as an area where strategic plans were deficient and performance plans are expected to excel. Additionally, the Republicans expected the governmentwide plan to "identify areas of overlap, based on OMB's review of agency plans; explain how to correct serious overlap problems and streamline agencies subject to mission creep and duplication; and ensure that similar programs have comparable outcome measures."

It's unlikely, however, that the Clinton administration's approach to cross-cutting issues will satisfy Armey and his colleagues. In its fiscal 1999 governmentwide performance plan, the administration addresses cross-government issues by highlighting 22 priority management issues, about half of which are interagency issues such as the year 2000 information technology problem. The plan focuses on customer service, partnerships and information technology across government, particularly in the 32 high-impact agencies identified by the National Performance Review as interacting the most with the public and businesses.

Finally, the president's governmentwide performance plan incorporates the goals of a variety of interagency working groups, such as the Chief Financial Officers Council, Chief Information Officers Council, National Partnership Council, President's Management Council, President's Council on Integrity and Efficiency, Joint Financial Management Improvement Program, Federal Credit Policy Working Group and others.

Despite congressional pressure on agencies to deal with cross-cutting issues and duplicative programs in their performance plans, OMB directed agencies to get their own plans in order before addressing interagency issues. "The agencies really need to focus very heavily on complying with the statute and getting their own performance measures right," says G. Edward DeSeve, acting OMB deputy director for management.

But Ginni Thomas, Armey's committee liaison, found the administration's approach wanting. "It does not satisfy Dick Armey that there are a handful of organizations that occasionally meet to talk about similar functions. None of those organizations has contacted us," she says. "We're surprised there hasn't been more coordination in government. There's probably a gold mine of savings and efficiencies if people would focus on [cross-cutting programs] now and not wait." Christopher Mihm, GAO assistant manager for federal management issues, also had problems with the administration's plan. "OMB is focusing on management issues vs. GAO's focus on cross-cutting programs," he says. "Interagency groups are doing more heavy lifting, but they're not so good on cross-cutting issues such as job training and the like. The five-year governmentwide financial plan is a good piece of work and the CFO Council had real input, but it is not the vehicle to address cross-cutting programmatic issues."

Grading Grind

It's clear Congress will not go gently into performance planning. Already, House staffers are preparing to grade performance plans and a Results Act reform bill introduced last year by Rep. Dan Burton, R-Ind., seems likely to pass this year. Burton's bill would require agencies to add details about overlapping programs and management problems to their strategic plans and resubmit them by the end of fiscal 1998. He would have inspectors general audit performance reports and would compel OMB to submit governmentwide performance reports on the same schedule as annual agency performance reports. Performance plan grading will be conducted by the 24 cross-committee House task forces created in 1997 to educate staffers about GPRA, teach them how to grill agencies during congressional consultations over strategic plans, and enlist them in grading strategic plans.

While the administration accepted grading of strategic plans as part of the Results Act's requirement for congressional consultation, it opposes the plan to rate performance plans, especially since the law contains no similar consultation requirement for them. "I am opposed to performance plans being graded. Any 'grading' of performance plans is best done by appropriators and authorizers in the regular course of business," says DeSeve.

Grading will be made more difficult by the fact that performance plans aren't presented in a uniform format in agency budget justifications. Some are discrete documents, others are marbled throughout budget documents, rendering them difficult to compare. "What works for a holding company department with a diverse portfolio, the Agriculture Department for example, might not work so well for an integrated department like Housing and Urban Development," DeSeve says. "One size does not fit all. If you want to evaluate the usefulness of the performance plan for an agency, you've got to look at why they constructed it the way they did to meet the needs of the agency and appropriators' needs."

Limiting Legislators

In their zeal to enforce the Results Act's requirements of agencies, legislators should not overlook the challenges the law presents Congress. Hill staffers and legislators who have played active roles so far know that GPRA threatens to disrupt the traditional arrangement of appropriations subcommittees, raise serious internal squabbles over agency roles and programs, and constrain Congress' ability to develop new programs and geographically redistribute federal funds.

For example, it's not only agencies that are plagued by overlap and duplication. As GAO pointed out in an August 1997 report, many programs are "subject to multiple congressional authorization, oversight and appropriations jurisdictions." ("Managing for Results: Using the Results Act to Address Mission Fragmentation and Overlap, AIMD-97-146.) Employment training programs must endure review and funding decision-making by seven different appropriations subcommittees, for example. Each of those subcommittees, as well as myriad others, may have had a role in creating a program deemed duplicative or overlapping by other legislators. The Results Act could bring ugly internecine warfare over the missions and need for many programs.

Programs also will be harder to develop as the law takes hold. "In the post-GPRA environment, the administration, and to a lesser extent Congress, ought to begin the development of new programs with the goal (improving students' education, for example) and how we will achieve it (putting money into new schools, for example) and how much education will be improved as a result. We're not doing that now," says Tony McCann, clerk of the House Appropriations Labor, Health and Human Services, and Education subcommittee. DeSeve echoes the sentiment, saying, "It's very important in policy development to be able to articulate the results we expect."

In fact, old GPRA hands predict that agencies' troubles with strategic and performance planning may pale to insignificance when Results Act political squabbles break out in earnest. Mihm says he has advised agencies not to get too hung up striving for perfection in some aspects of their plans. "We tell agencies, 'Don't get wrapped up in trying to find technical fixes to political issues-in other words, the agency mission or appropriate outcome measures for interagency programs. You've got to get the stakeholders involved, especially the Hill, in working out those issues.' "

Data Dearth

GPRA not only requires reasoning out the raison d'etre for existing programs and the desired results for new ones, it also means measuring outcomes. But measurement rests on good data, a commodity many say is sadly lacking in most agencies today. Without good baseline performance data, appropriators and authorizers won't be able to see progress or lack thereof nor use it in making funding and policy decisions.

"I expect there will be baseline data for a very high percentage of the measures in performance plans," DeSeve said confidently as plans were being prepared. Mihm, on the other hand, expected data in first-year performance plans to be poor, owing to the sorry state of government information systems, the inability of statistical systems to focus on program results in small target communities, and the lack of performance measures in agencies until now. "Our survey of federal managers ['The Government Performance and Results Act: 1997 Governmentwide Implementation Will Be Uneven,' GGD-97-109] showed 60 percent did not have measures of the things or services they provide," Mihm says. What the administration considers good data may not pass muster with lawmakers, Thomas predicts. "Numbers of people served is not [the same as] whether we're having an effect on the people we're serving," she says. "We're trying to find out whether federal programs are positively serving taxpayers, not whether they're pulling people through the door at a rapid rate."

The quality of data may depend on the type of agency. McCann seems more willing than Thomas to accept interim output data, such as cases processed or time per case, and more willing to concede that some outcomes, such as the number of elderly living in poverty, can't be measured in the short term. "We can and should expect very good and accurate performance data from businesslike agencies such as the General Services Administration, Social Security Administration, Veterans Benefit Administration, and others along that line," McCann says. "They have a lot of that information and they do things analogous to the private sector. Regulatory agencies ought to be able to make it available fairly quickly-time to process cases, process on appeal, have been given a lot of thought already.

"For the third category, grants-in-aid agencies, the data is out there [for community health centers, job training programs and the like], but it's going to be very hard to collect," McCann says. Burton's bill would require agencies to discuss in their strategic plans the relative strength and weakness of their performance measurement data. Performance plans already must discuss how data will be verified and validated.

In addition to data problems, agencies also face challenges in evaluating program results. Many agencies haven't done much program evaluation until now, and those that have, have done it sporadically. "With the Hill we need to discuss how we make sure there are periodic program evaluations and that they are available," DeSeve says. "Where there are older program evaluations, we need to make them more current."

As agencies anxiously await the 1999 funding verdicts on their performance plans and reporting on those plans in 2000, they can take comfort in knowing most observers agree that only practice will bring Results Act perfection. "Each subsequent strategic and performance planning cycle can, and likely will, result in revisions," GAO wrote in its report on final strategic plans.

NEXT STORY: New Procedures Getting Air Time