Defining Dashboards

Agencies are boosting transparency online, but making data meaningful is the real trick.

Agencies are boosting transparency online, but making data meaningful is the real trick.

In the spirit of transparency, federal agencies have begun charting their performance for all to see, using online dashboards replete with pie charts and color-coded diagrams that measure progress in specific missions. Federal Chief Information Officer Vivek Kundra has said dashboards launched during the next year will increase visibility into "every operation of government," from human resources to contracting. But creating the online displays is the easy part, say management and information technology specialists. The hard part is identifying performance indicators that are measurable and meaningful.

"What I've found unfortunately is that sometimes people are racing into technology before they've found the right measure," says Jon Desenberg, senior policy director at the Performance Institute, a research group focused on results-oriented government.

Susie Adams, chief technology officer for Microsoft Federal adds, "You could have a beautifully designed dashboard that doesn't report data that is useful, if that data is suspect or not valuable in some manner." She points to a March Government Accountability Office report that found agencies were submitting incomplete and inaccurate data to USASpending.gov, a site that was built to track federal contract and grant awards.

To identify the appropriate data, Adams recommends hiring a consultant to drill down to five or 10 metrics. "A lot of times you need to ask people outside the organization" for opinions, according to Desenberg. The White House has taken this approach. Last fall, the Office of Management and Budget formed a task force to develop metrics for gauging the information security postures of each agency. The task force in- cluded the U.S. cybersecurity coordinator and government and private sector representatives from the federal Chief Information Officers Council, Council of Inspectors General on Integrity and Efficiency, National Institute of Standards and Technology, Homeland Security Department, and the Information Security and Privacy Advisory Board. "Metrics are a policy statement about what federal entities should concentrate resources on," Kundra told a House subcommittee this spring.

With the help of the task force, the White House decided to assess IT defenses through interviews with agency leaders, a new digital tool that continuously monitors security controls, and agency estimates of spending on personnel, reporting, security management, and certification and accreditation.

One of the reasons OMB likes dashboards is they are a proven private sector management tool. A big priority for the Obama administration is bridging the gap between federal sector technology and business world technology. Tech titans such as Google and Microsoft use the applications internally to optimize performance and externally to increase transparency. In 2009, Google launched a status dashboard to update customers on the day-to-day reliability of its Web services, including Google Talk and Google Video. The site was prompted by a Gmail outage that left customers unable to access e-mail for two and a half hours and in the dark as to why. Microsoft manages all its business lines using dashboards.

Federal Chief Performance Officer Jeffrey Zients, who has 20 years of corporate experience, announced the government soon will launch a "performance portal" to follow its progress on goals, by thematic area, agency and program.

Some say agencies should organize their performance markers into four categories:

  • Inputs-expenditures per activity
  • Processes-regulations governing the activity
  • Outputs-what actually has been produced
  • Outcomes-the impact of the outputs on the community

For an education department dashboard, an input could be the amount of money spent on each child in a classroom. A process might be the type of curriculum applied or initiatives undertaken to retain qualified teachers. An output could be the exam results for a particular school. And the outcomes might be whether the school's students are accepted at four-year colleges or obtain jobs.

On a public dashboard, citizens typically prefer to see factors that are relevant to them. "You've got to realize that people are interested in what they're interested in, not what you're interested in telling them," says Greg Parston, director of the Accenture Institute for Health and Public Service Value, a research division at the consulting firm that focuses on public service delivery. Parston says New York City's reporting portal-NYCStat-does a good job of targeting a diverse audience. The dashboard segments performance reviews into citywide administration, social services, economic development and several other thematic areas. Within each theme, users can view citywide performance over time or focus on more detailed measures and shorter time frames.

Agencies must learn to accept that dashboards sometimes shed light on results that federal officials are not happy to reveal. Some probably were embarrassed last fall, when the stimulus- tracking dashboard Recovery.gov displayed first quarter spending results showing that certain department-sponsored projects were creating few jobs-in some cases, zero. In addition, one of the Obama administration's first report cards-the IT Dashboard-showed that 18 percent of agency technology investments had triggered schedule concerns.

By listening to citizen feedback on Recovery Act job figures, the White House later realized that its formula for calculating job creation was not the best metric. OMB initially asked funding recipients to report the number of jobs created or saved over the course of the project. Now recipients only have to count jobs funded with stimulus money during a given quarter and do not have to calculate the number of jobs saved.

"They tweaked their measures. And that's natural," Desenberg says. An agency's gauge of success could change over time and require more manual analysis, such as customer satisfaction surveys, or automated tabulations, such as roadway sensors. "Let's make sure measures are both used by people and are actually useful," he adds.

NEXT STORY: Energy Smarts