Survey finds agencies are not taking full advantage of data analysis


Even as more federal agencies embrace the idea of increased data analysis, many don’t know what to do with this newfound information.

In a survey co-sponsored by the Association of Government Accountants and Accenture, and released Tuesday morning, 67 percent of federal officials said their organization analyzes data and uses the findings to make fact-based decisions, a process known as “data analytics.”

Of the respondents who said their agency incorporates analytics, 46 percent reported a “low” integration of analytics into agency management. Another 46 percent said their agency had “medium” levels of integration, while 8 percent said their agency practiced “high” levels of integration.

Approximately 40 chief financial officers, deputy CFOs and chief information officers were interviewed for the survey between October 2011 and January 2012, from an indeterminate number of agencies that survey co-author Helena Sims, director of intergovernmental relations for AGA, said represented “a good cross-section.” Additionally, the survey authors conducted interviews with officials from eight offices within six federal agencies, including the Agriculture, Defense and Education departments and the U.S. Postal Service. The interviews were geared toward specific projects that use data integration.

Half the respondents who said their agency used analytics also said a contractor helped develop their data analytics system. During the panel Tuesday, survey co-author and primary researcher Steve Sossei said he suspected the number might actually be higher and agency officials surveyed may not have been aware of which tasks were performed by contractors.

“Really there’s no right-or-wrong approach to the development of data analytics,” said Sossei, who is a retired director of state audits from the New York State Comptroller’s Office. “It’s what fits in your organization. Everything within life is a balance. Sometimes you have to budget, sometimes the time is right to go big.”

For the purposes of the survey, low levels of analytics integration were defined as “data analytic processes are conducted in silos with little consistency or standardization.” Medium levels meant “data analytic techniques are used inconsistently” with “some linkage to management budget and planning functions.”

Lack of integration was a source of concern for Sossei, who warned about the consequences of improperly integrated analytics.

“You don’t want to have a system that just produces results without identifying what they’re going to do, integrating them into your operations, and feeding that back into your system and moving forward and producing better results on a more timely basis,” he said, adding increased analytics use and greater integration would require more resources as well as different management styles.

According to the survey, “nearly all” government data analytics systems focus on financial performance, improper payments and identifying targets for high-risk investigations. Two of the panelists demonstrated how their departments were employing analytics for these purposes: William McGee explained that Defense has used near real-time business transaction analysis to prevent more than $4 billion in improper payments through April, and Edward Slevin discussed how Education is using analytics to attempt to combat student financial aid fraud rings.

Data analytics is projected to be a $92 billion industry over the next five years, according to Kevin Greer, executive director of Accenture Finance and Enterprise Performance. “Those who adapt analytics will have a competitive advantage,” Greer said.

Stay up-to-date with federal news alerts and analysis — Sign up for GovExec's email newsletters.
Close [ x ] More from GovExec

Thank you for subscribing to newsletters from
We think these reports might interest you:

  • Sponsored by G Suite

    Cross-Agency Teamwork, Anytime and Anywhere

    Dan McCrae, director of IT service delivery division, National Oceanic and Atmospheric Administration (NOAA)

  • Data-Centric Security vs. Database-Level Security

    Database-level encryption had its origins in the 1990s and early 2000s in response to very basic risks which largely revolved around the theft of servers, backup tapes and other physical-layer assets. As noted in Verizon’s 2014, Data Breach Investigations Report (DBIR)1, threats today are far more advanced and dangerous.

  • Federal IT Applications: Assessing Government's Core Drivers

    In order to better understand the current state of external and internal-facing agency workplace applications, Government Business Council (GBC) and Riverbed undertook an in-depth research study of federal employees. Overall, survey findings indicate that federal IT applications still face a gamut of challenges with regard to quality, reliability, and performance management.

  • PIV- I And Multifactor Authentication: The Best Defense for Federal Government Contractors

    This white paper explores NIST SP 800-171 and why compliance is critical to federal government contractors, especially those that work with the Department of Defense, as well as how leveraging PIV-I credentialing with multifactor authentication can be used as a defense against cyberattacks

  • Toward A More Innovative Government

    This research study aims to understand how state and local leaders regard their agency’s innovation efforts and what they are doing to overcome the challenges they face in successfully implementing these efforts.

  • From Volume to Value: UK’s NHS Digital Provides U.S. Healthcare Agencies A Roadmap For Value-Based Payment Models

    The U.S. healthcare industry is rapidly moving away from traditional fee-for-service models and towards value-based purchasing that reimburses physicians for quality of care in place of frequency of care.

  • GBC Flash Poll: Is Your Agency Safe?

    Federal leaders weigh in on the state of information security


When you download a report, your information may be shared with the underwriters of that document.