Image via Tommistock/Shutterstock.com

Policy Informatics is Bridging the Gap Between Researchers and Politicians

Academics are using new networks and methods to convey information in language understood by policy makers.

Academics don’t always speak the same language as practitioners. But they oftentimes have useful ideas to convey. So how do we bridge the gap between research and practice?

I’ve been asked to participate as a “practitioner” on a discussion panel at the upcoming conference of the Association for Public Policy Analysis and Management (APPAM) It is a high-powered conference of top academics from around the U.S., including the likes Louise Comfort. This particular panel hopes to define the future research agenda for an emerging field that academics call “policy informatics.” To be clear, I was more than a little clueless about what the term even means!

What is “Policy Informatics?”

Fortunately, with Google to the rescue, I learned that “policy informatics” is “the use of information technologies and computational modeling. . . . to inform policy analysis, management, and decision making.” It seems to be a relatively new academic field. But interestingly, governments at all levels, and the consulting world, have been pursuing these kinds of efforts for some years. They just have different terms for it – or subsets of it -- which may be equally daunting, including:

Dr. Natalie Helbig, with the State University of New York at Albany’s Center for Technology in Government, is involved in a network of academics focused on policy informatics. She says “We aren't necessarily trying to build a new area of study, but our main goal is to connect the network of networks... there are people who care about the 'evidence,' there are people who care about the 'tools,' there are people who care about the 'information' etc.

What we are trying to address (and what we believe) is that these are relatively disconnected groups of researchers . . . We think there is value in bridging these various communities together to see how we can make more effective linkages and knowledge exchanges. 

Interestingly, the field is receiving increasing prominence among practitioners. The U.S. Office of Management and Budget has recently promoted “evidenced-based decision-making” as part of its efforts to improve government performance. And this has been reflected in recent legislation, such as the healthcare reform bill’s emphasis on evidence-based medicine. The most refreshing part of this new field is how interdisciplinary it is.

What Are Some of the Challenges They Could Address?

Regardless of the terms being used, the problems this field can address are real. For example, in North America, up to 22 percent of total sea port volume is empty containers. In the Port of New Jersey alone, there are 100,000 empty containers sitting in storage – worth nearly $200 million. How can shipping routes be made more efficient?

In the U.S., handwritten prescriptions result in 2.2 million dispensing errors, resulting in additional healthcare interventions or even death. Can this be done better?

These kinds of challenges extend to the use of telework, better routing of air traffic, faster processing of patents, uncovering tax cheats, determining benefit eligibility for various social programs, and more effective methods for educating students. Information, and its analysis and use, are everywhere!

Academics are both looking for interesting public policy challenges to study as well as new models and tools with which to conduct these studies. But can research make a difference in the “real world?”

Can “Informatics” Make a Difference in the “Real World?”

I’ve seen clashes between good academic work and policy actors, and the academics never won. What I learned was that it is not just the data and the models that matter, but also how they are conveyed and how decision makers respond to them.

For example, some time ago I worked on a state legislator’s staff. He was chair of the House Public Education Committee and was given a briefing by academics on extensive, data-driven research on how to reduce the cost of school bus transportation while expanding access. The research was excellent and was based on regression analyses of time series data. But the chair used some colorful expletives and threw them out of the committee room. He said he could never explain regression analysis to his constituents.

But researchers are making more progress these days because of the availability of new ways to display and explain data that would make policy makers more comfortable about use the results of sophisticated research. For example, recent reports by the IBM Center and by others show that decision makers do make use of evidence and evaluation when making decisions:

  • Internal Revenue Service (IRS) analyst Shauna Henline established and documented a process for analyzing tax returns to detect patterns of abuse involving frivolous returns. As a result of her analyses, the fines were raised for filing or promoting a frivolous return. In addition, decision-makers in IRS were convinced to take action to charge tax evasion promoters with criminal fraud or pursue civil injunctions.
  • The Food and Drug Administration (FDA) now uses new program performance goals which include the total time-to-decision for approving medical devices. This is new for FDA. In the past, its leaders were reluctant to be held accountable for the length of time when an application was back in the manufacturer’s ball court for additional information or clarifications. However, an analysis of performance data showed that the time-to-approval was increasing despite FDA meeting most of its internal performance goals (and industry was still complaining about a slow approval process). Analysts conducted a root cause analysis and as a result, FDA leadership agreed to a number of corrective actions. The new goals now focus on shared responsibility between government and industry on meeting timeliness goals for completing a review to approve applications for new medical devices.
  • A senior manager at the National Institute of Health’s Eye Institute assessed the progress of research in the field and found that it was not progressing fast enough. She shifted the Institute’s emphasis from reviews of individual grants to reviews of a portfolio of grants around a particular research topic. A colleague had been advocating support for a $50,000 grant on a particular project, but when he saw the analysis of the overall portfolio and saw that another NIH institute was investing $5 million into a similar project, he then dropped his support and these funds were directed to another research project.

When taken together, this greater reliance on informed analyses is making a difference in government agencies. A recent research report noted that the difficult part was not the technology or analyses, but rather the creation of a cadre of “data detectives” in an agency and the acceptance of the research results by decision makers.

What Are Some of the Tools They Could Use?

The reason this field has evolved so quickly in recent years is that the tools and data have become more sophisticated – yet easier to use. The Obama Administration’s emphasis on “open data” has created new opportunities for researchers to access entire data sets of current data, and not have to rely on small statistical samples, independently collected.

Likewise, new analytic tools and approaches, such as:

. . . offer new ways to collect, analyze, and display large amounts of information in easy-to-interpret ways. Hopefully, the value and uses of these and other techniques will be discussed by the APPAM panel.

What Are Some of the Research Opportunities on the Horizon?

Recent trends in increased complexity and the government’s pursuit of large-scale initiatives, coupled with new and pending legislation will offer additional “umph” to the whole field in the near future, making it even more relevant. The GPRA Modernization Act requires agencies to measure and manage progress of key priorities on a quarterly basis. It also requires agencies to develop staffs with analytic capabilities. A pending law, the Digital Accountability Transparency Act, would extend the financial reporting requirements of the Recovery Act (see recovery.gov) to all government spending. The Obama Administration, in parallel, is taking administrative steps to expand financial reporting and to use evidence-based approaches to making budget decisions.

These efforts will push government to be more data-driven in its approach. Other initiatives – such as the Obama Administration’s commitment to identify poor performing programs and policies through the budget process -- will as well.

Since this is a broad and growing topic area for research, I’m sure there are other issues you think should be put on the APPAM research agenda. I’m certainly looking forward to the panel’s discussion – and finding ways to bridge the gap between research and practice!

(Image via Tommistock/Shutterstock.com)