Shutterstock.com

Laboratories of Democracy in Action: Investing in What Works

The mere availability of information is not enough to get policy makers to actually use it.

In states across the country, the evidence-based policy and practice movement is taking root. This “do what works” approach is global and is premised “on the assumption that increased use of research evidence will lead to better outcomes in terms of effectiveness, accountability, and sustainability,” according to a recent article by Roman Kislov et al in Public Administration Review.

At the federal level, this is reflected in the passage earlier this year of the Evidence Act, which requires agencies to develop “evidence agendas” for research on what works in their programs in coming years.  

But, as we learned with the adoption in the 1990s of the Chief Financial Officers Act and the Government Performance and Results Act, the mere availability of information was insufficient in getting policy makers to use it in their decision-making processes. So what might be different strategies for getting officials to actually use data and evidence to inform program and budget decisions?

Top-Down v. Bottom-Up  

The research by Kislov and others examines how public organizations actually use evidence in practice. To do this, they examined how evidence-based nursing practices were implemented in four countries. However, their research actually told a bigger story. Two countries—the United Kingdom and Australia—leveraged the “disciplinary power of standards and audits” to engage practitioners in the use of evidence-based nursing practices. This involved the creation of top-down, centrally driven, codified standards. In contrast, two other countries—Canada and Sweden—used more of a bottom-up “soft power” approach of “designated facilitator roles with less emphasis on performance standards.” These facilitators developed bottom-up knowledge flows outside the lines of formal supervision and performance management systems.

In the United States, states seem to be further along in some respects than the federal government in developing and using evidence and data. There is a rich variety of examples, as exemplified by approaches taken by two national organizations: the National Association of State Budget Officers and the non-profit advocacy organization for the use of evidence, Results for America.

A Facilitating Approach 

In July, NASBO released the results of a 50-state survey it conducted in order to identify statewide initiatives to use data and evidence in decision making. Based on survey results, it created an inventory of 108 initiatives covering 44 states and the District of Columbia. 

Fairly open-ended, the survey identified a broad range of initiatives with the goal of inspiring other states and connecting leaders so they can share lessons across the development of:

  • Evidence-based policies
  • Performance budgeting
  • Performance management
  • Process improvement
  • Data and analytics

Based on his observations of the survey data, John Hicks, NASBO executive director, says the two largest shifts he’s seen in his career in this field include an increase in the availability of automation and analytic capacity, and an increased willingness of state-level decision makers to use evidence-based information to improve both strategic and policy focus. He observed that this increase has occurred mainly in two policy areas: corrections/criminal justice and social services/health. He has observed less of an increase in education and transportation programs.

Some highlights of the survey include:

  • Use of performance dashboards. For example, Colorado and Minnesota have created public-facing dashboards on progress toward performance and outcome goals. Results so far suggest that this informs decisions to shift resources between programs and priorities, not an increase in overall spending.
  • Investments in analytic skills and capacity. One of the insights Hicks had about making an evidence initiative sustainable is the need to create analytic skills and capacity and embed them into the culture of state government. He sees this happening. For example, California reports that it created a permanent Director of Performance Improvement with resources to help create a data-driven and performance management framework that integrates strategic planning, risk management, data management and analysis, and more. Likewise, Georgia is creating a Data Analytic Center to allow for more real-time and cross-agency data collection and reporting. In addition, states are working with universities to support policy labs with tech advisors, such as the Policy Lab at Brown University in Rhode Island.
  • Demonstrated use informs policy changes. The survey also identified several specific examples of how the use of evidence informed policy changes. California enacted a Whole Person Care pilot program to coordinate health and social services for individuals in order to produce better health outcomes. Funding is allocated based in part on how well the various pilots perform. Similarly, in Connecticut, the mix of service interventions and funding for its diabetes prevention program is driven by demonstrated results.

A Standards-Based Approach 

In early October, Results for America released its assessment results of the maturity of states’ use of evidence, based on its 15-point State Standard of Excellence framework. Their assessment was based on a national standard that it developed that defines the data and infrastructure that states need to have in place in order to invest in what works. Michele Jolin, CEO and co-founder of RFA, said, “These effective and efficient initiatives provide a roadmap for how every state can use evidence of what works when making budget, policy and management decisions.” In their 2019 report, they identify “125 leading and promising practices, policies, programs and systems for using evidence and data” across 33 state governments that meet their standards, including:

  • Colorado, which was highlighted as a leading example for Criteria 1: Strategic Goals.  Colorado launched the Governor’s Dashboard in 2019 that highlights four high-priority strategic goals. A cabinet-level working group was formed for each of these four strategic goals. They develop strategies and metrics that tie back to their individual agency performance plans and link their budget requests to these activities. 
  • Washington State, which was highlighted as a leading example for Criteria 4: Data Policies/Agreements. The state’s Department of Social and Health Services created an Integrated Client Database with data from 10 state agencies, covering 40 different data systems, serving 2.4 million individuals. These data are used for various purposes including measuring progress, program evaluation, and predicting workloads and future service needs. According to state officials, program improvements have led to savings of over $20 million. 
  • Minnesota, which was highlighted as a leading example for Criteria 10: Evidence Definition and Program Inventory. A 2015 state statute led to the creation of a number of inventories of evidence-based programs, including Minnesota Inventory, a statewide clearinghouse of more than 400 state-run programs. As part of the inventory, the state developed a users’ guide with definitions for evidence in order to categorize whether interventions are effective. The state says these resources help guide funding decisions.

RFA has also developed standards of excellence for federal agencies as well as local governments via the Bloomberg Philanthropies What Works Cities Certification initiative.

Weighing the Trade-Offs  

Both the bottom-up facilitation and learning approach and the top-down standards-based approach pursue a common goal—develop evidence and use it in making decisions. But they use different theories of change. One focuses on inspiration and the other more on competition.

The more open-ended facilitation approach encourages flexibility and innovation, especially in a field that is still evolving. It is less prone to becoming a compliance-based exercise. But it might not provide enough structure for those who are just beginning the journey and who are looking for some guidance as to where to get started and what they should be doing.

The standards-based approach, on the other hand, provides a framework and an aspirational vision of what a state might strive to achieve. But unless managed deftly, it could devolve into a check-the-box compliance exercise.

In reality, it is probably not an either-or choice in strategic approach, but rather a both-and approach.And this may be how the field matures best.