How to Achieve Your Agency’s Objectives One Meeting at a Time

Data-driven meetings made a big splash in government nearly a decade ago. New research shows what it takes to make them work.

The concept of data-driven meetings was popularized in the mid-1990s by the New York City Police Department, which dubbed them “CompStat” meetings. Through the systematic analysis of trends, they were seen to contribute to a significant drop in crime and the concept was eagerly replicated by other cities (Citi-Stat) and a number of states (State-Stat). Harvard professor Bob Behn studied this phenomenon and wrote a book about “Performance-Stat” as a leadership style and way of thinking and behaving, not just an administrative process innovation.

At the federal level, data-driven meetings became more prevalent at the agency level beginning in 2009, according to a 2011 report by Harry Hatry and Elizabeth Davies of the Urban Institute.  Their use was reinforced by requirements in the GPRA Modernization Act of 2010, which the Government Accountability Office assessed in a 2015 report. Both the Urban Institute and GAO reported positive effects and identified some best practices.

The spread of management innovations, like data-driven reviews, is oftentimes problematic in different policy, geographic, and cultural settings. Sometimes an innovation transfers well, other times, not so well. But the spread of data-driven meetings quickly found an international following.

International Approaches

The United Kingdom created its own version of data-driven meetings in 2001, which were orchestrated by a newly-created organization reporting directly to Prime Minister Tony Blair. The Prime Minister’s Delivery Unit initially focused on improving citizen-facing services in four major policy areas: education, health, transportation, and crime. Its first head, Michael Barber, subsequently wrote a book in 2008 about how the unit was set up and operated. His advice was similar to Behn’s—using structured, data-informed processes are important, but the key to success is for the political leader to adopt the processes as the basis for a results-oriented, problem-solving leadership style.

The Delivery Unit concept spread rapidly in South America, beginning about five years ago. A recent study by the Inter-American Development Bank assessed its use in 14 South American governments and concluded that, “under certain preconditions,” its use improved management and attained “results that have a direct impact on citizens.” For example, Chile and Peru focused on complex goals at the outcome level, such as reducing childhood malnutrition and increasing student math test scores. Other countries, including Costa Rica and Guatemala, focused on delivering services such as supplying medicines to hospitals or paving roads.

The study’s authors, Mariano Lafuente and Sebastián González, emphasize that the ultimate aim of delivery units “is to ensure that promises are kept.”

But achieving success wasn’t without institutional or political challenges. The authors’ survey of different delivery units across South America identified a series of challenges faced in implementing this approach.

Key Lessons Learned

Interestingly, the lessons learned in South America have a familiar ring to them, since they parallel many of the lessons offered by Behn’s research on the state and local Performance-Stat approach:

  • The chief executive likes the idea of a delivery unit but doesn’t devote the time to follow-up meetings or delegate responsibility to someone who has the authority to deliver results. Top leaders must demonstrate ownership, note the authors: “If the highest political authority does not take ownership . . . by devoting time in quarterly or at least biannual follow up meetings . . . the model will simply not work.”
  • The delivery unit is used by the chief executive for firefighting and not to achieve longer-term goals, and priorities are ill-defined or constantly changing.
  • The delivery unit is introduced midway through a term of office. Typically, at that point, central government positions are filled by other stakeholders and decisionmaking processes are already in place and difficult to change.
  • There’s too much emphasis on collecting data to monitor gaps in performance, and not enough on learning and problem-solving.  Creating scorecards to monitor progress and meet milestones doesn’t necessarily lead to results. The goal should be to identify strategies for improvement and have mechanisms that allow rapid redeployment for resources to pursue the revised strategies.
  • The head of the delivery unit lacks the necessary political profile and influence. Or, the delivery unit team lacks the technical skills necessary to add value to the efforts being made by agencies.
  • Other institutional actors feel threatened by a perceived intrusion into their roles and responsibilities and withhold support.
  • The line agencies do not see the delivery unit as a value-adding partner. It is critical to the success of delivery units to “be seen by them as a partner rather than a rival.” If not, the line agencies won’t share information and will shut out the unit when trying to develop joint solutions to identified challenges. This includes working with budget teams to ensure resources are available to tackle the challenges.
  • Public reporting of results is always rosy, and credibility is lost in the eyes of the public. Political actors will begin to ignore the delivery unit as producing propaganda and the veracity of the data will be questioned.

The authors conclude that data-driven meetings using the delivery unit approach “are destined to become part of a wider ecosystem of public management innovations” used by agencies to improve decisionmaking, strengthen coordination between agencies and levels of government, and leverage better performance. They believe this is the case not only because of the widespread adaptation of this approach by so many different kinds of governments across the globe, but also because digital transformations in government are allowing data to be easily collected, shared, and analyzed.