Lightspring/Shutterstock.com

Five Tips from Federal Innovation Entrepreneurs

Nearly 200 enthusiastic innovators from more than two dozen agencies gathered to share ideas and inspire each other.

Using “lightning round” presentations, nearly a dozen presenters shared their stories. Andy Feldman from the Department of Education, who coordinated the event, noted that the goal wasn’t innovation for innovation’s sake, but rather to use innovation as a tool to tackle mission-related performance challenges: “We’re here to focus innovation on our agencies’ biggest challenges and opportunities.” Delegated Deputy Secretary of Education John King also welcomed attendees, urging them to put ideas into action.

As if to reinforce these messages, the same morning at a meeting of the President’s Management Council, deputy secretaries discussed way to expand the use of evidence, evaluation, and innovation in their agencies.  Several White House policy councils participated in the meeting, asking agencies to “articulate their strategy to advance the use of evidence in decision making . . . as a part of their budget submissions, due to OMB on September 14, 2015.”

Some of the useful insights from the various presentations at the Innovation Exchange session reflected issues discussed by the deputy secretaries, as well:

Use behavioral insights. Recent studies from behavioral economics and psychology have found that people do not always make rational decisions regarding their well-being.  But making it easier for people to make better choices (such as putting healthy foods at eye level in a cafeteria) or re-framing the costs and benefits involved can help people make better decisions.  To see if these behavioral approaches might help agencies improve policy and practice, the White House created a small social and behavioral sciences team to pilot and champion their use. 

Will Tucker, who is on the White House team, described how the Defense Department was facing a change in its payroll system, and 140,000 service members would have to re-enroll in their Roth Thrift Savings Plans. Defense was concerned that service members might not re-enroll, so the White House team helped Defense redesign the email encouraging service members to re-enroll. By piloting and comparing the early responses from the redesign vs. the original language, they were able increase re-enrollment by 22 percent.  Tucker says that by applying behavioral insights and rapid pilot testing, agencies can better leverage their resources to improve outcomes at lower costs.

Apply data-driven strategies. A number of agencies have adopted the “PerformanceStat” model as a way of getting decisions made and implemented, based on the use of data and evidence of what works and what doesn’t. Carter Hewgley the director of analytics for FEMA, described how FEMAStat has created a problem-focused “repetitive demand for data from leadership” by “working smarter through data analytics.”

For example, after a flood has receded, where should FEMA disaster insurance assistance teams go within a community? FEMAStat meetings of top leaders from across the agency led to an answer -- they began integrating data from different internal databases that show via GPS where assistance teams are, overlays them on a street map, and then overlays this with flood insurance maps to see which homes were most likely to be flooded.

Hewgley says that FEMAStat sets the tone for the use of data by top decision makers and that the next step is to provide such data and analytic tools directly to frontline employees so they can solve problems themselves.

Innovate from within. Read Holman, with the IDEA Lab at the Department of Health and Human Services, asked “why don’t good ideas take off?” He says HHS tries to address this with an “Ignite Accelerator,” which mimics the dynamics of a venture capital supported startup program. The goal is to “create a constant churn of energy.”

The Accelerator is a time-bound, safe space for HHS staff to offer and test ideas on how to improve their corner of government. Teams selected into the competitive program give 50 percent of their time to their project over a 90-day period. In return they receive mentorship, seed-funding and a series of weekly check-ins on progress from IDEA Lab staff. At the end, each team makes a pitch for funding and support to take their project to the next phase. Holman says that about one-third of the projects have received post-Ignite funding and support from their local office or agency. And a few of them have pitched for and received funding from an HHS Ventures Fund. For example, a team from the Health Resource Services Administration worked with a team from the Department of Education on a program that helps nurses repay their federal student loans. This pilot effort ultimately led to the automation of an existing time-intensive manual process and a 75 percent reduction in time and effort.

Strengthen the capacity to use evidence. Naomi Goldstein, the head of research and program evaluation at HHS’s Administration for Children and Families, says that a key strategy for creating a sustained agency commitment to rigorous program evaluation is a written evaluation policy. She described how ACF in 2012 created a formal policy on evaluation by outlining goals and principles related to rigor, relevance, transparency, independence, and ethics. One reason that’s important: the credibility of evaluation is suspect if there’s not a commitment to integrity and transparency. She notes that other agencies across the federal government are also creating similar policies.

Joy Lesnick, the acting commissioner at the Institute of Education Sciences in the Department of Education, described how the agency strengthens practitioners’ and decision makers’ capacity to use evidence by creating the What Works Clearinghouse. It has curated more than 10,000 studies on education-related evaluations of what works in improving student achievement. She says they are turning technical studies into easily-consumable infographics and videos, grouped by topic, to encourage non-researchers to use the results in making fact-based decisions in education policy. She also noted that other departments across the government are creating similar clearinghouses in other policy areas, such as labor, health, and criminal justice.

Use outcome-focused grant designs. Kathy Stack, a former executive at the Office of Management and Budget, described multiple approaches that are being taken in different federal agencies to focus on program outcomes, guided by evidence and data. For example, she pointed to the increased use of “tiered-evidence” grant programs, where new intervention approaches can be piloted for small amounts of money, validated with slightly larger amounts, and then scaled-up, once there is strong evidence that a particular intervention approach works.

She pointed to recent legislative initiatives with bipartisan support, such as the cross-agency Performance Partnership Pilots initiative for disconnected youth. The 10 pilots will allow selected providers to blend federal funding across Education, Labor, and Health and Human Services programs and allows individual program requirements to be waived in exchange for strong accountability around program results. She said the first round of pilots are to be announced this Fall.

Based on Stack’s observations, she sees a bipartisan hunger for research, evaluation, and experimentation. And based on the response of the Innovation Exchange participants, there’s a hunger for it in the rank-and-file of the federal workforce, as well.

(Image via Lightspring/Shutterstock.com)