Decoding the Language of Behavioral Science for Government Officials

Before the techniques can be used to improve program outcomes, it’s helpful if we can agree on what the terminology actually means.

“Applying behavioral insights in the right context can lead to substantial improvements in program outcomes,” writes behavioral scientist Amira Choueiki Boland in a 2016 Public Administration Review article.

But just what are these insights derived from the academic field of behavioral science that can be applied in government? It’s a complex field wrapped in a technical language that takes some decoding.  

Following are some of the underlying concepts and terms, at least as they are beginning to appear in the public administration literature. Because the field is still evolving, sometimes different language is used to describe the same concepts, and the ways concepts are organized will vary between authors. As a result, my descriptive efforts should be seen as a beginner’s effort.

‘System 1 and System 2’ Thinking 

In a 2018 Public Administration Review article, Nicola Bellé and his colleagues briefly describe the historical evolution of some of the concepts underpinning behavioral science.  They note that before the 1940s, the dominant model used to describe decision making “features a rational decision maker who has clear and comprehensive knowledge of the environment, a well-organized system of preferences, and excellent computational skills to allow for the selection of optimal solutions.”

However, in the late 1940s and 1950s, scholars began to question this approach, noting that “decision makers are endowed with bounded rationality.”  As a result, “people make decisions for themselves and for others by relying on a limited number of heuristic principles [mental short cuts] that reduce the complex tasks of assessing probabilities and predicting values to simpler judgmental operations.”

Based on this new theory, “decision makers are prone to cognitive biases [errors in thinking] that systematically affect their estimates, judgments, and choices in any domain.”

What Is “System 1 and System 2” thinking?  Pioneering psychologists Daniel Kahneman and Amos Tversky describe the differences between the use of heuristics and rational decision making as System 1 and System 2 thinking, where:

  • System 1 thinking is perceptual, fast, intuitive, automatic, and effortless. An example is judging the potential actions of other drivers while driving home from work using the same route each day. The advantage of this use of mental shortcuts reduces complexity and allows fast, effortless, automatic and associative decision making. 
  • System 2 thinking is reason-based, slow, takes mental effort, and is rule governed. Judgements are based on intentional and explicit processes. An example is choosing a health plan. Sometimes it involves the use of external decision support models, software, or group decision making. 

Under System 1, the use of heuristics (rules of thumb/mental shortcuts) can be effective in that they reduce complexity. However, they tend to lead to systematic errors which are called “cognitive biases.”

Cognitive Bias

What exactly is cognitive bias? Dr. Travis Bradberry writes: “Cognitive bias is the tendency to make irrational judgments in consistent patterns . . . Researchers have found that cognitive bias wreaks havoc by forcing people to make poor, irrational judgments . . . Since attention is a limited resource, people have to be selective about what they pay attention to in the world around them. Because of this, subtle biases can creep in and influence the way you see and think about the world.”

How do cognitive biases work? According to Kendra Cherry, “A cognitive bias is a type of error in thinking that occurs when people are processing and interpreting information in the world around them . . . They are rules of thumb that help you make sense of the world and reach decisions with relative speed.”

Cherry  elaborates, noting that: “When you are making judgments and decisions about the world around you, you like to think that you are objective, logical, and capable of taking in and evaluating all the information that is available to you. Unfortunately, these biases sometimes trip us up, leading to poor decisions and bad judgments.”

A Wikipedia article catalogs 170 different kinds of cognitive biases. John Manogian III developed a codex that organizes this inventory of cognitive biases into four categories:

  1. What should we remember (e.g., discarding specifics in order to create generalities)
  2. Too much information (e.g., focus on details that reinforce pre-existing beliefs)
  3. Need to act fast (e.g., bias towards status quo)
  4. Not enough meaning (e.g., we fill in characteristics with stereotypes)

Intervention Techniques

Following are some examples of behavioral intervention techniques that leverage the basic concepts of System1/System 2 thinking and cognitive biases. Many are based on a 2019 literature synthesis by Paul Battaglio, Jr., et al, in Public Administration Review:

The Use of “Nudge” or Choice Architecture. Nudging and choice architecture are useful tools for influencing the choices or behaviors of citizens and government workers. According to Battaglio: “nudge theory systematizes the use of behavioral science to influence high-stakes choices through low-powered incentives . . . A nudge is any aspect of the choice architecture that alters people’s behavior in a predictable way without forbidding any option . . . Nudges are not mandates. Putting fruit at eye level counts as a nudge. Banning junk food does not.” 

Example: The Social Security Administration’s Supplemental Security Income (SSI) is a monthly cash benefit for the disabled, blind, or elderly poor. Less than 60% of those eligible apply, in part because of perceived administrative barriers. So SSA sent letters to those it judged might be eligible to let them know the application process was simple and what was the maximum benefit level. It tested several variations of the letter and found that 6% of those receiving the letter applied vs. 1% among those who did not receive a letter. Ultimately, the number of participants who received the letter and who qualified for the program was greater than those who did not receive the letter and qualified, by 340%.

Opting for the Status Quo. In this form of cognitive bias, decision makers “tend to prefer the status quo option as the number of viable alternatives increase.” That is, when more options become available, the decision maker is more likely to prefer sticking with the status quo, such as the same contractor, same doctor, or same appliance. According to Richard Thaler: “the most powerful nudge we have in our arsenal is simply to change the default . . . The default is what happens when you do nothing.”

Example: Public health professionals are being trained to understand the role of status quo bias in decisions made by patients and are using this greater understanding to increase participation rates in organ donation programs, vaccination campaigns, and HIV screening by asking participants to opt-out of participation rather than asking them to opt-in.

The Use of Outcome Framing. How a choice among alternatives is framed typically influences the selection made by the decision maker; e.g., describing policies with the same outcome in positive terms (lives saved) vs. in negative terms (lives lost).  As a result, “individuals prefer the policy with the sure outcome when the outcomes are framed positively and prefer the policy with the probabilistic outcome when outcomes are framed negatively . . . decision makers are risk averse in the domain of gains and risk takers in the domain of losses . . . individuals tend to react in a systematically different manner to the same piece of information, depending on how it is presented to them.”

Example: A team of Italian scholars led by Paolo Belardinelli examined the use of outcome framing as an intervention technique. They found that describing something in terms of success rates vs. failure rates affects decisions—a more positive framing leads to more favorable decisions than a negative framing. In an experiment, they found that user satisfaction with a sports facility results in different responses to the same data. When a negative framing was used, survey respondents gave lower ratings to a sports facility and its director than did respondents who were presented the same data but with a positive framing. 

The Use of Anchors. Anchoring is the tendency to rely too heavily on an initial estimate, which biases our final answer. Bellé observed that “different starting points yield different estimates, which are biased toward the initial values.” In other words, when decision makers are given a random number, and then given an “anchor” number, they tend to make a judgement about the random number in relation to the anchor number. This can be done in areas as diverse as pricing, performance, or promotions.

Example: Doctors found they could increase patients’ willingness to receive monthly injections to manage their psoriasis if they were first asked if they would be willing to receive daily injections. A test found that those that were asked to participate daily more readily participated in monthly injections than those who were initially told they should receive monthly injections.

There are many more types of insights that can be derived from behavioral science, which is why public administrators are beginning to invest in specialists that help their agencies and programs apply the principles of behavioral science in day-to-day work.