Drop in Federal Employee Viewpoint Survey Participation Raises Concerns

OPM’s engagement metric doesn’t take into account agency response rates, potentially masking declines in morale.

When the Office of Personnel Management announced the results of the 2017 Federal Employee Viewpoint Survey on Oct. 12, officials lauded increases to benchmarks that gauge employee engagement and satisfaction, but some fear a drop in participation could mask eroding morale in government.

Across government, there was a 3-point increase in the FEVS “global satisfaction index” and the employee engagement index jumped 2 points over the 2016 score. But the response rate dropped 0.3 points to 45.5 percent, the lowest level of participation in at least a decade.

Mallory Barg Bulman, vice president for research and evaluation at the nonprofit Partnership for Public Service, said there could be a number of factors leading to a decrease in employee participation in FEVS across the federal government.

“Between 2005 and 2010, [the response rate] was over 50 percent, and it hasn’t been above 50 percent since then, which was in large part because agencies and the administration were putting tremendous focus on response rates and not necessarily improved scores,” she said. “It’s unclear if [the recent decrease] is based off of survey fatigue. Or it could be—since the FEVS doesn’t use a census, they use a sample—that not all employees get it so the messaging is hard.”

Bulman said from a topline perspective, OPM could encourage greater participation by simplifying the survey, which was made up of more than 80 questions in 2017, removing questions that are captured in other government datasets or are repetitive.

“I would caution against cutting out too much of the survey, because it’s good to have longitudinal data, but there are questions that are really not that helpful, like the ability to telework and those kinds of things that [the government] already captures elsewhere,” she said. “And then some questions are duplicative, and OPM has data on which of those have more reliability. So you could remove some of those more duplicative questions and still have longitudinal data to track issues over time.”

Perhaps more concerning from an analytical standpoint is the relationship between engagement scores and response rates for individual agencies. OPM’s employee engagement index does not take into account shifts in survey participation, and in 2017, several agencies where news organizations have reported internal consternation following the presidential transition saw minimal shifts in their scores combined with stark drops in employee response rates.

At the State Department, where a career employee was criticized by name by then-White House chief strategist Steve Bannon in August and which was unsuccessfully targeted for steep cuts in President Trump’s fiscal 2018 budget proposal, satisfaction dropped 2 points and engagement decreased by 1 point in 2017. But the department’s response rate fell by 16.9 percentage points.

And at OPM, which administers the survey, satisfaction and engagement as measured by FEVS were flat, but participation dropped 9.3 percentage points. OPM and the State Department did not respond to requests for comment.

Bulman said that while the satisfaction and engagement indices can be helpful when dealing with a consistent level of participation, when statistics like the response rate start to fall off, observers should begin looking at other metrics.

“I would always say the FEVS is a really important tool, and you should make sure there is always a mechanism for employees to have a voice and speak up about their experience, but it’s not a substitute for looking at behavior,” she said. “In cases like these, you have to look at the response rate, the percentage of people who have left in recent months, the number of bid protests and other metrics to really look at how employees behave.”

The Partnership uses data from FEVS to do its own analysis, which culminates in the annual Best Places to Work in the Federal Government rankings. Bulman said her organization is still waiting for OPM to provide it with the survey’s raw data, but that she and her colleagues are considering ways to better integrate those additional metrics in their analysis.

“Indices don’t answer all questions, but they aren’t meant to,” Bulman said. “They’re meant to start the conversation, and then you can ask what’s behind them.”