GettyImages/Morsa Images

Better Data Sharing Begins With Dispelling Staff Mistrust

Ensuring that employees understand data-sharing agreements and are comfortable with the terminology will build the trust they need to learn to use data effectively, experts say.

While the idea of sharing data may still be foreign to some agencies, properly training employees how to use data can help build the necessary trust far better than any technology solutions, leaders on a panel last week said.

Governments collect scads of raw data but face challenges properly understanding it. Too often, data analysts have had to spend their time cleaning and sorting data before they can use it.

But the biggest obstacle to the better use of data is long-standing employee mistrust or reluctance—issues that are hard to shake, said Maryland Chief Data Officer Pat McLoughlin. Employees may be hesitant to use unfamiliar data to make decisions or may feel they do not understand what it is telling them.

“Part of the general challenge in the data space is fear,” he said during a panel discussion at the Amazon Web Services Summit last week in Washington, D.C.

For many agencies, the inability to properly understand the data they had collected meant data became a “sunk cost,” said Patrick McGarry, deputy vice president for federal sales at data visualization company “It was never a first-class citizen,” he said, and it was instead put in repositories or databases never to be seen again. In other instances, employees may also see their data as something to be closely guarded, he added.

If multiple departments want to access the same data, agency leaders typically write data-sharing agreements, a bureaucratic process that often creates delays. Thousands of those data-sharing agreements are in place throughout Maryland, McLoughlin said, but the lack of standardized language in those agreements means they are a “challenge to manage.” 

Too often, those onerous sharing agreements mean agencies feel forced to build their own datasets even if what they need may already be available elsewhere in the government. Duplicating the data creates additional issues, said Carlos Rivero, Virginia’s former chief data officer and now an executive government advisor at AWS. 

“When you have a lot of duplicative data across these different agencies, from an operational perspective, it makes sense, they control it, and need to move fast,” Rivero said. “But the reality is that your customers’ experience when they have to enter the same information over and over into different systems is just overbearing,” he said. Plus, it “also creates duplicate data across the entire environment.”

To deal with those issues, Rivero and McLoughlin called for data officers to promote employee training and to have a common glossary of terms that are used across government. “The idea is that you have consistent terminology that's being used across the board when we're talking about data,” McLoughlin said. Those efforts meet people where they are, the pair said. They help build skills, and encourage better data use.

McLoughlin said too often, agencies try to “boil the ocean” in their use of data by starting with projects that are too big or trying to solve enormous issues that have proved difficult to get under control. He said starting small is the way to go.

In Virginia, Rivero said efforts to leverage data started small but were quickly scaled up. His office began a pilot project to track opioid use in Winchester, which brought together information on overdoses. It tracked where people were getting their supply and detailed the drugs found in their system during toxicology reports, all in a bid to get the problem under control and identify where potentially deadly fentanyl was being found.

That project, after proving itself successful in bringing down opioid overdoses in that city, expanded statewide and pivoted to help local health departments determine where COVID-19 testing and vaccine sites should go. It used 34 criteria to identify communities’ needs and determine how residents should be formally invited and notified.

Virginia also used the Winchester pilot to help build out a statewide online portal during the pandemic that showed available services, created an account for residents and then tracked how long it took for agencies to reach out. Those initiatives all put what he called “actionable intelligence” into people’s hands and empowered them to make better decisions.

But Rivero said his efforts as the first chief data officer in Virginia history started small, with very few resources and no staff. First, he said he helped set up a data advisory committee with legislators, cabinet representatives and others to identify problems and map out how they could be solved with data.

Then, he went around the commonwealth to meet with local government leaders, hear about their challenges and discuss where to collaborate. “At first, I’m sure there were a lot of folks that thought I was crazy,” Rivero said.

The idea of better data sharing between government agencies is still in its infancy, panelists said, creating what McGarry described as a wild west situation with little oversight and governance. But that fluidity could be a benefit to data-driven government employees, he said. They have more opportunities to innovate and collaborate, as there are few precedents to follow.