Sponsor Content  What's this? This content is made possible by our sponsor. The editorial staff of Government Executive Media Group was not involved in its preparation.

The Power of
Data Analytics
to Boost
Government
Performance

Written by William Jackson, The Tech Writers Bureau, for Government Executive Media Group, Studio 2G

Written by William Jackson, The Tech Writers Bureau, for Government Executive Media Group, Studio 2G

Agencies rely on large collections of data to execute their missions. Advanced analytics can derive insight from data so agencies can anticipate rather than react, improving efficiency and security. Partnering with data science professionals such as Dun & Bradstreet can help agencies move up the data value chain. 

Government agencies rely on data to execute their missions, gathering large volumes to enable daily operations. But there is value in this data beyond day-to-day activities and agencies that unlock that value through advanced analytics can drive efficiency, improve security and become more responsive to rapidly changing conditions. 

Analytics allows agencies to leverage data as a strategic asset, making sense of it to derive insight, make informed decisions and drive better outcomes.

“The amount of data that government agencies collect is growing,” said Bill Pastro, Dun & Bradstreet’s senior vice president for North American Government Solutions. And the need for data science professionals who can turn data into actionable information is also growing. They are critical to helping agencies understand not only what has happened but also what will happen and to more efficiently allocate limited resources. 

Using analytics to move up the data value chain requires both reliable data and the tools to derive actionable insights. 

Smart Data

The first step in taking advantage of the ever-increasing amounts of data that agencies are collecting is to manage it so that it can be used as an asset for solving mission critical problems. 

“The foundation is having high quality data,” Pastro said. “If you don’t have reliable, verifiable data to start with, you are going to be challenged in deriving information from it. This starts with the ability to vet the information you have.” 

This is done with smart data – analytics-infused data that provides the ability to vet and update large volumes of structured and unstructured inputs to uncover the truth and meaning from data. When the quality of the data is assured, it can be cross-referenced with other data sets to add context and meaning. The data can be combined with various derived attributes, empirical forecasts and binary indicators to give agencies confidence in the decisions that are based on this information.

"If you don't have reliable, verifiable data to start with, you are going to be challenged in deriving information from it. This starts with the ability to vet the information you have."

Case Study: Government Services Administration

For 30 years, the federal government has been using a smart data approach to ensure data integrity and to deliver operational efficiency in the award management system for government contracts and grants. By deploying the D-U-N-S® Number as a requirement for entities conducting business with the US Government, the General Services Administration was able to transition from a paper-based global award management processes to electronic commerce, ensure business verification at the point of entry, and enrich records with entity hierarchy data for increased transparency. 

This smart data approach enables a strong foundational, trusted master data system for issuing hundreds of billions of dollars in taxpayer-funded awards every year. 

Predictive Analytics

Predictive analysis adds value by leveraging data, statistical algorithms, and machine learning techniques to predict the likelihood of future outcomes. “It’s no secret that budgets are being squeezed, and that doing more with less is here to stay in the federal government. That’s why predictive analysis is so critical in this budget environment,” Pastro said. 

By identifying trends based in past behaviors, agencies can develop rule-based models, scorecards, and watch lists to improve the efficiency of limited resources. “For instance, regulatory agencies can prioritize the workload by focusing time-intensive investigations on high-risk entities,” Pastro said. 

Algorithms will not replace workers, Pastro said. “You’ll always need the value of human intelligence.” But predictive analysis helps agencies put that intelligence to better use, delivering better results and driving desired outcomes. “This makes the human intelligence so much more efficient.” 

One key benefit of predictive analysis is streamlining due diligence and oversight of regulated businesses. 

"Analytics offers agencies the ability to make sense of the ever-increasing amounts of data being collected."

Case Study: Centers for Medicare and Medicaid Services

The Centers for Medicare and Medicaid Services (CMS) Center for Program Integrity has done a masterful job preventing fraudulent payments by using predictive models combining historical agency data with third party data to screen medical providers. Providers and claims are checked against known indicators to flag suspect claims. 

The CMS predictive analysis systems improve over time, using machine learning to create more accurate predictive models as new data is added. The result is greater efficiency with fewer fraudulent payments of taxpayer money. 

Anticipating Scenarios

The public sector is at a turning point in how it manages resources to execute its missions and deliver taxpayer value. The next step in leveraging data analytics is anticipating events and conditions by asking hypothesis-driven questions. This paradigm shift in mission planning strengthens the government’s ability to protect citizens and national assets from known and unknown threats, and promote business innovation and economic growth. 

Hypothesis-driven analytics can be used to accelerate threat assessments and to anticipate future business and market scenarios. By feeding different hypotheses into customizable analytics models, agencies can more quickly vet the feasibility and potential impact of process and policy changes. Once again, human judgment is needed in developing hypotheses that allow agencies to anticipate scenarios, but this process also improves over time as machine learning assists human intelligence. 

Business Cases

With a comprehensive understanding of risk factors, enhanced with mission-based analytics, data scientists can help agencies detect and investigate complex criminal activity, improve emergency preparedness and response, and ensure mission-critical supply chains against future economic disruption. 

Dun & Bradstreet applied these analytic capabilities to predict the local, regional, state, national and economic impact of the 2016 floods in Louisiana. With these insights, agencies and responders were able to understand the business impact of the floods – including the number of local businesses and workers impacted and the level of difficulty for a business to recover. “With this information, the government was better able to size up the impact on the health of Louisiana’s small business community and anticipate repercussions in areas such as unemployment rate and real gross domestic product (GDP) growth,” said Pastro. In late 2017 and early 2018, Dun & Bradstreet applied similar analytical capabilities to evaluate the impact of Hurricanes Harvey, Irma, and Maria.

Analytics offers agencies the ability to make sense of the ever-increasing amounts of data being collected. The analytics provided by Dun & Bradstreet bring the data in its global commercial database to life, enabling agencies to derive actionable insight to drive better outcomes. Bringing world-class data to the sharpest minds in data science helps agencies to improve the business of government.