Key Standards

The barriers to reliable information sharing are numerous.

A crucial aspect of most management quality programs is standardization-an attempt to make sure processes produce optimal results every time. One reason standardization is easier said than done is because data systems have yet to allow us to enter data once and then let everyone who needs that information access it on-demand. Will technology eventually let us stop wasting our time searching for the information we need?

Major projects throughout government are seeking to produce more perfect information systems. The Army, for example, is developing the Future Combat Systems. FCS would create a way to capture key information about the battlefield and instantly make it available to soldiers on the ground, their front-line leaders and central commanders. Intelligence agencies have spent years working on systems that gather data from a variety of sources so analysts can identify potential national security threats. The Weather Service and the U.S. Geological Survey constantly are working on systems that capture information and disseminate it to the right people in order to predict natural disasters.

These efforts have run into numerous obstacles that have so far prevented them from developing perfect information systems-ones that capture all the right data and share it with all the right people at the right times.

For one, such efforts have tended to be expensive. FCS has been a target for budget hawks since its inception a decade ago as cost estimates have risen. Major systems take a long time to develop and run into hiccups along the way. It's hard to maintain funding support through the duration.

Developing such systems also can be politically controversial, as was the case with the Defense Department's Total Information Awareness program, one version of the intelligence agencies' data collection efforts. Lawmakers sought to kill the program because of concerns that it was developing into a big brother-type system that gathered too much personal data about Americans, violating their privacy.

Programs that try to connect people in different agencies with a common set of data face a huge set of challenges. Different agencies use different words to describe the same things, meaning they have to change the way they talk about key information in order to standardize data. The things they measure also can be slightly different-one might use inches and another centimeters, for example-requiring additional work to standardize information. Agencies also use different computer systems that have their own idiosyncrasies, making standardization even more difficult.

For example, the site tries to offer users one place to look up information about government contractors and grantees. But the two primary systems that draws information from do not have standard methods for identifying contractors and grantees. That has forced site administrators to try to standardize the names of the contractors and grantees. "There may exist some errors in these parent company assignments, so that transaction records that really apply to different contractors are grouped together," explains.

Technology won't solve every problem that prevents information from being shared effectively. But if managers are committed to making better data available to people when they need it, their agencies will be able to at least reduce the errors caused by the lack of the right information.

Brian Friel covered management and human resources at Government Executive for six years and is now a National Journal staff correspondent.

Stay up-to-date with federal news alerts and analysis — Sign up for GovExec's email newsletters.
Close [ x ] More from GovExec

Thank you for subscribing to newsletters from
We think these reports might interest you:

  • Sponsored by G Suite

    Cross-Agency Teamwork, Anytime and Anywhere

    Dan McCrae, director of IT service delivery division, National Oceanic and Atmospheric Administration (NOAA)

  • Data-Centric Security vs. Database-Level Security

    Database-level encryption had its origins in the 1990s and early 2000s in response to very basic risks which largely revolved around the theft of servers, backup tapes and other physical-layer assets. As noted in Verizon’s 2014, Data Breach Investigations Report (DBIR)1, threats today are far more advanced and dangerous.

  • Federal IT Applications: Assessing Government's Core Drivers

    In order to better understand the current state of external and internal-facing agency workplace applications, Government Business Council (GBC) and Riverbed undertook an in-depth research study of federal employees. Overall, survey findings indicate that federal IT applications still face a gamut of challenges with regard to quality, reliability, and performance management.

  • PIV- I And Multifactor Authentication: The Best Defense for Federal Government Contractors

    This white paper explores NIST SP 800-171 and why compliance is critical to federal government contractors, especially those that work with the Department of Defense, as well as how leveraging PIV-I credentialing with multifactor authentication can be used as a defense against cyberattacks

  • Toward A More Innovative Government

    This research study aims to understand how state and local leaders regard their agency’s innovation efforts and what they are doing to overcome the challenges they face in successfully implementing these efforts.

  • From Volume to Value: UK’s NHS Digital Provides U.S. Healthcare Agencies A Roadmap For Value-Based Payment Models

    The U.S. healthcare industry is rapidly moving away from traditional fee-for-service models and towards value-based purchasing that reimburses physicians for quality of care in place of frequency of care.

  • GBC Flash Poll: Is Your Agency Safe?

    Federal leaders weigh in on the state of information security


When you download a report, your information may be shared with the underwriters of that document.