Tracking Technology

Agencies need to get a handle on measuring IT effectiveness.

Every day hundreds of millions of dollars are spent on information technology in the federal government. To an outsider, these expenditures seem exorbitant. But, if you recognize the shift from a society of capital or tangible assets to an information-intensive one, and you factor in the relative newness of the information systems discipline, then this spending might be quite justifiable. For all we know, it might be too low.

The question is how to measure these investments.

Information systems are a tool to support the strategy of an enterprise. Without strategy, however, the success of any tool is relegated to luck. Recognizing this, Congress passed the Clinger-Cohen Act in 1996 to ensure that agencies develop clearly articulated visions, strategies and enterprise architectures, and perform appropriate due diligence before investing in information systems. Part of this process includes a proven technique for managing such investment-IT portfolio management. Further guidance was provided in the 2002 E-Government Act. And in 2004, the Government Accountability Office published the "Information Technology Investment Management Framework," providing even more explicit direction. The Office of Management and Budget requires IT investments to be vetted through a standardized business case, commonly referred to as Exhibit 300, which would feed the portfolio with standard investment-quality information.

But the jury is still out, and it might be for quite some time, on whether this has made a difference. It's impossible to determine the efficacy of a program, tool or technique without some form of measurement. Also, measurements must be examined to ensure that their validity hasn't been altered by an external factor (such as measuring benefits in real dollars versus actual dollars to minimize the false influences of inflation). Prescient change agents must identify criteria before embarking on an effort. Yet no such criteria were spelled out when portfolio management was mandated for government.

IT portfolio management-and all laws, mandates and guidelines around the investment in and use of IT in government-is in its infancy. There is no good quantitative data to prove its effectiveness. An IT portfolio must account for the entire life cycle of investments. Assets must be tracked against business cases used to fund them. The process is simply too new to assess expected results against actual life cycle benefits.

Qualitative observations, however, would cast doubt on the appropriateness of agencies' approach to IT portfolio management. The data and analytics are complex. The data, usually generated from IT processes that are not standardized, is suspect. While the current techniques probably are better than none at all, results appear bloated given the immaturity of the process. Fortunately, according to the laws of thermodynamics, things tend toward equilibrium. The IT process will mature, and during this evolution new guidelines likely will emerge.

Some agencies make IT portfolio management too complicated, applying techniques and algorithms to questionable data to make important decisions. Even with the most elaborate process, bad or inconsistent data will generate bad or inconsistent results. It would be better to temper the quantitative with the qualitative. Agencies must prioritize objectives. All IT investments should be associated with those objectives in a consistently weighted manner. The approach might be imprecise, but it's better than the current practice at many agencies.

Stay up-to-date with federal news alerts and analysis — Sign up for GovExec's email newsletters.
Close [ x ] More from GovExec

Thank you for subscribing to newsletters from
We think these reports might interest you:

  • Sponsored by G Suite

    Cross-Agency Teamwork, Anytime and Anywhere

    Dan McCrae, director of IT service delivery division, National Oceanic and Atmospheric Administration (NOAA)

  • Data-Centric Security vs. Database-Level Security

    Database-level encryption had its origins in the 1990s and early 2000s in response to very basic risks which largely revolved around the theft of servers, backup tapes and other physical-layer assets. As noted in Verizon’s 2014, Data Breach Investigations Report (DBIR)1, threats today are far more advanced and dangerous.

  • Federal IT Applications: Assessing Government's Core Drivers

    In order to better understand the current state of external and internal-facing agency workplace applications, Government Business Council (GBC) and Riverbed undertook an in-depth research study of federal employees. Overall, survey findings indicate that federal IT applications still face a gamut of challenges with regard to quality, reliability, and performance management.

  • PIV- I And Multifactor Authentication: The Best Defense for Federal Government Contractors

    This white paper explores NIST SP 800-171 and why compliance is critical to federal government contractors, especially those that work with the Department of Defense, as well as how leveraging PIV-I credentialing with multifactor authentication can be used as a defense against cyberattacks

  • Toward A More Innovative Government

    This research study aims to understand how state and local leaders regard their agency’s innovation efforts and what they are doing to overcome the challenges they face in successfully implementing these efforts.

  • From Volume to Value: UK’s NHS Digital Provides U.S. Healthcare Agencies A Roadmap For Value-Based Payment Models

    The U.S. healthcare industry is rapidly moving away from traditional fee-for-service models and towards value-based purchasing that reimburses physicians for quality of care in place of frequency of care.

  • GBC Flash Poll: Is Your Agency Safe?

    Federal leaders weigh in on the state of information security


When you download a report, your information may be shared with the underwriters of that document.