Build Once, Use Often

May marked a White House deadline for government agencies to begin streaming data directly to outside developers and the public through application programming interfaces, or APIs. Basically, these are instructions for one computer to continuously grab information from another. 

Some agencies launched a dozen or more APIs in response to the mandate, which is part of President Obama’s open government initiative. 

At the Labor Department, lead information technology specialist Mike
Pulsifer took a different tack. Labor published just one API for 175 information stockpiles, ranging from workforce statistics to historical trends for the Consumer Price Index.

Pulsifer has been assembling the API since 2011. The plan, he says, is to build for the long term. This strategy allows the agency to pack new data sets into the existing API rather than developing a separate infrastructure for each one. 

Pulsifer came up with the idea from the photo sharing site Flickr, which has consolidated all of its data into a single stream. No one else at Labor was using APIs, so building once and using often seemed like a good approach.

“We started really small with three data sets that were admittedly of limited usefulness,” he says. The plan worked, and other divisions began agreeing to let Pulsifer’s API grab their data. 

“We’ve got a tremendous amount of data that we’d love for app developers out there to turn into information,” Pulsifer says. “The stories that can be told from this data, that’s what we’re really hoping the public can produce out of this.”

Stay up-to-date with federal news alerts and analysis — Sign up for GovExec's email newsletters.
Close [ x ] More from GovExec

Thank you for subscribing to newsletters from
We think these reports might interest you:

  • Sponsored by G Suite

    Cross-Agency Teamwork, Anytime and Anywhere

    Dan McCrae, director of IT service delivery division, National Oceanic and Atmospheric Administration (NOAA)

  • Data-Centric Security vs. Database-Level Security

    Database-level encryption had its origins in the 1990s and early 2000s in response to very basic risks which largely revolved around the theft of servers, backup tapes and other physical-layer assets. As noted in Verizon’s 2014, Data Breach Investigations Report (DBIR)1, threats today are far more advanced and dangerous.

  • Federal IT Applications: Assessing Government's Core Drivers

    In order to better understand the current state of external and internal-facing agency workplace applications, Government Business Council (GBC) and Riverbed undertook an in-depth research study of federal employees. Overall, survey findings indicate that federal IT applications still face a gamut of challenges with regard to quality, reliability, and performance management.

  • PIV- I And Multifactor Authentication: The Best Defense for Federal Government Contractors

    This white paper explores NIST SP 800-171 and why compliance is critical to federal government contractors, especially those that work with the Department of Defense, as well as how leveraging PIV-I credentialing with multifactor authentication can be used as a defense against cyberattacks

  • Toward A More Innovative Government

    This research study aims to understand how state and local leaders regard their agency’s innovation efforts and what they are doing to overcome the challenges they face in successfully implementing these efforts.

  • From Volume to Value: UK’s NHS Digital Provides U.S. Healthcare Agencies A Roadmap For Value-Based Payment Models

    The U.S. healthcare industry is rapidly moving away from traditional fee-for-service models and towards value-based purchasing that reimburses physicians for quality of care in place of frequency of care.

  • GBC Flash Poll: Is Your Agency Safe?

    Federal leaders weigh in on the state of information security


When you download a report, your information may be shared with the underwriters of that document.