Panelist notes politics of putting agency information online

Patrice McDermott, executive director of OpenTheGovernment.org, chose to participate in a Tuesday workshop sponsored by the World Wide Web Consortium and the Web Science Research Initiative because she wants to convince techies that the government's underutilization of the Internet has a lot to do with politics.

The workshop, held this week at the National Academy of Sciences, brought together government officials, computer scientists, academics, Web standards leaders and government vendors. W3C, an Internet standards group, organized the event to facilitate the deployment of Web standards across government Web sites; help shape research agendas; and guide officials in crafting Web policy that increases access to government information.

After speaking at the event, which was closed to the press, McDermott told Technology Daily: "What the people in there -- mostly technology people -- don't understand is that it's not just a resource decision, it's a political decision to expose that information. It's really more the politics than the policy."

She added that there is nothing in the policy to prevent the Bush administration from "exposing" public records on the Internet, yet government agencies are Web-averse. "What we get is the information that the government wants us to know about," so "we don't know what we don't know," McDermott said.

While techies want the government to venture into the "semantic Web" -- an evolving, more intelligent Internet that can deeply analyze content, McDermott said, "we'd like to get to Web 1.0" first.

She said that at the workshop, attendees told her that government agencies just need to make their databases available on the Internet, and others in the online community will reformat the contents so the information is compatible with new technologies. "Others will create the [topical] tags" that allow the content to be integrated into advanced Web technologies, they said.

McDermott's reply: The policy is already there to do that. "It's been there for years. It's just not being enforced. It takes leadership from the White House."

She pointed to the Environmental Protection Agency as an example. "Our experience with the Web is Clinton and Bush. During the Clinton administration, people could call people in the office of pollution control ... and say, 'What else do you have that is not online?'" Agency employees were willing to provide whatever hard data they had in the building.

Today, she said, agencies are not always aware themselves of the information they have collected.

Brand Niemann, an EPA official and co-chair of the CIO Council's Semantic Interoperability Community of Practice, who was in the room with McDermott, told Technology Daily, "I offered to have her give me a list of things that she needed and I would make sure she got them."

He added that McDermott's criticisms were not consistent with his experiences at EPA.

Stay up-to-date with federal news alerts and analysis — Sign up for GovExec's email newsletters.
FROM OUR SPONSORS
JOIN THE DISCUSSION
Close [ x ] More from GovExec
 
 

Thank you for subscribing to newsletters from GovExec.com.
We think these reports might interest you:

  • Sponsored by G Suite

    Cross-Agency Teamwork, Anytime and Anywhere

    Dan McCrae, director of IT service delivery division, National Oceanic and Atmospheric Administration (NOAA)

    Download
  • Data-Centric Security vs. Database-Level Security

    Database-level encryption had its origins in the 1990s and early 2000s in response to very basic risks which largely revolved around the theft of servers, backup tapes and other physical-layer assets. As noted in Verizon’s 2014, Data Breach Investigations Report (DBIR)1, threats today are far more advanced and dangerous.

    Download
  • Federal IT Applications: Assessing Government's Core Drivers

    In order to better understand the current state of external and internal-facing agency workplace applications, Government Business Council (GBC) and Riverbed undertook an in-depth research study of federal employees. Overall, survey findings indicate that federal IT applications still face a gamut of challenges with regard to quality, reliability, and performance management.

    Download
  • PIV- I And Multifactor Authentication: The Best Defense for Federal Government Contractors

    This white paper explores NIST SP 800-171 and why compliance is critical to federal government contractors, especially those that work with the Department of Defense, as well as how leveraging PIV-I credentialing with multifactor authentication can be used as a defense against cyberattacks

    Download
  • Toward A More Innovative Government

    This research study aims to understand how state and local leaders regard their agency’s innovation efforts and what they are doing to overcome the challenges they face in successfully implementing these efforts.

    Download
  • From Volume to Value: UK’s NHS Digital Provides U.S. Healthcare Agencies A Roadmap For Value-Based Payment Models

    The U.S. healthcare industry is rapidly moving away from traditional fee-for-service models and towards value-based purchasing that reimburses physicians for quality of care in place of frequency of care.

    Download
  • GBC Flash Poll: Is Your Agency Safe?

    Federal leaders weigh in on the state of information security

    Download

When you download a report, your information may be shared with the underwriters of that document.