Science agency seeks place at 'cutting edge' of data mining

The National Science Foundation funds research "right at the cutting edge of discovery," Director Rita Colwell said in a recent interview. So it is only fitting that the foundation announced on Friday that it is funding eight projects that go beyond the technologies currently being developed to mine large amounts of data.

The projects are being supplemented by $4 million over two years as part of the Management of Knowledge Intensive Dynamic Systems (MKIDS) program, which is part of NSF's charter to support science and engineering research related to national security.

"The systems envisioned by the MKIDS program go beyond even today's leading-edge data-mining systems, which attempt to monitor vast streams of data and pinpoint events of interest," the agency said in a release.

The projects are examining ways to use technology to help organizations make better decisions. An MKIDS system would use tools to help decision-makers use the information mined from databases to allocate physical resources, technology services and human resources. It also would have controller functions to monitor the organization's response to those decisions and provide ways to fine-tune the process, NSF said.

In one project being developed at Carnegie Mellon University, external data sources such as e-mail, phone calls and personnel databases will be fed into computational models. The models will extrapolate an organization's structure and highlight likely "failure points."

"We want to develop computational tools to help managers design organizations the way engineers design bridges," said Ray Levitt, who is managing another project at Stanford University. "There is so little predictive ability for organizations in this area. It's all based on managers' experience and intuition."

NSF uses 95 percent of its roughly $5 billion annual budget to fund grants and contracts. It funds research at nearly 2,000 universities and institutions. It receives about 30,000 requests for funding every year and makes about 10,000 funding awards. It has long been involved in Internet-related issues, having brought the Internet to the nation's universities through the .edu domain.

NSF has put its focus in recent years on interdisciplinary research in new areas, Colwell said. "I would say right now, the interface between nano, bio, info and cognotechnology is where the exciting discoveries are occurring, and I would urge you to keep an eye on those developments in the future," she said.

NSF is the lead agency on two interagency initiatives, on information technology and nanotechnology. The foundation is targeting new software development and moving toward providing access to high-end computing through cyber infrastructure in the next few years, Colwell said. It is working with the Defense Advanced Research Projects Agency on developing technologies to move from data to "wisdom" by mining large databases, she added.

Colwell also offered a glimpse of what the future may hold thanks to nanotechnology. "Some of the bright information technology folks tell me that when we get to molecular computers," she said, "we will have computers a hundred-billion times faster than our current computers."

Stay up-to-date with federal news alerts and analysis — Sign up for GovExec's email newsletters.
FROM OUR SPONSORS
JOIN THE DISCUSSION
Close [ x ] More from GovExec
 
 

Thank you for subscribing to newsletters from GovExec.com.
We think these reports might interest you:

  • Sponsored by G Suite

    Cross-Agency Teamwork, Anytime and Anywhere

    Dan McCrae, director of IT service delivery division, National Oceanic and Atmospheric Administration (NOAA)

    Download
  • Data-Centric Security vs. Database-Level Security

    Database-level encryption had its origins in the 1990s and early 2000s in response to very basic risks which largely revolved around the theft of servers, backup tapes and other physical-layer assets. As noted in Verizon’s 2014, Data Breach Investigations Report (DBIR)1, threats today are far more advanced and dangerous.

    Download
  • Federal IT Applications: Assessing Government's Core Drivers

    In order to better understand the current state of external and internal-facing agency workplace applications, Government Business Council (GBC) and Riverbed undertook an in-depth research study of federal employees. Overall, survey findings indicate that federal IT applications still face a gamut of challenges with regard to quality, reliability, and performance management.

    Download
  • PIV- I And Multifactor Authentication: The Best Defense for Federal Government Contractors

    This white paper explores NIST SP 800-171 and why compliance is critical to federal government contractors, especially those that work with the Department of Defense, as well as how leveraging PIV-I credentialing with multifactor authentication can be used as a defense against cyberattacks

    Download
  • Toward A More Innovative Government

    This research study aims to understand how state and local leaders regard their agency’s innovation efforts and what they are doing to overcome the challenges they face in successfully implementing these efforts.

    Download
  • From Volume to Value: UK’s NHS Digital Provides U.S. Healthcare Agencies A Roadmap For Value-Based Payment Models

    The U.S. healthcare industry is rapidly moving away from traditional fee-for-service models and towards value-based purchasing that reimburses physicians for quality of care in place of frequency of care.

    Download
  • GBC Flash Poll: Is Your Agency Safe?

    Federal leaders weigh in on the state of information security

    Download

When you download a report, your information may be shared with the underwriters of that document.