Genome project to require Google-like computing power

An ambitious project with the goal of producing a more detailed understanding of the link between genetic variations and susceptibility to disease will require an unprecedented amount of computing power and terabytes of data storage, according to the leaders of the project.

The 1,000 Genomes Project, announced earlier this week by an international consortium that includes the National Human Genome Research Institute, part of the National Institutes of Health, plans to examine over a three-year period the human genome at a level of detail never before accomplished.

The project "will greatly expand and further accelerate efforts to find more of the genetic factors involved in human health and disease," said Richard Durbin, deputy director of the Wellcome Trust Sanger Institute in Cambridge, England.

Francis Collins, director of the research institute, said the project will lead to a fivefold increase in the sensitivity of disease discovery efforts across the human genome.

Any two humans are more than 99 percent similar at the genetic level, but the fractional differences can help determine susceptibility to disease and how the body will respond to drugs. The goal of the project is to produce a catalog of variants that are present at 1 percent or greater frequency in the human population across most of the genome. That requires the project to sequence the genes of at least 1,000 people.

The project plans to sequence 8.2 billion DNA base pairs a day -- or the equivalent of more than two human genomes every 24 hours -- during its two-year production phase, for a total of 6 trillion DNA bases, said Gil McVean, co-chair of the analysis committee and professor of mathematical genetics at the University of Oxford.

Managing this massive amount of data will require novel computational methods. Gonçalo Abecasis, a professor of applied statistics and a geneticist who works at the Center for Statistical Genetics at the University of Michigan, said the data produced by the genome project will be so immense that the only process that he can think of that is similar in scope is the search engine Google, which manages billions of Web searches daily.

If the project had to start crunching all the sequence data today, Abecasis estimated it would take a supercomputer with 10,000 massively parallel processors. But, he said, the project is working to develop algorithms and mathematical and computational models that should reduce the computing requirements.

Because the genomes of most people are mostly similar, Abecasis said he is working on models and algorithms designed to process and crunch the fractional differences, much like the way video compression algorithms function when processing power is applied to objects that move and not to static background objects.

The models still are being developed, but the project will require supercomputers to manipulate the data but need far fewer than 10,000 processors, Abecasis said.

The Beijing Genomics Institute in Shenzhen, China, is the other key research organization participating in the project and will perform sequencing along with the Wellcome Trust Sanger Institute and its large-scale sequencing network. That network includes the Broad Institute of MIT and Harvard, the Washington University Genome Sequencing Center at the Washington University School of Medicine in St. Louis, and the Human Genome Sequencing Center at the Baylor College of Medicine in Houston.

Stay up-to-date with federal news alerts and analysis — Sign up for GovExec's email newsletters.
FROM OUR SPONSORS
JOIN THE DISCUSSION
Close [ x ] More from GovExec
 
 

Thank you for subscribing to newsletters from GovExec.com.
We think these reports might interest you:

  • Going Agile:Revolutionizing Federal Digital Services Delivery

    Here’s one indication that times have changed: Harriet Tubman is going to be the next face of the twenty dollar bill. Another sign of change? The way in which the federal government arrived at that decision.

    Download
  • Cyber Risk Report: Cybercrime Trends from 2016

    In our first half 2016 cyber trends report, SurfWatch Labs threat intelligence analysts noted one key theme – the interconnected nature of cybercrime – and the second half of the year saw organizations continuing to struggle with that reality. The number of potential cyber threats, the pool of already compromised information, and the ease of finding increasingly sophisticated cybercriminal tools continued to snowball throughout the year.

    Download
  • Featured Content from RSA Conference: Dissed by NIST

    Learn more about the latest draft of the U.S. National Institute of Standards and Technology guidance document on authentication and lifecycle management.

    Download
  • GBC Issue Brief: The Future of 9-1-1

    A Look Into the Next Generation of Emergency Services

    Download
  • GBC Survey Report: Securing the Perimeters

    A candid survey on cybersecurity in state and local governments

    Download
  • The New IP: Moving Government Agencies Toward the Network of The Future

    Federal IT managers are looking to modernize legacy network infrastructures that are taxed by growing demands from mobile devices, video, vast amounts of data, and more. This issue brief discusses the federal government network landscape, as well as market, financial force drivers for network modernization.

    Download
  • eBook: State & Local Cybersecurity

    CenturyLink is committed to helping state and local governments meet their cybersecurity challenges. Towards that end, CenturyLink commissioned a study from the Government Business Council that looked at the perceptions, attitudes and experiences of state and local leaders around the cybersecurity issue. The results were surprising in a number of ways. Learn more about their findings and the ways in which state and local governments can combat cybersecurity threats with this eBook.

    Download

When you download a report, your information may be shared with the underwriters of that document.