Virginia Tech to tackle the 'Big Data' challenges of next-generation sequencing with HokieSpeed

Wu Feng, associate professor of computer science in the College of Engineering at Virginia Tech, will engage in Big Data research with promising advances for genomics. Credit: Virginia Tech

The National Science Foundation (NSF) and the National Institutes of Health (NIH) today announced nearly $15 million in new big data fundamental research projects. These awards aim to develop new tools and methods to extract and use knowledge from collections of large data sets to accelerate progress in science and engineering research.

Among the awards is a $2 million grant to Iowa State, Virginia Tech, and Stanford University to develop high-performance computing techniques on massively parallel heterogeneous computing resources for large-scale data analytics.

Such heterogeneous computing resources include the NSF Major Research Instrumentation (MRI) funded HokieSpeed supercomputing instrument with in-situ visualization. HokieSpeed was the highest-ranked commodity supercomputer in the U.S. on the Green500 when it debuted in November 2011.

Specifically, the three-university team intends to develop techniques that would enable researchers to innovatively leverage high-performance computing to analyze the data deluge of high-throughput DNA sequencing, also known as next generation sequencing (NGS).

The research will be conducted in the context of grand challenge problems in human genetics and metagenomics or the study of metagenomes, the genetic material received directly from environmental samples.

On this grant, working together are Srinivas Aluru, a chaired professor of computer engineering at Iowa State University and principal investigator; Patrick S. Schnable, a chaired professor of agronomy, also at Iowa State; Oyekunle A. Olukotun, a professor of electrical engineering and computer science at Stanford University; and Wu Feng, http://www.cs.vt.edu/user/feng who holds the Turner Fellowship and who is an associate professor of computer science at Virginia Tech. Olukotun and Feng are co-principal investigators.

In previous research Aluru has advanced the assembly of plant genomes, comparative genomics, deep-sequencing data analysis, and parallel bioinformatics methods and tools. Aluru and Schnable previously worked together on generating a reference genome for the complex stalk of corn genome that will help speed efforts to develop better crop varieties.

Feng's relevant prior work lies at the synergistic intersection of life sciences and high-performance computing, particularly in the context of big data. For example, in 2007, Feng and his colleagues created an ad-hoc environment called ParaMEDIC, short for Parallel Metadata Environment for Distributed I/O and Computing, to conduct a massive sequence search over a distributed ephemeral supercomputer that enabled bioinformaticists to "identify missing genes in genomes."

Feng said, "With apologies to the movie Willy Wonka and the Chocolate Factory, one can view ParaMEDIC as WonkaVision for Scientific Data a way to intelligently teleport data using semantic-based cues. "

See the rest here:
Virginia Tech to tackle the 'Big Data' challenges of next-generation sequencing with HokieSpeed

Related Posts

Comments are closed.