YouTube Facebook LinkedIn Google+ Twitter Xingrss  




January 12, 2004 | AS WE GO further into the systems biology century (or what some people curiously refer to as "new" biology), researchers face the task of generating "omic" data sets appropriate for modeling entire systems. Unfortunately, in many cases, the lack of good data sets to model is directly related to the limitations in available instrumentation.

"The old adage of 'garbage in, garbage out' in the systems biology era is more appropriate than ever!" 

STEPHEN NAYLOR, MIT, COMPUTATIONAL SYSTEMS BIOLOGY INITIATIVE 

"The prevailing wisdom is that the 'omic' instrument development waves that washed through in the 1990s have left us wallowing in a plethora of data," says Stephen Naylor, a faculty member in MIT's Computational Systems Biology Initiative and Boston University School of Medicine's Department of Genetics and Genomics. "Relatively speaking, this is true: 'Omic' data sets — whether genomic, transcriptomic, proteomic, or metabolomic — are certainly getting larger and more complex. However, if one is attempting to characterize the mind-numbing complexity of a system, our ability to acquire 'meaningful' data is quite limited."

In addition to good sample quality/history and well-implemented experimental design, a large, "meaningful" data set must be generated by an instrumentation platform that is "robust, reproducible, routine, specific, sensitive, selective, stoichiometric, and speedy," according to Naylor.

"Analysis of perturbation in a complex system requires an instrument platform that provides accurate and precise quantitative data across a broad (in terms of depth and breadth) coverage of analytes present in the complex mixture," he says. "This has to be carried out in a reproducible manner with relatively high throughput. To date, such platforms across the entire 'omics' and systems biology space do not exist. Don't confuse lots of data with high-quality, meaningful data — the old adage of 'garbage in, garbage out' in the systems biology era is more appropriate than ever!"

The workhorse of proteomic discovery is mass spectrometry (MS), often coupled with chromatography on the front end (see Fully Equipped, March 2003 Bio·IT World, page 94). Despite recent advances, MS-based proteome analysis still faces limitations in throughput, dynamic range, resolution, and quantitation. Some of the newest technologies tackling these challenges include the ion mobility spectrometry coupled with time-of-flight tandem MS (IMS-TOF-MS/MS), under development at Indiana University; as well as Fourier-transform ion cyclotron resonance MS (FT-ICR MS), with commercial versions launched last year by Thermo Electron and Bruker Daltonics (see Fully Equipped, May 2003 Bio·IT World, page 72).

The validation stage of systems biology involves repetitive analyses of a relatively small number (dozens) of components, but across thousands of samples. Most of the validation studies rely on expression, protein, and metabolite arrays. The latter two are still in their infancy, with challenges in quantitation and coupling chemistries. Protein arrays, which allow rapid quantitation of differences between samples, are likely to experience stable long-term growth.

High-content cellular imaging is rapidly expanding into the functional genomics and systems biology space, in part due to its ability to monitor the cell's dynamic state by fluorescently tagging proteins and tracking their intracellular activity over time. According to Judy P. Masucci, director of marketing at Cellomics, the high-content screening market is projected to grow by more than 30 percent in 2004. The highest growth is expected to be in academia, fueled by demand from newly formed systems biology departments.

"Current 'omic' instrumentation is inadequate for systems biology analyses," Naylor says. "We need new ways of thinking to carry out the analyses of complex biological systems."

Amersham's vice president of business development, Ger Brophy, agrees: "Researchers understand that the complexity of the task must be matched by the appropriate level of connectivity and data transparency in technology solutions. This understanding has significant implications for technology vendors — an intimate understanding of customer workflow is more important now than ever."



Julia Boguslavsky is conference director for Cambridge Healthtech Institute. Reach her at julia@thebiotechwriter.com. 










For reprints and/or copyright permission, please contact  Jay Mulhern, (781) 972-1359, jmulhern@healthtech.com.