YouTube Facebook LinkedIn Google+ Twitter Xingrss  

Informatics Cornucopia

By John Russell

Sept. 18, 2006 | Predictive Informatics, the hopeful title of a session at last month’s Drug Discovery Technology & Development World Congress, remains an enticing but mostly elusive goal. The proliferation of informatics software, however, is entirely concrete, as speakers from National Cancer Institute (NCI), Novartis, and Virginia Bioinformatics Institute (VBI) demonstrated in talks about the tools they’ve developed, followed by a panel on systems biology’s prospects.

Indeed, the range of tools developed at NCI and presented by John Weinstein, head, genomics & bioinformatics group, laboratory for molecular pharmacology, was astonishing, and they are all freely available through NCI ( No wonder the commercial informatics world has struggled.

Even with its wealth of tools, Weinstein characterized the current environment as “Wild West days” for technology to analyze large data sets. William Egan, a computational chemist from Novartis, provided a glimpse into that company’s tools for predicting toxicology, and Darius Dziuda reviewed a suite of tools being developed at VBI. Clearly, these tools can help researchers make sense of unwieldy data sets, though they are often used singly and frequently defy easy integration with results from other tools.

In broad terms, systems biology attempts to integrate various omic data — frequently incorporating input from informatics tools — to produce a holistic view of how a given biological system works. But questioned by moderator Alan Louie, research director for Health Industry Insights, the systems biology panelists demonstrated there is no consensus on how far systems biology has progressed, which part of drug development it’s most likely to impact, or what the right approach is to creating these models of living systems.

“As I think of system biology, there’s kind a tension between this notion that (Leroy) Hood (founder of the Institute for Systems Biology) and others sold us in the early 2000s, that you could look at everything and put it all together,” said panelist Stan Letovsky, senior director of computational sciences, Millennium Pharmaceuticals, “versus the picture that pharma has of traditionally knowing a whole lot of your target and the molecule that’s interacting with it and maybe a little bit about what it does to the system because you’ve got to drive this thing down the pipeline as quick as you can. It’s not clear how that tension is being resolved.”

“I don’t think we’re seeing a lot of value in the short term for large quantitative models. They are just too unreliable at the moment. So we’re probably going to have more luck with the more correlative data kicking out the occasional target by hypothesis-driven or data-mining candidate identification approaches,” he said.

Bruce Gomes, head of modeling at Pfizer’s systems biology group, saw things a little differently. He allowed, “Systems biology is useless unless it actually answers practical questions. I understand you said that large models are limited, and I think that is true. I think, however, the ability to answer very specific drug discovery questions, where you can get small amounts of information from very discrete experiments, will have an impact today. However, the ultimate goal is to give context to targets, to tell us all there is to know about it someday, and safety, what’s the best delivery, what’s the dosing, where’s the best targeting, that’s going to happen some day.”

Pfizer seems to moving beyond the flirting stage with systems biology. About a year ago, it named a director of systems biology (David de Graaf) with whom Gomes works. After the panel, Gomes noted that internal demand is starting to outstrip his small group’s ability to respond, based largely on the promising results from two projects that have excited Pfizer researcher interest.

Undeniably, there is a new buzz around systems biology. Harvard Medical School, for example, now has a department of systems biology, and one of its professors, Walter Fontana, participated on the panel.

“I represent the academic interest in systems biology,” Fontana said, “[and] I would like to take issue with the notion that because we measure everything, we automatically know everything. That is far from being the case. So systems biology to me means two things: first is the biology of systems, and that’s a truism, going from individual proteins to systems behavior. But the second target of systems biology is a systems technology.”

Models should be collaborative learning devices, informed by both theory and experiment, argued Fontana. Too often, he stressed, we think of models developed by theorists as different from models developed by experimentalists. “We need to erase this distinction,” said Fontana. “When an experimentalist reasons about biological systems, he or she has a model in her mind. What we need to do is to write down this model in order to make the assumptions behind it explicit and enable that reasoning process to become assisted by machines. Biological knowledge is fragmented and open ended because new developments and technology give us new discovery. We need to have a system in place for coping with open-ended knowledge, and modeling is one way of coping with that.”

A few companies are trying to make a go of modeling, though virtually all of them use proprietary platforms. Panelist William Ladd, VP of discovery systems, Genstruct, noted they’ve had success generating hypotheses but agreed there was much room for progress.

In the end, Letovsky offered a sober caution. Yes, he said, we have made huge progress in perturbing systems and generating rich data sets. There are tools to probe a wide variety of systems characteristics. But we’re still in the early days for many of these technologies.

“It’s also important to realize these things don’t work very well yet. RNAi, for example, is fraught with crosstalk issues that people are only beginning to realize,” cautioned Letovsky. “When we throw expression profiling at homogenated tissue, we’re suppressing potential biological effects around the tissue’s heterogeneity.”

“If stem cells are as important in cancer as people have started to think they are, then we may be looking under lights instead of in the bushes where we’ve lost our keys when we homogenize our various cell lines or tissue sample and do expression profiling that characterizes the mass of cells instead of the rarer cells that we really should be looking at,” he said.

Put another way, the Millennium researcher said we’ve learned to put pieces on the chessboard, and learned to assemble many combinations, but we still haven’t learned to play chess.

Sidebar: Fast Forward Five Years
Asked what systems biology would look in five years and what will constitute success, panelists offered the following:

Fontana (Harvard): I think the biggest thing in the future is going to be the integration of the modeling process in our reasoning. We need to move forward to a Biology 2.0 where modeling is supported by architecture, where it becomes collaborative, where models are there to be changed, where you could use 1,000 models a second because parameters are changing.

Letovsky (Millennium): What success looks like is that [systems biology] becomes just another tool in your Swiss army knife. What’s that poem about returning home after traveling? At the end of all this omics we’re back at the laboratory bench trying to figure out some piece of biology, and microarray and sequencing and systems biology and everything else is just another tool like a gel. So success really means disappearing into the woodwork and not being a buzzword anymore.

Gomes (Pfizer): It’s ultimately used as a management tool. We start hundreds of projects each year in the pharmaceutical industry, and they are a lot of times redundant. They have the same pathway, the same safety problems, all the same things keep coming up. I think what systems biology will do is say, “Don’t do that,” and help get the most our of your research dollars.

Ladd (Genstruct): I’d have to agree [with Fontana]. [It’s] the notion of a modeling framework, that every compound that’s released is assessed against the model, that every experiment you do is measured against. I think what remains to be seen is what modeling frameworks are going to be successful and actually make a real-time impact on the organization. That’s still an unsolved question.  --J.R.

Email John Russell at:


Click here to login and leave a comment.  


Add Comment

Text Only 2000 character limit

Page 1 of 1

For reprints and/or copyright permission, please contact  Jay Mulhern, (781) 972-1359,