YouTube Facebook LinkedIn Google+ Twitter Xingrss  

‘Cross-Omics’ and Systems Toxicology


By Kurt Zingler

Nov. 13, 2007 | Some of the latest and most intriguing advances in biomarker development are emerging in systems toxicology — the combination of traditional toxicology methods with new strategies and tools for integrating high-throughput transcriptomics, proteomics, and metabolomics data. The goal is to better understand and predict potential toxicities at an early stage of drug development, so that biopharmas can gain deeper insights into the biology underlying toxicity, and make “go/no-go” decisions well before committing to further development and clinical trials.

The great challenges of developing effective systems toxicology approaches are well understood: making productive use of the disparate resources available as well as the acceptance of new technologies. While toxicologists have a mass of data that could help them better understand and predict toxicity, they are often saddled with incompatible data types, formats, databases, and analysis tools. They are surrounded by information that might bring them forward, yet they lack the means to make use of it, and derive informed decisions in compound profiling programs.

The emerging solution is the creation of a common system that can capture classic toxicological analyses as well as more recent “omics” data, and then provide a framework to easily make use of these data to analyze, relate, compare, and share this information — from bench researchers all the way up to top-level decision-makers.

In addressing the core technological issue of this endeavor — namely integration of toxicogenomic data with conventional toxicological endpoints — researchers face several technological and methodological limitations. Specifically, they frequently lack:

•           Efficient workflows that capture, store, and analyze toxicological data, based on integrated systems;

•           An ability to make decisions based on integrated information that captures the breadth of the available data;

•           Access to solutions or reports that do not require end-users to be informatics experts.

The key question at this stage of systems toxicology research is, “What will it take to create a truly integrated approach?” Evidence derived from a growing number of collaborations between biopharma groups and bio-IT vendors shows that researchers require support in four main areas: integration with major high-throughput technology platforms including objective data quality assessment; integration capabilities across varied data types and formats; new standardization and automation methods for processing and analysis; and full-spectrum workflow, from sample stage to final study and result reporting.

Processing Tox Data
In a typical scenario, toxicologists use a wide range of data from conventional tox analyses (enzyme assays, histopathology, animal observations, etc) and more recently, genomic data (transcriptomic, proteomic, and metabolomic). 

While each of these data types may help characterize and predict certain types of toxicity, the data stores and analysis tools for each information source are fragmented. The problem is compounded by disseminating the information through specific program groups and decision-making teams. While higher-level committees can view static information, they lack the analysis, visualization, and interpretation tools to do additional comparisons and derive educated conclusions.

The required solution is a unified system where toxicologists and program committees can review and assess data across a wide range of technologies and models to facilitate knowledge-based compound promotion. An ideal approach provides a common framework and analysis system for the toxicologist and subsequent review committees. Each can look at the breadth of available data, drill down to specific details and easily import additional data.

As a sign of the importance of new technologies for toxicological assessment, a number of biopharma/bio-IT consortia are looking to improve the process of compound progression. Examples include the InnoMed PredTox Consortium in Europe, the U.S.-based Critical Path Initiative’s Predictive Safety Testing Consortium, and the Japanese Toxicogenomics Project. Collectively these consortia are examining a variety of systems toxicology approaches to address the field’s key challenges.

For example, InnoMed PredTox is looking at a range of compounds that failed late in the process, to determine if newer technologies (or combination of technologies) might have allowed an earlier, less costly decision to halt or redirect development.

The consortium has helped develop sophisticated databases (such as the InnoMed PredTox database) and data analysis systems, while at the same time providing validation for these emerging technologies and analysis strategies and education of scientific experts and regulators. 

The ultimate goal of the consortia is to provide researchers with the breadth of  available data in a shared, easy-to-use platform so that organizations can better understand the biology underlying toxicity and make scientifically justified decisions about compound progression.

Kurt Zingler is head of U.S. Business at Genedata. He can be contacted at kurt.zingler@genedata.com.
----------------------
Subscribe to Bio-IT World  magazine.

 

Click here to login and leave a comment.  

0 Comments

Add Comment

Text Only 2000 character limit

Page 1 of 1

For reprints and/or copyright permission, please contact  Jay Mulhern, (781) 972-1359, jmulhern@healthtech.com.