| A Bio-IT World special section
Biochips for Gene Expression
Using DNA chips to discover potential toxicity in new drug compounds—a key application of toxicogenomics—can predict adverse effects before they occur, enabling safer clinical trials.
By Donna Mendrick
October 10, 2003
| Toxicogenomics involves the study of alterations in gene expression following in vivo
or in vitro
exposure to a toxicant. Early review articles described the promise of toxicogenomics to predict potential human toxicity earlier and more effectively than traditional methods. In the last few years, numerous authors have presented single-case and multiple-compound experiences with drug and chemical compounds that illustrate toxicogenomics can be used to address mechanistic issues of organ toxicity seen in animals.
Toxicogenomics offers the promise of predicting human hepatotoxicity in the presence or absence of animal responses (the latter contain the so-called human-specific hepatotoxicants). In addition, toxicogenomics can provide a means to identify, rank, and prioritize new molecular entities (NMEs) based on their potential toxicities in a higher-throughput manner than current methods. Using transcriptomewide gene expression studies, for example, to assess the impact of a toxicant across whole genomes, such as >26,000 genes and expressed sequence tags (ESTs) on the Affymetrix Rat Genome GeneChip, provides more information and can validate the observations seen from the approximate 100 endpoints evaluated during preclinical toxicology studies.
Hepatotoxicity remains the primary organ of interest, as it accounts for the majority of drug failures due to toxicity. Since rats continue to be the primary species used in preclinical safety testing, most of the work in toxicogenomics has focused on using livers of rats exposed in vivo to toxicants, or primary rat hepatocytes treated in vitro with test compounds.
Toxicity Assessment Revolution
A revolution in toxicity assessment and evaluation of human safety issues has emerged through examination of multifunctional gene sets and numerous biological pathways. When used in combination with traditional preclinical toxicology studies, toxicogenomics adds a molecular understanding, through integration of multiple pathways and gene-by-gene analysis, of how toxicity occurs.
|Developing a toxicogenomics program requires highly skilled scientists from multiple disciplines, ranging from toxicologists to biostatisticians, and careful selection of a core of reference toxicants.
Collectively, gene expression analysis is a powerful technology that can direct investigative approaches to fully characterize a compound and accelerate biomarker discovery. Toxicity-related biomarkers discovered through the use of toxicogenomics, whether they be gene, protein, or analyte, offer the promise of predicting adverse effects before they occur
, thus enabling safer clinical development.
Developing a toxicogenomics program requires highly skilled scientists from multiple disciplines, ranging from toxicologists to biostatisticians, and careful selection of a core of reference toxicants. Toxicologists link the biology to gene expression changes, while biostatisticians determine if a particular gene expression change is significant and a reliable indicator of toxicity. To achieve high confidence from analyzing gene expression changes requires a large reference database, as factors such as whether an animal ate or drank just prior to sacrifice can alter the expression of many genes.
The construction of a large reference database, comprising many known toxicants and control compounds, multiple doses of those compounds, several time points post-exposure, and biological replicates for each condition, helps establish normal reference ranges across both biological and microarray processing parameters. Many toxicity-related genes exhibit small changes in expression level, such as by a factor of 1.5 to 3, requiring either very large numbers of replicates for each experiment or the use of a reference database for interpretation of gene dysregulation.
Multiple doses of toxicants allow for the separation of the pharmacological effects from the toxic responses. Time-course data documents observable gene expression changes linked to time-dependent toxic response and offers the additional benefit of increasing the chances of observing a true toxic response over that of a single time point.
Model Approach to Prediction
Using statistical inference to develop predictive models allows the behavior of individual marker genes to be thoroughly examined and the natural variation of gene expression over thousands of samples to be documented. Predictive models built on these premises can provide early-development stage compound toxicity ranking using in vitro or in vivo testing, and often point to possible mechanistic-based analyses for toxicity.
A single microarray experiment yields expression information for up to tens of thousands of parameters (genes and ESTs), and poses challenges for data analysis and interpretation. Among the most important parameters is the understanding of normal biological variation and the microarray variations themselves.
Biostatisticians address these challenges by creating normal reference ranges (control and treated) for each marker on the microarray. A large gene expression reference database (>1000 samples per target organ studied) establishes these ranges and allows for statistical validation methods to be employed along with subsequent biological validation as necessary for such an analysis.
Predictive toxicity models based on gene expression data require marker genes be identified, then classified by the type of toxicity and/or induced pathology. To accomplish this, toxicity markers are identified that demonstrate gene expression changes in approximately the same degree, irrespective of the specific type of toxic effect the compound may induce.
Pathology-specific markers are identified by selecting those that are regulated in the same manner as all pathology-specific (e.g., necrosis) causing agents. Using predictive markers that characterize a general toxic response, specific pathologies, and even specific compounds of known toxicity, enables researchers to effectively characterize a compound's toxicity.
Toxicogenomics has created a new dimension to investigate the underlying cause of toxicity. With over 300 metabolic, signal transduction, and other pathways identified (and more added regularly), investigations can now expand to multiple pathways simultaneously with a single experiment.
Public annotation of the rat and human genome to functional biological aspects such as cellular, molecular, and biological components have increased the level of understanding of complicated co-regulation within and among pathways and genes. Such information provides toxicologists with new tools to fully characterize and rank a compound.
Molecular mechanism of toxicity analysis often can point to pathways, and genes within these pathways, which are highly regulated when exposed to a toxicant. Reference toxicants provide a baseline for comparing induced pathologies and mechanism of toxicity (MOT) to novel compounds.
By evaluating each biological sample for dysregulated pathways, toxicogenomic investigations can tease apart the key genes or gene families that are responsible for the development of toxicity, whether or not the animals exhibit classical signs of such response.
This is accomplished by comparing the patterns and ranges of gene expression found in a reference database to the compound under evaluation and intersecting these results with biological pathways. These analyses need to be carefully evaluated for biological context and often require experts, such as toxicologists, to review and interpret results.
In addition, toxicogenomics can allow investigators, in a single experiment, to structure searches against key functional gene families, such as cytochrome 450. In combination with identified pathways, MOT analysis can render a possible explanation for toxicity well before accepted methods, thus accelerating biomarker discovery and compressing the time required to fully characterize a compound.
Toxicogenomics may lead to the identification of markers in blood. Dysregulation of these blood surrogate markers precedes organ injury, thus improving the safety of clinical trials. In contrast, classical markers of hepatotoxicity, such as alanine aminotransferase (ALT), require cell injury to occur before levels of this serum analyte are significantly altered. By monitoring the RNA in circulating white blood cells and/or exploiting the biology of dysregulated genes in solid organs, better blood-based predictors may become available in the near future.
Read about microarray data-exchange standards; experiment design; and high-resolution, genome-wide SNP genotyping tools.
Helping to achieve these advances are standards initiatives being led by government and industry organizations to find methods for interpreting correctly the results obtained from multiple genomic platforms. Two examples of these initiatives are the Toxicogenomics Research Consortium, managed by the National Institute of Environmental Health Sciences and involving six academic centers, and the Committee for the Application of Genomics in Mechanism-Based Risk Assessment, which has members from 30 corporations and government agencies and is managed by the International Life Sciences Institute and the Health and Environmental Sciences Institute.
|Future Toxicogenomic Applications
|As toxicogenomics becomes more widely adopted and applied at higher throughput, it will have an impact on the continuum of drug discovery and development.
Currently, cross-platform comparisons are difficult for multiple reasons, including the fact that probes may be oligonucleotide- or DNA-based, gene annotation is not uniform, and the hybridization kinetics employed for a microarray measuring thousands of genes aren't optimized for each gene, but instead utilize a standard time and condition.
Genomic tests are being employed in preclinical safety studies to rank lead compounds and to stratify patient populations. An example of the latter is the Netherlands Cancer Institute, which used a microarray screen to determine the appropriate treatment for women with breast cancer.
The use of genomics for patient stratification and diagnoses is prompting government agencies to work with industry to determine the best standards to employ, in the hope that such parameters can be exploited in all species. For example, a meeting sponsored by the National Institute of Standards & Technology, Affymetrix, and Agilent last March discussed the possible approaches that can be taken. (Presentations from this meeting are available online at www.cstl.nist.gov/biotech/UniversalRNAStds.)
Regulatory Restraint Promised
Technical issues, coupled with biological inter-animal and inter-species differences, suggest that the field of toxicogenomics is entering adolescence. However, though it's been established that toxicogenomics can provide valuable information for evaluating and understanding the mechanism of drug toxicity, the technology has been adopted slowly due to a concern about the use of such information by regulatory agencies.
To counteract the hesitation from drug makers, the FDA's Center for Drug Evaluation and Research is working on a safe harbor provision that should dramatically increase the use of genomics across the preclinical and clinical phases of drug development. The proposed provision would let drug companies voluntarily submit toxicogenomic data on their products. These data may have regulatory impact if a species-specific toxicity is demonstrated. The safe harbor proposal should be circulating for public comment by late this year.
Donna Mendrick is vice president of toxicogenomics at Gene Logic, in Gaithersburg, Md. She may be reached at firstname.lastname@example.org.