By Catherine Varmazis
July 14, 2008 | Merck and the H. Lee Moffitt Cancer Center & Research Institute (MCC) in Tampa, FL, have built a biospecimen repository and IT “pipeline” that facilitates the exchange of clinical and biomarker data and could advance the development of personalized drugs, earning the project a Translational and Personalized Medicine Best Practices Award. (see Best Practices)
Launched last fall, the Biomarker Information Pipeline automates the flow and integration of patient data from MCC, with gene expression and profiling data from analysis conducted by scientists at Merck. The pipeline is part of Moffitt’s Total Cancer Care (TCC) program, a translational medicine approach that aims to improve cancer patient outcomes by encompassing genetic predisposition, lifestyle, and integrative medicine in patient care.
Cancer patients at member institutions of the Moffitt TCC consortium contribute tissue samples to the biorepository. According to Rick Garrison, COO at Moffitt, eight sites are contributing tissue and data to the program.
Specialists from many disciplines helped plan the Biomarker Information Pipeline. The underlying database had to be flexible enough to incorporate changes and additions after its launch. Initially, when queried about the kind of information they wanted the database to include, “people said: ‘everything,’ says Srivaths Srinivasan, director of TCC systems, “we built it so data we did not plan for today [could] still be collect[ed] and store[d]” as needs arise.
Ensuring patient privacy is paramount. “We abide by HIPAA to the nth degree,” says Shane Huntsman, manager of biorepository operations at Moffitt. As soon as a patient signs consent, a random number is generated for each sample, which is thus “de-identified.” Gary Mallow, director, biomarker programs at Merck Research Labs (MRL) IT, says, “ensuring privacy is the number one priority. Security for the data collected is tightly regulated and supported by a multi-layered security infrastructure.”
Patient data is automatically loaded into the data warehouse without human intervention and ad-hoc querying of the MCC-provided data is not allowed, says Mallow. To ensure that tissue samples are processed and collected consistently, Huntsman’s group is creating standard operating procedures.
Before installing a robotic sample storage system last November, “it took us 14 days to pull up 3200 samples” from the liquid nitrogen freezer, says Edward Seijo, manager of shared resources at Moffitt. “With the new system—a Thermo Scientific Biobank—we can find that many in seven to eight hours.”
Using new quality-control standards, tissue samples from consenting patients are collected in the operating room and sent to MCC for analysis and storage. Huntsman says each piece of stored tissue is graded on cellularity, necrosis, and gene profile “so we can mine and extract sample composition data and compare it to changes in profile or changes in genes.”
The patient data in the MCC database are transmitted nightly by MCC’s EAI layer—BizTalk—over the internet to Merck’s EAI layer—TIBCO—and then loaded into Merck’s CDR for use within its data analysis units. The results of these gene expression analyses are then transmitted back to MCC, where they are added back to existing patient data. As patients undergo further treatment, new data are automatically added to the database, creating a valuable longitudinal history.
Both partners believe the work is setting a precedent for breaking down silos between pharma and clinical communities. “External collaborations are a key strategy for Merck,” says Jim Swanson, VP global services IT for MRL. “Working with great partners like the MCC helps Merck advance the science and meet its core mission: to develop novel treatments for disease.”
This article appeared in Bio-IT World Magazine.
Subscriptions are free for qualifying individuals. Apply Today.