YouTube Facebook LinkedIn Google+ Twitter Xingrss  



By Brian Reid

November 15, 2002 | The long, winding path from clinic to FDA approval may soon get another overhaul as the push for better information technology standards gains proponents in both industry and government, agency and industry IT experts told attendees at a recent Data Standards and Clinical Development Conference hosted by the Drug Information Association (DIA) in Bethesda, Md.

The investment in standardized data forms – and the myriad of applications that would follow – would pay dividends both in time saved and financial opportunity created, said James Klein, the vice president and research director of Gartner Health IT Research and Advisory Service, who gave the keynote address at the conference.

“How effective do we think these [standards] will be? We believe that through 2005, pharmaceutical companies that standardize … will experience a 35 percent drop in interface costs in the first year,” Klein said, adding that the effort could also shorten the time needed to complete clinical trials by as much as a year. “If you look at what [drug development] costs, the marginal profit can be as much as $1 million a day.”

The FDA has become more supportive of electronic data submissions in lieu of the truckloads of paper that were once standard practice, and companies have been increasingly willing to follow agency guidance that encourages files in Adobe Acrobat-readable PDF format. But as technology moves forward, experts said that a new paradigm for submissions to the FDA would be needed to easily manipulate the data in those filings.

The Clinical Data Interchange Standards Consortium (CDISC) has led that charge. The nonprofit group, which counts 85 member companies, is developing an XML-based standard that would allow raw data to be analyzed without painstakingly copying the information from paper or PDF documents into a data-analysis tool. Though still in its early stages, the group has offered companies a rough framework for how to ease the transition of data from the clinical laboratory all the way to the computer screens of the FDA.

That effort is focused on four different areas: moving data from the systems of outside firms into a central database, which CDISC calls “operational data modeling”; moving laboratory data to central databases, known as “laboratory data modeling”; moving data to regulatory bodies in a standard form, or “submission domain standards”; and the creation of analytical tools for standardized data, which the group has dubbed “analysis dataset modeling.”

The challenge will be convincing an entire industry – and the army of clinicians and support organizations that underpin the pharmaceutical marketplace – to move to a new way of doing business. The CDISC leaders envision a day in which data entered by a single investigator at a hospital in a standard format can be analyzed by a trial sponsor, then twisted and re-analyzed without human intervention.

“We believe there’s a lot to be saved in terms of money,” said Wayne Kubick, the vice president of Lincoln Technologies Inc. and a member of the CDISC board, “but there’s a lot more to be saved in terms of time.”

FDA Pilot Earns Its Wings
The FDA has already experimented with manipulating standardized data. The agency is in the midst of evaluating a pilot program using a technology called the Patient Profile Viewer – a program that can take nuts-and-bolts data, such as dosing schedules, adverse events, and medication use, and turn them into easy-to-interpret graphical displays.

Though the viewer itself appears to be a powerful tool for FDA reviewers, helping to make clear relationships between events like side effects and concomitant drug use, the speakers at the DIA conference said that the pilot’s most important role was in forcing the nine companies that submitted clinical information to standardize the format for that data.

“It’s really more about the use of standardized data that this particular tool,” said Jeremy Pool, the vice president of product development at PPD Informatics, which participated in the FDA pilot.

That effort was not without roadblocks. Each piece of data from the trials  had to be placed in a specific preset category, which raised problems when the study results didn’t fit exactly with existing definitions. Fred Wood, the global data standards manager at Procter & Gamble Co.’s drug unit, lamented the problems with interrupted doses, an important variable that wasn’t initially allowed for in the standard data collection. And the effort – the first large-scale effort to get study data to the FDA in a standard form – was time-consuming, though participants said they expected the process would move more quickly in the future.

Regardless, the FDA’s effort to move to new electronic data standards is only a small piece of the puzzle, Klein said. Ensuring that the same set of standards was used not only for transmitting data from drug company to FDA, but at all other steps in the drug development process, could be a larger challenge. In some cases, companies are committed to their proprietary data-collection systems, and the process of tying together disparate contract research organizations, hospitals, and others will be a huge undertaking.

The final result, however, could be felt beyond the reaches of industry drug development. The FDA wants to put standardized data to work in a number of areas, enabling it, for example, to track demographic trends of a broad cross-section of clinical trials without having to paw through hundreds of data sets from disparate filings.

“These standards will not just help us with reviews, but with all of the other things we do in the interest of public health,” said Randy Levin, the associate director for information management at the FDA’s drug branch.

 

 


For reprints and/or copyright permission, please contact  Terry Manning, 781.972.1349 , tmanning@healthtech.com.