YouTube Facebook LinkedIn Google+ Twitter Xingrss  

FDA Works to Become More Transparent

By John Russell

March 29, 2011 | The Russell Transcript | How many genomic data sets were submitted to FDA under the VXDS (voluntary exploratory data submissions) program last year? Is the agency meeting its hiring goals? How many manufacturing plants were inspected on a monthly basis? The answers to these and other questions about FDA activities are now readily available as part of the FDA-TRACK program first announced in 2009 and launched in August 2010.

You may have seen FDA Commissioner Margaret Hamburg’s expansive comment in the August announcement, “FDA-TRACK will bring the operations of this historically opaque Agency into the daylight and help us be even more responsive.” Now, with the first quarterly TRACK assessment briefings completed (January) and a fair amount of data available in the TRACK system, it’s a good time to consider whether the program will fulfill Hamburg’s hope.

It’s often easy to lob criticism at FDA. In 2010, for example, FDA approved just 21 new drugs (15 new molecular entities (NME) and 6 new biologics (BLA). That’s slightly down from 2009 (25) and 2008 (24) and about the same as 2005 through 2007. The long-term trend, of course, has been downward at least since 1996 when 58 new drugs and biologics were approved. FDA will get much of the blame for last year’s slide with far less discussion about the uneven quality of submissions and the proliferation of follow-on products.

So it’s interesting and praiseworthy that FDA should undertake this new exercise to pull back the curtain on many of its activities. TRACK—which stands for transparency, results, accountability, credibility, and knowledge sharing—monitors more than 100 FDA program offices using key performance measures. Data is gathered monthly, analyzed, and presented each quarter to FDA senior leadership. Importantly, the public and industry can easily access the data at the FDA-TRACK website:

It’s not that you couldn’t get much of this data before, but doing so was often cumbersome and time-consuming. FDA has created what it calls dashboards, which are basically tables indicating how many or how much of a given measure for a particular task or project has been accomplished. Projects, for example, may list a task and a milestone date, and then indicate if the task has been ‘completed’, is ‘on track’, ‘on hold’, or ‘delayed.’ Each dashboard also has a dictionary in which the measures are clearly defined.

Statistics on Demand

The program has rightfully drawn admiration and worry. At a minimum TRACK makes painfully clear how much work is heaped onto FDA’s plate. For example, FDA received 336 commercial INDs and 1423 research INDs through September (latest data in TRACK at this writing). Conversely, nobody likes to look bad and there’s always a chance of that. For example, the dashboard for the Center for Drug Evaluation and Research (CDER) Office of Translational Sciences indicates there were just five VXDS submissions in 2010 (through September) not a roaring sign of acceptance for the program.

Indeed in discussing the VXDS results, a TRACK entry declares, “Over the past year, we have observed a stable number of new VXDS in line with the annual average of 5 we have had over the past 7 years, while the regulatory application of the types of biomarkers introduced through VXDS has increased 250% year-to-year. The current level of regulatory reviews with genomic data is a good indicator of the success of VXDS meetings.” There’s probably room for debate over that conclusion. Then again the number of genomics INDs, NDAs, and BLAs for the same period was a solid 106.

Another disappointing stat, at least to me, turns up in the same dashboard. There were only seven End of Phase 2A (EOPA2)/VXDS/biomarker meetings led by the Office of Clinical Pharmacology in the same period. These meetings generally use advanced informatics and predictive modeling and are intended to “improve drug development process and reduce attrition rate caused by poor dose selection and study design.” The low number of meetings doesn’t suggest growing industry trust for FDA-Sponsor collaboration in jointly using advanced tools to make decisions.

One challenge in using TRACK is the lack of context and more granular information. In August, for example, FDA took action on 12 NDA/BLAs. Ten were within the scheduled period; two were not and the “median number of days past the deadline” was 609. (I like the choice of median over average, but I’m not sure how you calculate it when there are just two data points). Ten out of 12 on-time is terrific but we’ve no way of knowing if the delayed actions were FDA-caused or sponsor-caused.

Clearly TRACK a young program in part driven by President Obama’s drive for transparency in government. FDA seems committed to expanding and improving it as the lessons pour in. The biggest question is how useful will TRACK really prove. Poke around and see what you find. The agency clearly thinks TRACK is working and has even posted a section called Significant Accomplishments to Date.

This article also appeared in the March-April 2011 issue of Bio-IT World Magazine. Subscriptions are free for qualifying individuals. Apply today.
Click here to login and leave a comment.  


Add Comment

Text Only 2000 character limit

Page 1 of 1

For reprints and/or copyright permission, please contact  Jay Mulhern, (781) 972-1359,