YouTube Facebook LinkedIn Google+ Twitter Xingrss  

Only Moore's Law Can Save Big Pharma


GUEST COMMENTARY

By Bill Frezza

June 29, 2009 | If ever there was an industry at risk of being sunk by not one but three hurricanes, it's the pharmaceutical industry. Whether it's on the political, economic, or scientific front, this major contributor to our nation's financial and physical well being is headed for wrenching transformations.

Politically, Big Pharma is at the mercy of all three branches of an increasing hostile government. The executive branch, through its regulatory agencies, has raised the cost of product development to astronomical heights. The judicial branch, through its class action machinery, has made the penalty for delivering anything short of zero-defects untenable. And the legislative branch, on its way to becoming the industry's monopsony purchasing agent, is hell bent to drive prices down to the marginal cost of production.

Economically, Big Pharma continues to deliver less and less for more and more. A new blockbuster cancer drug is almost never a cure. The "good" ones have no effect on most patients besides making their hair fall out while helping some "fortunate" subset die in 15 months instead of 12. For some advanced biologics, this pathetic result comes with a sticker price of $100,000. The only reason there are any customers at all for products this bad is that someone else is paying the bills.

Scientifically, the classic drug discovery paradigm has reached the end of its long road. Penicillin, stumbled on by accident, was a bona fide magic bullet. The industry has since been organized to conduct programs of discovery, not design. The most that can be said for modern pharmaceutical research, with its hundreds of thousands of candidate molecules being shoveled through high-throughput screening, is that it is an organized accident. This approach is perhaps best characterized by the Chief Scientific Officer of a prominent biotech company who recently said, "Drug discovery is all about passion and faith. It has nothing to do with analytics."

Does this sound like science to you?

The problem with faith-based drug discovery is that the low hanging fruit has already been plucked, driving would be discoverers further afield. Searching for the next miracle drug in some witch doctor's jungle brew is not science. It's desperation.

The only way to escape this downward spiral is new science. Fortunately, the fuzzy outlines of a revolution are just emerging. For lack of a better word, call it Digital Chemistry.

Drug companies of the future will be built around drug design, not discovery. Scientists cross trained in engineering will run product development teams with productivity levels comparable to other industries. Compare this to today's chemist, who can spend an entire career at a pharmaceutical company without ever working on a drug that gets to market. This is not just scientifically embarrassing, it's economically indefensible.

Tomorrow's drug companies will build rationally engineered multi-component molecular machines, not small molecule drugs isolated from tree bark or bread mold. These molecular machines will be assembled from discrete interchangeable modules designed using hierarchical simulation tools that resemble the tool chains used to build complex integrated circuits from simple nanoscale components. Guess-and-check wet chemistry can't scale. Hit or miss discovery lacks cross-product synergy. Digital Chemistry will change that.

But modeling protein-protein interaction is computationally intractable, you say? True. But the kinetic behavior of the component molecules that will one day constitute the expanding design library for Digital Chemistry will be synthetically constrained. This will allow engineers to deliver ever more complex functional behavior as the drugs and the tools used to design them co-evolve.
How will drugs of the future function? Intracellular microtherapeutic action will be triggered if and only if precisely targeted DNA or RNA pathologies are detected within individual sick cells. Normal cells will be unaffected. Corrective action shutting down only malfunctioning cells will have the potential of delivering 99% cure rates. Some therapies will be broad based and others will be personalized, programmed using DNA from the patient's own tumor that has been extracted, sequenced, and used to configure "target codes" that can be custom loaded into the detection module of these molecular machines.

When it arrives, the transition to Digital Chemistry will be similar to the revolution set in motion when engineers began using transistors as switches instead of amplifiers. Over the succeeding 40 years, the semiconductor industry used the simplest of components to design increasingly more sophisticated integrated circuits whose complexity now rivals that of many of the metabolic disease pathways we hope to control.

Only Moore's Law can save Big Pharma. We better hope it arrives soon.


Bill Frezza is a partner at Adams Capital Management, He can be reached at waf@acm.com.
Click here to login and leave a comment.  

10 Comments

  • Avatar

    The problem of the falling productivity in the drug development pipeline is widely accepted but the diagnosis of the cause and the recommended therapy is not so clear. Bill Frezza's idea that Moore's Law may come to the rescue is not a complete solution. Even if the cost of computing power does fall to close to zero, where is the experimental data going to come from that will underpin the simulation models? We need a new paradigm for in-vitro testing. If HTS delivers as much misinformation as real information then how do we get more physiologically meaningful data? J Malcolm Wilkinson, Kirkstall Ltd UK

  • Avatar

    Dear Mr. Frezza,

    excellent comment!

    But Big Pharma cannot be saved by Moore's Law, reaching its end soon.
    The von Neumann syndrome enforces a paradigm shift.
    Performance growth will come from non-von-Neumann accelerators like FPGAs.

    Best regards,
    Reiner Hartenstein
    http://hartenstein.de

  • Avatar

    Interesting but a bit one-sided. "Witch-doctors" in fact know quite a lot about local flora and their pharmacological properties, as we know from ethnographic taxonomies that routinely distinguish more species of local plant than Ph.D. botanists. Why wouldn't we make use of their 10,000-year old knowledge? As for rational drug design, there's a whole lotta faith in that highly-touted approach, given that we cannot yet predict pharmacological actions based on 1-D protein structure. The bigger problems with Big Pharma are outright lying (Vioxx, anyone?), and the basically fraudulent way that new drugs are "proven" in experiments intentionally designed small to show that they as good as existing drugs--not better, mind you, but "no difference." Finally, there is the fact that Big Pharma spends more on advertising than on R&D. Is there any room for surprise here?

  • Avatar

    I think you had better read Derek Lowe's analysis of your article over at "In the Pipeline" (http://pipeline.corante.com/archives/2009/07/02/jargon_will_save_us_all.php). Your article does rather ... gloss over some fairly basic (and significant) issues. Producing completely novel targetted compounds is hard for a number of reasons, none of which are going to go away in the short term: chemistry is applied condensed matter physics, which in itself is somewhat computationally challenging to model ... The behaviour of nanoscale materials compared to their bulk properties is not well understood ... It is clear that quantum effects have more of an impact on chemistry than has previously been believed (for example, it was discovered in 2001 that the lowest energy configuration of ethane owes more to quantum mechanics than steric hindrance, as has been taught to generations of chemistry students) ... Ab initio prediction of chemical properties is hard ... Determining 3D protein structures is very hard ... Predicting the tertiary structure of large (and therefore biologically relevant) proteins is impossible today ... And accurately computing modes of binding for small molecules to proteins (let alone protein-protein interactions) is best described as an art (yes, even faith) because there doesn't yet exist a modelling tool which works 100% of the time. And yet, people stake their careers (and their R&D budgets) on the predictions made by these imperfect modelling tools. In your words, and with tongue slightly in cheek, this is not just scientifically embarrassing, it's economically indefensible. This is the foundation that "Digital Chemistry" must sit upon if it is to work, and it must be a secure and enduring foundation. These computational and data management issues is occupying some of the finest scientific minds of our generation. Finally, the industry as a whole knows it's beset by problems, but the solutions to those pr

  • Avatar

    A non-expert commentary:
    Clearly, the public sector will play the central role in the future health care industry and it would profit everyone to accept that fact. And just as clearly, costs will have to be contained. Some cost cutting will result from management of competition, since competition between medical service providers has driven costs upward in some cases. Likewise, the pricing of pharmaceuticals in this country will probably be regulated (i.e. drastically reduced) through some public-sector health insurance plan or other. For pharma to prosper, its representatives and those of the public-sector will have to negotiate a new form of cooperation. As Frezza points out, regulation and litigation are significant components of cost-driving forces. Ideally, a component of this new public-private partnership will include means of assuring efficacy and safety of candidate products while reducing the cost of clinical trials. Might this take the form of a public institution responsible for clinical trial testing? I can envision two benefits that would accrue to the industry from this system: it would offload the unnecessarily duplicated clinical testing process and its attendant costs to the public sector (Hopefully, this will result in a reduction of total testing costs, so that the taxpayer will not be onerously burdened.) Therewith, it will also offload the accountability for testing onto the public sector. Hopefully, public doubt about the integrity of the overall approval process, which arises from private control and secrecy, should lessen, with attendant reductions in legal costs. No doubt, there will be kinks to be ironed out, and as an outsider, I can't envision all of these. One issue would be assuring trade secrecy within a centralized testing system. Im sure there will be many more.
    As for the science, this is an area that is already highly subsidized through taxpayer support of education and basic research. It seems to me that r

  • Avatar

    Mr. Freeza’s analysis of the problems confronting the Pharma industry are a curious combination of insight and ignorance. While there is little doubt that there are both internal and external forces responsible for the declining productivity of the industry – some of which are described in his article, “digital chemistry” is hardly a solution to any of them. First of all, few companies search for new medicines in “jungle brew,” or even natural products in general anymore. They are largely already rationally designed against targets chosen on the basis of the best available biology. High-throughput screening of compound libraries is simply a way to jump-start the search for molecules that have at least some of the appropriate physicochemical and pharmacologic properties. Structure based drug design and computer-aided medicinal chemistry are in widespread use throughout the industry. The era of “digital chemistry” is already here – and has been for some time. It cannot save the industry because the problem is not in the design of drugs – it is in the level of biological knowledge about the targets. Drugs rarely fail because they don’t hit their targets – they fail either because hitting a target didn’t have the anticipated effect on the intended disease, or because they had unanticipated side-effects (toxicities), largely because the full biological role of the target was unappreciated. This is not a problem that will be solved by chemistry (digital or otherwise) but by biological knowledge and insight. As to the political and economic problems which Mr. Freeza correctly identifies, I’m afraid these will require political and economic solutions – not scientific ones. By the way, as a cancer drug developer, I am always bemused by the complaints about the relatively short mean survival benefit conferred by most cancer drugs – complaints echoed again here by Mr. Freeza. What is less widely appreciated is the fact that cancer is largely a disease of the elderly.

  • Avatar

    This is an interesting proposal to save Big Pharma. I would add that they also need Digital Biology as a way to quantitatively model the interactions between complex biological networks in which their targets are active. Very often target validation is limited to target selection, because the human mind - unlike computer models - is incapable of predicting quantitatively the complex interaction between many subsystems. The large failure rate of clinical trials - in the absence of toxicity - is often due to off-target effects of the selected drug, imbalanced comedication that interferes with the drug’s mode of action and functional genotypes that indirectly act on the pathways where the target is active. Developing a virtual human being is similar to the introduction of mathematical models simulating complex electronic circuits in micro-electronics.

  • Avatar

    "Only Moore's Law can save Big Pharma. We better hope it arrives soon."

    This is a very strange statement. Moore's law does not 'arrive', it either holds or it does not hold. It has held for over four decades, so it can be said to have 'arrived' long ago.

    The problem is that Biological data more than doubles every 18 months, and Moore's law can therefore not keep up. As a result, Clusters have gone from hundreds of nodes to thousands and now to the tens of thousands. CPU's simply cannot keep up. Accelerators based on GPGPU's, FPGA's and CBE's are the only way to provide the computational power that we need without dedicating large amounts of power, A/C and FTE's to the infrastructure.

  • Avatar

    I have no truck with the hopes expressed in the article - only suspended disbelief. However, I do take exception with a particular phrase, popular in various contexts among conservatives: "The only reason there are any customers at all for products this bad is that someone else is paying the bills."

    I'd like to remind my Republican "friends" that we live in society - that it is not, that it should not be, each man/woman for himself/herself. Yes, the cost of some medicines is a disgrace, but that a civilized society would defray the cost of such treatment for one of its members should not ever be an issue in this country and the Right should stop framing it as such.

    We live in cities, pay taxes, go to war on politicians's whims, and collectively provide conditions for some to become very very rich. Why should the common citizenry not demand that society do something in return?

  • Avatar

    Oh come now. Stop blaming the plight of the pharmaceutical industry on factors beyond its own control.

    However, you are right. "The only way to escape this downward spiral is new science." Digital chemistry might help. But even this will not be sufficient to fix more fundamental problems involving drug development, drug regulation, and drug use.

    Remember the promise of personalized medicine based on genomics? This includes great potential to improve the pharmaceutical industry. However, there is no efficient way to advance personalized medicine with science that uses group averages to assess causality. Individuality, which includes genetic differences and differences in patient histories, become part of the error term in statistical models. After this, the game is lost.

    An alternative is to exercise randomized experimental control over time for individual patients. This becomes possible by measuring the benefit/harm of treatments for individual patients as an interaction-over-time between (i) drug dose and (ii) health variables and biomarkers that can vary and fluctuate in level over time. Then reliable, valid, detailed, and comprehensive benefit/harm scores from each of many patients can be analyzed statistically to help identify genetic and other predictors of differential response. Personalized medicine that improves individual health will improve group average or public health.

    This "new science" is possible when drugs are developed and used to prevent, manage, or control chronic health problems that account for upwards of 75% of U.S. annual healthcare expenditures of about $2.4 trillion. However, no one is acutally measuring the benefit/harm of treatments. Try Googling "benefit/harm score" in quotation marks. (Curtis A. Bagne, bagne underscore curt at msn.com)

Add Comment

Text Only 2000 character limit

Page 1 of 1

For reprints and/or copyright permission, please contact  Jay Mulhern, (781) 972-1359, jmulhern@healthtech.com.