Genstruct: Patience, Persistence, and Payoff

By John Russell

March 19, 2008 | Finding the right business model for systems biology (SB) technology providers has been challenging — that’s hardly a new theme in biotech. Most SB pioneers were founded as platform providers but soon encountered difficulties growing their sales sufficiently to increase valuations. Genstruct, a pioneer in using computational approaches to infer mechanistic hypotheses from large experimental data sets, is no exception.

Many reasonable-sounding explanations are offered for the community’s lack of traction: 1) the technologies weren’t (aren’t) really ready; 2) not enough biology was (is) known for predictive technologies to work very well; 3) desperate pharma jilted platforms for compounds; 4) collaborators wouldn’t share meaningful IP; 5) personnel shuffling inside pharma has made maintaining influence problematic, etc.

Genstruct CEO Keith Elliston is familiar with the list and hardly agrees with all of them (try bringing up #2!). He is, however, a realist. What’s important is what will work. Genstruct’s bread and butter is hypothesis generation. In the past, most engagements lasted two to four months — Genstruct has done 30 or so such projects — and contrary to Elliston’s original expectations, none grew into a deep, long-term collaboration or produced the kind of glittering IP that excites the financial world.

“When you're working in a process that has many different contributions, how do you measure the value of one particular contribution?” says Elliston. “When you have a compound, then you've got a piece of something tangible and you say, well, I developed the compound, so therefore this is my piece. When you define a hypothesis for a marker, or a mechanism, how does somebody value that? It’s not a discrete entity, in many cases it's not a physical entity, and so getting real value for it is really tough. That's the reason we're actually looking at products.”

Keith Elliston
Keith Elliston

So, like most of its SB peers, Genstruct is adopting a hybrid business model that blends fee-for-service projects and R&D collaborations with a growing bet on internal drug and biomarker development. Recently, the company hired Michelle Gordon-Savenor to grow the consulting and alliance business. She had been alliance manager at Millennium Pharmaceuticals. Genstruct also recruited Christian Reich, an M.D. and Ph.D., to run internal R&D. He’s another former Millennium researcher who ran its pathway initiative.

Genstruct was co-founded in 2002 by the late Navin Chandra, a pioneer in artificial intelligence and MIT Ph.D. holder, and Elliston. Their idea was to create a computational platform that could model biology and disease and help streamline drug development. Rather than pursue a traditional mathematical modeling approach, such as using systems of differential equations to describe biological processes, Genstruct’s platform describes systems and their components in terms of state (for example, on, off, increased, decreased, no change).

The notion was that doing so would permit the technology (explained more fully below) to make biology “computable” and they would be able to tease out mechanisms of action for compounds. But success in the lab and on projects for clients hasn’t yet translated into clear business success. The Cambridge, Mass-based company, with a current headcount of around 30, faces the same wall its SB brethren face in scaling up business.

In the struggle to break free of financial doldrums, several SB companies have fastened onto using their platforms to reposition jettisoned compounds as a new strategy. Elliston is less sanguine about that approach. Others are investigating or actively pursuing non-biomedical markets — biofuels and cosmetics, for example — and Elliston agrees there may be worthwhile opportunities elsewhere. But the most promising opportunity for Genstruct, he says, is in biomarkers generally, and diagnostic biomarkers specifically.

“In seven biomarker programs we've done with one partner, we’ve identified 134 mechanistic biomarkers. I think we're actually particularly good at this kind of work,” he says. “So our goal is to go from doing hypotheses very well to doing molecules. Rather than doing therapeutics, which I think is a long, tough road and fraught with risk, we want to do molecular diagnostics, and to do them where we have now the key intellectual property, such biomarkers for breast cancer, for lung and neck cancer, etc. I think these are things that are very practical for us.”

To help guide the business shift, “we're building what we call a business advisory board,” Elliston says. “We have in essence ex-pharmaceutical senior execs. We're going to recruit people from diagnostics as well as from the payer side of the business to a once-a-quarter review our internal programs and help direct us. We think cancer is certainly the first area for us.”

Having survived the steep learning curve of the past few years, Elliston believes Genstruct’s new strategy maps nicely against what the market will support. The kinks have been worked out of the technology, he says, overhead has been deliberately kept low to reduce financial pressure, and confidence is growing in Genstruct’s technology.

There is evidence to support Elliston's contention. Last month, after five years of doing short-term projects with Genstruct, the pharmaceutical giant Pfizer struck a corporate-wide agreement that sets standard terms for working with Genstruct. It's not a deep, single project, but does include commercialization rights to biomarkers discovered as part of the collaboration, and is a clear vote of confidence in Genstruct technology. Individual Pfizer groups can now engage Genstruct much more quickly and easily.

To use Elliston’s analogy, the Genstruct approach is a little like the National Safety Traffic Board’s efforts to reconstruct how and why an airplane crashed by sifting through the debris field and reviewing flight conditions. He emphasizes that this is different from trying to predict whether a crash will occur before takeoff, which he still thinks is very hard to do. It can, however, produce insight that leads to improved airplane design or better flight rules and traffic management to prevent crashes in the future.

Customers bring their experimental data to Genstruct to learn what their drug did, to whom, and why. The emphasis is on seeking mechanistic hypotheses. To accomplish this goal across a broad swath of biology and disease, Genstruct built what it calls a Knowledge Assembly database of biological relationships, culled from literature and other work on rat, mouse, and human biology.

This doesn’t sound so different from what other pathway and disease modelers do. The key difference, says Elliston, is the conversion of the data into Genstruct’s proprietary “computable” format and use of its causal modeling engine to produce hypotheses.

“In the pathway world,” he says, “you have a network that's composed of simple nodes of proteins and complex connections. We've turned that paradigm completely around to be able to reason or compute on these networks. Our world has very rich nodes and very simple connections.”

So what's an example of a rich node? “Well, we have a protein X — we live in a protein-centered world — and a property of that protein is the kinase activity of X. The kinase activity, in fact, is distinct from the concentration, from the transcription, from other key attributes of that particular protein. We can relate that to other proteins through a series of cause-and-effect relationships. So this is a logical chain,” says Elliston.

Basically, each entity (e.g., protein) also has a “state” (up, down, no change) based on its interactions with other entities. In nearly Lego-like fashion, they can be placed into networks a priori based on known biology and then have the client data “painted on” the nodes to produce a result. Similarly, a causal network could be “inferred” from customer-supplied data using Genstruct’s causal modeling engine. In all cases, the nature of the system forces a “result,” even if the result is no change or no connection.

“You do what we call a state change analysis. We analyze this data to figure out, did the state of transcription of A change or not? Did it change tenfold or fivefold or twofold? It doesn't matter to me. Did it change? What are my criteria for that? The beauty of how this works is, as long as I have signal in the system, I'll converge on something in the network. We've done a lot of experiments. If I don't have signal here, there's no convergence,” explains Elliston.

This is a simplified description, to be sure, and the data sets and networks can become quite large and complicated. Still, computing these “states” is fairly straightforward, so exotic computer hardware isn’t needed. Genstruct says it produces results fast, generally in two to three months. The “result” can be a family of mechanistic hypotheses, each scored for probability. These hypotheses, of course, must be tested in the lab. The system can also identify biomarkers associated with active networks.

Talking about the database, Elliston says “Its [size] is not particularly mind-boggling. The latest numbers on human [data show that] we have about 75,000 entities and about 300,000 relationships amongst them. If you ask the nature of those entities, for example, [for] how many genes or transcripts do I have upstream causal information about their regulation? Well, there are about 15,000 known and demonstrated transcripts in the human genome. We have upstream causal relationships for 11,000 of those. If you look at protein phosphorylations, we have 20,000 sites. There's estimated to be about 60,000 sites.”

One recurring criticism of database-informed approaches is that not enough biology is known for these systems to work well. Ellison scoffs at this idea.

“If we look at the early days of the human genome work, people thought there might be 200,000 human genes. Then the generally accepted number was 100,000. [Later] people said, “Well, maybe it’s closer to 50,000” and there are estimates of 37,000. Now what’s interesting is to find that we can build a human being with trillions of individual cells out of about 15,000 to 20,000 genes that we actively use,” he says.

“That tells us something about the way biology works. Many people say we only know 5 percent of what’s going on biologically. I think we know much more than that if you take a look at the modularity and conservation of biology.” Besides, says Elliston, Pfizer would have dumped Genstruct long ago if it weren’t producing value.

Other notable current collaborators include GlaxoSmithKline, Berlex, Dana Farber Cancer Institute, Stanford University, and the Institute for Systems Biology (ISB). The company website reports technology alliances with Ariadne, Jubilant BioSys, and Spotfire.

Ellison says Genstruct has been operating at cash flow breakeven in recent years, and suggests “with our new strategy we'll in essence double our revenues over '07.” He also says that about 30 percent business is already booked, and about 80 percent is in the pipeline. “I'm looking at covering a couple of things in the fourth quarter to get my number,” he says.

Doubling seems like a big jump, though it’s not clear what ’07 totals were. Bringing biomarkers to market, even through partnerships, will take some time. But Elliston is confident the Genstruct platform can create a high volume of marker candidates, and rather than try to take one to market, he says Genstruct will be happy to own a piece, say 10 to 20 percent, “of a large number of candidates.” Meanwhile the consulting and R&D collaboration businesses will foot the operations bill.

Validation remains a missing competency, and the Genstruct CEO says he’d consider acquiring validation capacity to speed up the process. Pharma collaborators have validation capacity, but often not the urgency to do them quickly. If Genstruct did acquire wet lab capabilities, they would be narrowly focused on validation activities.

Reflecting on the travails of selling technology to the biopharma industry, Elliston says,

“It's a very tough business. It’s trying to build a technological approach that's going to change the way an industry works. Go back to financial services, for example. In the '70s and '80s, there were huge problems with credit card fraud. A company called Fair Isaac came up with ways of using artificial intelligence to detect fraud. They initially tried to sell the technology to people and nobody would buy it.”

“So they said, ‘Well, wait a minute. You send us your transaction data, we'll look through it, and we'll tell you which transactions are fraudulent.’ They started doing that business, and pretty soon everybody was sending them their data, and over 90 percent of credit card fraud detection was done by their technology. And they built that company up continually through now. They have integrated throughout the financial services industry not only things like fraud detection for credit cards or cell phones to actually modeling, et cetera, and really changing the face and nature of the industry. I think that's the way this has to be done.”

Sidebar: The Changing Marketplace

Keith Elliston comments on two opportunities, a Genstruct challenge, and one lingering peril for all SB technology providers.

Saving Pharma: “My thesis is if you look at the problems of the pharma industry, they don't have a problem generating enough data. They don't have problems of not being able to access biology. They have trouble making sense of all the stuff. So my thesis is that the salvage of the pharmaceutical industry is going to be almost entirely based on software technology.”

Personalized Medicine: “I think personalized medicine right now is sitting on an opportunity without a technology to deliver what it needs. If I'm a payer treating patients with Avastin, what I know is that 10 or 15 percent of patients will respond, but I've got to treat them for six or 12 months to figure out who that is. If I can figure that out in two weeks now I've got something of value.”

Cheaper by the Hour: “One of the challenges on our consulting side is that we're a little too quick. If you need prostate cancer biomarkers and we do it in six months and all I get paid for is my work, I'm not recognizing any of the value.”

That Was Then: “Ten years ago, the world was ripe and receptive for computational technology coming into biology. Today, the world is not ripe and receptive. In fact, it's quite the opposite. There's been such disappointment over genome [results] and informatics approaches that nobody believes any of this is actually doable. One reason we haven't gone out and claimed we can do really wonderful things in the press is because nobody believes any of that. So what we've tried to do is chip away.”


Click here to login and leave a comment.  


Add Comment

Text Only 2000 character limit

Page 1 of 1