YouTube Facebook LinkedIn Google+ Twitter Xingrss  

High-Content Analysis: Balancing Power and Ease of Use


By Jim Kling
Nov. 13, 2007 | High-content analysis (HCA) — also known as high-content screening — makes possible massively parallel experiments that can reveal much about the inner workings of cells and their response to stimuli such as drugs or signaling agents. Typically, an agent or cellular proteins are tagged with a fluorescent marker that can then be imaged.

Many processes can be studied using HCA, including intracellular translocation of proteins; movement of proteins in response to activation of a receptor or a cellular pathway; and protein co-localization. Such studies have enormous potential to streamline drug discovery. For example, HCA has been used to identify or validate targets using RNA inhibition, and in secondary screens to detect cellular signs of toxicity. It can also provide visual evidence of a cancer agent as it blocks cell division, thus providing mechanistic clues. Studies that validate a drug target or identify toxicity in the earliest stages of preclinical development could drastically reduce the nearly $1 billion that is typically sunk into the development of a novel drug.

Nevertheless, early predictions that HCA would revolutionize drug development have not entirely panned out. Many pharmaceutical companies have set up HCA or HCS programs, but they have not been fully embraced — partly because it hasn’t yet been proven that high content approaches can streamline drug discovery enough to justify the investment. Another important issue is that software systems that control the instruments can be difficult to learn.

Walking a Thin Line
HCA instrumentation has matured to the point that systems of comparable acquisition speed, magnification, and resolution are difficult to distinguish from one another because the optics have been so well optimized. Hence other factors typically drive purchasing decisions, and software is perhaps the most important. “Now that we’re almost feature complete on imaging systems, our focus needs to be totally on software for the foreseeable future,” says Jan Hughes, general manager and vice president of bioresearch for Molecular Devices.

HCA software developers must walk a thin line. On the one hand, users want powerful software that can perform just about any analysis they can think of. On the other hand, they want it to be easy to use. Industry experts agree that ease of use is critical. “If it’s only the province of the highly talented core laboratories and can’t be done by bench scientists, it has the potential to retard” the adoption of HCA, says Mark Collins, senior marketing manager for cellular imaging at Thermo Fisher Scientific, which acquired industry vanguard Cellomics in 2005.

It isn’t always easy to convince customers to scrutinize software when making a purchasing decision. “We find that probably the most difficult challenge is going to a novice imaging customer who wants to get into high-content analysis or high-content screening, and explaining to them the importance not just of image acquisition, but the database, image analysis, and informatics tools. They’re focused on the hardware. As an end-user, software can constrain you or be enabling. You better worry about your software just as much as your hardware,” Hughes says.

Vendors are taking various approaches to simplifying software while maintaining complexity, though it is a tricky tightrope to walk. Companies have taken various steps toward this goal. Thermo Fisher has streamlined the naming conventions for its parameters and measurements. Users click on a cell of interest and use its characteristics to create a training set for modifying the assay parameters — for example, to set a threshold for nuclear area.

The company is also continuing to develop ‘out of the box’ protocols for key biological processes that can be combined with Thermo Fisher’s HCS Reagent Kits. Evotec Technologies, a former subsidiary of Evotec AG now part of PerkinElmer, has reduced the number of modules in its Acapella image analysis software by automating some of the functions, such as cell nucleus detection.

Other related functions have been bundled into a single module. “That makes it very flexible on the one hand, but also very user friendly,” says Martin Daffertshofer, leader software development, Molecular Medicine at PerkinElmer. Evotec Technologies also offers about a dozen “canned” solutions for studying GPCRs, kinases, and other specific systems.

GE Healthcare has tried to boost usability by designing its Investigator software to operate like an interactive web page, with drop boxes, descriptive text, and very few parameters required to run an assay. “You can apply these software modules without any training,” says Jacob Tesdorpf, marketing manager platform software for GE Healthcare.

Molecular Devices has two versions of its image analysis software, one geared toward research applications, the other toward high throughput screening. Both versions of the software use the same underlying engine, but MetaMorph is geared toward research applications, while MetaXpress is aimed at imaging high throughput screening experiments. Both software packages include many of the same functions, but they have interfaces that are tailored for the end user. MetaXpress development will continue to specifically address the need for fast analysis and the management of large amounts of image-derived data, Hughes says.

Definiens has taken a similar approach. Its Enterprise Image Intelligence Suite has several components that can be purchased individually, each to be used on a different rung of the research ladder.

For example, there is a viewer client that allows a manager to view results without influencing the experiment. An analyst client allows a technician to run preprogrammed methods and change parameters. The architect contains pre-built modules that allow a user to construct a novel assay. “They map to different work flows,” says Kurt Scudder, field applications scientist for Definiens.

Thermo Fisher’s Collins admits that the challenge of balancing flexibility and power with ease of use continues to be a major challenge. “I don’t think we’ve got it yet,” he admits. “I don’t think anyone has.”

Ease of use versus flexibility isn’t just limited to choice of software. Another option is to split tasks between different instruments. For example, Genetix has developed Cell Reporter, a high content screener with an emphasis on speed and ease of use. Such machines can serve as a first run analysis. “You can whittle [a couple of hundred lead compounds] down to 10 or 20, and then take those into the machines with higher magnification [like the GE InCell or Evotec Opera]. You can reduce your bottleneck earlier in the drug discovery process,” says Nicol Watson, product manager for CellReporter software at Genetix. “You don’t need more than 15 or 20 minutes to train on it.” (See “Tools for Therapeutics.”)

Standards and Compatibility
Ease of use is one thing. Compatibility is another. It’s not uncommon for high-content analysis and high-content screening labs to use more than one type of instrument, and that creates problems. “Screening labs will often have three (or more) different imagers, each with different characteristics. It’s not uncommon at all for a screening group to have multiple platforms that can’t talk to one another. That [communications blackout] is changing, but at a glacial pace,” says Definiens’ Scudder.

Definiens’ software is designed to run on any system, giving it an edge over built-in software packages. “Where we come into play is if you want to develop a new assay every month,” says Scudder. “You need an environment where you can develop your own capabilities. Our system allows you to develop completely new assays and attach them to any instrument on the market. If you want to run the same assay on a Cellomics Array Scan as on a GE InCell 1000, you can use our software to do the same analysis.” Adds Mark Watson, head of life science marketing at Definiens: “You don’t have to worry about comparing apples to apples.”

There are efforts underway to standardize the data formats for HCA experiments, which should make it easier to swap data between instruments. Perhaps the most important element of such standards is the so-called metadata. It includes experimental parameters, such as the magnification, the nature of the dye used in the experiment, exposure time, and other factors. “That information is critical to analyzing those images and also interpreting the data from that image analysis,” says Collins.

To that end Thermo Fisher has taken it upon itself to develop a standard it calls MIAHA, for ‘Minimum Information for a High-Content Assay.’ It’s a combination of the OME (Open Microscopy Environment) standard commonly used in the microscopy field, and the Minimum Information for a Cellular Assay (MIACA).

The proposed standard is based on XML, and will describe the image and associated metadata, including experiment name, type of container, barcode, dye used, filter, image size, cell type, and other information. “Our strategy is that the community will encourage all vendors to be able to read and write data in the MIAHA standard, so anyone should be able to use any tool with data that is written to the MIAHA standard,” says Collins. Thermo Fisher will support the standard in its HCSGateway and HCSExplorer software.

Thermo Fisher plans to send the draft out for review by the end of this year and then present the standard at a scientific meeting next spring. “We’ll see what people think, take comments, and hopefully we’ll agree on a standard soon after that,” says Collins. “It’s not easy. But the good thing is that the community is pretty small and the players are committed to standards.”

In fact, most companies contacted for this story endorsed standards. “We’re supportive and convinced that it’s the right thing to go. Otherwise we’re just putting up barriers to adoption,” says GE’s Tesdorpf.

Research Challenges
Like any field, HCA is moving forward rapidly, and instrument vendors are doing their best to keep up. They hold regular user group meetings and “go back to the lab and scratch our heads” about how to incorporate new requests into their software suite, says Collins.

One key area is bright field imaging, which does not rely on fluorescent agents. These can be toxic or introduce confounding factors in an experiment. That’s an especially important issue among researchers studying stem cell differentiation, as they want to avoid any factors that might influence the differentiation process. It can also be used as an extra visualization channel. “Once you get past four fluorescent probes, it’s pretty hard to pack any more. Bright field provides an extra channel, if you will,” says Collins.

However, it presents a challenge to image analysis. Unlike fluorescent images — which show up starkly against a dark background — bright field illuminates everything. That poses a greater challenge to object recognition and other image analysis routines. Companies are busily adding bright field analysis capabilities to their software packages.

Primary cells will be another challenge. The typical study uses mixtures of cells, “which is pretty arbitrary,” says Daffartshofer. The next step is to use primary cells, taken directly from the patient or animal model, which are more biologically relevant than proliferated cell lines. “That (requires) good algorithms for morphology investigation, because those cell mixtures can be analyzed on morphological differences from cell to cell.”

Live cell imaging is also driving software development, as companies work to design software modules that can incorporate kinetics in their analysis. That will lead to a dramatic increase in the amount of data to be stored, as systems collect continuously rather than taking timed snapshots. That represents a data storage and management challenge.

Once the data are stored, users want easy access, and they also want to be able to mine it. “What I perceive is people wanting to manage all of this data in an integrated way, and to ask questions of the data that were not part of the experimental design,” says Scudder. “Some of the system vendors are addressing this, but it’s still not a turnkey [solution].”

Tesdorpf agrees: “I think the next big thing is going to be data management, because that’s a big problem for users of image analysis. Once you generate terabytes of data, how do you stay on top of that? That’s what’s occupying us the most right now.”

All of these challenges make this a fascinating time for HCA software, as companies pour more resources into their packages. “We did the investment in hardware in the last three to four years,” says Hughes. But in the next few years, he says, “I think we’ll focus more on the software end of the business rather than hardware improvements.”

Subscribe to Bio-IT World  magazine.

Click here to login and leave a comment.  

0 Comments

Add Comment

Text Only 2000 character limit

Page 1 of 1

For reprints and/or copyright permission, please contact  Jay Mulhern, (781) 972-1359, jmulhern@healthtech.com.