YouTube Facebook LinkedIn Google+ Twitter Xingrss  


New chips bring more speed and much more memory.
But choosing the right processor for life science work involves more than just wanting to go faster. 
 
By Salvatore Salamone

October 15, 2003 | BACK IN THE day when dinosaur microcomputers roamed the Earth, their brains could handle only 8 bits of data at a time, and their memory capacity was a paltry 64 kilobytes. Then Intel came along with a chip, the 8086, that doubled the amount of instructions a computer could chew on at once, and expanded the amount of memory a program could access to a whopping 1 megabyte. Deciding to upgrade was a no-brainer.

Now we're at another historic juncture. Intel, AMD, and Apple/IBM have introduced new 64-bit processors that will power the next generation of high-performance computing systems. The challenge for life scientists is figuring out which processor will deliver the best price/performance to meet their computing needs.

The new 64-bit chips are the Itanium from Intel, the Opteron from AMD, and the PowerPC G5, based on IBM's design and starring in Apple's new G5 PowerMac. In theory, all three offer substantial raw performance improvements over 32-bit processors.

In practice, the true performance will depend on many factors, including the amount of memory a company is willing to purchase and whether applications are optimized to take advantage of the processor's capabilities. Additionally, if the 64-bit systems are part of a cluster, the efficiency and performance of the interconnection software and switches in the cluster will also affect performance.

That said, the main advantage of using a 64-bit system is that much more data can be put into memory — which means faster results.

A 32-bit system can access only 4 gigabytes of memory. (There are ways around this limitation, but these solutions are typically complicated and expensive.) In many common life science applications — searches of a huge database, for example — this 4GB limit can be a real performance stopper.

Replacement Parts 
Systems based on the new 64-bit processors from AMD, Intel, and Apple/IBM can be used in many high-performance computing environments. However, based on the existing prices and available system configurations, some systems are more suited to certain applications than others.

Read More 
  
To work with a large database or other big application requires the use of virtual memory, which sets aside a portion of a hard drive's space to temporarily store additional data for the application. But accessing and manipulating data stored in hard drive virtual memory in this way takes about 40 times longer than when the data are stored in random access memory (RAM).

A 64-bit system circumvents this obstacle because it can accommodate a much larger amount of data in memory — up to 16 terabytes. By loading an entire database into memory, any calculations, searches, and manipulations can be performed more quickly than when accessing the same data residing in virtual memory on a hard disk drive. Grabbing a piece of data from RAM is like calling it up in your own brain; snagging it from a hard disk is like reaching over and pulling the information off a bookshelf.

This is one of the most compelling reasons life scientists are eyeing 64-bit systems. "We expect the superior memory addressability and floating-point performance of the Itanium 2-based systems to meet our scientific researchers' [high] requirements for processing massive data sets," says Gunaretnam Rajagopal, acting director of the Singapore Bio-Informatics Institute.


The Old Compatibility Issue 
Beyond the memory support for much larger databases, two other factors — compatibility and pricing — are guiding life science choices when considering 64-bit systems.

A recent study commissioned by AMD and carried out by Gartner Custom Research (a division of Gartner Inc.) found that existing 32-bit applications are of major concern when considering the issue of moving to 64-bit technology. More than 80 percent of the IT managers surveyed said the capability to run both 32-bit and 64-bit applications while migrating to 64-bit systems is important.

Processors and Performance 
Two factors determine the raw performance of a computer's central processing unit

Read More 
  
At the heart of this compatibility issue is whether the performance boost realized when running existing applications justifies the higher costs of a 64-bit system. Interestingly, AMD and Intel have taken vastly different approaches to this situation.

The first Itanium-based systems introduced last year ran many 32-bit programs slower than these same programs ran on existing high-end Pentium systems — and the Itanium systems were significantly more expensive.

Such early results, widely reported in computer trade publications, left a bad taste in many IT managers' mouths. "We realized the Itanium was a new architecture, one that offered some interesting benefits for the future," says the manager of a Cambridge, Mass., biotech company who did not want to be identified. "But the high costs and poor performance with current applications caused me to wait. It was as if someone introduced a new high-performance car, but it didn't run well on gasoline."

Intel has addressed this issue with a software emulation program that runs 32-bit applications at speeds comparable to Pentium systems.

On the other hand, AMD's Opteron runs both 32-bit and 64-bit applications in their native modes. This, and the fact that Opteron systems are priced only slightly higher than high-end Pentium systems, has led many to consider the AMD CPU.

"Right now, the Opteron-based systems look like a no-brainer for any company adding computational power for common bioinformatics tasks like performing BLAST runs or alignment programs," says David O'Neill, an independent IT consultant and a former network administrator at a specialty drug manufacturer. The Opteron systems "give a company a migration path to take advantage of new 64-bit applications as they become available. And these systems are just about the same price as [higher-end] versions of many existing servers."


Who Needs 64 Bits? 
The new 64-bit systems will have an impact on life sciences ranging from the desktop for the individual researcher all the way up to the core of an entire enterprise's computational efforts. Three general classes of applications could benefit from the 64-bit processor:

* The widely used search and sorting bioinformatics algorithms such as BLAST, Hidden Markov Modeling (HMM), and Smith Waterman

* Simulation and modeling applications such as molecular docking and computational chemistry algorithms, as well as programs that calculate the binding energy between two molecules

* Applications that include large-scale data mining (and searching), knowledge management, and visualization

Not all 64-bit systems are suitable for every application. Selecting the right system depends on the mix of applications that must be supported. It also depends on where an application runs today — on a single server or workstation, on a departmental cluster, or as part of an enterprise computing facility. 
While all these applications could run faster using the increased memory afforded by 64-bit applications, not all 64-bit systems are suitable for every application. Selecting the right system depends on the mix of applications that must be supported. It also depends on where an application runs today — on a single server or workstation, on a departmental cluster, or as part of an enterprise computing facility.

For example, one use envisioned for Opteron-based systems is new departmental clusters, taking advantage of Opteron's high performance and relatively low cost. "Almost every lab could have its own cluster chugging away," says Douglas O'Flaherty, strategic program manager at AMD. "There can be a dedicated system in each lab. [Scientists] don't share assays, so why share in silico assays?"

Similarly, the Apple PowerMac G5, which uses a processor based on IBM's 64-bit PowerPC architecture, offers the speed and memory to conduct research that involves running common bioinformatics software such as BLAST and Smith Waterman.

Opteron systems can also serve as a bridge between today's research and the algorithms of tomorrow. For a company that does a lot of search and sorting runs on an enterprise cluster of Pentium servers, Opteron-based machines could deliver comparable performance now, would not cost significantly more than adding new Pentium systems, and would provide a migration path as 64-bit versions of popular programs become available.

That's not to say Itanium systems will not be used for these applications. But if researchers want simply to run existing 32-bit applications with the option of running 64-bit applications in the future, the high price of Itanium systems might prove excessive for this segment of the market. (If previous Intel processor rollouts are any indication, pricing often drops significantly over time. Expect a sharp reduction in Itanium prices within the year.)

Itanium systems are commanding a lot of attention from companies interested in building an infrastructure to support life science research for as far out as a decade. In this scenario, the Itanium is seen more as a long-term investment. "We will be using Itanium processors, as they are powerful and are widely viewed as the next-generation IT infrastructure," Rajagopal says.

"We're looking for architectures that will last three to five years, maybe even 10 years," concurs Al Stutz, director of high-performance computing at the Ohio Supercomputing Center. "We think the Intel architecture is that architecture." The processing power of the Itanium "will benefit our researchers in [areas] of protein folding, computational chemistry, and new drug design," Stutz says. "We will be able to do these discoveries a lot faster."

The research collaborative called the TeraGrid is standardizing on distributed clusters of Itanium servers running Linux. The TeraGrid, once completed, will have more than 20 teraFLOPs (20 trillion floating-point operations per second) of processing power and about 1 petabyte of storage capacity distributed over five sites.

Itanium and Opteron (to a lesser degree) are also drawing attention as substitutes for high-powered but proprietary systems. For example, some industry experts believe Itaniums will, in many cases, replace high-end machines from Sun Microsystems and Hewlett-Packard, which have been using chips based on 64-bit reduced instruction set computer (RISC) architectures. These RISC processors include Sun's UltraSPARC and Hewlett-Packard's Alpha.

RISC-based servers are used to run many simulation programs as well as perform modeling, simulations, and data mining on extremely large data sets. Because the new 64-bit processors can support memory in excess of 4 gigabytes, servers with the newer chips could compete with the RISC-based systems.

RISC workstations have dominated the data visualization market. Usually, such machines are quite pricey — often costing well over $10,000. The Apple Power Mac G5 is a likely, less expensive candidate to replace some of these high-end systems.

However, even with the benefits of the new 64-bit chips, some companies will for the time being stay with their older RISC hardware. For instance, earlier this year when Celera Genomics undertook the replacement of its RISC-based Alphaserver systems, it chose a RISC-based alternative, the IBM p690 Regatta (see "Buying Power," July 2003 Bio·IT, page 32).

"We didn't go for bleeding edge. We were looking for flexibility and didn't want to have lots of platforms. So we chose a platform that we could lock down and repurpose to support the mix of applications that we run." 
John Reynders, Celera Genomics 
When Celera first began thinking about its upgrade project almost two years ago, "We didn't go for bleeding edge," says John Reynders, vice president of informatics. "Everything we got is solid. We were looking for flexibility and didn't want to have lots of platforms. So we chose a platform that we could lock down and repurpose to support the mix of applications that we run."


Keys to Adoption 
Many IT managers are sticking with Pentium systems for now, given the relatively early development of 64-bit processors and the lack of software tuned to take advantage of their capabilities. But there are many areas where systems based on the Itanium, Opteron, and PowerPC G5 make sense. Indeed, early adopters are leaning on their systems vendors and the developer community to optimize their applications to perform better.

On the systems side, most major players (IBM, Dell, HP, etc.) are selling products based on Itanium and Opteron. And other companies, such as Linux NetworX, that have expertise in both the life sciences and higher-performance computing, are offering 64-bit systems.

Of course software is a big part of the equation. Applications have to be engineered to take advantage of the faster hardware. Groups such as Gelato.org are dedicated to advancing the development of optimized Linux applications that run on Itanium-based systems. In September, the U.S. Department of Energy's Pacific Northwest National Labs, a member of Gelato, upgraded its center to include a 2,000-Itanium Linux cluster capable of delivering 11.8 teraFLOPs — among the top 10 most powerful supercomputers in the world.

The BioTeam consultancy has optimized certain bioinformatics applications to take advantage of Apple's Velocity Engine, part of the G5 processors. The Velocity Engine allows for the parallel execution of up to 16 operations in a single clock cycle. The optimization process involves looking for parts of an algorithm, such as a repetitive loop, that can be executed in parallel instead of sequentially.

Such efforts by developers and systems companies will likely speed the adoption of 64-bit systems in the life sciences. But many companies will need to see clear price/performance benefits for their specific applications before large-scale deployment occurs. * 




ILLUSTRATION BY STUART BRADFORD 


For reprints and/or copyright permission, please contact  Terry Manning, 781.972.1349 , tmanning@healthtech.com.