YouTube Facebook LinkedIn Google+ Twitter Xingrss  



By Salvatore Salamone

July 15, 2003 | Blue Gene -- IBM’s highly touted petaFLOP protein-folding supercomputer -- is still at least three years away from scheduled completion in 2006, but project managers say they have achieved some significant milestones. But will gains in other high-performance computing areas, notably clustering, make the system obsolete by the time it’s ready to be unveiled?

While the short answer is probably no, IBM must deliver on the promise of Blue Gene.  Hence the recent developments are so important.

Arguably the most significant advancement is that the new processor chips -- the heart of the Blue Gene system -- are nearing completion. “The main processor chips are in production,” says William Pulleyblank, director of IBM’s Deep Computing Institute and director of exploratory server systems. Pulleyblank expected to have the first chips in hand by the time this issue went to press. “We’ll power them up and start work on a 512-node prototype late this fall,” he says.

The processor chips are essential to the success of Blue Gene, as they represent a new design aimed specifically at data-intensive applications in high-performance scientific computing. With these processors, IBM is trying to address a performance limitation common to all data-intensive calculations caused by the time required to move data from memory to the processor.

Blue Gene will use what IBM calls data-chip cells, which have been optimized for calculations that access lots of data. Essentially, each cell includes a processor for computing and a processor for communications. And, like most high-end processors, each cell will have its own onboard memory (see “Think Blue … Again: It’s in the Genes,” Aug. 2002 Bio-IT World, page 28.)

This new design is critical for supercomputers to enhance performance when conducting data-intensive calculations. For example, the current top performing supercomputer is the Earth Simulator in Japan’s Earth Simulator Center, part of the Japan Marine Science and Technology Center. The system has been benchmarked at 35.8 teraFLOPS -- almost five times more powerful than Los Alamos National Laboratory’s ASCI Q (the second most powerful computer in the world).

The Earth Simulator does not use exactly the same architecture as Blue Gene, but its design also takes data movement to the processor into account. “What really makes the Earth Simulator strong is a mighty fine node, the SX-6,” says Jim Taft, a project manager at NASA Ames Research Center. “The SX-6 nodes have great memory bandwidth.”

Beyond Chips
IBM has also been working on other Blue Gene enhancements . “We’ve made huge strides in the system software,” Pulleyblank says. “We’ve created a Blue Gene simulator. With this, we’ve taken the operating system, booted it, run it, and we’ve done compilations.”

One key driver is to restrain the operational costs. In the area of cooling requirements, for example, the first commercial implementation of Blue Gene -- Blue Gene/L -- will be air cooled. Blue Gene/L is a joint effort between IBM and the U.S. Department of Energy that is to be completed in 2005.

Many supercomputers require water cooling of their core processing facilities, which is not a trivial task (see “Data Centers: It’s Getting Hot in Here,” May 2003 Bio-IT World, page 1). To better understand Blue Gene’s cooling needs, IBM has created a heating simulator. “We built a full-scale thermal prototype of Blue Gene using resistors to generate the heat we expect Blue Gene to produce,” Pulleyblank says. The simulator revealed a problem. “We could not get enough air through to cool the system,” he says.

This led to development of a novel cooling technique. “We slanted the walls of the equipment rack,” Pulleyblank says, to pull more air through the equipment. The cold air intake, which is usually on the front side of a rack of equipment, is larger at the bottom than normal. And the hot air venting side of the rack is large at the top. This pulls more cold air into the rack from the bottom and discharges more hot air from the top. “This gained us 13 degrees of cooling,” he says.

Will It Play in Peoria?
While IBM appears on track with delivery of Blue Gene/L slated for early 2005 (perhaps earlier) and the full-blown Blue Gene around 2006, the growing interest in clusters of commodity servers raises a question: Are dedicated supercomputers are a dying breed?

Currently two of the top 10 computers are clusters. And there are 55 Intel-based and eight AMD-based PC clusters among the top 500 (see www.top500.org). When you add in clusters based on Sun Microsystems and Hewlett-Packard (including the former Compaq Alpha-based systems) machines, there are 93 cluster systems among the top 500 computers.

But some experts are quick to defend dedicated supercomputers. “To be at the cutting edge of computational science, you need the most powerful computers,” said Jim Decker of the Department of Energy’s Office of Science, at an IDC forum recently.

Decker notes that the Earth Simulator was certainly expensive -- the Japanese government invested about $350 million over four years. However, it appears to be about 5 percent to 10 percent more efficient than other systems. This makes it “not look so expensive, especially when you figure in the productivity of the scientists,” Decker says. The Earth Simulator can be used for quantitative prediction and assessment of variations of the atmosphere, ocean, and solid earth; this includes high-resolution models of atmospheric circulation and climate modeling.

IBM is hoping that the combination of expanded performance, more efficient data access for processors, and lower operational costs will give Blue Gene a big leg up in the high-performance computing world.

 






For reprints and/or copyright permission, please contact  Jay Mulhern, (781) 972-1359, jmulhern@healthtech.com.