By Salvatore Salamone
March 7, 2002 | An Anthrax research project being conducted by Oxford University with the help of Intel, Microsoft, and United Devices demonstrates how the tremendous computing power many companies have sitting at their disposal could be applied to research projects. The Anthrax Project creates a virtual supercomputer by tapping into spare processing power offered by Internet-connected volunteers. The goal is to screen 3.5 billion molecules in silico to identify those that are likely to bind to a protein known to mediate Anthrax toxicity and hopefully inhibit the toxicity.
Anthrax toxin has three protein components, one of which forms a ring and facilitates entry to cells. Typically, this ring binds to the “lethal factor,” another protein, and transports it into the cell. Recent Harvard University research has identified and characterized this binding site. The Oxford research project is searching for small molecules that have the correct characteristics to attach to the ring, and thus inhibit the binding with the lethal component. Likely molecules identified through the virtual screening process will be examined in more detail in the lab.
“Instead of spending millions of dollars and years in the lab screening maybe hundreds of thousands of molecules, now it is possible to screen hundreds of millions of molecules in months,” says Graham Richards, chairman of the chemistry department at Oxford and director of the National Foundation for Cancer Research (NCR) Centre for Computational Drug Design. Oxford’s cancer research project also “harvests” unused computing cycles donated by volunteers.
United Devices’ contribution to the projects is its distributed computing platform. Microsoft, through its .NET architecture, brings an underlying enabling technology for conducting distributed computing over the Internet. Intel, through its Intel Philanthropic Peer-to-Peer Program, brings peer-to-peer (P2P) computing expertise. Intel is also sponsoring Alzheimer and protein-folding projects. Combined, the four research programs have more than 1 million computers that have contributed more than 700 million hours of processing time.
In both the anthrax and cancer research projects, volunteers download a small software program that runs in the background on their computers whenever the computer is turned on. The program has a conformational engine that calculates and studies the different ways a molecule could interact with the target protein. On a computer with a 750MHz processor, it takes about 1-2 minutes to evaluate each molecule. Each volunteer gets 100 molecules at a time downloaded to their computer to test.
Typically, one or two molecules of the 100 tested are identified as candidates for closer inspection. Once all 100 molecules are tested, the volunteer gets another batch to process.
The software running on a volunteer’s computer will tell the person how many candidate molecules their computer has identified. Volunteers will not know if their molecule is eventually the one that leads to a solution to this problem.
Well-Suited to Private Research
While university-based research such as the anthrax and cancer research projects draw on the generosity of the general public, companies can use the same techniques to tap unused corporate computing power for internal research. For example, the approach can be applied to any research that requires virtual screening of large numbers of different items against a set number of specific tests.
These applications all test many candidates against a single database item. Alternatively, “you could take any large database, divide it up and distribute pieces to thousands of computers,” says Ed Hubbard, CEO of United Devices. “When a search needs to be done, you send a query string to all the machines.”
The database can be anything from the genome to a pharmaceutical company’s library of millions of proprietary compounds. The advantage of using a distributed approach is that “you can use the computers you already have,” says Scott Griffin, Intel Philanthropic P2P program manager. “You get to use these systems 100 percent of the time-when people are sleeping, in meetings or at lunch.”
Processing power adds up quickly. For example, a company with a mix of 2,000 older Pentium class, 166MHz processor computers and 100 newer Pentium III, 1GHz processor PCs used mostly for office applications, can get an aggregate processing power of about 240 gigaflops (a gigaflop is equal to one billion floating-point operations per second). That is equivalent to using four Sun Enterprise 10000 servers, according to United Devices.
One worry companies might have using such a distributed approach is the security of the data. Could a competitor eavesdrop? In the Oxford research projects, the data transferred to and from each volunteer’s computer is protected using 168-bit triple Data Encryption Standard to ensure eavesdroppers cannot view any information they happen to capture.
A Rich History
In concept, the Anthrax Project is similar to past large-scale Internet-based distributed computing efforts. For example, SETI@Home (the Search for Extraterrestrial Intelligence at Home) uses volunteers to analyze large volumes of data to see if there are any recognizable patterns that might indicate the signals were being sent from intelligent extraterrestrial sources. The project started in July 1999 and it has had more than 3.5 million people help out. These volunteers provided an aggregate CPU time of about 876 thousand years and counting.