The Gift of Gigaflops

Distributed computing makes supercomputers of PCs

by Neil McAllister, Special to SFGate
(Originally published Wednesday, November 3, 1999. Editor: Amy Moon)

The holiday season is fast upon us yet again. Soon we'll be trimming our respective trees and certifying our halls appropriately decked; but let's not forget that the Yuletide is also a time for giving.

There are a number of organizations on hand that are always appreciative of a donation, when we have a little to spare. And who else has more to spare — indeed, more excess — than the average American computer professional?

I refer, of course, to unused CPU cycles. The processing power of today's desktop PCs far surpasses what most users will ever need. So this season, why not donate your processor's extra power to one of the interesting projects now underway that are more than willing to take up the slack? Like cryptography or the search for UFOs. You can do it through a technology called "distributed computing."

Now, before you protest, I can predict what you're going to say. "Sure, I ordered that 500MHz Power Macintosh G4," you'll complain, "but that was before Apple ran into its CPU supply problems. Now I'm not even sure if they're going to ship me that back-ordered machine at all, and if they do, it'll probably be with a slower processor than I originally specified. I just don't have any megahertz to spare!"

But even if your venerable, pre-G4 machine can only limp glacially through those mathematical models of hydrogen bomb explosions — cake, to any worthy supercomputer — don't expect too much sympathy.

For one, you'd do well to take Apple's claims that the new chip is a "supercomputer" with a healthy dash of salt. With modern chip technology, it doesn't take too much for a single chip to reach what was once considered supercomputer performance.

According to government standards, even Sony's PlayStation 2 gaming console was a supercomputer, until the Clinton administration raised the legal performance limit. But today's true supercomputer doesn't rely on a single fast processor.

Instead, modern supercomputers contain many CPUs, often numbering in the thousands. They deliver their staggering performance by breaking up complex computational tasks into smaller pieces, and assigning each piece to a separate processor, to be performed simultaneously.

Don't let your eyes get bigger than your motherboard, though. One processor is probably more than enough for you or me. Digital video people will tell you there's never enough megahertz or hard drive space to crunch those giant MPEG files. But for the rest of us, how much good is a two gigaflop processor when we're waiting for a Web page to download, or while the cursor is blinking in an empty Microsoft Word document?

Whatever their processor speed, the truth is our PCs spend much of their time sitting idle on our desks. Maybe they're waiting for us to push the "OK" button, perhaps to flip through a software manual. We just don't challenge them enough. No matter how bloated and inefficient it is, even Windows 2000 will spend much of its time waiting for the user. That's where distributed computing can step in, to pick up the slack.

A distributed computing system works in almost the same way as the aforementioned modern supercomputer. But in this case, the individual processors need not coexist within the same box.

Instead, a distributed computer might use the Internet to assign tasks across the globe, to any CPU that wants to participate — even the one in your lowly desktop PC, even if it's only part-time. As of this writing, a number of these Internet-based distributed computing experiments are in the works, and at least two are already underway.

All current Internet-distributed projects share a common basic strategy. Every volunteer who wants to participate downloads and installs a piece of specially designed client software. This software activates itself automatically when it detects your computer is idle, much like a screen saver.

When activated, it contacts the controlling server of the distributed computing project, downloads its next "assignment," and keeps crunching until you need your CPU back again. Results are transmitted back to the project servers for tabulation.

The computational power of thousands of PC processors networked together — even part-time — could potentially rival that of today's most powerful supercomputers. But then, every CPU cycle counts, because each of the Internet distributed computing projects has pretty lofty goals.

The longest-running effort to date is hosted by distributed.net. Called "Project Bovine," its aim is to crack the RC5-64 encryption algorithm, as part of a contest hosted by the algorithm's patent holder, RSA Labs.

They've chosen a "brute force" method — meaning, rather than trying to reverse the encryption scheme, they're just going to try every single possible key until they hit the correct one. A simple idea, but no mean feat nonetheless.

As of April 1999, the project members were trying more than 70 billion keys per second on average; at that rate, they won't exhaust every key for about another 7.6 years. Don't count distributed.net out, though. They've already broken the earlier RC5-56 and DES algorithms using the same technique, and every new CPU helps.

If cryptography doesn't interest you, perhaps you'd be like help out in the search for life in outer space? The SETI@Home project aims to do just that. Its members use their spare CPU time to analyze tiny portions of the massive data returned from the Arecibo radio telescope.

So far, SETI (Search for Extraterrestrial Intelligence) astronomers have had to rely on expensive custom mainframes to weed out signals of possible intelligent origin from the cosmic background noise. Now, the SETI@Home team at UC Berkeley wants to turn the task over to you, and hopefully thousands of others like you, one small piece at a time.

A number of other Internet-distributed computing projects will soon be available to your underworked CPU. The Casino-21 project at the UK's Rutherford Appleton Laboratory hopes to use the technology to model changes in global climate.

Distributed.net is preparing a few other cryptographic challenges, and according to their mission statement, they remain committed to "conducting and actively supporting distributed computing research of all kinds."

'Tis the season for giving, and an idle processor is the devil's workshop. So, download a distributed computing client today, and get crunching in your CPU's spare time.

Maybe you'll still envy the new generation of "supercomputer" CPUs. But whether your desktop PC runs at 500 MHz or a humble 50, by joining in on a distributed project your machine can become part of a massive computational system to rival mainframes.

Beat that, G4! And better still, no longer will you have to hang your head in shame at the painful truth, that you're probably barely using the CPU you've already got.



1999 Article IndexArticles HomeNeil's Homepage

Valid XHTML 1.1!