I'm afraid I have to disagree,
-brazil-.
Supercomputers are used for exceptionally
difficult problems, such as
weather prediction and especially
nuclear weapons research. The new
Blue Mountain system, made by
Silicon Graphics and installed in
New Mexico, consists of well over 1,000
processors. Its purpose? Nuclear research, and it's currently one of the three fastest systems available for the purpose. Rather than bang away at a problem with a few
centralized processors, the problem can be broken into thousands of parts, worked on, then re-assembled. If you wish to make another comparison, look at distributed computing with the
SETI system. Thousands of folks are donating spare CPU ticks to search for
extraterrestrial life. They break the problem into tiny chunks, send it out to get processed, then the
data is reassembled. If a supercomputer could crank through the data faster, they would've used a
Cray and finished the job. It's faster to use many processors to do bits of work.
rp says WonkoDSane says (to me, by mistake) I don't want to add a response WU to Beowulf Cluster, but you may want to add that the other reason why SETI doesn't use one big SC is because it can't afford to do such things.