Powering up the Grid
Supercomputer centres around the world are linking up over the internet to create a new generation of enormously powerful machines.
The networks of supercomputers are needed because some problems in science are just too large for any one machine to tackle by itself.
The software used to link supercomputers is also being used to give access to high performance computers to universities and research organisations with limited resources.
The first networks of interconnected supercomputers, or computational “grids”, have already been created in the US and the US Government is spending around $100 million to find better ways to tie the machines together.
While all universities, research labs and companies have their own computers, most also have their own ways of doing experiments on these machines.
Just like an internet browser, the grid software puts a common face onto the network of supercomputers behind it.
This lets researchers submit data the same way every time, even though they may be using different computers each time they use the Grid.
“We want to use these services to generate new knowledge,” said Rob Allan, a spokesman for the Daresbury Laboratory in Cheshire, who is helping co-ordinate UK work on Grid computers.
“Instead of using the web to look up existing data, we can log on and create new data by running applications on a computer or use instruments connected to the web,” said Mr Allan.
Already a research group led by Kurt Anstreicher and Nathan Brixius from the University of Iowa have used some Grid software called Globus to tie together computers to solve the “nug30” Quadratic Assignment Problem (QAP) that was first posed in 1968.
The computational crunching to solve the problem involved over 1,000 computers from eight different institutions scattered around the world. Cracking the problem took a total of 6.9 days. The researchers estimate that if a single machine had done the number crunching it would have taken 6.9 years to solve the problem.
The Grid software will see heavy use because the experiments planned in particle physics, astrophysics and climate modelling will produce so much data, they risk overwhelming existing supercomputers.
For example, the collaborative software will prove useful for analysing the data produced by the new particle accelerator, called the Large Hadron Collider, being built at the Cern laboratory in Switzerland.
The LHC will probe deeper into the structure of matter than ever before and in doing so will produce 1,000 times more data than the Swiss lab can handle.
All computers great and small
But the work on the Grid is not the only project that is getting lots of computers working together.
Many universities and research labs now use networks of low-powered computers as surrogate supercomputers.
They are known as Beowulf clusters following the work of Nasa engineers Thomas Sterling and Don Becker who gave this name to the first machine to use this configuration.
A similar approach was used to create the animated ants in the movie Antz. So far, over two million people have downloaded the [email protected] screensaver that uses idle computers to search for signs of alien life.
The software, developed at the University of California at Berkeley, has now clocked up over 300,000 years of computer time.
Separately, two companies, ProcessTree and Parabon, are now starting to pay people to use their idle computers to crunch numbers for their customers. Like the [email protected] project, both companies make a computer process a package of data that it has downloaded. The machine does not have to be online all the time.