I'm fascinated to discover a novel use for Sony's Playstation 3 game consoles.
Dr. Gaurav Khanna and his colleagues at the University of Massachusetts at Dartmouth have put together a network of 16 PS3's to form what they call the "gravity grid". I was at a loss to figure out why they'd use game consoles instead of "proper" computers for the purpose until I read their explanation:
The Sony PlayStation 3 has a number of unique features that make it particularly suited for scientific computation. To start with, the PS3 is an open platform, which essentially means that one can run a different system software on it, for example, PowerPC Linux. Next, it has a revolutionary processor called the Cell processor which was developed by Sony, IBM and Toshiba. This processor has a main CPU, called the PPU and several (six for the PS3) special compute engines, called SPUs available for raw computation. Moreover, each SPU performs vector operations, which implies that it can compute on multiple data, in a single step. Finally, its incredibly low cost makes it very attractive as a scientific computing node, that is part of a cluster. In fact, its highly plausible that the raw computing power per dollar that the PS3 offers, is significantly higher than anything else on the market today!
Well, color me impressed! You can read more about the team's use of PS3's in their research into "Binary Black Hole Coalescence using Perturbation Theory" here (and no, I have no idea what that means!).
It seems that Dr. Khanna isn't the only one to discover this novel use for the PS3. Dr. Tod Martinez, Professor of Chemistry at the University of Illinois, runs a cluster of consoles from IBM using the same Cell chip technology as the PS3, and links them to graphics processor chips from Nvidia. Dr. Martinez' research examines:
... how molecules behave when you shine light on them - which has wide ramifications for the fields of agriculture, solar energy and the study of human vision.
"We have done tests of algorithms and Nvidia cards are four to five times faster than the Cell chip, which is 20 times faster than an ordinary high end computer," he says.
Because both technologies can be classified as "stream processors" they are highly suited to moving massive volumes of data - unlike the general purpose processing for ordinary computing.
"Some people think it's just about having a faster computer. They don't realise how big a change it is to do computing at your desk after accessing a computer in a room somewhere where you have to wait around for results.
"Of course it does cost less, but what needs to be recognized is that it also changes the way people think about problems when they are given a hundred times more computer power."
More about Dr. Martinez' work (and Dr. Khanna's) here.
I'm from the "old school" of computers. I started working with them in the 1970's using Siemens and Telefunken military systems and IBM System/360 and System/370 mainframes. I went on to work with progressively larger IBM mainframes, DEC PDP and VAX mini-computers, the first PC's, and so on. I can still recall when, in 1976, the Shell oil company subsidiary in South Africa upgraded its mainframe computer from 1MB of core memory (little magnetized rings wrapped around a wire framework) to 2MB. This is what core memory (the predecessor of today's memory chip) looked like, magnified fifty times:
There were screams of outrage from the bean-counters, along the lines of "What does an oil company need with two megabytes of memory? You'll never use that much!" Well, I'm typing this post on a desktop computer with two gigabytes of memory - a thousand times more than that mainframe upgrade! I guess that answers the memory-expansion critics from thirty-plus years ago . . .
I've never used a games console: but as a former computer systems engineer, I'm definitely impressed by this evidence of their raw power. I might just have to get hold of one and play around with it to see what it can do.