In late 2010, the U.S. Air Force built their own supercomputer that would drastically reduce the time needed for pattern recognition, image analysis, and artificial intelligence research. It was one of the most powerful supercomputers in the world, capable of analyzing billions of pixels per minute of extremely high-resolution satellite images. However, this supercomputer was not built from traditional parts, but from a cluster of 1,760 PlayStation 3 gaming consoles.
The PS3 was a seventh-generation console made by Sony, released in 2006. It competed primarily against Microsoft’s Xbox 360 and Nintendo’s Wii. At the time, the PS3 was a powerful console with the ability to run custom software. It was for these reasons that the PS3 was used by researchers, who wanted to possess large amounts of computing power without the large costs associated with traditional supercomputers.
The PS3 was not the first console to be used in this way, though, and it wasn’t even the first PlayStation used for this.
Console supercomputers
In 2002, Sony released a Linux kit for the PlayStation 2, which allowed the console to be used as a personal computer. This opened up the PS2 to researchers who wanted to use its processing power. Craig Steffen, a senior researcher at the National Center for Supercomputing Applications (NCSA), said, “They built the bridges so that you could write the code, and it would work.”
Steffen was on a team in 2002 that tried to make a supercomputer from a cluster of PS2s. 60–70 consoles were hooked up together, but unfortunately, the system was unreliable: “It worked okay, it didn’t work superbly well,” Steffen said. The system suffered from bugs that the team simply could not fix.
The team abandoned the project soon after. The PS3’s arrival marked one of the biggest generational jumps in performance in the history of gaming consoles. The PS3 brought 37 times the FLOPS offered by the PS2. FLOPS, or floating point operations per second, is a measure of computer performance. It is a good indicator of performance, although it’s not a foolproof way of measuring practical performance, similar to how cars with the same horsepower can be wildly different in terms of speed.
With the release of the PS3, which could also run Linux software reliably, researchers now had a stable, powerful system to work with.
Black hole researcher Gaurav Khanna from the University of Massachusetts Dartmouth recognized the PS3 as a potential tool for his work. Theoretical astronomy is heavily reliant on computer simulations, which naturally require enormous amounts of processing power to accurately simulate unfathomably complex astronomical objects like black holes. But this level of computing power is expensive.
Because this kind of research is not as impactful on society as other types like cancer research and technology, funding is usually scarce. Because of this, Khanna and his colleagues were looking for a cheap form of processing power, which they found in the PS3.
Well, actually a lot of PS3s — 176 of them in total.
While for the average gamer, the cost of 176 consoles is astronomical (pun intended), Khanna saved millions by using the PS3. They were able to use this machine to study black holes, crunch huge calculations and even win cryptography competitions.
The Air Force’s Condor Cluster
For the U.S. Air Force, financial constraints are not an issue, but they too built a PS3 supercomputer, one that used 10 times the number of consoles as Khanna’s system.
They assembled 1,760 PS3 consoles into a supercomputer called the Condor Cluster. The consoles were connected by five miles of wiring and became the 35th most powerful supercomputer in the world at its peak.
But having the highest performance was not the Condor Cluster’s only goal, as it was a demonstration of supercomputing with low energy consumption. The highly efficient PS3 consoles allowed the Condor Cluster to use just 10% of the energy of comparable supercomputers at the time.
It produced 500 TFLOPS (for comparison, the NVIDIA RTX 8000 graphics card, one of the most powerful on the market, is 16.2 TFLOPS), and was used by the Air Force to process high-resolution satellite images.
The Condor Cluster could also run “learning” algorithms that could read text and accurately fill in missing information in the text. This was particularly useful for Air Force intelligence operations, as it could fill in the missing gaps in valuable documents.
More from us: Messerschmitt Built Tiny, Ridiculous Cars When Planes Weren’t An Option
The Air Force’s time with the Condor Cluster was rather brief, as the continuous improvement in processing power and the lowering of costs for that power made the machine redundant. A lack of replaceable machines and the release of new generations of consoles marked the end of the project.
Many of the PS3s used in the supercomputer were sold off, and some even went to Khanna.
Leave a Comment