Inside Tsubame - the Nvidia GPU supercomputer

10.12.2008

"I think we should have been able to achieve 85 [T Flops], but we ran out of time so it was 77 [T Flops]," said Matsuoka of the benchmarks performed on the system. At 85T Flops it would have risen a couple of places in the Top 500 and been ranked fastest in Japan.

There's always next time: A new Top 500 list is due out in June 2009, and Tokyo Institute of Technology is also looking further ahead.

"This is not the end of Tsubame, it's just the beginning of GPU acceleration becoming mainstream," said Matsuoka. "We believe that in the world there will be supercomputers registering several petaflops in the years to come, and we would like to follow suit."

Tsubame 2.0, as he dubbed the next upgrade, should be here within the next two years and will boast a sustained performance of at least a petaflop (a petaflop is 1,000 teraflops), he said. The basic design for the machine is still not finalized but it will continue the heterogeneous computing base of mixing CPUs and GPUs, he said.