Share This

Friday 12 November 2010

Jaguar's supercomputing reign coming to an end?

A timeline of supercomputing speed.
A timeline of supercomputing speed.
(Credit: AMD)
 
The Jaguar supercomputer, housed at the Oak Ridge National Laboratory at the University of Tennessee, has been the fastest supercomputer on the planet for almost a year. But is it about to lose that title and place atop the podium?

Every six months, the Top500 project releases the rankings of the most powerful supercomputers. The current pace of technology development means the list does tend to reorder every half a year or so. But Jaguar has been poised at the top of the food chain for almost a year. Though the Top500 list doesn't get released until next week, it's been widely assumed that Jaguar will be taken down by a supercomputer built by China's National University of Defense Technology, located at the National Supercomputing Center in Tianjin.

Jaguar narrowly avoided being overtaken in June, the last time the rankings were released. The Nebulae supercomputer, located at the National Supercomputing Center in Shenzhen, came in second, achieving 1.271 petaflops/s (1.271 quadrillion floating point operations per second) running something called the Linpack benchmark.

But it appears that Jaguar's lead has been overcome this time. There have been reports about it over the last few weeks, and President Barack Obama even mentioned it during a speech last week:

"And we just learned that China now has the fastest supercomputer on Earth--that used to be us. They're making investments because they know those investments will pay off over the long term," he said.

The supercomputers are ranked on many factors, but the the Top500 list is ordered based on the results of the Linpack benchmark. Even if it places the Tianjin supercomputer above Jaguar, it doesn't necessarily mean the U.S. is getting bumped from its perch atop supercomputing, argue two scientists who work at Oak Ridge.


"What you find historically with these supercomputers is they become the normal machines 5 or 10 years later that everybody uses."
--Jeremy Smith, Center for Molecular Biophysics
"China might have the largest number of cores in one computer, so theoretically they have the most powerful computer. But they maybe don't have the most powerful scientific codes yet that use that computer," said Jeremy Smith, director of the Center for Molecular Biophysics at the University of Tennessee, in an interview. "So from that perspective, they may not be at the same level as Oak Ridge."

Jaguar is comprised of more than 250,000 AMD Opteron cores, running extremely sophisticated computer programs that try to answer complex questions like why ribosomes (components of cells that create amino acids) are dependent on magnesium, how to simulate making more environmentally-friendly ethanol out of plant material, and how to predict climate change. Jaguar's specialty is getting all those cores running together extremely efficiently, which is a separate and perhaps harder task than just building a really powerful computer.

Smith says that the projects at Oak Ridge National Laboratory run extremely efficiently on Jaguar, and the scientific value of the computing is therefore very high.

While China's supercomputer is based on GPUs (graphics processing unit) (in this case, built by Nvidia), and it's faster technically because the CPU (central processing unit) uses the GPU to accelerate its speed. But if you don't get the software to run on it properly, it's actually harder to use, Roland Schultz, graduate student at the University of Tennessee's Center for Molecular Biophysics, said.

What Schultz says he is much more interested in is the Gordon Bell Prize, which is awarded by the Association for Computing Machinery to the most innovative scientific application of supercomputing. Teams from Oak Ridge have won most recently in 2008 and 2009 for research into high-temperature superconductivity, or sending electricity over long distances in high temperatures with no loss of transmission.

But do we make too much of who's faster? Smith put it in perspective.

"What you find historically with these supercomputers is they become the normal machines 5 or 10 years later that everybody uses," said Smith. "The Jaguar machines that we're so amazed at right now, it could be every university or company has one" eventually.

We'll know exactly how things have shaken out next week when the Top500 List is released. But even if Jaguar does get hunted down by a Chinese supercomputer, it's not as if the folks at Oak Ridge are sitting still. The Department of Energy, which owns Oak Ridge's supercomputer, is already looking at moving from the current peta-scale computing (a quadrillion floating point operations per second) to exa-scale computing (a quintillion floating point operations per second), a speed one thousand times faster than Jaguar is currently capable of processing at.

"To get there in the next 5 to 10 years, to get to 10 million cores in one room, is a major technical challenge," noted Smith. "It's going to be fundamentally different than before. It's a hardware problem, and getting the software working is a major challenge indeed."


Erica Ogg is a CNET News reporter who covers Apple, HP, Dell, and other PC makers, as well as the consumer electronics industry. She's also one of the hosts of CNET News' Daily Podcast. In her non-work life, she's a history geek, a loyal Dodgers fan, and a mac-and-cheese connoisseur. E-mail Erica.
Recent posts from Circuit Breaker
Jaguar's supercomputing reign coming to an end?
iOS 4.2, where iPhone meets iPad
IT admins mourn Xserve's death
Will the IT guy learn to love Apple?
Oakley focuses on 3D future
Report: Hurd told contractor about EDS buy
Apple retires Xserve in favor of Mac Pro Server
Apple adds calendars back to iPhoto

Add a Comment (Log in or register) (4 Comments)
by Otto Holland November 11, 2010 5:50 PM PST
The article mentioned 250,000 cores of AMD Opteron. I am curious to know if those processors are the 4 cores or the new Barcelona 12 cores on 32 NANO. If they are the older 4 or 6 cores, they can be swaped out for the new 12 cores, because they use the same ZIFF socket. Just wondering....
Reply to this comment

by rip_saw November 11, 2010 6:19 PM PST
Durr, who has the most cores and flops is meaningless now. Last I checked, Folding@home trumps the crap out of anything in China, and google's servers totally destroy any supercomputer, although they are not being used for that purpose. I understand the use of a single computer, but for many projects, it's just not needed.
Reply to this comment

by dralw65 November 11, 2010 6:42 PM PST
This a good article that is very informative, however, the statement about ribsomes appears incorrect: ribosomes synthesize proteins from amino acids. Amino acids are not made by ribosomes.
Reply to this comment

by realityenigma November 11, 2010 7:46 PM PST
When I first read this (on slashdot.org) I was concerned myself. However, I was directed to an interesting link about a supercomputer (US built) that will be ready in 2012: http://bits.blogs.nytimes.com/2009/02/03/ibms-sequoia-supercomputer-to-shatter-speed-records/ http://en.wikipedia.org/wiki/IBM_Sequoia I am sure you guys can find more articles if you are interested;nevertheless, I think we can rest easy if we are worried about speed records.
Reply to this comment
Newscribe : get free news in real time

No comments:

Post a Comment