Volume 2, Issue 14
Sure, supercomputers are fast and powerful, but what do they really DO? Scientists are increasingly turning to supercomputers to perform calculations and do mathematical modeling they couldn’t do with standard machines. They allow researchers to open new windows into phenomena as vast as the Universe and as small as nanoparticles.
Supercomputers, also called high performance computers, depend on the design of algorithms, software systems, and computer hardware to deliver the computing power needed to tackle the most computationally challenging problems.
Over the past few decades, increases in computing power have largely been described by Moore's Law, which is the observation that computing power – especially the number of transistors on circuit boards - has generally doubled every 18‐24 months. At the end of the 1980s, computers were capable of performing a billion arithmetic operations per second. Ten years later, computing technology had advanced to the point that it was possible to perform a trillion arithmetic operations per second. In 2008, computers capable of a quadrillion operations per second were deployed.
Image of the Blue Waters supercomputer. Image credit: NCSA/University of Illinois.
However, all of this processing power packed into increasingly smaller sizes, has resulted in a problem – having to deal with significant heat. In order to add processing power, computer chips must run at faster speeds – requiring packing more transistors on chips and pushing electricity through the transistors at higher rates of speed, which leads to more heat. Some supercomputers use huge room size fans to blow cold air through the computer room. Others have liquid coolant, like what is used in the refrigerator in your kitchen,pumped right around the computer chips. But even then, computer manufacturers can only run the chips so fast without generating enough heat to damage the electronics.
Last week, two supercomputer facilities supported by the National Science Foundation were formally declared open for use to the science and engineering research community at-large. Stampede at the University of Texas at Austin's Texas Advanced Computing Center (TACC) was dedicated on Wednesday, March 27, 2013. The second, Blue Waters at the National Center for Supercomputing Applications at the University of Illinois at Urbana Champaign was dedicated on Thursday, March 28, 2013.
These two significant systems are part of NSF's comprehensive strategy for advanced computing infrastructure to facilitate transformative foundational research in computational and data-intensive science and engineering across all disciplines.
Image of Thom Dunning.
Who thinks of this stuff? Professor Thom Dunning is the Director of the National Center for Supercomputing Applications (NCSA) and Distinguished Chair for Research Excellence in Chemistry at the University of Illinois at Urbana Champaign. He is a chemist by training and has spent some of his time during his career serving at the U.S. Department of Energy. While Professor Dunning’s primary job is leading NCSA, he still continues his chemistry research and loves to advise and mentor students. During his free time, Professor Dunning enjoys spending time with his family and 9 grandchildren, who range in age from 5 to 18.
Learn more about Stampede at UT-Austin at: http://www.tacc.utexas.edu/resources/hpc/stampede and about it’s dedication at: https://www.nsf.gov/news/news_summ.jsp?cntn_id=127194&org=NSF&from=news.
Read more about Blue Waters at: http://www.ncsa.illinois.edu/BlueWaters/ and about its dedication at: https://www.nsf.gov/news/news_summ.jsp?cntn_id=127193&org=NSF&from=news.
Read more about Professor Thom Dunning at: http://www.ncsa.illinois.edu/AboutUs/Leadership/dunning.html.