text-only page produced automatically by LIFT Text
Transcoder Skip all navigation and go to page contentSkip top navigation and go to directorate navigationSkip top navigation and go to page navigation
National Science Foundation
design element
News From the Field
For the News Media
Special Reports
Research Overviews
NSF-Wide Investments
Speeches & Lectures
NSF Current Newsletter
Multimedia Gallery
News Archive
Press Releases
Media Advisories
News Tips
Press Statements
Speech Archives
Frontiers Archives

Recreating the Early Universe

May 1996

Over the past several years, we have been treated to astounding pictures of the distant universe from sophisticated telescopes. Alongside images of galaxies, supernovae, and possible evidence of new planets, there often appear artistic renderings of the heavens, attempts to recreate events that took place billions of years ago.

Until recently, cosmology--the study of the origin and nature of the universe--was largely a science of speculation, without the tools or data to put theories to the test. As more of this data is made visible through advances in observational technology, that which we cannot see also comes tantalizingly closer.


Today, the researchers who make up the Grand Challenge Cosmology Consortium (GC3) harness the power of supercomputers to look at the birth and infancy of the universe, starting from the Big Bang, the cosmic explosion which is believed to have started it all about 15 billion years ago. GC3 is a collaboration between cosmologists, astrophysicists, and computer scientists studying the formation of large-scale cosmological structure.

Now in the third year of a five-year NSF grant administered by the Division of Astronomical Sciences, the consortium is one of several groups participating in NSF's High Performance Computing and Communications (HPCC) "Grand Challenge" initiative .

Led by astrophysicist Jeremiah Ostriker of Princeton University, GC3 consists of seven principal investigators at six universities and over 30 professional staff members, postdoctoral research associates, and graduate student assistants. Consortium members use high-performance computers at all four NSF-supported National Supercomputing Centers to build three-dimensional models of galaxy formation, recreating the early universe as predicted by theory and glimpsed by the Hubble Space Telescope late last year. Along the way, they are testing competing theories to explain what our universe is made of and how it has evolved.


Supercomputers are the modern variant of the pencil and paper historically used to express astronomical observations in terms of physical laws and mathematical equations.

To test theoretical models, modern cosmologists simulate, numerically, scenarios in the early universe that could have produced what we see and measure now. Massively parallel and scalable high-performance computers are the only machines capable of performing the billions of calculations required to solve the equations explaining the complex interactions of energy and matter occurring over vast ranges of space and time.

But modeling is more than plugging in numbers. Much of the GC3 effort is directed toward developing new algorithms, programming models, and software technology that uses supercomputers on both large and small scales. This "multiscale" aspect of the GC3 endeavor is what makes it an NSF "Grand Challenge" computational project.


GC3 cosmologist Michael Norman at the National Center for Supercomputing Applications (NCSA) at the University of Illinois studies the nature of dark matter and its role in galaxy cluster formation. Scientists estimate that 90% of the matter in the universe does not emit enough light to be observed, but they know it exists because of its gravitational effects on stars and galaxies. This invisible mass is a key component of current versions of the Big Bang theory and is crucial to understanding how matter clusters under gravity.

But cosmologists do not agree on the form dark matter takes, whether it is made of "cold" matter such as dead stars or heavy exotic particles, or "hot" and lightweight undetectable subatomic particles. Astrophysicists know that hot dark matter alone could not have produced galaxies early enough to account for some of the largest superclusters of galaxies that are visible today. On the other hand, a universe filled with only cold dark matter would produce too many superclusters too fast.

According to Norman, one promising model combines cold and hot dark matter. The trick is to find the right mix, let it evolve in cyberspace, and see if it brings us to where we are now.

To do this, Norman and his team imagine space broken into a grid of cells. They determine how the matter in each of these cells will evolve, by integrating equations for dark matter, gases, temperature, and gravity in an expanding universe. The grid is reproduced as a three-dimensional image of a large section of the universe, 340 million light years wide. The simulation begins about 1 billion years after the Big Bang, when the universe was a much hotter and denser place than it is today. The grid is subdivided into over a hundred million cubes, reflecting the changes and fluctuations occurring as the universe cools and expands. The result is a map showing the behavior of gases and clusters of matter.


In 1994, Norman and his NCSA colleague, Gregory Bryan, carried out the largest cosmological simulation to date, creating a universe composed of 60 percent cold dark matter, 30 percent hot, and 10 percent ordinary matter. The simulation, conducted on NCSA's Connection Machine-5 supercomputer, took over 30 hours to complete and produced a model that closely resembled recent observations made by the orbiting x-ray satellite ROSAT. While the simulation accurately predicted the number and arrangement of galaxy clusters, it did not capture exactly the measurable ratio of gas to dark matter.

GC3 researchers need more refined programming codes to improve the model's resolution and allow them to zoom into the individual cubes on the grid. This would also enable the researchers to dedicate more computational energy to analyzing the crucial areas where galaxy formation begins and experiment with different particle-based equations aimed at capturing the behavior of dark matter. Dark matter is a very fertile field for cosmologists. "Everyone is motivated to find out what it is," says Norman, "but there is nothing definitive yet."

At the same time that supercomputers provide cosmologists with the tools to tinker with celestial ingredients, cosmological simulations also help push computer hardware and software systems to new limits. GC3 members work closely with the staffs of the supercomputer centers as well as the computer vendors to develop efficient strategies for storing, analyzing, and visualizing vast amounts of data.

With support from GC3, Joel Welling at the Pittsburgh Supercomputing Center has written a more efficient program to visualize three- dimensional data like that generated by Norman's simulations. The program -- the VFleet Distributed Volume Renderer -- can be run in a parallel environment, on either a workstation cluster station or supercomputer, and is able to handle large datasets at fast speeds.


GC3 theoretical physicist Edmund Bertschinger of MIT is studying the evolution of matter and energy in the early universe starting only a few minutes after the Big Bang. At that time the universe was filled with hot, dense matter that glowed from its heat like the interior of the Sun. Bertschinger and MIT postdoctoral researcher Paul Bode numerically integrate the equations governing radiation, ordinary and dark matter, and Einstein's general theory for gravity. Their simulation produces a map of the glowing radiation left over from the Big Bang, which can be compared with maps of the cosmic microwave background radiation made recently by the Cosmic Background Explorer satellite, COBE.

Bertschinger and computational physicist Robert Ferrell also study the formation of galaxies and clusters using models like those of Norman and Bryan. Their simulations omit the gas but achieve higher spatial resolution for the dark matter. In order to run their calculations on parallel supercomputers, Ferrell and Bertschinger have developed an algorithm that has since been used to model marine oil spills, replacing cosmological equations with those that render the behavior of ocean currents, wind, and globules of oil on the surface of water.

Although much of the effort now is concentrated on exploring the numerical calculations and expanding high-performance applications and software, some of the results will soon be coming to the big screen. Several GC3 members are creating visualizations of their numerical simulations for "Cosmic Voyage," an IMAX feature film about the universe and debuting at the Smithsonian National Air and Space Museum in mid-1996. Frank Summers, a postdoctoral research associate at Princeton, is coordinating the GC3 sequence in collaboration with the National Supercomputing Metacenter and computer graphic artists at PIXAR (the studio responsible for the movie "Toy Story"). Audiences will travel through the gravitational collapse of structure shortly after the Big Bang, the formation of galaxies, and the collision of two spiral galaxies.

And what about the future? Can computer modeling also be used to explore questions about where we are going? Earlier this year members of the GC3 team linked three or four parallel computers to simulate the collision between our galaxy, the Milky Way, and the Andromeda Galaxy, one of our larger neighbors. The preliminary findings indicate that we won't be able to compare the computer model with the actual event for another three billion years.

Return to May 1996 Frontiers home page   Other Contents of This Issue
Visit Other Frontiers Issues page   Other Frontiers Issues
Visit Other NSF Publications page   Other NSF Publications
Visit Office of Legislative and Public Affairs page   Office of Legislative and Public Affairs


Email this pagePrint this page
Back to Top of page