News

Largest-Ever Simulation of Cosmic Evolution Calculated at San Diego Supercomputer Center

Published January 21, 2004



Full Size Image (816K, JPEG)

Center for Astrophysics & Space Sciences (CASS),


Image credit: Michael Norman, Pascal Paschos, UCSD; Robert Harkness, SDSC.


This image shows the distribution of visible matter - galaxies, quasars, and gas clouds - inside a cube-shaped volume 248 million light-years on a side, the product of the world's most complex scientific simulation of the evolution of the universe ever performed. University of California, San Diego cosmologist Michael Norman ran his "ENZO" program for more than 130 hours on 512 processors of the San Diego Supercomputer Center's Blue Horizon supercomputer, tracking more than a billion particles of visible matter and dark matter and performing hydrodynamics calculations in more than a billion cells (a simulation volume 1024 cells on a side) for more than three billion years of simulated time. The simulation begins only 30 million years after the Big Bang, when the universe was a uniform sea of hydrogen and helium gas and dark matter; over time, irregularities in density of about one part in a thousand are amplified by the action of gravity to form clusters of galaxies in enormous sheets and strings separated by immense voids. This view of the simulation corresponds to a time 1.3 billion years after the Big Bang, or 12.4 billion years ago. The simulation has generated more than 12 terabytes of data, with more to come as further runs extend the evolution of the universe further towards the present day.

"SDSC is the only place in the world at this time where this simulation can be done and the scientific content can be analyzed, because of the investment SDSC has made in data management technology," Norman said. Blue Horizon calculates at 1.7 teraflops (trillion operations per second) - in raw power, roughly equivalent to giving every man, woman, and child on earth 100 pocket calculators and having them punch in one calculation per second on each one.

The universe of cosmologist Michael Norman has just become a lot bigger and more complex. Norman, a professor of physics at the Center for Astrophysics and Space Science (CASS) at the University of California, San Diego (UCSD), together with colleagues at CASS and the San Diego Supercomputer Center (SDSC), has just run the world's largest and most complex scientific simulation of the evolution of the universe ever performed. Using SDSC's Blue Horizon supercomputer, the team tracked the formation of enormous structures of galaxies and gas clouds during the millions and billions of years following the Big Bang.

"SDSC is the only place in the world at this time where this simulation can be done and the scientific content can be analyzed, because of the investment SDSC has made in data management technology," Norman said. Enzo can follow the evolution of the cosmos from shortly after the Big Bang through the formation of the stars, galaxies, and clusters of galaxies, basing its calculations on the best information that scientists have about the physics and chemistry that govern the processes that shape our universe. The agreement between simulations and telescopic observations of distant galaxies, which are seen as they were billions of years ago because their light takes that much time to travel the vast distances of intergalactic space, provides a "reality check" on astrophysicists' theories of the origin of the universe and the formation of stars and galaxies. As simulations improve, scientists are able to refine their theories; for example, in the past five years the comparison between theories and observations has enabled astrophysicists to date the Big Bang to 13.7 billion years ago, to an uncertainty of only a few percent.

Norman ran his "Enzo" cosmology program for more than 100 hours on all 128 computing nodes of Blue Horizon. (The machine is a "massively parallel" supercomputer; its 128 computing nodes use 1,024 CPU chips.) Blue Horizon calculates at 1.7 teraflops (trillion operations per second) - in raw power, roughly equivalent to giving every man, woman, and child on earth 100 pocket calculators and having them punch in one calculation per second on each one.

The simulation required 454 gigabytes of main system memory - roughly a thousand times the memory of a typical PC - just to act as a "scratchpad" for its intermediate calculations. The run produced more than 9 trillion bytes of simulation data as output; it would take about 12, 000 CD-ROMs to hold this much information. The amount of data will double over the next several weeks as the results are analyzed. In the run performed on Blue Horizon, Enzo tracked more than a billion particles of visible matter and dark matter and performed gravitational and hydrodynamics calculations through more than three billion years of simulated time. The simulation begins only 30 million years after the Big Bang, when the universe was a uniform sea of hydrogen and helium gas and dark matter. Over time, minute random fluctuations in density - about one part in a thousand - are amplified by the action of gravity to form clusters of galaxies in enormous sheets and strings separated by immense voids.

A visualization produced from the data by CASS Research Associate Paschalis Paschos shows the distribution of intergalactic gas clouds inside a cube-shaped volume 248 million light-years on a side. The gas clouds form wispy structures that surround enormous empty areas, just as they do in surveys of distant galaxies painstakingly compiled by observational astronomers.

This view of the simulation corresponds to a time 1.3 billion years after the Big Bang, or 12.4 billion years ago. At this early epoch, most of the visible matter in the universe has not yet condensed into galaxies, but remains dispersed in what astronomers call the intergalactic medium. In the real universe, these gas clouds are observed as foreground absorbers of ultraviolet light emitted by even more distant quasars.

"Astronomers study these clouds using the University of California's powerful Keck Telescopes atop Mauna Kea in Hawaii," Norman said. "This simulation, used in conjunction with the lastest Keck data, will give us a better idea about the distribution of matter in the early universe, and will serve as a critical test of the model." SDSC computational astrophysicist Robert Harkness adapted Enzo to run on Blue Horizon and currently is adapting the code to run on other supercomputers, including the new TeraGrid supercomputing facility. Harkness is a member of SDSC's Strategic Applications Collaborations team, which works closely with scientific investigators to tune their simulation and analysis programs to take maximum advantage of the power of modern supercomputers.

Harkness noted that Enzo can easily use all 1024 processors of Blue Horizon and write more than a terabyte of data in the course of a 24-hour run. "With 15 terabytes of storage in our GPFS [General Purpose File System], we can reserve a free terabyte or two so Enzo can run full-bore for a decent length of time," he said. "A lot of otherwise highly capable computational systems are really handicapped because they lack this capacity."

"I've been working on Enzo for nearly three years, ever since the day I arrived at SDSC," said Harkness. "It's very satisfying to see such spectacularly successful results. Computational science - using computer simulations to explore the consequences of physical laws - has become an essential part of modern physics and astronomical research. Astronomers, after all, don't have any other way to bring galaxies into the laboratory and perform experiments on them."

The latest Enzo simulation provides a way for astrophysicists to test the "consensus model" of the composition of the cosmos, which makes some rather surprising claims about what the universe is made of. In this model, the ordinary matter of stars, planets, and nebulas makes up only 4.4 percent of the universe. Another 22.6 percent of the mass - roughly five times as much - is in the form of mysterious "dark matter," particles of some unseen substance that does not emit light and does not interact with ordinary matter other than by gravity. The remaining 73 percent is the even more mysterious "dark energy," which, according to observations made within the past five years, appears to be speeding up the expansion of the universe after the original Big Bang.

Originally very controversial, dark matter and dark energy gradually have become generally accepted by cosmologists as the main constituents of the universe. When simulations such as Enzo omit them from the mix, the structures that form after the Big Bang bear no resemblance to the distribution of matter that astronomers observe in the real universe. "All of the matter that we can see, from here to the farthest galaxy, seems to be less than the tip of the iceberg," said Norman.


Media Contact: Greg Lund, SDSC, 858-534-8314, greg@sdsc.edu

Technical Contact: Robert Harkness, SDSC, 858-822-5431, harkness@sdsc.edu

Archive

Back to top