NERSCPowering Scientific Discovery for 50 Years

Computational Cosmology Comes of Age

20 years of research at NERSC has changed the way we think about the universe

May 7, 2014

mccurdy1995

Bill McCurdy, 1995

“If NERSC does not enable a major scientific discovery every few years, then we’re not doing our job.”

That was the challenge issued by Bill McCurdy, then Lawrence Berkeley National Laboratory’s (Berkeley Lab) Associate Laboratory Director for Computing Sciences, at the first all-hands meeting for staff of the National Energy Research Scientific Computing Center (NERSC) when the center reopened after moving to Berkeley Lab in spring of 1996.

No one at that meeting could have guessed that the first major breakthrough enabled by NERSC would be a startling discovery in cosmology—that the expansion of the universe was accelerating due to an unknown force, now referred to as dark energy—and that 15 years later that discovery would be honored with a Nobel Prize. But one of the first Berkeley Lab research groups to sign up for time on NERSC’s supercomputers was the Supernova Cosmology Project (SCP), led by Saul Perlmutter.

In fact, it was during McCurdy’s first weeks at Berkeley Lab that Perlmutter approached him. “Saul came into my office one evening at about 8 p.m. in December 1995—apparently on his way out—and started to make a pitch for computer time,” McCurdy recalled. “We still didn’t have any of NERSC’s computers yet at Berkeley Lab; we were hiring the staff and in the process of rebuilding the center. Saul sketched the supernova search on my whiteboard and explained how it worked. At the time he had no idea that they would discover the accelerating expansion of the Universe.

“I remember that night absolutely clearly, that Saul was so polite and generous with his time,” McCurdy continued. “I myself never had to make that decision [to grant computer time]; the ordinary process of allocating computer time made the decision. But I sure wish that I had a picture of what Saul had written on my whiteboard that night. It was the plan that was going to win a Nobel Prize.”

By the mid-1990s, some theoretical cosmologists were already using supercomputers for such daunting tasks as solving Einstein’s equations for General Relativity. But Perlmutter’s group is believed to have been the first to use supercomputers to analyze and validate observational data in cosmology—a field that would soon expand rapidly at NERSC and elsewhere.

Computers were an essential part of the automated supernova search system that Perlmutter and Carl Pennypacker, co-founders of the SCP, were using in the early 1990s. They used a robotic telescope equipped with a CCD detector instead of photographic plates, producing digital images that could be compared automatically by computers using the image subtraction software they developed.

Saul Perlmutter, with a view of the supernova 1987a in the background.

By 1994 the SCP team had proved that they could discover supernovae “on demand,” and Perlmutter realized that the influx of data would soon require more computing power to analyze. NERSC’s move to Berkeley Lab provided the perfect opportunity to take advantage of high performance computing (HPC).

Simulating 10,000 Supernovae

With a Laboratory Directed Research and Development grant, the NERSC and Physics divisions jointly hired a postdoc, Peter Nugent—now leader of the NERSC Analytics Team and co-leader of Berkeley Lab’s Computational Cosmology Center (C3)—to provide HPC and theoretical support to the Supernova Cosmology Project. Nugent helped them develop parallel algorithms that could take advantage of NERSC’s 512-processor Cray T3E-900 supercomputer.

To analyze the data from 40 supernovae for errors or biases, Nugent used the Cray T3E-900 to simulate 10,000 exploding supernovae at varying distances, given universes based on different assumptions about cosmological values; these were then plotted and compared with the observed data to detect any biases affecting observation or interpretation.

To make meaningful comparisons of nearby and distant Type Ia supernovae, the light measurements from the more distant supernovae, with larger redshifts, were compared with the redshifts of closer ones. Those measurements were then altered slightly to examine the effects of dust along the line of sight, and to test slightly different explosion scenarios. These simulations were compared with the team’s observations to make sure the data matched their theoretical calculations.

The T3E was also used to make sure that the error bars presented in the research were reasonable. The researchers plotted the mass density of the universe and the vacuum energy density based on data from 40 supernovae. Then they began resampling the data, taking random sets of any of the 40 supernovae and finding and plotting the minimum value for each parameter. This resampling procedure was repeated tens of thousands of times as an independent check on the assigned error bars.

These rigorous, supercomputer-powered analyses of potential biases reduced the uncertainties in the data and helped Perlmutter’s team quickly win widespread acceptance of their conclusions in the scientific community. And in 2011, Perlmutter was awarded the Nobel Prize in Physics. His research team is believed to have been the first to use supercomputers to analyze and validate observational data in cosmology.

Upsurge in Computational Cosmology 

While the Supernova Cosmology team was analyzing its data, another new hire at NERSC, Julian Borrill, was working with George Smoot, who would win a Nobel Prize in 2006 for his 1992 co-discovery of a pattern of minuscule temperature variations in the cosmic microwave background (CMB)—the last faint echo of the Big Bang. Borrill’s task was to develop parallel algorithms that could cope with the influx of CMB data from the upcoming BOOMERANG and MAXIMA experiments.

George Smoot with a Cobe map (2006)

BOOMERANG, a 1999 balloon-based CMB survey, made close to one billion measurements of CMB temperature variations. Analysis of the BOOMERANG dataset at NERSC, published in a Nature cover story in April 2000, established that the Universe is flat—that its geometry is Euclidean, not curved.

“Almost all CMB experiments launched since then have used NERSC for data analysis in some capacity,” said Borrill, who now co-leads Berkeley Lab’s C3 with Nugent.

Likewise, the Planck Collaboration has been using supercomputers at NERSC to create the most detailed and accurate maps yet of the CMB. The Planck satellite, a joint mission between the European Space Agency and NASA, made news in 2013 when its science team released the most detailed map ever made of the CMB and refined some of the fundamental parameters of cosmology and physics.

For over a decade Planck has relied on NERSC to provide the necessary computational capabilities, including tens of millions of CPU hours, hundreds of terabytes of spinning disk space, and support for hundreds of Planck data analysts. The culmination of this work to date, and the most computationally challenging part of the entire CMB analysis pipeline, has been the production of the sixth Full Focal Plane simulation (FFP6), comprising 1,000 realizations of the Planck mission reduced to 250,000 maps.

“These maps are proving to be a goldmine containing stunning confirmations and new puzzles,” said Martin White, a Planck scientist and physicist with University of California Berkeley and at Berkeley Lab. “This data will form the cornerstone of our cosmological model for decades to come and spur new directions in research.”

Other Cosmic Innovations

Meanwhile, in 2009, a sky survey called the Palomar Transient Factory (PTF) began discovering relatively rare and fleeting cosmic events, like supernovae and cataclysmic variables. PTF is the first project dedicated solely to finding transient events. A team at Caltech worked with NERSC to develop an automated system that sifts through terabytes of astronomical data every night to find interesting events. In August 2011, PTF discovered one of the closest Type Ia supernovae in the last 40 years, SN 2011fe, in the nearby Pinwheel Galaxy.

“This truly novel survey combines the power of a wide-field telescope, a high-resolution camera, high-performance network and computing, as well as the ability to conduct rapid follow-up observations with telescopes around the globe for the first time,” said Nugent, who also architected and led the Deep Sky science gateway, one of the largest repositories of astronomical imaging data (over 80 terabytes) and the backbone of the PTF.

In August 2011, PTF discovered one of the closest Type Ia supernovae in the last 40 years, SN 2011fe, in the nearby Pinwheel Galaxy.

The Baryon Oscillation Spectroscopic Survey (BOSS) is another experiment using NERSC resources to advance our understanding of dark energy. It is the centerpiece physics experiment in the continuation of the Sloan Digital Sky Survey. Since 2009, BOSS has used the Sloan Foundation Telescope at the Apache Point Observatory in New Mexico to record high-precision spectra of well over a million galaxies with redshifts from 0.2 to 0.7, looking back over six billion years into the universe’s past.

In early 2014, the BOSS Collaboration announced that BOSS has measured the scale of the universe to an accuracy of one percent. This and future measures at this precision are the key to determining the nature of dark energy.

“One-percent accuracy in the scale of the universe is the most precise such measurement ever made,” said BOSS’s principal investigator, David Schlegel, of Berkeley Lab.

NERSC was critical to enabling the analysis. Said Martin White, chair of the BOSS science survey team, “NERSC set aside resources for us to push analyses through quickly when we were up against deadlines. They provide a virtual meeting place where members of the collaboration from all around the world can come together on a shared platform, with both the data and the computational resources they need to perform their research.”


About NERSC and Berkeley Lab
The National Energy Research Scientific Computing Center (NERSC) is a U.S. Department of Energy Office of Science User Facility that serves as the primary high performance computing center for scientific research sponsored by the Office of Science. Located at Lawrence Berkeley National Laboratory, NERSC serves almost 10,000 scientists at national laboratories and universities researching a wide range of problems in climate, fusion energy, materials science, physics, chemistry, computational biology, and other disciplines. Berkeley Lab is a DOE national laboratory located in Berkeley, California. It conducts unclassified scientific research and is managed by the University of California for the U.S. Department of Energy. »Learn more about computing sciences at Berkeley Lab.