NERSCPowering Scientific Discovery for 50 Years

Discovery of Dark Energy Ushered in a New Era in Computational Cosmology

October 4, 2011

By John Hules

“If NERSC does not enable a major scientific discovery every few years, then we’re not doing our job.” That was the challenge issued by Bill McCurdy, then Lawrence Berkeley National Laboratory’s Associate Laboratory Director for Computing Sciences, at the first all-hands meeting for staff of the National Energy Research Scientific Computing Center (NERSC) when the center reopened after moving to Berkeley Lab in spring of 1996.

No one at that meeting could have guessed that the first major breakthrough enabled by NERSC would be a startling discovery in cosmology—that the expansion of the Universe was accelerating due to an unknown force, now referred to as dark energy—and that 15 years later that discovery would be honored with a Nobel Prize. But one of the first Berkeley Lab research groups to sign up for time on NERSC’s supercomputers was the Supernova Cosmology Project (SCP), led by Saul Perlmutter.

Saul Perlmutter

Saul Perlmutter, an astrophysicist at the U.S. Department of Energy’s Lawrence Berkeley National Laboratory and a professor of physics at the University of California at Berkeley, won the 2011 Nobel Prize in Physics “for the discovery of the accelerating expansion of the universe through observations of distant supernovae.” Perlmutter heads the international Supernova Cosmology Project, which pioneered the methods used to discover the accelerating expansion of the universe, and he has been a leader in studies to determine the nature of dark energy. Perlmutter shared the prize with Brian Schmidt and Adam Riess, leader of the High-z Supernova Search Team and first author of that team’s analysis, respectively, which led to their almost simultaneous announcement of accelerating expansion.

In fact, it was during McCurdy’s first weeks at Berkeley Lab that Perlmutter approached him. “Saul came into my office one evening at about 8 p.m. in December 1995—apparently on his way out—and started to make a pitch for computer time,” McCurdy recalls. “We still didn’t have any of NERSC’s computers yet at Berkeley Lab; we were hiring the staff and in the process of rebuilding the center. Saul sketched the supernova search on my whiteboard and explained how it worked. At the time he had no idea that they would discover the accelerating expansion of the Universe.

“I remember that night absolutely clearly, that Saul was so polite and generous with his time,” McCurdy continues. “I myself never had to make that decision [to grant computer time]; the ordinary process of allocating computer time made the decision. But I sure wish that I had a picture of what Saul had written on my whiteboard that night. It was the plan that was going to win a Nobel Prize.”

By the mid-1990s, some theoretical cosmologists were already using supercomputers for such daunting tasks as solving Einstein’s equations for General Relativity. But Perlmutter’s group is believed to have been the first to use supercomputers to analyze and validate observational data in cosmology—a field that would soon expand rapidly at NERSC and elsewhere.

Computers were an essential part of the automated supernova search system that Perlmutter and Carl Pennypacker, co-founders of the SCP, were using in the early 1990s. They used a robotic telescope equipped with a CCD detector instead of photographic plates, producing digital images that could be compared automatically by computers using the image subtraction software they developed. By 1994 the SCP team had proved that they could discover supernovae “on demand,” and Perlmutter realized that the influx of data would soon require more computing power to analyze. NERSC’s move to Berkeley Lab provided the perfect opportunity to take advantage of high performance computing (HPC).

With a Laboratory Directed Research and Development (LDRD) grant, the NERSC and Physics divisions jointly hired a postdoc, Peter Nugent—now leader of the NERSC Analytics Team and co-leader of Berkeley Lab’s Computational Cosmology Center (C3)—to provide HPC and theoretical support to the Supernova Cosmology Project. Nugent helped them develop parallel algorithms that could take advantage of NERSC’s 512-processor Cray T3E-900 supercomputer.

To analyze the data from 40 supernovae for errors or biases, Nugent used the Cray T3E-900 to simulate 10,000 exploding supernovae at varying distances, given universes based on different assumptions about cosmological values; these were then plotted and compared with the observed data to detect any biases affecting observation or interpretation.

To make meaningful comparisons of nearby and distant Type Ia supernovae—in other words, to confirm their usefulness as “standard candles,” objects whose intrinsic brightness is the same wherever they are found—the light measurements from the more distant supernovae, with larger redshifts, were compared with the redshifts of closer ones. Those measurements were then altered slightly to examine the effects of dust along the line of sight, and to test slightly different explosion scenarios. These simulations were compared with the team’s observations to make sure the data matched their theoretical calculations.

The T3E was also used to make sure that the error bars presented in the research were reasonable. The researchers plotted the mass density of the universe and the vacuum energy density based on data from 40 supernovae. Then they began resampling the data, taking random sets of any of the 40 supernovae and finding and plotting the minimum value for each parameter. This resampling procedure was repeated tens of thousands of times as an independent check on the assigned error bars.

Science magazine cover

Einstein watches in surprise as a universe expands, its galaxies rushing apart ever faster. Evidence for an accelerating universe, Science magazine’s Breakthrough of the Year for 1998, resurrected Einstein's discarded idea of an energy that counteracts gravity and pushes space apart.

These rigorous, supercomputer-powered analyses of potential biases reduced the uncertainties in the data and helped Perlmutter’s team quickly win widespread acceptance of their conclusions in the scientific community. And the HPC community took notice too: in November 1998, Perlmutter was invited to address the SC98 supercomputing conference, where he discussed the melding of cosmology and computational science at Berkeley Lab. In December 1998, the discovery of the accelerating universe was named Science magazine’s Breakthrough of the Year.

The upsurge in computational cosmology

The same technology trends that were driving the steady growth of computer speeds, as described by Moore’s Law, were also driving the escalating data gathering capability of astronomical instruments. It was inevitable that these trends would come together in the specialized field of computational cosmology, and Berkeley Lab and NERSC were ready.

While the Supernova Cosmology team were analyzing their data, another new hire at NERSC, Julian Borrill, was working with George Smoot, who would win a Nobel Prize in 2006 for his 1992 co-discovery of a pattern of minuscule temperature variations in the cosmic microwave background (CMB)—the last faint echo of the Big Bang. Borrill’s task was to develop parallel algorithms that could cope with the influx of CMB data from the upcoming BOOMERANG and MAXIMA experiments.

BOOMERANG, a 1999 balloon-based CMB survey, made close to one billion measurements of CMB temperature variations. Analysis of the BOOMERANG dataset at NERSC, published in a Nature cover story in April 2000, established that the Universe is flat—that its geometry is Euclidean, not curved. “Almost all CMB experiments launched since then have used NERSC for data analysis in some capacity, and today NERSC supports around 100 researchers from a dozen experiments,” says Borrill, who now co-leads Berkeley Lab’s Computational Cosmology Center.

The highest-resolution map of the CMB to date is currently being made by the European Space Agency’s Planck satellite observatory, which is yielding 300 billion samples per year. Borrill and the C3 team spent nearly a decade developing the supercomputing infrastructure for the U.S. Planck Team’s data analysis operations at NERSC (and received a NASA Public Service Group Award for their effort). Planck will make the definitive CMB temperature measurement; all the experiments that follow Planck will be going after polarization signals. Mission concept studies for NASA’s proposed CMBPol satellite aim for 200 trillion samples per year, and the C3 team is already getting prepared.

In the meantime, the search for supernovae was also scaling up. SCP member Greg Aldering of the Berkeley Lab Physics Division went on to lead the Nearby Supernova Factory (SNfactory), an international collaboration between several groups in the United States and France, including Berkeley Lab and Yale University, to address a wide range of issues using detailed observations of low-redshift supernovae. By 2003 the SNfactory was discovering eight supernovae per month, a rate made possible by a high-speed data link, custom data pipeline software, and NERSC’s ability to store and process 50 gigabytes of data every night.

The SNfactory’s productivity was improved even further when the Sunfall visual analytics system went into production in 2006. Developed by Cecilia Aragon of the NERSC Analytics Team and collaborators in Berkeley Lab’s Physics Division and at UC Berkeley, Sunfall eliminated 90% of the human labor from the supernova search and follow-up workflow, and its statistical algorithms reduced the number of false-positive supernovae candidates by a factor of 10. The SNfactory to date has discovered over 1000 nearby supernovae.

An innovative new sky survey, called the Palomar Transient Factory (PTF), began discovering relatively rare and fleeting cosmic events, like supernovae and cataclysmic variables, in 2009. PTF is the first project dedicated solely to finding transient events. A team at Caltech worked with NERSC to develop an automated system that sifts through terabytes of astronomical data every night to find interesting events. “This truly novel survey combines the power of a wide-field telescope, a high-resolution camera, high-performance network and computing, as well as the ability to conduct rapid follow-up observations with telescopes around the globe for the first time,” says Nugent, who is the Real-time Transient Detection Lead for the PTF project.

To date, PTF has discovered more than 8,700 new astrophysical transients, including supernovae, novae, active galaxies, and quasars, and even three new classes of objects. In August 2011, PTF discovered one of the closest Type Ia supernovae in the last 40 years, SN 2011fe, in the nearby Pinwheel Galaxy.

supernova SN 2011fe

Before and after images of supernova SN 2011fe as it appeared in the nearby M101 galaxy.

 

The latest experiment using NERSC resources to advance our understanding of dark energy is the Baryon Oscillation Spectroscopic Survey (BOSS), the centerpiece physics experiment in the continuation of the Sloan Digital Sky Survey. BOSS will make the largest three-dimensional map of the Universe with 1.5 million galaxies and 200,000 quasars. From this map, researchers will make precise measurements (accurate to 1%) of the expansion history of the Universe.

All of these experiments are producing data that shed light on the fundamental cosmological parameters of the Universe. At the same time, theoretical astrophysicists have been using NERSC supercomputers to increase our understanding of a wide variety of important phenomena by simulating supernova explosions, black hole mergers, gamma-ray bursts, accretion disks, star and galaxy formation, and even large-scale structure formation in the Universe.

As for Perlmutter and the Supernova Cosmology Project, their search for very high-redshift supernovae continues. In 2011 they started a new multi-year search for supernovae in high redshift galaxy clusters with a new camera on the Hubble Space Telescope.

So, after a dozen or more years of rapid growth, what is the state of computational cosmology today? George Smoot summed it up in an interview in the Winter 2007 issue of SciDAC Review magazine: “We’ve reached a point in cosmology—with the CMB discovery, the supernova group, large-scale structures, surveys, and simulations—where we know a lot about cosmology and have brought the parameters down to around the 2 percent level," Smoot is quoted. "We can expect in the next few years to get down to the one percent level. When you get down to the one percent level, you are actually testing your knowledge well."

The article continues quoting Smoot: “Most things in your life are alright with a one percent parameter and you are happy with it. For example, your pant size. If it’s wrong by ten percent, they will fall down, or you can’t button them. If it’s one percent, it can be a little tight or a little loose, but it doesn’t matter. You can use a belt. There are lots of things where if you get to one percent then it’s good enough. You are reaching a threshold where you understand things reasonably well. When you look at things carefully at the one percent level, you can tell whether your model is right or wrong. The same is true for the science about the Universe. When we get to the one percent level, we will know whether we are right about what the Universe is."

 


About NERSC and Berkeley Lab
The National Energy Research Scientific Computing Center (NERSC) is a U.S. Department of Energy Office of Science User Facility that serves as the primary high-performance computing center for scientific research sponsored by the Office of Science. Located at Lawrence Berkeley National Laboratory, the NERSC Center serves almost 10,000 scientists at national laboratories and universities researching a wide range of problems in climate, fusion energy, materials science, physics, chemistry, computational biology, and other disciplines. Berkeley Lab is a DOE national laboratory located in Berkeley, California. It conducts unclassified scientific research and is managed by the University of California for the U.S. Department of Energy. »Learn more about computing sciences at Berkeley Lab.