NERSCPowering Scientific Discovery for 50 Years

World-Changing Science at NERSC

NERSC provides systems, support, and professional consulting services that enable nearly 10,000 researchers working on a thousand different science projects to publish around 2,000 refereed articles in scientific journals each year. The six Nobel prizes associated with NERSC users or projects, and some other accomplishments given here, are good examples of the scientific impact NERSC has had on DOE Office of Science research and the science community at large.

An Accelerating Universe - Saul Perlmutter, Nobel Prize in Physics 2011

In the 1990s, Saul Perlmutter from Berkeley Lab and his team were studying supernovae to uncover how the expansion of the universe was slowing. Surprisingly, they found the opposite. The universe’s expansion rate was, in fact, accelerating. The insight earned Perlmutter the 2011 Nobel Prize in Physics, which he shared with the team of Adam Schmidt and Brian Reiss.

Perlmutter’s results were based on observations, but those measurements had to be interpreted and confirmed by running thousands of simulations at NERSC. His research team is believed to have been the first to use supercomputers to analyze and validate observational data in cosmology. This melding of computational science and cosmology sowed the seeds for more projects, establishing Berkeley Lab and NERSC as centers in the emerging field.

The Supernova Cosmology Project, co-led by Perlmutter, used a robotic telescope equipped with a digital detector instead of photographic plates. Its digital images were compared with earlier images using “subtraction” software, allowing the team to discover supernovae “on demand.” As the number of observations grew, the team required the capability available at NERSC.

To analyze the data from supernova for errors or biases, the team simulated 10,000 exploding supernovae at varying distances under varying circumstances using NERSC’s Cray T3E supercomputer, MCurie. These were then plotted and compared with the observed data to detect any biases affecting observation or interpretation. MCurie was also used to check and recheck their work by resampling the data and running calculations that helped determine the reliability of their measurements thousands of times. These rigorous, supercomputer-powered analyses of potential biases reduced the uncertainties in the data and helped Perlmutter’s team win widespread acceptance of their conclusions in the scientific community.

Oscillating Neutrinos - The SNO team, Nobel Prize in Physics 2015

The 2015 Nobel Prize in Physics was awarded to the leaders of two large experiments that discovered that neutrinos transform periodically from one type to another – Takaaki Kajita of Tokyo University, who led the Super-Kamiokande experiment, and Arthur B. McDonald of Queen’s University in Kingston, Ontario, head of the SNO collaboration. 

The connection to NERSC is through SNO, short for the Sudbury Neutrino Experiment. SNO is a huge neutrino detector located in Sudbury, Ontario, in an old Copper mine 2 km underground to shield it as much as possible from the noisy background of particle interactions that take place on Earth’s surface.

The SNO detector was turned on in 1999, and from the earliest days, SNO data was transferred to NERSC, landing on its HPSS tape storage system and analyzed using NERSC’s PDSF cluster in what became known as the “West Coast Analysis.” When the discovery of neutrino flavor mixing in solar neutrinos was published in 2001 in Physical Review Letters (Measurement of the Rate of νe+d→p+p+e− Interactions Produced by B8 Solar Neutrinos at the Sudbury Neutrino Observatory,” PhysRevLett.87.071301) NERSC’s role was well established and recognized by scientists working on the project: they presented a signed and framed copy of the journal article to NERSC’s PDSF team.


“SNO has been blessed by the top-notch support and facility at NERSC.  Without NERSC’s support, SNO would not have been able to process and reconstruct the data, simulate the data, and run massive jobs for the physics fits so smoothly and successfully.” - Alan Poon, SNO Project
“May I add my thanks to those from Alan. We greatly appreciate your support.” - Art McDonald”

The experiment wound down in 2006, but its data would be invaluable forever. To ensure its integrity and safekeeping, NERSC was chosen to be the trusted home to the data archive. Archiving and moving all the data to NERSC required close collaboration between SNO, NERSC, and DOE’s Energy Sciences Network (ESnet).

Multiscale Chemical Modeling - Martin Karplus, Nobel Prize in Chemistry 2013

The Nobel Prize in Chemistry in 2013 was awarded to three scientists for pioneering methods in computational chemistry that brought a deeper understanding of complex chemical structures and reactions in biochemical systems. These methods can precisely calculate how very complex molecules work and even predict the outcomes of very complicated chemical reactions. One of the laureates — Martin Karplus of Harvard University — was a long-time user and Principal Investigator at NERSC, starting to use the center in 1998. Karplus’ work revolutionized the field of computational chemistry by making Newton’s classical physics work side-by-side with fundamentally different quantum physics. Previously, researchers could only model one or the other. By combining the best from both physics worlds, researchers can now run simulations to understand complex processes, like how a drug couples to its target protein in the body, and are crucial to many of the advancements in the field.

Karplus began computing at NERSC in 1998, with an award from the Department of Energy’s Grand Challenges competition. The Grand Challenges applications addressed computation-intensive fundamental problems in science and engineering, whose solution could be advanced by applying high performance computing and communications technologies and resources.

At the time, Karplus and his colleague, Paul Bash who was at Northwestern University, were looking to understand chemical mechanisms in enzyme catalysis, which they couldn’t investigate experimentally. So they ran computer simulations at NERSC to gain a complete understanding of the relationship between biomolecular dynamics, structure and function.

One of the enzymes they looked at was a class called beta-lactamases. Researchers knew that these enzymes were responsible for the increasing resistance of bacteria to antibiotics, but the precise chemical resistance mechanisms were unknown. So Karplus and Bash ran simulations on NERSC supercomputers to investigate this mechanism at an atomic-level of detail.

Over 20 years Karplus and his research group have explored everything from how the molecule ATP synthase acts as a motor that fuels cells, to how myosin – the molecular engine behind muscles – operates. Today, Karplus' group is tackling the science behind molecular machines, which may someday power man-made systems, for example by converting sunlight into biofuels; working as tiny “molecular motors” capable of performing chemical analyses or other tests for “lab-on-chip” devices; or even "manufacturing" nanodevices.

Understanding the Function of Biomolecules - Joachim Frank, Nobel Prize in Chemistry 2017

Joachim Frank, a NERSC user, and principal investigator, shared the 2017 Nobel Prize in Chemistry for the development of software used to reconstruct three-dimensional structure of in situ biological molecules from transmission electron microscopy images. Frank pioneered the computational methods needed to reconstruct the 3D shape of biomolecules from thousands of 2D images obtained from Cryo-EM, methods employed today by most structural biologists who use electron microscopy. The SPIDER (System for Processing Image Data from Electron Microscopy and Related fields) software package Frank helped develop was for many years one of the most widely used for carrying out single-particle image 3D reconstruction of macromolecular assemblies from Cryo-EM image data.

From 2004-2006, Frank was the principal investigator on a NERSC project, “Correlative Cryo-EM and Molecular Dynamics Simulations of Ribosomal Structure.” Using NERSC’s IBM SP 3 Seaborg system, Frank and his team completed several molecular dynamics simulations of the GTPase-associated center in the 50S ribosome subunit and transfer RNA (tRNA) and compared the molecular dynamics snapshots with molecular structures computationally reconstructed from experimental electron microscopy images using the SPIDER software package.

“Using the single-particle reconstruction technique, Cryo-EM maps have provided valuable visualizations of ribosome binding with numerous factors,” Frank noted in his 2006 allocations request to NERSC. “Thus, so far, Cryo-EM has been the only means to visualize the ribosome in its functional states.”

Frank's work continued, prominently by Berkeley Lab's Eva Nogales. In a pair of breakthrough Nature papers published in 2016, researchers in Nogales’ Lab at UC Berkeley and the Lawrence Berkeley National Laboratory (Berkeley Lab) mapped two important protein functions in unprecedented detail: The role of TFIID, effectively improving our understanding of how our molecular machinery identifies the right DNA to copy; and how proteins unzip double-stranded DNA, which gives us insights into the first-key steps in gene activation.

They captured these processes at near-atomic level resolution by freezing purified samples and photographing them with electrons instead of light via the cryo-EM process. The researchers then used supercomputers at NERSC to process and analyze the data. 

Those results were followed by another groundbreaking study published in the Journal Science in 2017 where Nogales and her team used cryo-EM to capture a high-resolution image of a protein ring called an “inflammasome” as it was bound to flagellin, a protein from the whiplike tail used by bacteria to propel themselves forward. This insight holds great promise for developing strategies for protection from a number of diseases.

The Birth of Precision Cosmology - George Smoot, Nobel Prize in Physics 2006

George F. Smoot, leader of a research team that was able to image the infant universe and reveal a pattern of minuscule temperature variations that evolved into the universe we see today, was awarded the Nobel Prize for physics in 2006, along with John Mather, and “for their discovery of the blackbody form and anisotropy of the cosmic microwave background radiation (CMB).” 

Smoot and his research team, after analyzing hundreds of millions of precision measurements in the data they had gathered from an experiment aboard NASA’s Cosmic Background Explorer (COBE) satellite, had produced maps of the entire sky which showed “hot” and “cold” regions with temperature differences of a hundred-thousandth of a degree. These temperature fluctuations, produced when the universe was smaller than a single proton, were consistent with Big Bang predictions and are believed to be the primordial seeds from which our present universe grew.

The COBE analysis did not require supercomputers, but as their data volumes grew, the researchers turned to NERSC to meet their analysis and simulation needs. By 2006 Berkeley Lab and NERSC had become a world focus of CMB research. To simulate and process an entire year’s worth of data from Planck, the third-generation CMB space mission, the team, now led by Berkeley Lab’s Julian Borrill,  used 6,000 processors on NERSC’s Seaborg supercomputer for nearly two hours – the first time virtually all of the processors were used on a single code – mapping 75 billion observations to 150 million pixels. For comparison, the COBE sky map had used only 6,144 pixels.

In 2015 the collaboration generated 1 million synthetic maps using NERSC’s 150,000-processor Hopper supercomputer to support the second release of Planck results. These maps were necessary to support the first released results from Planck. Planck produced detailed maps of the CMB that provided detailed descriptions of the universe at the largest and smallest scales. In 2018 the Planck team was awarded the Gruber cosmology prize, awarded for groundbreaking work that provides new models that inspire and enable fundamental shifts in knowledge and culture.

Follow-on CMB experiments are now in the planning stages and the CMB team, led by Borrill, made news in 2018 year by scaling their modern data simulation and reduction framework, TOAST – developed for the Planck mission and extended to the next generation of even larger missions – to use to all 658,784 Intel Xeon Phi processor cores on NERSC’s Cori. The scaling was made possible by using the NERSC-developed Shifter container technology and a collaboration with NERSC’s Exascale Science Applications Program that helps scientists move their codes onto modern, energy-efficient HPC architectures. In 2020 Berkeley Lab was selected to lead the Next-Generation Cosmic Microwave Background Experiment, CMB-4.

Our Changing Climate - The 2007 Nobel Peace Prize

Warren Washington was one of the first developers of groundbreaking atmospheric computer models at the National Center for Atmospheric Research in the early 1960s. As the models continued to grow in sophistication and capability, Washington worked to incorporate representations of oceans and sea ice, the precursors of today’s climate models that include components that depict surface hydrology, vegetation, aerosols, and other effects as well as the atmosphere, oceans, and sea ice. The more physics Washington and his collaborators put into the models, the more their need for computing grew, and by the early 2000s, Washington and his group were heavy users of NERSC. 

In 2004, as deadlines for contributions to the Fourth Report of the Intergovernmental Panel on Climate Change (IPCC) approached, Washington and his team turned to NERSC for help improving the throughput of their simulations. At the time, Washington and his team were running a substantial portion of the climate change scenarios for the U.S. DOE/NSF contribution to the IPCC report. 

The climate change scenario simulations were using five instances of the Community Climate System Model (CCSM) codes running as an ensemble, in which each job was run using different initial conditions. Run as five separate jobs, the team was not getting the required throughput on NERSC’s Seaborg IBM SP3 system. So, they asked NERSC’s consultants if there was an efficient way to combine them into a single one. A single job running on 512 or more processors had a number of scheduling and policy advantages, but was difficult using the scheduling and system runtime software of the time.  

NERSC took on the challenge, and with the help of NERSC consultants Jonathan Carter (now Berkeley’s Lab Associate Laboratory Director for Computing Sciences) and Harsh Anand – who made changes to the team’s batch scripts and made novel use of an alternative to the IBM high-speed communication protocol to aggregate the results of the ensemble members – the team was able to achieve the throughput needed to meet their deadlines.

The results derived from the models Washington helped develop to study the impacts of climate change in the 21st century were used extensively in the 2007 4th IPCC assessment, for which Washington, and colleagues around the world shared the 2007 Nobel Peace Prize with Al Gore “for their efforts to build up and disseminate greater knowledge about man-made climate change, and to lay the foundations for the measures that are needed to counteract such change.“

Among NERSC users contributing to the assessment were William Collins, Gerald Meehl, David Randall, Lawrence Buja,  Daithi Stone, Ruby Leung, Julie Arblaster, David Baker, Aiguo Dai, Marika Holland, Aixue Hu, Elizabeth Hunke, Bette Otto-Bliesner, Claudia Tebaldi, Haiyan Teng, Ronald Stouffer, Inez Fung, Surabi Menon, Curtis Covey, Peter Gleckler, Stephen Klein, Thomas Phillips, Ben Santer, Karl Taylor, and Steven Ghan. 

Today NERSC continues to be a focus of climate research, running a wide range of simulations reconstructing past climates from sparse observations, studying extreme climate events, and pioneering the use of machine learning for extracting features from climate datasets. 

Additional Breakthroughs

NERSC continues to provide systems and expertise required for researchers to conduct world-changing science.

Neutrino Discoveries at Daya Bay

A NERSC-enabled result from the Daya Bay neutrino experiment was named one of Science Magazine’s Top Ten Breakthroughs of 2012, for discovery of the last neutrino “mixing angle,” a fundamental quantity in high energy physics. Shortly after experimental data was collected from the experiment, which used neutrinos generated by the Daya Bay nuclear power plant in China, it traveled via the National Science Foundation’s GLORIAD network and DOE’s ESnet to NERSC. At NERSC the data was processed automatically, stored, and shared with collaborators around the world via the Daya Bay Offline Data Monitor, a web-based “science gateway” hosted by NERSC. NERSC is the only U.S. site where all of the raw, simulated, and derived Daya Bay data were analyzed and archived. Because of the ease of data movement and automated analysis and sharing pipelines, Day Bay was able to beat its competitors in its determination of the mixing angle.

 Advances in Water Desalinization 

Another notable achievement was one of Smithsonian Magazine’s Top 5 Most Surprising Milestones of 2012. Using NERSC resources, researchers Jeffrey Grossman and David Cohen-Tanugi from the Massachusetts Institute of Technology came up with a new approach to desalination of water using a different kind of filtration material: sheets of graphene, a one-atom-thick form of the element carbon, which they say can be far more efficient and possibly less expensive than existing desalination systems. “NERSC proved to be an invaluable resource for our research,” says Cohen-Tanugi, an MIT graduate student who was the lead author of the Nano Letters paper.  “The professionalism of the support staff, the comprehensive database of user guides and tutorials, as well as the amount of computational power offered by NERSC were extremely helpful to our work.” That same year Grossman and Cohen-Tanugi were presented the inaugural NERSC Award for High-Impact Scientific Achievement. 

 A Record Quantum Circuit Simulation

When two researchers from the Swiss Federal Institute of Technology (ETH Zurich) announced in April 2017 that they had successfully simulated a 45-qubit quantum circuit, the science community took notice: it was the largest-ever simulation of a quantum computer, and another step closer to simulating “quantum supremacy” – the point at which quantum computers become more powerful than ordinary computers. Researchers Thomas Häner and Damien Steiger, both Ph.D. students at ETH, used 8,192 of 9,688 Intel Xeon Phi processors on NERSC’s  Cori supercomputer to run this simulation. Optimizing the quantum circuit simulator was key to the successful run. Häner and Steiger employed automatic code generation, optimized the compute kernels, and applied a scheduling algorithm to the quantum supremacy circuits, thus reducing the required node-to-node communication. During the optimization process they worked with NERSC staff and used Berkeley Lab’s Roofline Model to identify potential areas where performance could be boosted.

“Our optimizations improved the performance between 10x and 20x for Cori,” Häner and Steiger said. “The time-to-solution decreased by over 12x when compared to the times of a similar simulation reported in a recent paper on quantum supremacy by Sergio Boixo and collaborators, which made the 45-qubit simulation possible.”

Gravitational Wave Sources

As a final recent example, a team led by Berkeley’s Dan Kasen published a pair of groundbreaking papers in the journal Nature in 2017 showing that observations of the first visible object associated with a gravitational wave detection were consistent with detailed simulations run at NERSC of neutron star mergers that produced a “kilonova.” The team also showed that such events are able to synthesize heavy elements that exist in the universe today but were not produced in the Big Bang, confirming a long-suspected but unproven mechanism.