NERSCPowering Scientific Discovery Since 1974

ASCR Leadership Computing Challenge (ALCC) Projects at NERSC


The mission of the ASCR Leadership Computing Challenge (ALCC) is to provide an allocation program for projects of interest to the Department of Energy (DOE) with an emphasis on high-risk, high-payoff simulations in areas directly related to the DOE mission and for broadening the community of researchers capable of using leadership computing resources.

Open to scientists from the research community in industry, academia, and national laboratories, the ALCC program allocates time on computational resources at ASCR’s supercomputing facilities. ASCR supercomputing facilities include NERSC at Lawrence Berkeley National Laboratory and the Leadership Computing Facilities at Argonne and Oak Ridge National Laboratories. These resources represent some of the world’s fastest and most powerful supercomputers. 

Allocations of computer time are awarded through a competitive process by the DOE Office of Advanced Scientific Computing Research. For more information about the program and instructions on how to apply see the DOE ALCC page. 

Computer Usage Charging 

When a job runs on NERSC's supercomputer,  Cori, charges accrue against one of a user's repository allocations.  The unit of accounting for these charges is the "NERSC Hour" (which is equivalent to the previous NERSC "MPP Hour")

A parallel job is charged for exclusive use of each multi-core node allocated to the job.  The base charge for each node used by the job is given in the following table.

SystemNode ArchitectureBase Charge per Node Hour (NERSC Hours)System Size (Nodes)Cores per Node
Cori Intel Xeon Phi (KNL) 80 9,304 68
Cori Intel Xeon (Haswell) 140 2.004 32


Example: A parallel job that uses 10 Cori Haswell nodes that runs for 5 hours accrues 140 x 10 x 5 = 7,000 NERSC Hours.

  • A similar job that runs on 10 Cori KNL nodes for 5 hours accrues 80 x 10 x 5 = 4,000 NERSC Hours.

NERSC Hours are designed to allow a common currency to be used across different NERSC systems and architectures. Historically, 1 NERSC hour is approximately equivalent to the charge for 1 core hour on the retired Hopper system (1 AMD "Magny-Cours" processor core hour).

The base charge can be modified by a number of factors. Please see How Usage is Charged for details.


Allocation Period

The ALCC year runs from July to June. This is shifted six months from the NERSC allocation cycle, so the full ALCC award for each project is allocated 50% in one NERSC allocation year and 50 percent in the next. Any, or all, of this time can be shifted from one year to the next upon request. Unused time from July through December is automatically transferred into the following year, however ALCC awarded time does not carry over past June 30.

NERSC ALCC Projects for 2019-2020

For the ALCC 2019-2020 campaign, ASCR received 75 proposals and, through a competitive review process, chose 37 proposals to receive allocations totaling 16.4 million node-hours. At NERSC, 10 research teams received a total of 4.1 million node-hours to use on the Cori supercomputer:

  • David Trebotich (Lawrence Berkeley National Laboratory) received 860,000 node-hours for “Multiphase Flow in Shale.”
  • J. Ilja Siepmann (University of Minnesota), Evgenii Fetisov (Pacific Northwest National Laboratory), Jason Goodpaster (University of Minnesota), Chris Knight (Argonne National Laboratory), Christopher Mundy (Pacific Northwest National Laboratory), and Yongchul Chung (Pusan National University) received 200,000 node-hours for “Predictive Modeling and Machine Learning for Functional Nanoporous Materials,” a Consortium/End-Station Proposal.
  • Frithjof Karsch (Brookhaven National Laboratory), Swagato Mukherjee (Brookhaven National Laboratory), Alexei Bazavov (Michigan State University), Heng-Tong Ding (Central China Normal University, China), Peter Petreczky (Brookhaven National Laboratory), and Patrick Steinbrecher (Brookhaven National Laboratory) received 800,000 node-hours for “Higher order cumulants of net-charge fluctuations.”
  • Aida El-Khadra (University of Illinois, Urbana-Champaign), Carleton DeTar (University of Utah), Steven Gottlieb (Indiana University), Elivra Gámiz (University of Granada), Andreas Kronfeld (Fermi National Accelerator Laboratory), John Laiho (Syracuse University), Doug Toussaint (University of Arizona), and Ruth Van de Water (Fermilab) received 330,000 node-hours for “Semileptonic B- and D-meson form factors with high precision.”
  • Katherine Calvin, (Pacific Northwest National Laboratory), Dave Bader (Lawrence Livermore National Laboratory), Susannah Burrows (Pacific Northwest National Laboratory), Ruby Leung (Pacific Northwest National Laboratory), Mathew Maltrud (Los Alamos National Laboratory), Mark Taylor (Sandia National Laboratories), and Peter Thornton (Oak Ridge National Laboratory) received 490,000 node-hours for “Investigating energy-climate-biogeochemistry sensitivity with the Energy Exascale Earth System Model (E3SM).”
  • Eric Lancon (Brookhaven National Laboratory), Douglas Benjamin (Argonne National Laboratory), Abid Malik (Brookhaven National Laboratory), and Paolo Calafiura (Lawrence Berkeley National Laboratory) received 400,000 node-hours for “Scaling LHC proton-proton collision simulations and Machine Learning for the ATLAS experiment.”
  • Gary S. Grest (Sandia National Laboratories), Shengfeng Cheng (Virginia Polytechnic Institute and State University), Sanat Kumar (Columbia University), Dvora Perahia (Clemson University), and Michael Rubinstein (Duke University) received 270,000 node-hours for “Large Scale Numerical Simulations of Polymer Nanocomposites.”
  • Stephen Jardin (Princeton Plasma Physics Laboratory), Nate Ferraro (Princeton Plasma Physics Laboratory), and Brendan Lyons (General Atomics) received 395,000 node-hours for “Study of a Disrupting Plasma in ITER.”
  • Frederico Fiuza (SLAC National Accelerator Laboratory), Anna Grassi (Stanford University), and Warren Mori (University of California, Los Angeles) received 395,000 node-hours for “Large-scale kinetic simulations of particle acceleration in laser-driven shocks.”
  • Darin Comeau (Los Alamos National Laboratory), Xylar Asay-Davis (Los Alamos National Laboratory), Matthew Hoffman (Los Alamos National Laboratory), Wuyin Lin (Brookhaven National Laboratory), Mark Petersen (Los Alamos National Laboratory), Stephen Price (Los Alamos National Laboratory), and Andrew Roberts (Los Alamos National Laboratory) received 360,000 node-hours for “Cryospheric Physics in E3SM: Impacts of Antarctic Ice Shelf Melting, Southern Ocean Resolution, and Sea Ice Coupling on Global Climate.”