NERSCPowering Scientific Discovery Since 1974


NERSC Exascale Science Application Program

NERSC has launched the NERSC Exascale Science Applications Program, a collaborative effort in which NERSC will partner with code teams and library and tools developers to prepare for the NERSC-8 Cori manycore architecture.

NESAP represents an important opportunity for researchers to prepare application codes for the new architecture and to help advance the missions of the Department of Energy’s Office of Science. The NESAP partnership will allow 20 projects to collaborate with NERSC, Cray, and Intel by providing access to early hardware, special training and preparation sessions with Intel and Cray staff.  Eight of those 20 will also have an opportunity for a postdoctoral researcher to investigate computational science issues associated with energy-efficient manycore systems.  In addition, about 24 more projects as well as library and tools developers will participate in NESAP via NERSC training sessions and early access to prototype and production hardware.  

The 20 selected projects were chosen based on computational and scientific reviews by NERSC and other DOE staff.  NESAP began during Fall 2014 and will remain active as the Cori system is delivered to NERSC in mid-2016. During this period, twenty of the project teams, guided by NERSC, Cray, and Intel, will undertake intensive efforts to adapt software to take advantage of Cori's Knights Landing manycore architecture and to use the resultant codes to produce pathbreaking science on an architecture that may represent an approach to exascale systems.

The 20 projects included in the NESAP partnership with NERSC, Cray and Intel

  • Advanced Scientific Computing Research (ASCR):
    • Optimization of the BoxLib Adaptive Mesh Refinement Framework for Scientific Application Codes, PI: Ann Almgren (Lawrence Berkeley National Laboratory); Postdoc assigned
    • High-Resolution CFD and Transport in Complex Geometries Using Chombo-Crunch, David Trebotich (Lawrence Berkeley National Laboratory); Postdoc assigned
  • Biological and Environmental Research (BER)
    • CESM Global Climate Modeling, John Dennis (National Center for Atmospheric Research)
    • High-Resolution Global Coupled Climate Simulation Using The Accelerated Climate Model for Energy (ACME), Hans Johansen (Lawrence Berkeley National Laboratory)
    • Multi-Scale Ocean Simulation for Studying Global to Regional Climate Change, Todd Ringler (Los Alamos National Laboratory)
    • Gromacs Molecular Dynamics (MD) Simulation for Bioenergy and Environmental Biosciences, Jeremy C. Smith (Oak Ridge National Laboratory)
    • Meraculous, a Production de novo Genome Assembler for Energy-Related Genomics Problems, Katherine Yelick (Lawrence Berkeley National Laboratory) 
  • Basic Energy Science (BES):
    • Large-Scale Molecular Simulations with NWChem, PI: Eric Jon Bylaska (Pacific Northwest National Laboratory)
    • Parsec: A Scalable Computational Tool for Discovery and Design of Excited State Phenomena in Energy Materials, James Chelikowsky (University of Texas, Austin)
    • BerkeleyGW: Massively Parallel Quasiparticle and Optical Properties Computation for Materials and Nanostructures (Jack Deslippe, NERSC)
    • Materials Science using Quantum Espresso, Paul Kent (Oak Ridge National Laboratory)Postdoc assigned
    • Large-Scale 3-D Geophysical Inverse Modeling of the Earth, Greg Newman (Lawrence Berkeley National Laboratory)
  • Fusion Energy Sciences (FES)
    • Understanding Fusion Edge Physics Using the Global Gyrokinetic XGC1 Code, Choong-Seock Chang (Princeton Plasma Physics Laboratory)
    • Addressing Non-Ideal Fusion Plasma Magnetohydrodynamics Using M3D-C1, Stephen Jardin (Princeton Plasma Physics Laboratory)
  • High Energy Physics (HEP)
    • HACC (Hardware/Hybrid Accelerated Cosmology Code) for Extreme Scale Cosmology, Salman Habib (Argonne National Laboratory)
    • The MILC Code Suite for Quantum Chromodynamics (QCD) Simulation and Analysis, Doug Toussaint (University of Arizona)
    • Advanced Modeling of Particle Accelerators, Jean-Luc Vay, Lawrence Berkeley National Laboratory)
  • Nuclear Physics (NP)
    • Domain Wall Fermions and Highly Improved Staggered Quarks for Lattice QCD, Norman Christ (Columbia University) and Frithjof Karsch (Brookhaven National Laboratory)
    • Chroma Lattice QCD Code Suite, Balint Joo (Jefferson National Accelerator Facility)
    • Weakly Bound and Resonant States in Light Isotope Chains Using MFDn -- Many Fermion Dynamics Nuclear Physics, James Vary and Pieter Maris (Iowa State University)

Resources available to NESAP Code Teams

  • A partner from NERSC’s Application Readiness team who will assist with code profiling and optimization
  • Access to Cray and Intel resources to help with code optimization
  • Up to 1M MPP hours in 2014 and 2M MPP hours in 2015 for code testing, optimization, scaling and debugging on Edison
  • Early access to prototype Knights Landing processor hardware (expected in late 2015)
  • Early access and significant hours on the full Cori system (expected delivery mid-2016)
  • Opportunity for a Post-doctoral researcher to be placed within your application team (NERSC will fund 8 Post-doctoral researchers and place each one within one of the 20 NESAP teams meaning that approximately 40% of NESAP applications teams will include a NERSC sponsored Post-doc)

Additional NESAP Application Teams with access to NERSC training and early hardware

GTC-P (William Tang/PPPL)
GTS (Stephane Ethier/PPPL) 
VORPAL (John Cary/TechX) 
TOAST (Julian Borrill/LBNL) 
Qbox/Qb@ll (Yosuke Kanai/U. North Carolina) 
CALCLENS and ROCKSTAR (Risa Wechsler/Stanford) 
WEST (Marco Govoni/U. Chicago) 
QLUA (William Detmold/MIT) 
P3D (James Drake/U. Maryland) 
WRF (John Michalakes/ANL) 
PHOSIM (Andrew Connolly/U. Washington) 
SDAV tools (Hank Childs/U. Oregon) 
M3D/M3D-K (Linda Sugiyama/MIT) 
DGDFT (Lin Lin/U.C. Berkeley) 
GIZMO/GADGET (Joel Primack/U.C. Santa Cruz)
ZELMANI (Christian Ott/Caltech)
VASP (Martijn Marsman/U. Vienna)
NAMD (James Phillips/U. Illinois)
PHOENIX-3D (Eddie Baron/U. Oklahoma)
ACE3P (Cho-Kuen Ng/SLAC)
S3D (Jacqueline Chen/SNL)
ATLAS (Paolo Calafiura/LBNL)
BBTools genomics tools (Jon Rood/LBNL, JGI)
DOE MiniApps (Alice Koniges, LBNL)
HipGISAXs (Heximer, LBNL)
GENE (Jenko, UCLA)


About NERSC and Berkeley Lab
The National Energy Research Scientific Computing Center (NERSC) is the primary high-performance computing facility for scientific research sponsored by the U.S. Department of Energy's Office of Science. Located at Lawrence Berkeley National Laboratory, the NERSC Center serves more than 6,000 scientists at national laboratories and universities researching a wide range of problems in combustion, climate modeling, fusion energy, materials science, physics, chemistry, computational biology, and other disciplines. Berkeley Lab is a U.S. Department of Energy national laboratory located in Berkeley, California. It conducts unclassified scientific research and is managed by the University of California for the U.S. DOE Office of Science. »Learn more about computing sciences at Berkeley Lab.