NERSCPowering Scientific Discovery for 50 Years

NERSC Announces Winners of Inaugural HPC Achievement Awards

February 15, 2013

By Linda Vu

The Department of Energy’s National Energy Research Scientific Computing Center (NERSC) announced the winners of their inaugural High Performance Computing (HPC) Achievement Awards at the annual NERSC User Group meeting at the Lawrence Berkeley National Laboratory (Berkeley Lab).

The awardees are all NERSC users who have either demonstrated an innovative use of HPC resources to solve a scientific problem or whose work has had an exceptional impact on scientific understanding or society. In an effort to encourage young scientists who are using HPC in their research, NERSC also presented two early career awards.

“High performance computing is changing how science is being done, and facilitating breakthroughs that would have been impossible a decade ago,” says NERSC Director Sudip Dosanjh. “The 2013 NERSC Achievement Award winners highlight some of the ways this trend is expanding our fundamental understanding of science, and how we can use this knowledge to benefit humanity.”

“The winning projects represent just a small sample of the groundbreaking research being done with NERSC resources. We received so many great nominations, and I look forward to seeing these projects and researchers spotlighted in future HPC Achievement Awards,” says Richard Gerber, Deputy Lead of NERSC’s User Services Group.

“I am especially impressed by our Early Career award winners,” said Dave Goodwin, who manages NERSC for the Department of Energy’s Office of Science. “These young researchers are the future of science, and it is really gratifying to see that they recognize the important role of HPC in science and achieve such incredible scientific results so early in their career.”

NERSC Award for High Impact Scientific Achievement

Jeff Grossman and David Cohen-Tanugi, Massachusetts Institute of Technology

alt

Jeff Grossman (left) and David Cohen-Tanugi (right), Massachusetts Institute of Technology

Using supercomputers at NERSC, Grossman, and Cohen-Tanugi came up with a new approach for desalinating seawater using sheets of graphene, a one-atom-thick form of the element carbon. This method holds the promise of being far more efficient and possibly less expensive than existing desalination systems. With world populations expected to keep growing and potable water projected to grow scarcer over the coming century, a practical and cheap means of desalinating seawater is one of materials science’s holy grails.

The key to this process is very precise control over the size of the holes in the graphene sheet. Using NERSC’s Hopper and Carver systems, the researchers aimed to control the properties of the material down to the atomic level, producing a graphene sheet perforated with precisely sized holes. They also added other elements to the material, causing the edges of these minuscule openings to interact chemically with water molecules – either repelling or attracting them. They found that ideal pore size is just about one nanometer, or one-billionth of a meter. If the holes are just a bit smaller &ndash ;0.7 nanometers &ndash ;the water won’t flow through at all.

One common method of desalination, called reverse osmosis, uses membranes to filter the salt from the water. But these systems require extremely high pressure – and hence, energy use – to force water through the thick membranes, which are about a thousand times thicker than graphene. The new graphene system operates at much lower pressure and thus could purify water at far lower cost. In December, Smithsonian Magazine named this result the fifth “Surprising Scientific Milestone of 2012.”

Related News

A New Approach to Water Desalination

NERSC Award for High Impact Scientific Achievement – Early Career

Tanmoy Das, Postdoctoral Researcher at Los Alamos National Laboratory

Tanmoy Das, Los Alamos National Laboratory

Das was nominated for his computational work to understand fundamental materials aspects in three different areas: (1) the role of Fermi surface anisotropy on the superconducting gap structure in multiband iron-based superconductors in the presence of rotating magnetic fields; (2) spin-orbit ordering effects in two-dimensional electron gases and in the hidden order state of URu2Si2; and (3) his seminal contributions to the self-consistent spin-fluctuation theory applied to real materials the intermetallic actinides.

Using his own MPI algorithm, Das used approximately 256 cores, primarily on NERSC’s Hopper, to compute the first-ever field-temperature phase diagram of the four-fold oscillations in the specific heat and the thermal conductivity of the new iron-based superconductor using material-specific Fermi surface parameterization, which showed a large effect of the Fermi surface on oscillations in thermal observables in a rotating magnetic field, even for fully isotropic superconducting gaps.

He also demonstrated for the first time that the peak-dip-hump structure in the spectral function could be explained in terms of spin fluctuations in the particle-hole spectrum. He did this by running his MPI algorithm on 2,000 to 3,000 cores to compute renormalized spectral functions in a self-consistent many-body self-energy approximation for the intermetallic Pu-115 and UCoGa5 compounds. This work could have important implications for future technologies.

NERSC Award for Innovative Use of High Performance Computing

Peter Nugent and the Palomar Transient Factory Team, Lawrence Berkeley National Laboratory

Peter Nugent, Lawrence Berkeley National Laboratory

The detection of transient events could lead to a greater understanding of astrophysical objects like supernovae, active galaxies, and gamma-ray bursts, among a variety of other known and unknown cosmic phenomena. But a major challenge has been identifying transient objects - in real-time- among a scene of normal cosmic variations. The Palomar Transient Factory (PTF) was the first project dedicated solely to finding and following up transient events in real time, and they worked with NERSC to develop an automated system to sift through terabytes of astronomical data every night to find interesting events.

Every night for about four years, the PTF camera – a 100-megapixel machine mounted on the 48-inch Samuel Oschin Telescope at Palomar Observatory in Southern California – automatically snapped pictures of the sky, then sent those observations to NERSC where computers running machine learning algorithms in the Real-time Transient Detection pipeline scoured this data for "transients,” or cosmic objects that change in brightness or position, by comparing the new observations with all of the data collected from previous nights. Once an interesting event was discovered, an automated system sent its coordinates to ground-based telescopes around the world for follow-up observations. NERSC also archived this data and allowed collaborators to access it over the Internet through a web-based science gateway, called DeepSky.

In 2011, the PTF Real-time Transient Detection Pipeline discovered the closest Type Ia supernova – approximately 21 million light-years away from Earth – in a generation, just hours after it exploded. Within hours of identifying the event, this automated system sent the coordinates to telescopes around the world for follow-up observations. By catching the supernova early, astronomers caught a rare glimpse of the outer layers of the explosion, which contained hints about the star as it once was. In 2010 the PTF discovered a new class of superluminous supernovae. While rare, these are among the brightest explosions in the Universe. The PTF pipeline also found the first-ever direct observations of a Type Ia supernova progenitor system. They were able to determine that the system contained a red giant star and that it previously underwent at least one smaller nova eruption before it ended its life in a destructive supernova. And recently, while digging through the PTF data archive at NERSC, astronomers found the first causal evidence that massive stars, which can explode as Type IIn supernovae, shed huge amounts of material in a “penultimate outburst” before their final destructive detonation.

Related News

NERSC Award for Innovative Use of High Performance Computing – Early Career

Edgar Solomonik, University of California at Berkeley

alt

Edgar Solomonik, University of California, Berkeley

Solomonik was nominated for developing novel algorithms for massively parallel tensor contractions and applying them to quantum chemistry problems, specifically coupled-cluster theory, which is the de facto standard for important scientific applications in the thermochemistry of combustion and excited-states of systems where density-functional theory (DFT) breaks down.

His algorithms represent a major development in the area of tensor computations. Rather than using one-sided communication and dynamic load-balancing as is done in NWChem and other codes like it, Solomonik has transformed the irregular nature of symmetric tensor contractions into highly regular computation that are solved using topology-aware and communication-avoiding dense linear algebra methods developed in Jim Demmel’s group at UC Berkeley.

Solomonik’s algorithmic developments are instantiated in the Cyclops Tensor Framework (CTF), which has been used on some of the largest supercomputers in the world, including the NERSC Hopper system, and the IBM Blue Gene/Q systems at the Lawrence Livermore National Laboratory and Argonne Leadership Computing Facility. To date, the largest application has achieved ~0.5 petaflop/s at ~30 percent of a theoretical peak on Blue Gene/Q. On the Cray XE6 system at NERSC, CTF is faster than NWChem, the previous state-of-the-art coupled-cluster code.


About NERSC and Berkeley Lab
The National Energy Research Scientific Computing Center (NERSC) is a U.S. Department of Energy Office of Science User Facility that serves as the primary high performance computing center for scientific research sponsored by the Office of Science. Located at Lawrence Berkeley National Laboratory, NERSC serves almost 10,000 scientists at national laboratories and universities researching a wide range of problems in climate, fusion energy, materials science, physics, chemistry, computational biology, and other disciplines. Berkeley Lab is a DOE national laboratory located in Berkeley, California. It conducts unclassified scientific research and is managed by the University of California for the U.S. Department of Energy. »Learn more about computing sciences at Berkeley Lab.