Nonequilibrium thermodynamic computing at scale
Science/CS Domain(s)
Machine learning, AI+HPC, statistical physics, unconventional computing
Project description
With the rise of AI, the energy cost of conventional computation is becoming unsustainable. One promising method for reducing this energy cost is thermodynamic computing: While thermal noise must be suppressed in digital or quantum computing at great energy cost, thermodynamic computers are instead powered by it. For instance, a circuit can be engineered such that the evolution of its states under thermal noise leads to a desired outcome, akin to a traditional neural network (Whitelam, Casert, Nat. Comms, 2026).
It is, however, still unclear how the network topology for such a thermodynamic circuit can be optimized for specific computational tasks, and which training methods (gradient vs. non-gradient-based) are fastest.
The goal of this internship is to perform large-scale simulations of thermodynamic computers at NERSC, with the aim of better understanding different topologies and their energy landscapes, as well as training methods.
Desired skills/background
- Required: Python, ML framework such as PyTorch, HPC
- Nice to have: Statistical physics knowledge
Apply to join this project
To apply or ask a question about this project:
Project mentor
Corneel Casert
Machine Learning Engineer
National Energy Research Scientific Computing Center (NERSC)
Science Engagement & Workflows Dept.
Data & AI Services Group