Welcome to NERSC
Welcome to the National Energy Research Scientific Computing Center, a high performance scientific computing center. This document will guide you through the basics of using NERSC's supercomputers, storage systems, and services.
What is NERSC?
NERSC provides High Performance Computing and Storage facilities and support for research sponsored by, and of interest to, the U.S. Department of Energy Office of Science. NERSC has the unique programmatic role of supporting all six Office of Science program offices: Advanced Scientific Computing Research, Basic Energy Sciences, Biological and Environmental Research, Fusion Energy Sciences, High Energy Physics, and Nuclear Physics. Scientists who have been awarded research funding by any of the offices are eligible to apply for an allocation of NERSC time. Additional awards may be given to non-DOE funded project teams whose research is aligned with the Office of Science's mission. Allocations of time and storage are made by DOE.
NERSC has about 6,000 active user accounts from across the U.S. and internationally.
NERSC is a national center, organizationally part of Lawrence Berkeley National Laboratory in Berkeley, CA. NERSC staff and facilities are primarily located at Berkeley Lab's Shyh Wang Hall on the Berkeley Lab campus.
Computing & Storage Resources
Major computing resources
- A Cray XC40 with 76,416 compute cores of Intel Xeon ("Haswell") and 658,784 compute cores of Intel Xeon Phi ("Knights Landing"). The Xeon nodes have a total of 307 TB of memory, and the Xeon Phi nodes have a total of nearly 1.1 PB of memory. Cori has 30 PB of disk, 1.8 PB of flash-based storage in a burst buffer, and features the Cray "Aries" high-speed internal network.
- Local Scratch
- Cori also has a local scratch file system. The default user quota on Cori is 20 TB.
- The project file system provides permanent storage to groups of users who want to share data. The default quota is 1 TB and can be increased by request. Project is available from all NERSC compute systems.
- HPSS Archival Storage
- NERSC's archival storage system provides up to 240 PB of permanent, archival data storage.
New NERSC Accounts
In order to use the NERSC facilities you need:
- Access to an allocation of computational or storage resources as a member of a project account called a repository.
- A user account with an associated user login name (also called username).
If you are not a member of a project that already has a NERSC award, you may apply for an allocation. If you need to get a new user account that will be associated with an existing NERSC award, you should submit a request for a new NERSC account.
Each person has a single password associated with their login account. This password is known by various names: NERSC password, NIM password, and NERSC LDAP password are all commonly used. As a new user, you will receive an email with a link to set your initial password. You should also answer the security questions; this will allow you to reset your password yourself should you forget it. See Passwords.
If you fail to type your correct password five times in a row when accessing a NERSC system, your account on that system will be locked. To clear these failed logins, you should login to NIM. The simple act of logging in to NIM will clear all your login failures on all NERSC systems.
Accounting Web Interface (NIM)
You log into the NERSC Information Management (NIM) web site at https://nim.nersc.gov/ to manage your NERSC accounts. In NIM you can check your daily allocation balances, change your password, run reports, update your contact information, change your login shell, etc. See NIM Web Portal.
How to Get Help
With an emphasis on enabling science and providing user-oriented systems and services, NERSC encourages you to ask lots of questions. There are lots of way to do just that.
Your primary resources are the NERSC main web site, NERSC documentation site and the HPC Consulting and Account Support staff. The consultants can be contacted by phone, email, or the web ticketing system during working hours Pacific Time. NERSC's consultants are HPC experts and can answer just about all of your questions.
The NERSC Operations staff is available 24x7, seven days a week to give you status updates and reset your password. The NERSC web site is always available with a rich set of documentation, tutorials, and live status information.
Technical questions, computer operations, passwords, and account support
1-800-666-3772 (or 1-510-486-8600)
Computer Operations = menu option 1 (24/7)
Account Support = menu option 2, email@example.com
HPC Consulting = menu option 3, or help.nersc.gov
Online Help Desk = https://help.nersc.gov/
Computer operations (24x7) can reset your password and give you machine status information. Account Support and HPC Consulting are available 8-5 Pacific Time on business days.
NERSC and our vendors supply a rich set of HPC utilities, applications, and programming libraries. If there is something missing that you would like to have on our systems, please submit a request on help.nersc.gov and we will evaluate it for appropriateness, cost, effort, and benefit to the community.
For a list of available software, see NERSC Software.
When you log in to any NERSC computer (not HPSS), you are in your global $HOME directory. You initially land in the same place no matter what machine you connect to. This means that if you have files or binary executables that are specific to a certain system, you need to manage their location.
We provide several ways for transferring data both inside and outside NERSC. To transfer files from/to NERSC, we suggest using the dedicated Data Transfer Nodes, which are optimized for bandwidth and have access for most of the NERSC file systems.
For more detailed information on data transfer, see Transferring Data.
Archiving Files with HPSS
The High Performance Storage System (HPSS) is a modern, flexible, performance-oriented mass storage system. It has been used at NERSC for archival storage since 1998. It is a valuable resource for permanently archiving user's data.
For more information about the specific features of HPSS see Getting Started with HPSS.