NERSCPowering Scientific Discovery Since 1974

Your First Program on Edison

From Logging in to Submitting a Job

In order to follow this page, you will need a NERSC username and password, and to be a member of an allocated project account ("repo"). If you do not have all of these things please visit the Accounts Page.

Logging in

% ssh -l username edison.nersc.gov

When you successfully log in you will land in your $HOME directory.

First Program Code: Parallel Hello World

Open a new file called helloWorld.f90 with a text editor such as emacs or vi.  Use the "view source" button Copy Icon that will appear in the upper right hand corner of the example when you move your cursor there and paste the contents of the code below into the file.

program helloWorld
implicit none
include "mpif.h"
integer :: myPE, numProcs, ierr
call MPI_INIT(ierr)
call MPI_COMM_RANK(MPI_COMM_WORLD, myPE, ierr)
call MPI_COMM_SIZE(MPI_COMM_WORLD, numProcs, ierr)
print *, "Hello from Processor ", myPE
call MPI_FINALIZE(ierr)
end program helloWorld

Compile the Program

Use the compiler "wrappers" to compile codes on Edison: use ftn for Fortran, cc for C, and CC for C++.

% ftn helloWorld.f90

Run the Program

There are two things that are needed to run a (parallel) code on Edison: (1) You must submit a request to the batch system; and (2) You must launch your job on to the compute nodes using the 'srun' command.  There are two ways to submit a request to the batch system: (1) You can request an interactive batch session using a command such as 'salloc -p debug' (which has a 30-minute limit) or your can submit a batch script (see below). 

Create a Batch Script

Open a file called my_batch_script with a text editor like vi or emacs and paste in the contents below. (Use the "view source" button Copy Icon that will appear in the upper right hand corner of the batch script example when you move your cursor there.)  The batch script is used to tell the Edison system to reserve compute node resources for your job and how it should launch your application on the compute nodes it has reserved.

Contents of file: my_batch_script

#!/bin/bash -l
#SBATCH -p debug
#SBATCH -N 2
#SBATCH -t 30:00
#SBATCH -J my_job

srun -n 48 ./a.out

Submit Your Job to the Queue

The sbatch command is used on the login nodes to submit your batch script so your job will run on the Edison compute nodes.

% sbatch my_batch_script

A jobid will be returned, such as 551690.

Monitor Your Job in the Queue

After you submit your job, the system scheduler will check to see if there are compute nodes available to run the job. If there are compute nodes available, your job will start running. If there are not, your job will wait in the queue until there are enough resources to run your application. You can monitor your position in the queue using several different commands, such as:

edison% squeue -u username
edison% scontrol show job <jobid>

Examine Your Job's Output

When your job has completed you should see a file called slurm-<jobid>.out

edison% cat slurm-552240.out 
 Hello from Processor            0
 Hello from Processor           28
 Hello from Processor            4
 Hello from Processor            7
 Hello from Processor            8
 Hello from Processor           11
 Hello from Processor           12
 Hello from Processor           13
 Hello from Processor           14
 Hello from Processor           15
 Hello from Processor           16
 Hello from Processor           17
 Hello from Processor           18
 Hello from Processor           19
 Hello from Processor           20
 Hello from Processor           21
 Hello from Processor           22
 Hello from Processor           23
 Hello from Processor            1
 Hello from Processor            2
 Hello from Processor            3
 Hello from Processor            5
 Hello from Processor            6
 Hello from Processor            9
 Hello from Processor           10
 Hello from Processor           33
 Hello from Processor           34
 Hello from Processor           35
 Hello from Processor           36
 Hello from Processor           37
 Hello from Processor           40
 Hello from Processor           41
 Hello from Processor           42
 Hello from Processor           43
 Hello from Processor           44
 Hello from Processor           45
 Hello from Processor           47
 Hello from Processor           24
 Hello from Processor           25
 Hello from Processor           26
 Hello from Processor           27
 Hello from Processor           29
 Hello from Processor           30
 Hello from Processor           31
 Hello from Processor           32
 Hello from Processor           38
 Hello from Processor           39
 Hello from Processor           46