NERSCPowering Scientific Discovery Since 1974

Featured Announcements

2014 Call for NERSC Initiative for Scientific Exploration (NISE) Program Due December 8

November 18, 2013 by Francesca Verdier | 0 Comments

Users may now submit requests for the 2014 NERSC Initiative for Scientific Exploration (NISE) program.  The deadline to apply is Sunday December 8, 11:59 PM Pacific Time.

0 comments | Read the full post

Call for nominations for NERSC HPC Achievement Awards Due December 16

November 18, 2013 by Francesca Verdier | 0 Comments

 

Nominations are open for the 2014 NERSC Award for Innovative Use of High Performance Computing and the 2014 NERSC Award for High Impact Scientific Achievement. NERSC Principal Investigators, Project Managers, PI Proxies, and DOE Program Managers may nominate any NERSC user or collaboration. The deadline for nominations is Mon. Dec. 16, 2013. Winners will be announced at the NERSC Users Group meeting on Feb. 4, 2014, listed on the NERSC web site and highlighted in a NERSC press release. Award recipients will be chosen by representatives from the NERSC Users Group Executive Committee and NERSC staff.

0 comments | Read the full post

2014 call for NERSC's Data Intensive Computing Pilot Program Due December 10

November 18, 2013 by Francesca Verdier | 0 Comments

NERSC's Data Intensive Computing Pilot Program is now open for its second round of allocations to projects in data intensive science. This pilot aims to support and enable scientists to tackle their most demanding data intensive challenges. Selected projects will be piloting new methods and technologies targeting data management, analysis, and dissemination.

0 comments | Read the full post

NERSC at SC13

November 12, 2013 by Francesca Verdier | 0 Comments

SC13 takes place November 17-22 at Denver's Colorado Convention Center.  NERSC is well represented at SC13, including:

0 comments | Read the full post

OpenCL now Available on Dirac GPU Cluster

October 30, 2013 by Francesca Verdier | 0 Comments

 

As requested by many Dirac users, NERSC is migrating Dirac to a newer Linux version to enable capabilities such as OpenCL. We aim to convert the whole Dirac system to Scientific Linux 6.3 by Mid-November. 
 
We have migrated the beta nodes to Scientific Linux 6.3. Please run some test jobs to make sure your code will work in the new system. In particular, please test your MPI+Cuda codes in a multi-node environment. 
 
You can get to one of these nodes with:
qsub -I -V -q dirac_int -l nodes=1:ppn=8:beta

Or to get 2 nodes:
qsub -I -V -q dirac_reg -l nodes=2:ppn=8:beta

Inside the job you need to load the latest cuda and SL6 flavor of gcc:
module unload cuda
module load cuda/5.5
module unload pgi
module load gcc-sl6

If you dont have access to Dirac you may request access:

 

0 comments | Read the full post

1 2 3 4 510