NERSCPowering Scientific Discovery Since 1974

Kristy Kallback-Rose

IMG 20170507 140232
Kristy A. Kallback-Rose
Storage Systems Group
National Energy Research Scientific Computing Center
Fax: (510) 486-6459
Lawrence Berkeley National Laboratory
1 Cyclotron Road
Mail Stop 59R4010A
Berkeley, CA 94720 US

Biographical Sketch

Kristy Kallback-Rose joined the NERSC Storage Systems Group in early 2017. In recent years, Kristy has worked with geographically distributed filesystems and archival storage, including GPFS and HPSS, primarily at Indiana University. Prior to that she worked in a variety of roles including grid computing in support of the ATLAS project, databases, development and instruction. Kristy has undergraduate degrees in Japanese and Physics and an masters in Physics and is pleased to be working at NERSC in support of scientific research. 


Conference Papers

K Kallback-Rose, D Antolovic, R Ping, K Seiffert, C Stewart, T Miller, "Conducting K-12 Outreach to Evoke Early Interest in IT, Science, and Advanced Technology", ACM, July 16, 2012,

This is a preprint of a paper presented at XSEDE '12: The 1st Conference of the Extreme Science and Engineering Discovery Environment, Chicago, Illinois.


Kirill Lozinskiy, Lisa Gerhardt, Annette Greiner, Ravi Cheema, Damian Hazen, Kristy Kallback-Rose, Rei Lee, User-Friendly Data Management for Scientific Computing Users, Cray User Group (CUG) 2019, May 9, 2019,

Wrangling data at a scientific computing center can be a major challenge for users, particularly when quotas may impact their ability to utilize resources. In such an environment, a task as simple as listing space usage for one's files can take hours. The National Energy Research Scientific Computing Center (NERSC) has roughly 50 PBs of shared storage utilizing more than 4.6B inodes, and a 146 PB high-performance tape archive, all accessible from two supercomputers. As data volumes increase exponentially, managing data is becoming a larger burden on scientists. To ease the pain, we have designed and built a “Data Dashboard”. Here, in a web-enabled visual application, our 7,000 users can easily review their usage against quotas, discover patterns, and identify candidate files for archiving or deletion. We describe this system, the framework supporting it, and the challenges for such a framework moving into the exascale age.

Kristy Kallback-Rose, NERSC Site Update at the Linear Tape User Group, May 2, 2018,

NERSC site update focusing on plans to implement new tape technology at the Berkelely Data Center. 

NERSC Site Report focusing on current storage challenges for disk-based and tape-based systems.


Glenn K. Lockwood, Damian Hazen, Quincey Koziol, Shane Canon, Katie Antypas, Jan Balewski, Nicholas Balthaser, Wahid Bhimji, James Botts, Jeff Broughton, Tina L. Butler, Gregory F. Butler, Ravi Cheema, Christopher Daley, Tina Declerck, Lisa Gerhardt, Wayne E. Hurlbert, Kristy A. Kallback-
Rose, Stephen Leak, Jason Lee, Rei Lee, Jialin Liu, Kirill Lozinskiy, David Paul, Prabhat, Cory Snavely, Jay Srinivasan, Tavia Stone Gibbins, Nicholas J. Wright,
"Storage 2020: A Vision for the Future of HPC Storage", October 20, 2017, LBNL LBNL-2001072,

As the DOE Office of Science's mission computing facility, NERSC will follow this roadmap and deploy these new storage technologies to continue delivering storage resources that meet the needs of its broad user community. NERSC's diversity of workflows encompass significant portions of open science workloads as well, and the findings presented in this report are also intended to be a blueprint for how the evolving storage landscape can be best utilized by the greater HPC community. Executing the strategy presented here will ensure that emerging I/O technologies will be both applicable to and effective in enabling scientific discovery through extreme-scale simulation and data analysis in the coming decade.