20th Century Reanalysis Project
Key Challenges: Assimilate historical weather observations from sources as diverse as 19th century sea captains and turn of the century explorers and medical doctors, to reconstruct a comprehensive global atmospheric circulation dataset in six-hour intervals from 1871 to the present. Use this dataset to analyze (and reanalyze) global weather conditions over that period of time.
Why it Matters: Using historical climate data will help us understand current weather patterns, especially for extreme weather events. It will help explain climate variations that may have misinformed early-century policy decisions and will provide an important validation check on climate models being used to make 21st-century climate projections. Results from this work are key validation tests for simulations assessing societal and economic impacts of climate change and will build confidence in model projections of regional changes and high-impact, extreme events.
Accomplishments: First complete database of 3-D global weather maps at a quality level similar to today’s three-day weather forecasts but dating back over 100 years. Has more than doubled the number of years for which a complete record of three-dimensional atmospheric climate data is available for climate studies and has provided quantitative estimates of uncertainties in those data. Has resulted in a much longer and more detailed record of past weather variability than is currently available and has also provided missing information – and key insight – on historically important extreme weather events such as the 1930s dust bowl, the deadly 1922 “Knickerbocker storm,” and a variety of El Niño episodes.
The weather maps resulting from this work provide the first estimates of global tropospheric variability and at a quality level similar to today’s three-day weather forecasts but dating back over 100 years. This will allow climate scientists to view current weather patterns in a historical perspective and determine if current extremes – such as the Northeast U.S. 2011 blizzard – are changing.
• Provide missing information about the conditions in which extreme climate events occurred.
• Reproduced 1922 Knickerbocker storm, comprehensive description of 1918 El Niño
• Data can be used to validate climate and weather models
The work commenced on an IBM RS/6000 SP with 6,080 IBM Power processors, IBM’s proprietary Colony Switch interconnect, AIX operating system, and GPFS filesystem. It was completed on a Cray XT4 with 38,128 AMD Opteron processor, Cray proprietary SeaStar interconnect, Cray Linux Environment operating system, and the Lustre filesystem on DDN storage devices.
NERSC serves as “master” site for the full dataset; NGF, HPSS are crucial. There was significant NERSC user support in optimizing data xfer to NOAA/NCAR/ORNL, in porting, tuning and debugging the application, and in creating scripting solutions. This project was a major beneficiary of Franklin upgrade, particularly in I/O rates (~2X faster plus greater consistency). NERSC consultants also provided assistance in adding multi-level parallelism to bundle several parallel jobs for high throughput, simplified data handling and in rewriting post processing mpi code. External login nodes on Hopper were (in part) to accommodate this project’s heavy aprun needs.
NERSC Science Gateway: The 20th Century Reanalysis Project Ensemble Gateway provides data from this project, offering temperature, pressure, humidity, and wind predictions in 200 km sections around the world, from 1871 to 2008.
Investigators: Gilbert Compo (University of Colorado)