Download presentation
Presentation is loading. Please wait.
Published byFelicia Annabella McLaughlin Modified over 9 years ago
1
ARSC Initiatives in Weather, Climate and Ocean Modeling Dr. Gregory Newby, Chief Scientist Arctic Region Supercomputing Center Presentation to the HPC User Form September 9 2008
2
Common Themes with Other User Forum Presenters Lots of work with WRFLots of work with WRF Work with CCSMWork with CCSM Mature community-developed applications that scale reasonably well and are hungry for more CPU powerMature community-developed applications that scale reasonably well and are hungry for more CPU power Desire for higher resolution to better approximate realityDesire for higher resolution to better approximate reality Desire to add increasingly sophisticated processes to the phenomena under investigationDesire to add increasingly sophisticated processes to the phenomena under investigation –Therefore, these items will not be major themes of today’s talk
3
ARSC Themes of potential interest to the HPC User Forum Performance analysis of current (multicore) and forthcoming (Cell, GPU, FPGA, T2+) processors for WRFPerformance analysis of current (multicore) and forthcoming (Cell, GPU, FPGA, T2+) processors for WRF Quasi-operational WRF with NWS/NOAA usersQuasi-operational WRF with NWS/NOAA users Wildfire smoke prediction based on WRF/ChemWildfire smoke prediction based on WRF/Chem Model couplingModel coupling High resolution ocean forecastsHigh resolution ocean forecasts Arctic ice tide impactsArctic ice tide impacts WRF sensitivity analysisWRF sensitivity analysis –These will be the major themes of today’s talk
4
Multicore Performance Analysis on ARSC’s Supercomputers Mostly on Midnight (dual core)Mostly on Midnight (dual core) –Sun x2200m2 & x4600 cluster nodes –2312 Opteron cores, IB, 4GB/core, Lustre –12.88 TFLOP theoretical peak BenchmarksBenchmarks Real-world applicationsReal-world applications Forthcoming: Pingo (quad core)Forthcoming: Pingo (quad core) –Cray XT5 –3456 cores, SeaStar, 4GB/core, Lustre – About 30 TFLOP theoretical peak
5
2-socket AMD64 topology One 1 GHz 16x16 HyperTransport link per supported processor with 8GB/second bandwidth CPU-0,1 CPU-2,3
6
8 socket AMD64: Sun Fire X4600 Server
7
8 socket AMD64 topology CPU0 CPU5CPU3 CPU6CPU4 CPU1 CPU2 CPU7 * All HT links are operating at 1GHz and 8GB/s
8
WRF Performance on Midnight Time required to compute 1 forecast hour on 3-nest domain
9
Next-Generation Processor Performance GPU, Cell, quad-core Xeon & Opteron, FPGA, CMTGPU, Cell, quad-core Xeon & Opteron, FPGA, CMT –Many benchmarks completed, some applications Forthcoming: WRF test cases for AK domains (submitted for AGU 2008)Forthcoming: WRF test cases for AK domains (submitted for AGU 2008) –GPU (nVidia 9800 GTX), possibly Firestream –FPGA: WRF doesn’t have enough “hot spots,” porting BLAS too much work (but RapidMind might help) –Cell: QS22 cluster being configured; should work well –UltraSPARC T2+: scaled well to 8 threads per socket; WRF is too CPU-bound to benefit much from T2+ CMT
10
Quasi-Operational and Research Weather Modeling Work has been evolving since 2005Work has been evolving since 2005 Initially, efforts were directed at quasi- operational, scheduled WRF forecasts for the Fairbanks Weather Forecast Office (WFO). The goal was to leverage lots of CPUs to provide high-resolution forecasts, and continually assess & improveInitially, efforts were directed at quasi- operational, scheduled WRF forecasts for the Fairbanks Weather Forecast Office (WFO). The goal was to leverage lots of CPUs to provide high-resolution forecasts, and continually assess & improve The scheduled runs continue to dominate our efforts, but rigorous analysis and study has yielded important resultsThe scheduled runs continue to dominate our efforts, but rigorous analysis and study has yielded important results
11
2-way Nested Domains for ARSCwrf 200x200x75 cells421x328x75 cells151x151x75 cells
12
Operational WRF – AWIPS This is the end product of our twice-daily WRF runs, as seen on the NWS Advanced Weather Interactive Prediction System (FAI WFO)
13
Value to WFO FAI Grid resolution is approximately 5km. The ability to populate with a reasonably accurate, high-resolution weather model saves the forecasters a lot of time. In the best case, the model output is the NDFD product.Grid resolution is approximately 5km. The ability to populate with a reasonably accurate, high-resolution weather model saves the forecasters a lot of time. In the best case, the model output is the NDFD product.
14
Verification Products Four days after a forecast, we automatically retrieve all available observations and tabulate a comparison of the WRF forecast for each location vs. the observationsFour days after a forecast, we automatically retrieve all available observations and tabulate a comparison of the WRF forecast for each location vs. the observations
15
Data Assimilation Motivation – better set of initial conditionsMotivation – better set of initial conditions Current initial conditions are interpolated from coarser resolution model runs – clearly much room for errorCurrent initial conditions are interpolated from coarser resolution model runs – clearly much room for error With data assimilation, we perturb our original input grid by carefully assimilating available observationsWith data assimilation, we perturb our original input grid by carefully assimilating available observations We are currently in the process of adding this capability to our operational runs, and once implemented will perform assimilated and non- assimilated runs side-by-side for comparisonWe are currently in the process of adding this capability to our operational runs, and once implemented will perform assimilated and non- assimilated runs side-by-side for comparison
16
Data Assimilation Test Case – 48-hour forecast starting at 00Z on 01 May 2007Test Case – 48-hour forecast starting at 00Z on 01 May 2007 Surface obs – temperature, dewpoint, pressure, winds Vertical soundings from raobs and satellite – temperature, dewpoint, pressure, winds
17
Alaska Air Quality Impacted by Wildfires South Fairbanks, July 6, 2004. Air quality particulate level at approximately 10 micrograms/cubic meter. South Fairbanks, June 28, 2004. Air quality particulate level at approximately 900 micrograms / cubic meter. Photos courtesy of Dr. James Conner, FNSB http://www.dec.state.ak.us/air/am/2004_wf_sum.htm
18
Initial architecture: WRF/Chem Smoke Dispersion System FIRE EMISSIONS FOFEM MODIS FIRE DETECTION & BURN AREA WRF-CHEM PARTICULATE, CHEMICAL & METEOROLOGICAL FORECAST WRF-MET METEOROLOGY FORECAST Gridded Hourly Emissions PLUME DYNAMICSFUEL MOISTUREFIRE SPREAD Identical WRF domain initialization (SI/WPS) WRF/Chem POSTPROCESSING WRF-Chem netCDF Static Fuel Data Emission Factors DEM, Static Fuel Adapted from a scheme used by our partners from the US Forest Service Fire Science Lab in Missoula.
19
Fire Detection and Burn Area Example
21
Example Product for Monday, 8th September 2008, 9:00 UTC A weak smoke concentration due to North-Easterly winds and extended fires North/East of the Yukon has been clearly confirmed in the Fairbanks area. Source: smoke.uaf.edu
22
Coupling Work: Regional Arctic Climate Model & More Practical ongoing work, DoE funded. Wieslaw Maslowski, PI Practical ongoing work, DoE funded. Wieslaw Maslowski, PI – ARSC is also looking at some more general approaches to coupled climate models, including adding permafrost, ice sheets, hydrology, ecosystems, human infrastructure, and coastal erosion. Major partner: International Arctic Research Center, UAF. RACM is a regional Arctic climate system model including WRF, VIC (land surface model), NAPC (ocean and sea ice model). Current emphasis is coupling WRF, VIC and NAPC via cpl7. RACM is a regional Arctic climate system model including WRF, VIC (land surface model), NAPC (ocean and sea ice model). Current emphasis is coupling WRF, VIC and NAPC via cpl7. Cpl7 is still in development. Our work will keep pace with cpl7 progress. CCSM4 project groups have made early-release codes available to RACM partners Cpl7 is still in development. Our work will keep pace with cpl7 progress. CCSM4 project groups have made early-release codes available to RACM partners Since cpl7 is designed for CCSM4, where the framework is different with WRF and VIC, some changes of cpl7 and WRF are necessary. Since cpl7 is designed for CCSM4, where the framework is different with WRF and VIC, some changes of cpl7 and WRF are necessary.
23
ATM LNDOCN CPL ICE MPI_COMM_WORLD CPLATM CPLLND CPLICE 5 COMPONENTS 10 COMMUNICATION GROUPS Partitioning of RACM Processing
24
ROMS and Beyond ROMS is an ocean model that can operate at both very large and very small scales, and takes shorelines into accountROMS is an ocean model that can operate at both very large and very small scales, and takes shorelines into account At ARSC: Ongoing research for northwest Pacific, Gulf of Alaska, Beaufort Sea & Bering SeaAt ARSC: Ongoing research for northwest Pacific, Gulf of Alaska, Beaufort Sea & Bering Sea Also, partnering to couple ROMS with other best-of-breed models for ocean / ice predictionAlso, partnering to couple ROMS with other best-of-breed models for ocean / ice prediction
25
CCSM3 Coupling for Ocean / Ice Modeling
26
The ROMS Regional Setups
27
Nutrient Phytoplankton Zooplankton models Secondary Producers (Zooplankton) Fish Nutrients NO 3, NH 4… Primary Producers (Phytoplankton) Physical Forcing (Wind, temperature Sunlight, mixing) ROMS Driving Ecosystem Model
28
Beaufort Sea WRF Sensitivity Analysis Funded by MMSFunded by MMS Emphasis includes near-coast oil spills in Beaufort Sea (north of AK & Yukon Territory)Emphasis includes near-coast oil spills in Beaufort Sea (north of AK & Yukon Territory) Many scenarios, many WRF runsMany scenarios, many WRF runs –Reanalysis study –Storm case studies –Impact of winds on waves
29
MMS WRF Report: Higher Resolution not needed for Effective Near-Shore Wind Fields
30
From Field to Supercomputer: Designing Next-Generation Sea Ice Modules for Very High Resolution Ice-Ocean Models
31
Arctic System Modeling Tidal and wind forcing of the ice-ocean boundary layer verses simple wind forcing (14km resolution) These animations demonstrate the difference between simulations being purely wind driven (right), and those that are both tidally and wind driven (left). Both animations span the same period, and have the same timestep, which is roughly 50 minutes)
32
Conclusions Climate, weather and related phenomena (sea ice, land surface, physical oceanography) models tend to be community developed and supportedClimate, weather and related phenomena (sea ice, land surface, physical oceanography) models tend to be community developed and supported –Scale fairly well –Ported & maintained for new hardware –Mostly Fortran + MPI, but very complex codes Scientists are hungry for higher resolution, longer runs, and additional runs to parameterize phenomenaScientists are hungry for higher resolution, longer runs, and additional runs to parameterize phenomena The Arctic requires several adjustments to models for optimal verisimilitude, versus mid-latitudesThe Arctic requires several adjustments to models for optimal verisimilitude, versus mid-latitudes
33
Huge THANKS Don Morton (ARSC & U. Montana): WeatherDon Morton (ARSC & U. Montana): Weather Abdullah Kayi (GWU): HyperTransportAbdullah Kayi (GWU): HyperTransport Martin Stuefer (UAF), Georg Grell (NOAA) & Saulo Freitas (CPTEC/INPE): Wildfire smokeMartin Stuefer (UAF), Georg Grell (NOAA) & Saulo Freitas (CPTEC/INPE): Wildfire smoke Kate Hedstrom (ARSC): ROMSKate Hedstrom (ARSC): ROMS Georgina Gibson (ARSC): NPZGeorgina Gibson (ARSC): NPZ Wieslaw Maslowski (NPS): POP, RACMWieslaw Maslowski (NPS): POP, RACM Andrew Roberts (ARSC), Jennifer Hutchings (UAF): Tidal iceAndrew Roberts (ARSC), Jennifer Hutchings (UAF): Tidal ice Juanxiong He (ARSC): CouplingJuanxiong He (ARSC): Coupling Jing Zhang, Jeremy Krieger (UAF), Don Morton: MMS WRF sensitivityJing Zhang, Jeremy Krieger (UAF), Don Morton: MMS WRF sensitivity –Many other partners & collaborators
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.