Interstate Data Moving and the Last Block Problem: Lessons Learned in the CAPS Spring Experiment 2014 Keith A. Brewster, Ph.D. Center for Analysis and.

Slides:



Advertisements
Similar presentations
Kathy Benninger, Pittsburgh Supercomputing Center Workshop on the Development of a Next-Generation Cyberinfrastructure 1-Oct-2014 NSF Collaborative Research:
Advertisements

Toward Improving Representation of Model Microphysics Errors in a Convection-Allowing Ensemble: Evaluation and Diagnosis of mixed- Microphysics and Perturbed.
User Perspective: Using OSCER for a Real-Time Ensemble Forecasting Experiment…and other projects Currently: Jason J. Levit Research Meteorologist, Cooperative.
Lesson 18-Internet Architecture. Overview Internet services. Develop a communications architecture. Design a demilitarized zone. Understand network address.
Networking Theory (Part 1). Introduction Overview of the basic concepts of networking Also discusses essential topics of networking theory.
1 PSC update July 3, 2008 Ralph Roskies, Scientific Director Pittsburgh Supercomputing Center
FORECASTING EASTERN US WINTER STORMS Are We Getting Better and Why? Jeff S. Waldstreicher NOAA/NWS Eastern Region Scientific Services Division – Bohemia,
Travis Smith (OU/CIMMS) February 25–27, 2015 National Weather Center
Hardware & Software Needed For LAN and WAN
Firewalls and VPNS Team 9 Keith Elliot David Snyder Matthew While.
Tara Jensen for DTC Staff 1, Steve Weiss 2, Jack Kain 3, Mike Coniglio 3 1 NCAR/RAL and NOAA/GSD, Boulder Colorado, USA 2 NOAA/NWS/Storm Prediction Center,
NETWORKING HARDWARE.
James Deaton Chief Technology Officer OneNet Glass, Graphs and Gear.
Introduction to Information and Computer Science Networks Lecture e This material (Comp4_Unit7e) was developed by Oregon Health and Science University,
Descriptive Data Analysis of File Transfer Data Sudarshan Srinivasan Victor Hazlewood Gregory D. Peterson.
Atmospheric Data Analysis on the Grid Kevin Hodges ESSC Co-workers: Brian Hoskins, Lennart Bengtsson Lizzie Froude.
Sponsored by the National Science Foundation GENI and Cloud Computing Niky RIga GENI Project Office
Integration of Storm Scale Ensembles, Hail Observations, and Machine Learning for Severe Hail Prediction David John Gagne II Center for Analysis and Prediction.
GOES-R Proving Ground Activities at the NWS Storm Prediction Center Dr. Russell S. Schneider Chief, Science Support Branch NWS Storm Prediction Center.
GOES-R Proving Ground GOES-R Proving Ground Demonstration of Imagery Products at NOAA’s Storm Prediction Center and Hazardous Weather Testbed Chris Siewert.
MySQL and PHP Internet and WWW. Computer Basics A Single Computer.
Background Info - The state of the campus telecommunications infrastructure in December 2001.
L inked E nvironments for A tmospheric D iscovery leadproject.org Using the LEAD Portal for Customized Weather Forecasts on the TeraGrid Keith Brewster.
GOES-R Proving Ground Storm Prediction Center and Hazardous Weather Testbed Chris Siewert 1,2, Kristin Calhoun 1,3 and Geoffrey Stano OU-CIMMS, 2.
Block1 Wrapping Your Nugget Around Distributed Processing.
HWT Spring Forecasting Experiment: History and Success Dr. Adam Clark February 25, 2015 National Weather Center Norman, Oklahoma.
Computers Are Your Future Eleventh Edition
THORPEX Interactive Grand Global Ensemble (TIGGE) China Meteorological Administration TIGGE-WG meeting, Boulder, June Progress on TIGGE Archive Center.
Network equipment used in a modern office
The II SAS Testbed Site Jan Astalos - Institute of Informatics Slovak Academy of Sciences.
MGS Reception/Transmission Test: USAP Field Season Matthew Lazzara AMRC/SSEC University of Wisconsin-Madison Mike Comberiate NASA Goddard Space.
NETWORK HARDWARE CABLES NETWORK INTERFACE CARD (NIC)
Translational Informatics (IDTC 289). Who we are.
The Internet The History and Future of the Internet.
Slide 1 9/29/15 End-to-End Performance Tuning and Best Practices Moderator: Charlie McMahon, Tulane University Jan Cheetham, University of Wisconsin-Madison.
Slide 1 Campus Design – Successes and Challenges Michael Cato, Vassar College Mike James, Northwest Indian College Carrie Rampp, Franklin and Marshall.
Hazardous Weather Forecasts and Warnings Overview: Forecast Division Jack Kain (NSSL) February 25–27, 2015 National Weather Center Norman, Oklahoma.
Web Pages with Features. Features on Web Pages Interactive Pages –Shows current date, get server’s IP, interactive quizzes Processing Forms –Serach a.
NOAA Hazardous Weather Test Bed (SPC, OUN, NSSL) Objectives – Advance the science of weather forecasting and prediction of severe convective weather –
TimeDays Outlooks Adapted from Dr. Heather Lazrus (SSWIM) and Lans Rothfusz FACETs: Forecasting A Continuum of Environmental Threats Space Regional State.
Cyberinfrastructure: An investment worth making Joe Breen University of Utah Center for High Performance Computing.
INFORMATION EXTRACTION AND VERIFICATION OF NUMERICAL WEATHER PREDICTION FOR SEVERE WEATHER FORECASTING Israel Jirak, NOAA/Storm Prediction Center Chris.
Term 2, 2011 Week 2. CONTENTS Communications devices – Modems – Network interface cards (NIC) – Wireless access point – Switches and routers Communications.
R2O and O2R between NSSL and SPC: The benefits of Collocation Jack Kain, Steve Weiss, Mike Coniglio, Harold Brooks, Israel Jirak, and Adam Clark.
 A hub is a central connecting device in a network.  Each node is connected directly to the hub.  They receive a data packet from one node and send.
Slide 1 UCSC 100 Gbps Science DMZ – 1 year 9 month Update Brad Smith & Mary Doyle.
Convective-Scale Numerical Weather Prediction and Data Assimilation Research At CAPS Ming Xue Director Center for Analysis and Prediction of Storms and.
UCSD’s Distributed Science DMZ
Component 4: Introduction to Information and Computer Science Unit 7: Networks & Networking Lecture 1 This material was developed by Oregon Health & Science.
CAPS Realtime 4-km Multi-Model Convection-Allowing Ensemble and 1-km Convection-Resolving Forecasts for the NOAA Hazardous Weather Testbed (HWT) 2009 Spring.
Activity 1 5 minutes to discuss and feedback on the following:
.  Hubs send data from one computer to all other computers on the network. They are low-cost and low-function and typically operate at Layer 1 of the.
DET Module 1 Ensemble Configuration Linda Wharton 1, Paula McCaslin 1, Tara Jensen 2 1 NOAA/GSD, Boulder, CO 2 NCAR/RAL, Boulder, CO 3/8/2016.
HWT Experimental Warning Program: History & Successes Darrel Kingfield (CIMMS) February 25–27, 2015 National Weather Center Norman, Oklahoma.
The Quantitative Precipitation Forecasting Component of the 2011 NOAA Hazardous Weather Testbed Spring Experiment David Novak 1, Faye Barthold 1,2, Mike.
Slide 1 E-Science: The Impact of Science DMZs on Research Presenter: Alex Berryman Performance Engineer, OARnet Paul Schopis, Marcio Faerman.
High Resolution Assimilation of Radar Data for Thunderstorm Forecasting on OSCER Keith A. Brewster, Kevin W. Thomas, Jerry Brotzge, Yunheng Wang, Dan Weber.
CAPS Realtime 4-km Multi-Model Convection-Allowing Ensemble and 1-km Convection-Resolving Forecasts for the NOAA Hazardous Weather Testbed (HWT) 2009 Spring.
Interstate Data Moving and the Last Block Problem: Lessons Learned in the CAPS Spring Experiment 2014 Keith A. Brewster, Ph.D. Center for Analysis and.
Tuning an Analysis and Incremental Analysis Updating (IAU) Assimilation for an Efficient High Resolution Forecast System Keith A. Brewster1
Multi-Scale Spatiotemporal Mining of Atmospheric Data
Tuning an Analysis and Incremental Analysis Updating (IAU) Assimilation for an Efficient High Resolution Forecast System Keith A. Brewster1
Center for Analysis and Prediction of Storms (CAPS) Briefing by Ming Xue, Director CAPS is one of the 1st NSF Science and Technology Centers established.
3.2 Virtualisation.
Keith A. Brewster1 Jerry Brotzge1, Kevin W
CAPS Mission Statement
Firewalls Routers, Switches, Hubs VPNs
CAPS Real-time Storm-Scale EnKF Data Assimilation and Forecasts for the NOAA Hazardous Weather Testbed Spring Forecasting Experiments: Towards the Goal.
Read this to find out how the internet works!
- Measurement of traffic volume -
Presentation transcript:

Interstate Data Moving and the Last Block Problem: Lessons Learned in the CAPS Spring Experiment 2014 Keith A. Brewster, Ph.D. Center for Analysis and Prediction of Storms University of Oklahoma

SPC/NSSL Spring Program in the Hazardous Weather Testbed Testing and calibration of new forecasting methods in a simulated operational setting 6 weeks in spring season Collaboration among NOAA research units NOAA operational units Universities Private sector Testbed located between the NOAA Storm Prediction Center and Norman National Weather Service Forecast Office

CAPS Spring Experiment Part of NOAA/SPC Spring Experiment at the Hazardous Weather Testbed Run Large Ensemble of Convection-Allowing NWP Forecasts for 6-weeks in Spring New methods for severe weather prediction in 1-2 days time frame 25 NWP models run at XSEDE Centers Darter at NICS Oak Ridge)

Goal: “Real-time” 4D Data Visualization Does not allow full 3D visualization May want to examine other fields and levels Run models at PSC or NICS Bring 2D files and images back 2D fields and levels pre-selected Procedure Since 2007 Issues Need to move 3D Data from NICS at UTenn CAPS at OU

Scoping the Task CONUS Domain at 4-km Resolution 1163 x 723 x 53 Output for one time: 4.2 GB Domain decomposition onto 384 (6x64) processors results in 384 split files per output time, each file = 11 MB For smooth animations, 10-minute output is generated for 5 members covering the afternoon and evening, forecast hours Complete Forecast: 60-h forecast, hourly output + 10 minute output 18h-30h: 121 output times, 508 GB Day-1 Afternoon and Evening for Animation Forecast 18h-30h with 10-minute output: 73 output times, 307 GB

Plan “A” Workflow NICS Darter WRF Model wrfout split files sftp or scp CAPS Server NSSL or SPC Machine in HWT joinWRF Join and subset Daily Region of Interest wrfout joined files WDSS-II Converter WDSS-II VAPoR vdf files VAPoR WDSS-II files vdfcreate GridFTP or bbcp sftp or scp

Data Route NICS/U Tenn in Oak Ridge, Tennessee to CAPS/OU in Norman, Oklahoma Map from mapquest.com 845 miles 13 hours via MichelinNet

Internet Route

Internet2

Internet Route “Last Mile” OneNet Tulsa to OU – Norman (4PP) 4PP to National Weather Center Across the parking lot National Weather Center to CAPS Switch to File Server System to CAPS Office Workstation via Firewall

Recent Networking Initiatives 1. University of Tennessee BLAST 2. OneOklahoma Friction Free Network (OFFN) 3. National Weather Center Upgrade University of Tennessee BLAST 100 Gps Upgrade of Research Network Includes connections to HPC at NICS

Recent Networking Initiatives OneOklahoma Friction Free Network (OFFN) NSF Campus Cyberinfrastructure-Network Infrastructure and Engineering Program (CC-NIE) Establish 10 Gbps Network Ring OU-OSU-Langston-Tandy Supercomputing Ctr

Traffic & Last Block Problem Testing revealed a “last block” problem, actually within the building itself, mostly due to a slow firewall.

Packing and Compression? Try creating compressed tar file before sending? Sending large files is faster ~100 MB/s vs ~10 MB/s BUT! Creating a compressed tar file takes time OperationTime $ tar –zcvf22 min bbcp2.5 min Total24.5 min OperationTime bbcp individual split files50 min OperationTime $ tar -cvf15 min bbcp5 min Total20 min Test for 1-hour of Full Domain Split Files With CompressionWithout Compression

Plan “D” Workflow NICS Darter WRF Model wrfout split files CAPS Stratus CAPS Laptop joinWRF Join and subset Daily Region of Interest wrfout joined files WDSS-II Converter WDSS-II VAPoR vdf files VAPoR WDSS-II files vdfcreate bbcp sftp wrfout joined files sftp SPC Workstation

Re-Scoping the Task Selected Subdomain of the Day 203 x 203 x 53 Output for one time: MB Single file for each time = MB vs 4.2 GB Day-1 Afternoon and Evening for Animation Forecast 18h-30h with 10-minute output: 73 output times, 8.2 GB vs 307 GB Processing and Transfer: ~20 min

Fruits of Labor WDSS-II

Fruits of Labor VAPoR

Lessons Learned Involve networking pros early Your mileage (throughput) may vary Be flexible with workflows Evaluate overhead of all steps Find ways to fund equipment upgrades where needed – slowest link sets your rate Software Programmable Networks/Science DMZ’s may be needed for the largest jobs

National Weather Center (September, 2014) Upgrading Network to 10 Gps Switches Software Programmable Networking Enabled Replacing Slow Firewall CAPS (September-October, 2014) Upgrading to Two 10 Gps Gateway Servers with Virtual Router Redundancy Protocol (VRRP) Software Programmable Networking Enabled Upgrading servers to 10 Gps network interfaces For 2015 Set Jumbo Size Maximum Transmission Unit (MTU) for route Explore use of Science DMZ within OneNet Future Plans

Questions? Contact Info: Keith Brewster CAPS/University of Oklahoma Thanks to: Chris Cook, CAPS Kevin W. Thomas, CAPS Victor Hazlewood and his networking team, UTenn Matt Runion & James Deaton, OneNet Henry Neeman & OSCER Team Mike Coniglio, NOAA/NSSL