2003 uMeteo-K Jai-Ho Oh Pukyong National University, Busan, Korea APAN for Meteorological Studies August 27, 2003
2003 uMeteo-K Main Goals Establishment of uMeteo-K, ubiquitous Korea meteorological research cooperation system
2003 uMeteo-K Examples of Weather Information Water Resources Environment Wild Fire MetorologicalDisaster Industreal Area Health…
2003 uMeteo-K uMeteo-KGRIDTestbed KMA Grid Super Ensemble Seasonal/Climate Super Ensemble Seasonal/Climate Risk management National response Risk management National response Applied Meteorology Agro,Energy,Fishery, Applied Meteorology Agro,Energy,Fishery, Model Development Next G./K-model Model Development Next G./K-model Impact Assessment Env. Impact/Feedback Impact Assessment Env. Impact/Feedback Meteorological industry Met info/Instruments Meteorological industry Met info/Instruments APCN-Grid University Grid Private Grid Project Grid NGIS Grid Institute Grid Inter-Office Grid CAgMGrid Core AgMet Station Koflux/RS reference Core AgMet Station Koflux/RS reference Network Hub for RCC Public service User req./Detailed Public service User req./Detailed Global Environment Flux, Aerosol, GHG Global Environment Flux, Aerosol, GHG
2003 uMeteo-K About uMeteo-K The concept of virtual lab. for interdisciplinary meteorological researches - Parallelized numerical weather prediction modeling (Computational Grid) - Cooperation meteorological research system (Access Grid) - Virtual server for large meteorological data (Data Grid) Grid technology is essential to accomplishment
2003 uMeteo-K Climate Prediction
2003 uMeteo-K Changes in solar input Changes in the atmosphere: Composition, chemical reaction rates and circulation Changes in the hydrological cycle H2O, N2, O2, CO2, O3 etc Terrestrial radiation Precipitation and evaporation Wind stress Heat exchange Air-ice coupling Sea ice Atmosphere Air-biomass Coupling (e.g. N C ) Snow Ecosystem changes smoke Biomass Changes in farming practice Land-biomass coupling (e.g. carbon) Changes in the land surface : Continental distribution, vegetation, Land-use, ecosystems Land Ocean Mixed layer processes Deep Ocean Shelf processes Aerosols & CO2 etc Rivers and lakes Human influences
2003 uMeteo-K km9km3km Climate Information by 2010 Climate Information by 2003 Required Climate Information by 2010
2003 uMeteo-K Earth Simulator Massively parallel super-computer based on NEC SX-5 architecture. –640 computational nodes. –8 vector-processors in each of nodes. –Peak performance of 1CPU : 8GFLOPS –Total peak performance : 8x8x640 = 40TFLOPS
2003 uMeteo-K Development of a High-Resolution Atmospheric Global Model on the Earth Simulator for Climate study 10km or less in horizontal, 100 levels in vertical Nonhydrostatic ICosahedral Atmospheric Model (NICAM)
2003 uMeteo-K Integration of Human and Computational Resources Daejeon Kwangju Pusan Chuncheon Incheon Suwon Chonan Chongju Jeonju Chang- won Ulsan Pohang Seoul Daegu Cheju Computers Supercomputers Experimental Facilities High Speed Networks Databases Mass Storage Visualization Environment Access Grid System Brain pool
2003 uMeteo-K AG in uMeteo-K Setup uMeteo-K AG with PIG + Room Node basis (ICSYM/PKNU, CES/SNU, NCAM/KMA) - Linkage in uMeteo-K with KOREN network system - Establishment of a duplex video conference system with PIG & Polycom - Establishment of computing environment among uMeteo-K’s PIG (AG toolkit 1.2 version) - Establishment of PIG + PIG based independent Room Node system (NCAM/KMA)
2003 uMeteo-K uMeteo-K AG configuration KAIST CNU KMA KJIST PKNU ( 부경대 ) KISTI AG Multicast Unicast Quick Bridge ANL KMA PKNU KISTI KJIST CNU KAIST KMA SNU
2003 uMeteo-K Samples of uMeteo-K AG operation < Korea AG-Group Quick bridge server test – Participants; PKNU, SNU, KISTI, KJIST, CNU, KAIST, KMA on July 8, 2003 >
2003 uMeteo-K < uMeteo-K monthly meeting using VRVS PKNU(Busan)-SNU(Seoul)-KMA(Seoul)-USA(Washington D.C), June 3, 2003>
2003 uMeteo-K uMeteo-K computational grid testbed (Two clusters utilized and each cluster has 4 nodes) CPUPentium Ghz RAM1G SDRAM HDDEIDE 40G VGANo Network Internal : 10/100 Fast Ethernet Exteral : Koren uMeteo-K CG Testbed
2003 uMeteo-K NAS storage sever 10/100 switch hub Monitoring system UPS 4 nodes ( single CPU ) cluster Electrometer KOREN 10/100 Ethernet 4 nodes ( single CPU ) cluster uMeteo-K CG Testbed Configuration
2003 uMeteo-K Linux : paran 7.0 (kernel version ) Globus 2.4 PG fortran 3.2 (Portland Group) MPICH-G for parallel job running –MPICH-G2 with PG fortran NCAR Graphics for graphic display NIS, NFS uMeteo-K CG Testbed S/W
2003 uMeteo-K Independent simple CA has installed at each master node. A group of slave nodes is controlled by each master node’s PBS scheduler CA-A Master A CA-B Master B slaves PBS Globus linkage between testbed clusters
2003 uMeteo-K subject : /O=uMeteoK/OU=PKNU/OU=pknu.ac.kr/CN=pknuCA2215/CN=proxy issuer : /O=uMeteoK/OU=PKNU/OU=pknu.ac.kr/CN=pknuCA2215 identity : /O=uMeteoK/OU=PKNU/OU=pknu.ac.kr/CN=pknuCA2215 type : full legacy globus proxy strength : 512 bits path : /tmp/x509up_u535 timeleft : 10:53:37 subject : /O=uMeteoK/OU=pknu.ac.kr/CN=pknuGB1/CN=proxy issuer : /O=uMeteoK/OU=pknu.ac.kr/CN=pknuGB1 identity : /O=uMeteoK/OU=pknu.ac.kr/CN=pknuGB1 type : full legacy globus proxy strength : 512 bits path : /tmp/x509up_u533 timeleft : 10:01:23 - CA-A : pknuGB01.pknu.ac.kr - CA-B : pknuGB05.pknu.ac.kr CA information of each cluster
2003 uMeteo-K Monitoring system on CG testbed Before integration Integration
2003 uMeteo-K Globus script file for Parallel MM5 run (mm5.rsl) + ( &(resourceManagerContact="pknuGB01") (count=4) (label="subjob 0") (environment=(GLOBUS_DUROC_SUBJOB_INDEX 0) (LD_LIBRARY_PATH /usr/local/globus/lib/)) (directory="/spring/KISTI/MM5/Run") (executable="/spring/KISTI/MM5/Run/mm5.mpp") ) ( &(resourceManagerContact="pknuGB05") (count=4) (label="subjob 4") (environment=(GLOBUS_DUROC_SUBJOB_INDEX 1) (LD_LIBRARY_PATH /usr/local/globus/lib/)) (directory="/summer/KISTI/MM5/Run") (executable="/summer/KISTI/MM5/Run/mm5.mpp") )
2003 uMeteo-K Parallel MM5 Benchmarks with GLOBUS Average job waiting time (including CA) : 25 sec The required time for 3600 sec (1 hour) model integration The required time for sec (1 day) model integration MPICH35 sec MPICH-G242 sec MPICH10 min 38 sec MPICH-G212 min 47 sec Single CPU67 min 12 sec
2003 uMeteo-K JMA SNU KISTI Supercom PKNU Forecast output Data input Forecast output NASA KMA Model output Data input COLA NCEP uMeteo-K Data Grid uMeteo-K Data Grid Configuration
2003 uMeteo-K Thank you for your attention!