Download presentation
Presentation is loading. Please wait.
Published byMargaretMargaret Walton Modified over 9 years ago
1
Computing at Norwegian Meteorological Institute Roar Skålin Director of Information Technology Norwegian Meteorological Institute roar.skalin@met.no CAS 2003 – Annecy 10.09.2003
2
Norwegian Meteorological Institute met.no
3
Norwegian Meteorological Institute Main office in Oslo Regional offices in Bergen and Tromsø Aviation offices at four military airports and Spitsbergen Three arctic stations: Jan Mayen, Bear Island and Hopen 430 employees (1.1.2004)
4
Norwegian Meteorological Institute met.no met.no Computing Infrastructure SGI O3800 512/512/7.2 SGI O3800 384/304/7.2 CXFS Climate Storage 2/8/20 Backup Server SGI O200/2 S-AIT 33 TB DLT 20 TB Production Dell 4/8 Linux Production Dell 2/4 Linux NetApp 790 GB Scali Cluster 20/5/0.3 XX Cluster y/y/y STA 5 TB Storage Server SGI O2000/2 DLT 20 TB met.no - Oslo NTNU - Trondheim Switch Router 2.5 GBit/s 155 MBit/s 1 GBit/s 100 MBit/s x/y/z = processors/GB memory/TB disk
5
Norwegian Meteorological Institute met.no met.no Local Production Servers Dell PowerEgde servers with two and four CPUs NetApp NAS Linux ECMWF Supervisor Monitor Scheduler (SMS) Perl, shell, Fortran, C++, XML, MySQL, PostgreSQL Cfengine Production Environment November 2003:
6
Norwegian Meteorological Institute met.no Linux replace proprietary UNIX at met.no Advantages: –Off-the-shelf hardware replace proprietary hardware Reduced cost of new servers Reduced operational costs –Overall increased stability –Easier to fix OS problems –Changing hardware vendor becomes feasible –Become an attractive IT-employer with highly motivated employees Disadvantages: –Cost of porting software –High degree of freedom: a Linux distribution is as many systems as there are users
7
Norwegian Meteorological Institute met.no Data storage: A critical resource We may loose N-1 production servers and still be up-and-running, but data must be available everywhere all the time We used to duplicate data files, but increased use of databases reduce the value of this strategy met.no replace a SAN by a NetApp NAS because: –availability –Linux support –”sufficient” IO-bandwidth (40-50 MB/s per server)
8
Norwegian Meteorological Institute met.no HPC in Norway – A national collaboration
9
Norwegian Meteorological Institute met.no Performance available to met.no CRAY X-MP CRAY Y-MP CRAY T3E SGI O3000
10
Norwegian Meteorological Institute met.no met.no Production Compute Servers SGI Origin 3800 Embla: 512 MIPS R14K processors 614 Gflops peak 512 GB memory Gridur: 384 MIPS R14K processors 384 Gflops peak 304 GB memory Trix OS / LSF batch system 7.2 TB CXFS filesystem
11
Norwegian Meteorological Institute met.no Atmospheric models HIRLAM 20HIRLAM 10HIRLAM 5UMMM5 PurposeOperation Exp Air Poll. Resolution20 km 40 layers 360 s 10 km 40 layers 240 s 5 km 40 layers 120 s 3 km 38 layers 75 s 3 km 17 layers 9 s 1 km 17 layers 3 s Grid Points 468*378248*341152*150280*27661x76 Forecast time 60 h48 h Result data 24 h 8 GB1.2 GB0.3 GB2.1 GB0.1 GB
12
Norwegian Meteorological Institute met.no Oceanograpic models MIPOM22ECOM3DWAVEECOM3D PurposeOperation Exp Resolution4 km 17 layers 150 s 20+20 km 21/5 layers 600 s 45/8 km4/300m km 17 layers 360/50 s Grid Points 1022x578208x120142x113 121x163 390x250 Forecast time 60 h Result data 24 h 1 GB
13
Norwegian Meteorological Institute met.no Production timeline HIRLAM20 HIRLAM10 HIRLAM5 UM MM5 ECOM3D/WAVE ECOM3D MIPOM22
14
Norwegian Meteorological Institute met.no HIRLAM scales, or …? The forecast model without I/O and support programs scales reasonably well up to 512 processors on a SGI O3800 In real life: –data transfer, support programs and I/O has a very limited scaling –there are other users of the system –machine dependent modifications to increase scaling has a high maintenance cost for a shared code such as HIRLAM For cost-efficient operational use, 256 processors seems to be a limit
15
Norwegian Meteorological Institute met.no How to utilise 898 processors operationally? Split in two systems of 512 and 384 processors and used as primary and backup system Will test a system to overlap model runs based on dependencies: HIRLAM 20 HIRLAM 10 ECOM3D HIRLAM 5 WAVE
16
Norwegian Meteorological Institute met.no Overlapping Production Timeline HIRLAM20 HIRLAM10 HIRLAM5 UM MM5 ECOM3D/WAVE ECOM3D MIPOM22
17
Norwegian Meteorological Institute met.no RegClim: Regional Climate Development Under Global Warming Overall aim: Produce scenarios for regional climate change suitable for impact assessment Quantify uncertainties Some keywords: Involve personell from met.no, universities and research organisations Based on global climate scenarios Dynamical and empircal downscaling Regional and global coupled models Atmosphere – ocean – sea-ice
18
Norwegian Meteorological Institute met.no Climate Computing Infrastructure SGI O3000 512/512/7.2 SGI O3000 384/304/7.2 CXFS Climate Storage 2/8/20 S-AIT 33 TB Para//ab - Bergen NTNU - Trondheim Router 2.5 GBit/s 155 MBit/s IBM Cluster 64/64/0.58 IBM p690 Regatta 96/320/7 IBM 3584 12 TB x/y/z = processors/GB memory/TB disk
19
Norwegian Meteorological Institute met.no Climate Storage Server Low-cost solution: Linux server Brocade switch Nexsan AtaBoy/AtaBeast RAID 19.7 TB 34 TB Super-AIT library, tapecapacity 0.5 TB uncompressed
20
Norwegian Meteorological Institute met.no GRID in Norway Testgrid comprising experimental computers at the four universities Globus 2.4 -> 3.0 Two experimental portals: Bioinformatics and Gaussian Plan testing of large datasets (Storage Resource Broker) and metascheduling autumn 2003 Plan to phase in production supercomputers in 2004
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.