Presentation is loading. Please wait.

Presentation is loading. Please wait.

Cross Council ICT Conference May 2004 1 High Performance Computing Ron Perrott Chairman High End Computing Strategy Committee Queen’s University Belfast.

Similar presentations


Presentation on theme: "Cross Council ICT Conference May 2004 1 High Performance Computing Ron Perrott Chairman High End Computing Strategy Committee Queen’s University Belfast."— Presentation transcript:

1 Cross Council ICT Conference May 2004 1 High Performance Computing Ron Perrott Chairman High End Computing Strategy Committee Queen’s University Belfast {r.perrott@qub.ac.uk}

2 Cross Council ICT Conference May 2004 2 A high performance computer is a hardware and software system that provides close to the maximum performance that can currently be achieved. => parallelism => state of the art technology => pushing the limits What is a high performance computer? Why do we need them? Computational fluid dynamics, protein folding, climate modeling, national security, in particular for cryptanalysis and for simulation, etc. Economy, security, health and well-being of the country. => Scientific discovery => Social impact => Commercial potential

3 Cross Council ICT Conference May 2004 3 HPC – UK Important to research in many scientific disciplines Increasing breadth of science involved High UK HPC international activities Contributions to and benefits for UK industry

4 Cross Council ICT Conference May 2004 4 UK projects Atomic, Molecular & Optical Physics Computational Biology Computational Radiation Biology and Therapy Computational Chemistry Computational Engineering - Fluid Dynamics Environmental Modelling Cosmology Particle Physics Fusion & Plasma Microturbulence Accelerator Modelling Nanoscience Disaster Simulation => computation has become as important as theory and experiment in the conduct of research

5 Cross Council ICT Conference May 2004 5 Whole systems Electronic Structure - from atoms to matter Computational Biology - from molecules to cells and beyond Fluid Dynamics - from eddies to aircraft Environmental Modelling - from oceans to the earth From the earth to the solar system ? ……And on to the Universe

6 Cross Council ICT Conference May 2004 6 Technology Trends: Microprocessor Capacity 2X transistors/Chip Every 1.5 years Called “Moore’s Law ” Microprocessors have become smaller, denser, and more powerful. Not just processors, bandwidth, storage, etc. 2X memory and processor speed and ½ size, cost, & power every 18 months. Gordon Moore, co- founder of Intel 1965 Number of devices/chip doubles every 18 months

7 Cross Council ICT Conference May 2004 7 J. Dongarra - Listing of the 500 most powerful Computers in the World - Yardstick: LINPACK Ax=b, dense problem - Updated twice a year SC‘xy in the States in November Meeting in Mannheim, Germany in June - All data available from www.top500.org

8 Cross Council ICT Conference May 2004 8 Scalar Super Scalar Vector Parallel Super Scalar/Vector/Parallel Moore’s Law 1941 1 (Floating Point operations / second, Flop/s) 1945 100 1949 1,000 (1 KiloFlop/s, KFlop/s) 1951 10,000 1961 100,000 1964 1,000,000 (1 MegaFlop/s, MFlop/s) 1968 10,000,000 1975 100,000,000 1987 1,000,000,000 (1 GigaFlop/s, GFlop/s) 1992 10,000,000,000 1993 100,000,000,000 1997 1,000,000,000,000 (1 TeraFlop/s, TFlop/s) 2000 10,000,000,000,000 2003 35,000,000,000,000 (35 TFlop/s) (10 3 ) (10 6 ) (10 9 ) (10 12 ) (10 15 )

9 Cross Council ICT Conference May 2004 9 TOP500 – Performance - Nov 2003 Laptop (10 15 ) (10 12 ) (10 9 )

10 Cross Council ICT Conference May 2004 10 Earth Simulator Homogeneous, Centralized, Proprietary, Expensive! Target Application: CFD-Weather, Climate, Earthquakes 640 NEC SX/6 Nodes (mod) –5120 CPUs which have vector ops –Each CPU 8 Gflop/s Peak 40 TFlop/s (peak) ~ 1/2 Billion £ for machine, software, & building Footprint of 4 tennis courts 7 MWatts –Say 10 cent/KWhr - $16.8K/day = $6M/year! Expect to be on top of Top500 until 60-100 TFlop ASCI machine arrives From the Top500 (November 2003)

11 Cross Council ICT Conference May 2004 11 HPC Trends Over the last 10 years the range for the Top500 has increased greater than Moore’s Law 1993: –#1 = 59.7 GFlop/s –#500 = 422 MFlop/s 2003: –#1 = 35.8 TFlop/s –#500 = 403 GFlop/s

12 Cross Council ICT Conference May 2004 12 November 2003 ManufacturerComputer Rmax Tflop/s Installation SiteYear# Proc Rpeak Tflop/s 1 NECEarth-Simulator35.8 Earth Simulator Center Earth Simulator Center Yokohama 2002512040.90 2 Hewlett- Packard ASCI Q - AlphaServer SC ES45/1.25 GHz 13.9 Los Alamos National Laboratory Los Alamos National Laboratory Los Alamos 2002819220.48 3 Self Apple G5 Power PC w/Infiniband 4X 10.3 Virginia Tech Blacksburg, VA2003220017.60 4 Dell PowerEdge 1750 P4 Xeon 3.6 Ghz w/Myrinet 9.82 University of Illinois U/C Urbana/Champaign 2003250015.30 5 Hewlett- Packard rx2600 Itanium2 1 GHz Cluster – w/Quadrics 8.63 Pacific Northwest National Laboratory Pacific Northwest National Laboratory Richland 2003193611.62 6 Linux NetworX Opteron 2 GHz, w/Myrinet 8.05 Lawrence Livermore National Laboratory Lawrence Livermore National Laboratory Livermore 2003281611.26 7 Linux NetworX MCR Linux Cluster Xeon 2.4 GHz – w/Quadrics 7.63 Lawrence Livermore National Laboratory Lawrence Livermore National Laboratory Livermore 2002230411.06 8 IBMASCI White, Sp Power3 375 MHz7.30 Lawrence Livermore National Laboratory Lawrence Livermore National Laboratory Livermore 2000819212.29 9 IBMSP Power3 375 MHz 16 way7.30NERSC/LBNL NERSC/LBNL Berkeley200266569.984 10 IBM xSeries Cluster Xeon 2.4 GHz – w/Quadrics 6.59 Lawrence Livermore National Laboratory Lawrence Livermore National Laboratory Livermore 200319209.216 50% of top500 performance in top 9 machines; 131 system > 1 TFlop/s; 210 machines are clusters

13 Cross Council ICT Conference May 2004 13 Performance Extrapolation TFlop/s To enter the list PFlop/s Computer Blue Gene 130,000 proc ASCI P 12,544 proc 10 15 10 12

14 Cross Council ICT Conference May 2004 14 Taxonomy Special purpose processors and interconnect High Bandwidth, low latency communication Designed for scientific computing Relatively few machines will be sold High price Commodity processors and switch Processors design point for web servers & home pc’s Leverage millions of processors Price point appears attractive for scientific computing Capability ComputingCluster Computing

15 Cross Council ICT Conference May 2004 15 UK Facilities Main centres in Manchester Edinburgh and Daresbury Smaller centres around UK

16 Cross Council ICT Conference May 2004 16 HPCx Edinburgh and CCLRC IBM 1280 processor POWER4 Currently 3.5 Tflop/s to 6.0 Tflop/s, July October 2006 up to 12.0 Tflop/s

17 Cross Council ICT Conference May 2004 17 CSAR University of Manchester/Computer Sciences Corporation 256 Itanium2 processor SGI Altix (Newton) - Jun 2006; peak performance of 5.2 Gflop/s 512 processor Origin3800 ( Green)- Jun 2006

18 Cross Council ICT Conference May 2004 18 Hector – High End Computing Terascale Resource Scientific Case Business case Peak performance of 50 to 100 Tflop/s by 2006, doubling to 100 to 200 Tflop/s after 2 years, and doubling again to 200 to 400 Tflop/s 2 years after that. Oak Ridge National Laboratory 100 Tflop/s in 2006 250 Tflop/s in 2007

19 Cross Council ICT Conference May 2004 19 ANL UK US – Teragrid HPC-Grid Experiment TeraGyroid: Lattice-Boltzmann simulations of defect dynamics in amphiphilic liquid crystals

20 Cross Council ICT Conference May 2004 20 TeraGyroid - Project Partners Teragrid sites at: –ANL Visualization, Networking –NCSA Compute –PSC Compute, Visualization –SDSC Compute Reality Grid partners: –University College London Compute, Visualization, Networking –University of Manchester Compute, Visualization, Networking –Edinburgh Parallel Computing Centre Compute –Tufts University Compute UK High-End Computing Services –HPCx -University of Edinburgh and CCLRC Daresbury Laboratory Compute, Networking, Coordination –CSAR -Manchester and CSC Compute and Visualization

21 Cross Council ICT Conference May 2004 21 TeraGyroid - Results Linking these resources allowed computation of the largest set of lattice-Boltzmann (LB) simulations ever performed, involving lattices of over one billion sites Won SC03 HPC Challenge for “Most Innovative Data-Intensive Application” Demonstrated extensive use of the US UK infrastructure


Download ppt "Cross Council ICT Conference May 2004 1 High Performance Computing Ron Perrott Chairman High End Computing Strategy Committee Queen’s University Belfast."

Similar presentations


Ads by Google