The UK eScience Grid (and other real Grids) Mark Hayes NIEeS Summer School 2003
The Grid in the UK Pilot projects in particle physics, astronomy, medicine, bioinformatics, environmental sciences... Contributing to international Grid software development efforts 10 regional “eScience Centres”
Some UK Grid resources Daresbury - loki - 64 proc Alpha cluster Manchester - green proc SGI Origin 3800 Imperial - saturn - large SMP Sun Southampton - iridis proc.Intel Linux cluster Rutherford Appleton Lab - hrothgar - 32 proc Intel Linux Cambridge - herschel - 32 proc Intel Linux cluster... coming soon: 4x >64 CPU JISC clusters, HPC(X)
Applications on the UK Grid Ion diffusion through radiation damaged crystal structures (Mark Calleja, Earth Sciences, Cambridge) Monte Carlo simulation lots of independent runs small input & output more CPU -> higher temperatures, better stats access to ~100 CPUs on the UK Grid Condor-G client tool for farming out jobs
Applications on the UK Grid GEODISE - Grid Enabled Optimisation & Design Search for Engineering (Simon Cox, Andy Keane, Hakki Eres, Southampton) Genetic algorithm to find the best design for satellite truss beams Java plugins to MATLAB for remote job submission to the Grid Used CPU at Belfast, Cambridge, RAL, London, Oxford & Southampton
Applications on the UK Grid Reality Grid (Stephen Pickles, Robin Pinning - Manchester) Fluid dynamics of complex mixtures, e.g oil, water and solid particles (mud) Used CPU at London, Cambridge Remote visualisation using SGI Onyx in Manchester (from a laptop in Sheffield) Computational steering
Applications on the UK Grid GENIE - Grid Enabled Integrated Earth system model (Steven Newhouse, Murtaza Gulamali - Imperial) Ocean-atmosphere modelling How does moisture transport from the atmosphere effect ocean circulation? ~1000 independent 4000year runs (3 days real time!) on ~200 CPUs Flocked condor pools at London & Southampton Coupled modelling
Two years to get this far... July Regional eScience Centres funded October First meeting of the Grid Engineering Taskforce (biweekly meetings using Access Grid) August ‘Level 1’ Grid operational (simple job submission possible between sites) April ‘Level 2’ Grid + applications (security, monitoring, accounting) July ‘Level 3’ Grid: more users, more robust
The European DataGrid Tiered structure: Tier0=CERN Lots of their own Grid software Applications: particle physics, earth observation, bioinformatics
NASA Information PowerGrid First “production quality” Grid Linking NASA & academic supercomputing sites at 10 sites Applications: computational fluid dynamics, meteorological data mining, Grid benchmarking
TeraGrid Linking supercomputers through a high-speed network 4x 10GBps between SDSC, Caltech, Argonne & NCSA Call for proposals out for applications & users
Asia-Pacific Grid No central source of funding Informal, bottom-up approach Lots of experiments on benchmarking & bio apps.
What does it take to build a Grid? Resources - CPU, network, storage People - sysadmins, application developers, Grid experts Grid Middleware - Globus, Condor, Unicore… Security - so you want to use my computer? Maintenance - ongoing monitoring, upgrades… and co-ordination of this between multiple sites Applications and users!
How you can get involved... NIEeS National eScience Centre (Edinburgh) NERC PhD studentships Your local eScience Centre Adopt an application!
Questions?