Enabling e-Research over GridPP Dan Tovey University of Sheffield.

Slides:



Advertisements
Similar presentations
Slide 1 Steve Lloyd Grid Brokering Meeting - 4 Dec 2006 GridPP Steve Lloyd Queen Mary, University of London Grid Brokering Meeting December 2006.
Advertisements

S.L.LloydATSE e-Science Visit April 2004Slide 1 GridPP – A UK Computing Grid for Particle Physics GridPP 19 UK Universities, CCLRC (RAL & Daresbury) and.
1 ALICE Grid Status David Evans The University of Birmingham GridPP 14 th Collaboration Meeting Birmingham 6-7 Sept 2005.
Tony Doyle GridPP – From Prototype To Production, HEPiX Meeting, Edinburgh, 25 May 2004.
S.L.LloydGrid Steering Committee 8 March 2002 Slide 1 Status of GridPP Overview Financial Summary Recruitment Status EU DataGrid UK Grid Status GridPP.
GridPP, The Grid & Industry Who we are, what it is and what we can do. Tony Doyle, Project Leader Steve Lloyd, Collaboration Board Chairman Robin Middleton,
1 ALICE Grid Status David Evans The University of Birmingham GridPP 16 th Collaboration Meeting QMUL June 2006.
Tony Doyle - University of Glasgow GridPP EDG - UK Contributions Architecture Testbed-1 Network Monitoring Certificates & Security Storage Element R-GMA.
The Grid Professor Steve Lloyd Queen Mary, University of London.
Southgrid Status Pete Gronbech: 21 st March 2007 GridPP 18 Glasgow.
Tony Doyle GridPP2 – Project Specification, GridPP9 Collaboration Meeting, Edinburgh, 4 February 2004.
Tony Doyle Executive Summary, PPARC, MRC London, 15 May 2003.
Your university or experiment logo here What is it? What is it for? The Grid.
A Grid For Particle Physics From testbed to production Jeremy Coles 3 rd September 2004 All Hands Meeting – Nottingham, UK.
The Grid What is it? what is it for?. Your university or experiment logo here Web: information sharing Invented at CERN by Tim Berners-Lee No. of Internet.
Tony Doyle GridPP2 Specification Process Grid Steering Committee Meeting, MRC, London, 18 February 2004.
S.L.LloydGridPP Collaboration Meeting IC Sept 2002Slide 1 Introduction Welcome to the 5 th GridPP Collaboration Meeting Steve Lloyd, Chair of GridPP.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridPP delivering The UK Grid.
GridPP Building a UK Computing Grid for Particle Physics A PPARC funded project.
1Oxford eSc – 1 st July03 GridPP2: Application Requirement & Developments Nick Brook University of Bristol ALICE Hardware Projections Applications Programme.
Slide 1 of 24 Steve Lloyd NW Grid Seminar - 11 May 2006 GridPP and the Grid for Particle Physics Steve Lloyd Queen Mary, University of London NW Grid Seminar.
From web to grid: The future of scientific computing? Neasan ONeill GridPP Dissemination Officer.
Slide 1 Steve Lloyd London Tier-2 Workshop - 16 Apr 2007 Introduction to Grids and GridPP Steve Lloyd Queen Mary, University of London London Tier-2 Workshop.
Particle physics – the computing challenge CERN Large Hadron Collider –2007 –the worlds most powerful particle accelerator –10 petabytes (10 million billion.
GridPP: Executive Summary Tony Doyle. Tony Doyle - University of Glasgow Oversight Committee 11 October 2007 Exec 2 Summary Grid Status: Geographical.
UK Agency for the support of: High Energy Physics - the nature of matter and mass Particle Astrophysics - laws from natural phenomena Astronomy - the.
Level 1 Components of the Project. Level 0 Goal or Aim of GridPP. Level 2 Elements of the components. Level 2 Milestones for the elements.
SymmetryBeauty Physics Grid Large Hadron Collider Particle Physics Condensed Matter Astro physics Cosmology Nuclear Physics Atomic Physics Biological Physics.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridPP24 Collaboration Meeting.
Tony Doyle GridPP2 Proposal, BT Meeting, Imperial, 23 July 2003.
Fighting Malaria With The Grid. Computing on The Grid The Internet allows users to share information across vast geographical distances. Using similar.
Steve Lloyd Tony Doyle GridPP Presentation to PPARC e-Science Committee 31 May 2001.
The National Grid Service Mike Mineter.
GridPP Deployment Status Steve Traylen 28th October 2004 GOSC Face to Face, NESC, UK.
31/03/00 CMS(UK)Glenn Patrick What is the CMS(UK) Data Model? Assume that CMS software is available at every UK institute connected by some infrastructure.
The LHC experiments AuthZ Interoperation requirements GGF16, Athens 16 February 2006 David Kelsey CCLRC/RAL, UK
Computing for LHC Dr. Wolfgang von Rüden, CERN, Geneva ISEF students visit CERN, 28 th June - 1 st July 2009.
GridPP From Prototype to Production David Britton 21/Sep/06 1.Context – Introduction to GridPP 2.Performance of the GridPP/EGEE/wLCG Grid 3.Some Successes.
15 May 2006Collaboration Board GridPP3 Planning Executive Summary Steve Lloyd.
LHCb Computing Activities in UK Current activities UK GRID activities RICH s/w activities.
Tony Doyle “GridPP2 Proposal”, GridPP7 Collab. Meeting, Oxford, 1 July 2003.
S.L.LloydGridPP CB 29 Oct 2002Slide 1 Agenda 1.Introduction – Steve Lloyd 2.Minutes of Previous Meeting (23 Oct 2001) 3.Matters Arising 4.Project Leader's.
Randall Sobie The ATLAS Experiment Randall Sobie Institute for Particle Physics University of Victoria Large Hadron Collider (LHC) at CERN Laboratory ATLAS.
Steve LloydInaugural Lecture - 24 November 2004 Slide 1 The Data Deluge and the Grid Steve Lloyd Professor of Experimental Particle Physics Inaugural Lecture.
Intro to grid computing Cristy Burne GridTalk Queen Mary University of London.
Exploiting the Grid to Simulate and Design the LHCb Experiment K Harrison 1, N Brook 2, G Patrick 3, E van Herwijnen 4, on behalf of the LHCb Grid Group.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridPP Vendor Day 30 th April.
GridPP Steve Lloyd, Chair of the GridPP Collaboration Board.
The Grid Prof Steve Lloyd Queen Mary, University of London.
Tony Doyle GridPP – From Prototype To Production, GridPP10 Meeting, CERN, 2 June 2004.
Tony Doyle - University of Glasgow 12 January 2005Collaboration Board GridPP: Executive Summary Tony Doyle.
GridPP & The Grid Who we are & what it is Tony Doyle.
12th November 2003LHCb Software Week1 UK Computing Glenn Patrick Rutherford Appleton Laboratory.
1 st EGEE Conference – April UK and Ireland Partner Dave Kant Deputy ROC Manager.
John Gordon CCLRC e-Science Centre LCG Deployment in the UK John Gordon GridPP10.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
GridPP Deployment & Operations GridPP has built a Computing Grid of more than 5,000 CPUs, with equipment based at many of the particle physics centres.
GridPP Building a UK Computing Grid for Particle Physics Professor Steve Lloyd, Queen Mary, University of London Chair of the GridPP Collaboration Board.
…building the next IT revolution From Web to Grid…
Tony Doyle - University of Glasgow 8 July 2005Collaboration Board Meeting GridPP Report Tony Doyle.
The LHC Computing Grid – February 2008 The Challenges of LHC Computing Dr Ian Bird LCG Project Leader 6 th October 2009 Telecom 2009 Youth Forum.
Les Les Robertson LCG Project Leader High Energy Physics using a worldwide computing grid Torino December 2005.
Your university or experiment logo here What is it? What is it for? The Grid.
J Jensen/J Gordon RAL Storage Storage at RAL Service Challenge Meeting 27 Jan 2005.
Slide § David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridPP delivering The UK Grid.
18/12/03PPD Christmas Lectures 2003 Grid in the Department A Guide for the Uninvolved PPD Computing Group Christmas Lecture 2003 Chris Brew.
The LHC Computing Grid Visit of Mtro. Enrique Agüera Ibañez
Understanding the nature of matter -
Building a UK Computing Grid for Particle Physics
Collaboration Board Meeting
Presentation transcript:

Enabling e-Research over GridPP Dan Tovey University of Sheffield

28 th March 2006University of Sheffield2 ATLAS Large Hadron Collider (LHC) under construction at CERN in Geneva. When commences operation in 2007 will be the world’s highest energy collider. Sheffield key member of ATLAS collaboration building one of two General Purpose Detectors on LHC ring. Main motivations for building LHC and ATLAS: –Finding the Higgs boson –Finding evidence for Supersymmetry – believed to be next great discovery / layer in our understanding of the universe.

28 th March 2006University of Sheffield3 Sheffield Sheffield leads Supersymmetry (SUSY) searches at ATLAS Also coordinates all ATLAS physics activities in the UK including Higgs and SUSY searches. Sheffield responsible for building ATLAS Semiconductor Tracker (SCT) detector, and writing event reconstruction software. SUSY (= Nobel Prize) SM NB: This is a simulation!

28 th March 2006University of Sheffield4 Construction

28 th March 2006University of Sheffield5 Event Selection 9 orders of magnitude

28 th March 2006University of Sheffield6 The Data Deluge Understand/interpret data via numerically intensive simulations e.g. 1 SUSY event (ATLAS Monte Carlo Simulation) = 20 mins/3.5 MB on 1 GHz PIII 16 Million channels 100 kHz LEVEL-1 TRIGGER 1 MegaByte EVENT DATA 200 GigaByte BUFFERS 500 Readout memories 3Gigacell buffers 500 Gigabit/s Gigabit/s SERVICE LAN PetaByte ARCHIVE EnergyTracks Networks 1 Terabit/s (50000 DATA CHANNELS) 20TeraIPS EVENT BUILDER EVENT FILTER 40 MHz COLLISION RATE ChargeTimePattern Detectors Grid Computing Service 300TeraIPS Many events –~10 9 events/experiment/year –>~1 MB/event raw data –several passes required  Worldwide LHC computing requirement (2007): –100 Million SPECint2000 (=100,000 of today’s fastest processors) –12-14 PetaBytes of data per year (=100,000 of today’s highest capacity HDD).

28 th March 2006University of Sheffield7 LCG Aim to use Grid techniques to solve this problem CERN LHC Computing Grid (LCG) project coordinating activities in Europe. Similar projects in US (Grid3/OSG) and Nordic countries (NorduGrid). LCG prototype went live in September 2003 in 12 countries including UK. Extensively tested by the LHC experiments

28 th March 2006University of Sheffield8 What is GridPP? 19 UK Universities, CCLRC (RAL & Daresbury) and CERN Funded by the Particle Physics and Astronomy Research Council (PPARC) GridPP £17m "From Web to Grid" GridPP £16m "From Prototype to Production" UK contribution to LCG.

28 th March 2006University of Sheffield9 UK Core e-Science Programme Institutes Tier-2 Centres CERN LCG EGEE GridPP GridPP in Context Tier-1/A Middleware, Security, Networking Experiments Grid Support Centre Not to scale! Apps Dev Apps Int GridPP

28 th March 2006University of Sheffield10 ARDA Expmts EGEE LCG Deployment Board Tier1/Tier2, Testbeds, Rollout Service specification & provision User Board Requirements Application Development User feedback Metadata Workload Network Security Info. Mon. PMB CB Storage

28 th March 2006University of Sheffield11 Tier Structure Tier-1 Tier-0 (CERN) Tier-1 (Lyon) Tier-1 (BNL) Tier-1 (RAL) NorthGridSouthGrid ScotGridULGrid Tier-2

28 th March 2006University of Sheffield12 UK Tier-1/A Centre Rutherford Appleton Laboratory High quality data services National and international role UK focus for international Grid development 1400 CPU 80 TB Disk 60 TB Tape (Capacity 1PB) Grid Resource Discovery Time = 8 Hours 2004 CPU Utilisation

28 th March 2006University of Sheffield13 UK Tier-2 Centres ScotGrid Durham, Edinburgh, Glasgow NorthGrid Daresbury, Lancaster, Liverpool, Manchester, Sheffield (WRG) SouthGrid Birmingham, Bristol, Cambridge, Oxford, RAL PPD, Warwick LondonGrid Brunel, Imperial, QMUL, RHUL, UCL

28 th March 2006University of Sheffield14 NorthGrid Tier-2 collaboration between Sheffield (WRG), Lancaster, Liverpool, Manchester and Daresbury Lab.

28 th March 2006University of Sheffield15 WRG & NorthGrid White Rose Grid contributing to NorthGrid and GridPP with new SRIF2 funded machine at Sheffield (Iceberg). LCG component to Iceberg provides a base of 230kSI2k and on demand up to 340kSI2k, with state-of-the-art 2.4 GHz Opteron cpus. Delivered 2 nd highest GridPP Tier-2 throughput for ATLAS in

28 th March 2006University of Sheffield16 GridPP Deployment Status Three Grids on Global scale in HEP (similar functionality) sites CPUs LCG (GridPP)228 (19) (3500) Grid3 [USA] NorduGrid GridPP deployment is part of LCG Currently the largest Grid in the world

28 th March 2006University of Sheffield17 ATLAS Data Challenges DC2 (2005): 7.7 M GEANT4 events and 22 TB DC3/CSC (2006): > 20M G4 events UK ~20% of LCG Ongoing.. (3) Grid Production Largest total computing requirement Small fraction of what ATLAS needs.. Now in Grid Production Phase  LCG now reliably used for production

28 th March 2006University of Sheffield18 Further Info