The Grid Prof Steve Lloyd Queen Mary, University of London.

Slides:



Advertisements
Similar presentations
Slide 1 Steve Lloyd Grid Brokering Meeting - 4 Dec 2006 GridPP Steve Lloyd Queen Mary, University of London Grid Brokering Meeting December 2006.
Advertisements

S.L.LloydATSE e-Science Visit April 2004Slide 1 GridPP – A UK Computing Grid for Particle Physics GridPP 19 UK Universities, CCLRC (RAL & Daresbury) and.
GridPP, The Grid & Industry Who we are, what it is and what we can do. Tony Doyle, Project Leader Steve Lloyd, Collaboration Board Chairman Robin Middleton,
The Grid Professor Steve Lloyd Queen Mary, University of London.
Your university or experiment logo here What is it? What is it for? The Grid.
The Grid What is it? what is it for?. Your university or experiment logo here Web: information sharing Invented at CERN by Tim Berners-Lee No. of Internet.
GridPP Building a UK Computing Grid for Particle Physics A PPARC funded project.
Slide 1 of 24 Steve Lloyd NW Grid Seminar - 11 May 2006 GridPP and the Grid for Particle Physics Steve Lloyd Queen Mary, University of London NW Grid Seminar.
Slide 1 Steve Lloyd London Tier-2 Workshop - 16 Apr 2007 Introduction to Grids and GridPP Steve Lloyd Queen Mary, University of London London Tier-2 Workshop.
SymmetryBeauty Physics Grid Large Hadron Collider Particle Physics Condensed Matter Astro physics Cosmology Nuclear Physics Atomic Physics Biological Physics.
31/03/00 CMS(UK)Glenn Patrick What is the CMS(UK) Data Model? Assume that CMS software is available at every UK institute connected by some infrastructure.
Computing for LHC Dr. Wolfgang von Rüden, CERN, Geneva ISEF students visit CERN, 28 th June - 1 st July 2009.
GridPP From Prototype to Production David Britton 21/Sep/06 1.Context – Introduction to GridPP 2.Performance of the GridPP/EGEE/wLCG Grid 3.Some Successes.
Randall Sobie The ATLAS Experiment Randall Sobie Institute for Particle Physics University of Victoria Large Hadron Collider (LHC) at CERN Laboratory ATLAS.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Infrastructure overview Arnold Meijster &
23/04/2008VLVnT08, Toulon, FR, April 2008, M. Stavrianakou, NESTOR-NOA 1 First thoughts for KM3Net on-shore data storage and distribution Facilities VLV.
Steve LloydInaugural Lecture - 24 November 2004 Slide 1 The Data Deluge and the Grid Steve Lloyd Professor of Experimental Particle Physics Inaugural Lecture.
Parallel Programming on the SGI Origin2000 With thanks to Moshe Goldberg, TCC and Igor Zacharov SGI Taub Computer Center Technion Mar 2005 Anne Weill-Zrahia.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 15 th April 2009 Visit of Spanish Royal Academy.
1 Data Storage MICE DAQ Workshop 10 th February 2006 Malcolm Ellis & Paul Kyberd.
Birmingham Particle Physics Masterclass 23 th April 2008 Birmingham Particle Physics Masterclass 23 th April 2008 The Grid What & Why? Presentation by:
11 Dec 2000F Harris Datagrid Testbed meeting at Milan 1 LHCb ‘use-case’ - distributed MC production
Exploiting the Grid to Simulate and Design the LHCb Experiment K Harrison 1, N Brook 2, G Patrick 3, E van Herwijnen 4, on behalf of the LHCb Grid Group.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridPP Vendor Day 30 th April.
GridPP Steve Lloyd, Chair of the GridPP Collaboration Board.
CMS Report – GridPP Collaboration Meeting VI Peter Hobson, Brunel University30/1/2003 CMS Status and Plans Progress towards GridPP milestones Workload.
Frédéric Hemmer, CERN, IT Department The LHC Computing Grid – June 2006 The LHC Computing Grid Visit of the Comité d’avis pour les questions Scientifiques.
F.Fanzago – INFN Padova ; S.Lacaprara – LNL; D.Spiga – Universita’ Perugia M.Corvo - CERN; N.DeFilippis - Universita' Bari; A.Fanfani – Universita’ Bologna;
1 st EGEE Conference – April UK and Ireland Partner Dave Kant Deputy ROC Manager.
Finnish DataGrid meeting, CSC, Otaniemi, V. Karimäki (HIP) DataGrid meeting, CSC V. Karimäki (HIP) V. Karimäki (HIP) Otaniemi, 28 August, 2000.
ScotGRID:The Scottish LHC Computing Centre Summary of the ScotGRID Project Summary of the ScotGRID Project Phase2 of the ScotGRID Project Phase2 of the.
Organisation Management and Policy Group (MPG): Responsible for setting and policy decisions and resolving any issues concerning fractional usage, acceptable.
Dr. M.-C. Sawley IPP-ETH Zurich Nachhaltige Begegnungen Standing at the crossing point between data analysis and simulation Knowledge Discovery Panel.
LHC Computing Plans Scale of the challenge Computing model Resource estimates Financial implications Plans in Canada.
LHC Computing Review Recommendations John Harvey CERN/EP March 28 th, th LHCb Software Week.
November SC06 Tampa F.Fanzago CRAB a user-friendly tool for CMS distributed analysis Federica Fanzago INFN-PADOVA for CRAB team.
To the Grid From the Web Dr. Francois Grey IT Department, CERN.
Grid Glasgow Outline LHC Computing at a Glance Glasgow Starting Point LHC Computing Challenge CPU Intensive Applications Timeline ScotGRID.
The UK eScience Grid (and other real Grids) Mark Hayes NIEeS Summer School 2003.
GridPP Building a UK Computing Grid for Particle Physics Professor Steve Lloyd, Queen Mary, University of London Chair of the GridPP Collaboration Board.
…building the next IT revolution From Web to Grid…
The LHC Computing Grid – February 2008 The Challenges of LHC Computing Dr Ian Bird LCG Project Leader 6 th October 2009 Telecom 2009 Youth Forum.
Les Les Robertson LCG Project Leader High Energy Physics using a worldwide computing grid Torino December 2005.
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
Your university or experiment logo here What is it? What is it for? The Grid.
Grid Glasgow Outline LHC Computing at a Glance Glasgow Starting Point LHC Computing Challenge CPU Intensive Applications Timeline ScotGRID.
23.March 2004Bernd Panzer-Steindel, CERN/IT1 LCG Workshop Computing Fabric.
Brunel University, Department of Electronic and Computer Engineering, Uxbridge, UB8 3PH, UK Dr Peter R Hobson C.Phys M.Inst.P SIRE Group.
1 Volunteer Computing at CERN past, present and future Ben Segal / CERN (describing the work of many people at CERN and elsewhere ) White Area lecture.
Computing for LHC Physics 7th March 2014 International Women's Day - CERN- GOOGLE Networking Event Maria Alandes Pradillo CERN IT Department.
Computing Issues for the ATLAS SWT2. What is SWT2? SWT2 is the U.S. ATLAS Southwestern Tier 2 Consortium UTA is lead institution, along with University.
CLRC Grid Team Glenn Patrick LHCb GRID Plans Glenn Patrick LHCb has formed a GRID technical working group to co-ordinate practical Grid.
M.C. Vetterli; SFU/TRIUMF Simon Fraser ATLASATLAS SFU & Canada’s Role in ATLAS M.C. Vetterli Simon Fraser University and TRIUMF SFU Open House, May 31.
Storage Management on the Grid Alasdair Earl University of Edinburgh.
EC Review – 01/03/2002 – F.Carminati – Accomplishments of the project from the end user point of view– n° 1 Accomplishments of the project from the end.
WLCG – Status and Plans Ian Bird WLCG Project Leader openlab Board of Sponsors CERN, 23 rd April 2010.
Slide § David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridPP delivering The UK Grid.
LHC collisions rate: Hz New PHYSICS rate: Hz Event selection: 1 in 10,000,000,000,000 Signal/Noise: Raw Data volumes produced.
GridPP, The Grid & Industry
Grid site as a tool for data processing and data analysis
The LHC Computing Grid Visit of Mtro. Enrique Agüera Ibañez
Understanding the nature of matter -
UK GridPP Tier-1/A Centre at CLRC
The LHC Computing Grid Visit of Her Royal Highness
Building a UK Computing Grid for Particle Physics
LHC Collisions.
Cécile Germain-Renaud Grid Observatory meeting 19 October 2007 Orsay
Collaboration Board Meeting
The LHC Computing Grid Visit of Professor Andreas Demetriou
Presentation transcript:

The Grid Prof Steve Lloyd Queen Mary, University of London

The Physics Challenge Steve Lloyd The Grid Slide 2 10 orders of magnitude All interactions The Higgs Standard Model: Jets, W, Z Relative number of events

Event Complexity Steve Lloyd The Grid Slide 3 25 Separate Interactions 8 Jet Event

Steve Lloyd The Grid Slide 4 The Data Deluge Collisions 40 Million times a second (40MHz) 150 Million electronic channels Petabytes of data per year 1 Petabyte = 1000 Terabytes 1 Terabyte = 1000 Gigabytes 1 Gigabyte = 1000 Megabytes Total ATLAS Disk Used 100 PB

Steve Lloyd The Grid Slide 5 E = mc 2 Grid Middleware Solution is a massive distributed computer system – The Grid From Web to Grid Relatively inexpensive Scalable Simple (?) to use Accessible 24/7 Easily upgraded Robust

Steve Lloyd The Grid Slide 6 Analogy with the Electricity Power Grid 'Standard Interface' Power Stations Distribution Infrastructure Electricity Grid

Steve Lloyd The Grid Slide 7 Computing and Data Centres Fibre Optics of the Internet Computing Grid

Steve Lloyd The Grid Slide 8 The Grid – A Distributed PC MIDDLEWARE CPU Disks, CPU etc PROGRAMS OPERATING SYSTEM Word/Excel /Web Your Program Games CPU Cluster User Interface Machine CPU Cluster CPU Cluster Workload Management Information Service Single PC Grid Disk Server Your Program Middleware is the Operating System of a distributed computing system Replica Catalogue Bookkeeping Service

Steve Lloyd The Grid Brunel Tier-0 Tier-1 National centres Tier-2 Regional groups Institutes Servers Offline farm Online system CERN computer centre RAL,UK ScotGridNorthGridSouthGridLondon FranceItalyGermanyUSA Imperial QMUL RHUL Useful model for Particle Physics but not necessary for others UCL Slide 9 Tier Structure

Steve Lloyd The Grid Slide 10 Tier-1 v Tier-2 Tier-1s – International role Tier-2s – Leverage University funding, Local publicity RAL Tier-1: 7,400 PC Equivalents, 2.3PB Disk, 5PB Tape QMUL Tier-2: 3,500 PC Equivalents, 1.7PB Disk

Steve Lloyd The Grid Problems that are highly parallelizable Problem Grid Solution Input data is independent e.g. Images: A=2 B=3 A=3 B=3 A=2 B=4 Simulation using different parameters: Not so good for closely coupled problems These pieces may be independent These pieces will have to interact What is a Grid good for? Slide 11

Steve Lloyd The Grid Astronomy Healthcare Bioinformatics Gaming Engineering Commerce Other Applications Slide 12

Steve Lloyd The Grid Slide 13 Other Applications Allow use by other scientific disciplines such as Bioinformatics

Steve Lloyd The Grid Slide 14 Take data from Medipix Detector Chips and make available via the Grid (Simon Langton Grammar School for Boys, Canterbury) Available for use in schools – cosmic rays, radioactivity etc LUCID flying on TechDemoSat-1 in Autumn 2012 Schools Coordinator School Local University Grid Schools Schools without Chips will be able to contribute to analysis

Steve Lloyd The Grid Slide 15 The Real Time Monitor Try it yourself (Java App):