SymmetryBeauty Physics Grid Large Hadron Collider Particle Physics Condensed Matter Astro physics Cosmology Nuclear Physics Atomic Physics Biological Physics.

Slides:



Advertisements
Similar presentations
1 AMY Detector (eighties) A rather compact detector.
Advertisements

EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Why Grids Matter to Europe Bob Jones EGEE.
Slide 1 Steve Lloyd Grid Brokering Meeting - 4 Dec 2006 GridPP Steve Lloyd Queen Mary, University of London Grid Brokering Meeting December 2006.
S.L.LloydATSE e-Science Visit April 2004Slide 1 GridPP – A UK Computing Grid for Particle Physics GridPP 19 UK Universities, CCLRC (RAL & Daresbury) and.
1 ALICE Grid Status David Evans The University of Birmingham GridPP 14 th Collaboration Meeting Birmingham 6-7 Sept 2005.
GridPP, The Grid & Industry Who we are, what it is and what we can do. Tony Doyle, Project Leader Steve Lloyd, Collaboration Board Chairman Robin Middleton,
Tony Doyle - University of Glasgow GridPP EDG - UK Contributions Architecture Testbed-1 Network Monitoring Certificates & Security Storage Element R-GMA.
The Grid Professor Steve Lloyd Queen Mary, University of London.
B A B AR and the GRID Roger Barlow for Fergus Wilson GridPP 13 5 th July 2005, Durham.
The Grid What is it? what is it for?. Your university or experiment logo here Web: information sharing Invented at CERN by Tim Berners-Lee No. of Internet.
GLite adoption and opportunities for collaboration with industry Tony Doyle Distributed Computing Workshop Westminster, 21 May 2008.
GridPP Building a UK Computing Grid for Particle Physics A PPARC funded project.
David Britton, 28/May/ TeV Collisions 27 km circumference m 8.36 Tesla SC dipoles 8000 cryomagnets 40,000 tons of metal at -271c 700,000L.
Slide 1 of 24 Steve Lloyd NW Grid Seminar - 11 May 2006 GridPP and the Grid for Particle Physics Steve Lloyd Queen Mary, University of London NW Grid Seminar.
Particle physics – the computing challenge CERN Large Hadron Collider –2007 –the worlds most powerful particle accelerator –10 petabytes (10 million billion.
UK Agency for the support of: High Energy Physics - the nature of matter and mass Particle Astrophysics - laws from natural phenomena Astronomy - the.
31/03/00 CMS(UK)Glenn Patrick What is the CMS(UK) Data Model? Assume that CMS software is available at every UK institute connected by some infrastructure.
The LHC experiments AuthZ Interoperation requirements GGF16, Athens 16 February 2006 David Kelsey CCLRC/RAL, UK
Experimental Particle Physics PHYS6011 Joel Goldstein, RAL 1.Introduction & Accelerators 2.Particle Interactions and Detectors (2) 3.Collider Experiments.
Searching for the Higgs Tara Shears University of Liverpool.
Why LHC? Tara Shears, University of Liverpool. To understand the universe … Fundamental particles atoms stars and galaxies NOW Investigate with astrophysics,
Computing for LHC Dr. Wolfgang von Rüden, CERN, Geneva ISEF students visit CERN, 28 th June - 1 st July 2009.
GridPP From Prototype to Production David Britton 21/Sep/06 1.Context – Introduction to GridPP 2.Performance of the GridPP/EGEE/wLCG Grid 3.Some Successes.
LHCb Computing Activities in UK Current activities UK GRID activities RICH s/w activities.
Enabling e-Research over GridPP Dan Tovey University of Sheffield.
Welcome to CERN CERN – The European Organization for Nuclear Research, Geneva, Switzerland.
Highest Energy e + e – Collider LEP at CERN GeV ~4km radius First e + e – Collider ADA in Frascati GeV ~1m radius e + e – Colliders.
Randall Sobie The ATLAS Experiment Randall Sobie Institute for Particle Physics University of Victoria Large Hadron Collider (LHC) at CERN Laboratory ATLAS.
The DataGrid Project NIKHEF, Wetenschappelijke Jaarvergadering, 19 December 2002
Searching for the Higgs – spearheading grid Tara Shears University of Liverpool.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 15 th April 2009 Visit of Spanish Royal Academy.
Exploiting the Grid to Simulate and Design the LHCb Experiment K Harrison 1, N Brook 2, G Patrick 3, E van Herwijnen 4, on behalf of the LHCb Grid Group.
CERN/IT/DB Multi-PB Distributed Databases Jamie Shiers IT Division, DB Group, CERN, Geneva, Switzerland February 2001.
The Grid Prof Steve Lloyd Queen Mary, University of London.
03/27/'07T. ISGC20071 Computing GRID for ALICE in Japan Hiroshima University Takuma Horaguchi for the ALICE Collaboration
Computing and LHCb Raja Nandakumar. The LHCb experiment  Universe is made of matter  Still not clear why  Andrei Sakharov’s theory of cp-violation.
A short introduction to GRID Gabriel Amorós IFIC.
Preparation of KIPT (Kharkov) computing facilities for CMS data analysis L. Levchuk Kharkov Institute of Physics and Technology (KIPT), Kharkov, Ukraine.
F.Fanzago – INFN Padova ; S.Lacaprara – LNL; D.Spiga – Universita’ Perugia M.Corvo - CERN; N.DeFilippis - Universita' Bari; A.Fanfani – Universita’ Bologna;
1 Kittikul Kovitanggoon*, Burin Asavapibhop, Narumon Suwonjandee, Gurpreet Singh Chulalongkorn University, Thailand July 23, 2015 Workshop on e-Science.
Copyright © 2000 OPNET Technologies, Inc. Title – 1 Distributed Trigger System for the LHC experiments Krzysztof Korcyl ATLAS experiment laboratory H.
Finnish DataGrid meeting, CSC, Otaniemi, V. Karimäki (HIP) DataGrid meeting, CSC V. Karimäki (HIP) V. Karimäki (HIP) Otaniemi, 28 August, 2000.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 25 th April 2012.
November SC06 Tampa F.Fanzago CRAB a user-friendly tool for CMS distributed analysis Federica Fanzago INFN-PADOVA for CRAB team.
EGEE is a project funded by the European Union under contract IST HEP Use Cases for Grid Computing J. A. Templon Undecided (NIKHEF) Grid Tutorial,
GridPP Building a UK Computing Grid for Particle Physics Professor Steve Lloyd, Queen Mary, University of London Chair of the GridPP Collaboration Board.
Event selection and readout Online networks and architectures Online event filter Technologies and trends Computing and communication at LHC.
Ian Bird LHC Computing Grid Project Leader LHC Grid Fest 3 rd October 2008 A worldwide collaboration.
…building the next IT revolution From Web to Grid…
The LHC Computing Grid – February 2008 The Challenges of LHC Computing Dr Ian Bird LCG Project Leader 6 th October 2009 Telecom 2009 Youth Forum.
Les Les Robertson LCG Project Leader High Energy Physics using a worldwide computing grid Torino December 2005.
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
ATLAS is a general-purpose particle physics experiment which will study topics including the origin of mass, the processes that allowed an excess of matter.
Dr. Andreas Wagner Deputy Group Leader - Operating Systems and Infrastructure Services CERN IT Department The IT Department & The LHC Computing Grid –
Computing for LHC Physics 7th March 2014 International Women's Day - CERN- GOOGLE Networking Event Maria Alandes Pradillo CERN IT Department.
LHC Computing, CERN, & Federated Identities
The ATLAS Computing Model and USATLAS Tier-2/Tier-3 Meeting Shawn McKee University of Michigan Joint Techs, FNAL July 16 th, 2007.
tons, 150 million sensors generating data 40 millions times per second producing 1 petabyte per second The ATLAS experiment.
Distributed Physics Analysis Past, Present, and Future Kaushik De University of Texas at Arlington (ATLAS & D0 Collaborations) ICHEP’06, Moscow July 29,
Meeting with University of Malta| CERN, May 18, 2015 | Predrag Buncic ALICE Computing in Run 2+ P. Buncic 1.
LHC collisions rate: Hz New PHYSICS rate: Hz Event selection: 1 in 10,000,000,000,000 Signal/Noise: Raw Data volumes produced.
GridPP, The Grid & Industry
Grid site as a tool for data processing and data analysis
The LHC Computing Grid Visit of Mtro. Enrique Agüera Ibañez
Understanding the nature of matter -
LHC DATA ANALYSIS INFN (LNL – PADOVA)
The LHC Computing Grid Visit of Her Royal Highness
Building a UK Computing Grid for Particle Physics
CERN, the LHC and the Grid
The LHC Computing Grid Visit of Professor Andreas Demetriou
Presentation transcript:

SymmetryBeauty Physics Grid Large Hadron Collider Particle Physics Condensed Matter Astro physics Cosmology Nuclear Physics Atomic Physics Biological Physics

Avoiding Gridlock Tony Doyle Particle Physics Masterclass Glasgow, 11 June 2009

Outline Introduction – Origin - Why? What is the Grid? How does the Grid work? When will it be ready? The Icemen Cometh

Historical Perspective The World Wide Web a global information system which users can read and write via computers connected to the Internet born on March 13 th 1989: A proposal was submitted information Management Tim Berners-Lee CERN : Development The first three years were a phase of persuasion to get the Web adopted… : Growth the load on the first Web server ("info.cern.ch") rose steadily by a factor of 10 every year… : Commercialisation Google and other search engines : "Dot-com" boom (and bust) 2002-Present: The ubiquitous Web Web 2.0: blogs and RSS

Data is everywhere… Q: What is done with the data? Nothing Read itListen to it Watch it Analyse it 2323 Read A Read B C = A + B Print C 5 Computer Program "Job" Calculate how proteins fold Calculate what the weather is going to do Q: How much data have humans produced? 1,000,000,000,000,000,000,000 Bytes 1 zettabyte or Bytes (~ doubling each year) According to IDC, as of 2006 the total amount of digital data in existence was zettabytes; the same paper estimates that by 2010, the rate of digital data generated worldwide will be zettabytes per year.

Why Grid? Analogy with the Electricity Power Grid 'Standard Interface' Power Stations Distribution Infrastructure

Computing Grid Computing and Data Centres Fibre Optics of the Internet

Why do particle physicists need the Grid? 4 Large Experiments CERN LHC The worlds most powerful particle accelerator

Why do particle physicists need the Grid? Example from LHC: starting from this event We are looking for this signature Selectivity: 1 in Like looking for 1 person in a thousand world populations Or for a needle in 20 million haystacks! ~100,000,000 electronic channels 800,000,000 proton-proton interactions per second Higgs per second 10 PBytes of data a year (10 Million GBytes = 14 Million CDs) Concorde (15 Km) Mt. Blanc (4.8 Km) One years data from LHC would fill a stack of CDs 20km high

Data Grid The Grid enables us to analyse all the data that comes from the LHC Petabytes 100,000 CPUs Distributed around the world Now used in many other areas The Grid

1. Rare Phenomena - Huge Background 9 orders of magnitude The HIGGS All interactions When you are face to face with a difficulty you are up against a discovery Lord Kelvin 2. Complexity Why (particularly) the LHC?

Four LHC Experiments ALICE - heavy ion collisions, to create quark-gluon plasmas - 50,000 particles in each collision LHCb - to study the differences between matter and antimatter - producing over 100 million b and b-bar mesons each year ATLAS - general purpose: origin of mass, supersymmetry, micro-black holes? - 2,000 scientists from 34 countries CMS - general purpose detector - 1,800 scientists from 150 institutes One Grid to Rule Them All?

The Challenges I: Real-Time Event Selection 9 orders of magnitude Time Real-Time In-Time

Understand/interpret data via numerically intensive simulations Many events –~10 9 events/experiment/year –>~1 MB/event raw data –several passes required Worldwide Grid computing requirement (2008): ~300 TeraIPS (100,000 of todays fastest processors connected via a Grid) 16 Million channels 100 kHz LEVEL-1 TRIGGER 1 MegaByte EVENT DATA 200 GigaByte BUFFERS 500 Readout memories 3Gigacell buffers 500 Gigabit/s Gigabit/s SERVICE LAN PetaByte ARCHIVE EnergyTracks Networks 1 Terabit/s (50000 DATA CHANNELS) 20TeraIPS EVENT BUILDER EVENT FILTER 40 MHz COLLISION RATE ChargeTimePattern Detectors Grid Computing Service 300TeraIPS The Challenges II: Real-Time Complexity

Share more than information Efficient use of resources at many institutes Leverage over other sources of funding Data, computing power, applications Join local communities Challenges: share data between thousands of scientists with multiple interests link major and minor computer centres ensure all data accessible anywhere, anytime grow rapidly, yet remain reliable for more than a decade cope with different management policies of different centres ensure data security be up and running routinely Solution – Build a Grid

Middleware is the Key MIDDLEWARE CPU Disks, CPU etc PROGRAMS OPERATING SYSTEM Word/Excel /Web Your Program Games CPU Cluster User Interface Machine CPU Cluster CPU Cluster Resource Broker Information Service Single PC Grid Disk Server Your Program Middleware is the Operating System of a distributed computing system Replica Catalogue Bookkeeping Service

Something like this…

… or this gridui JDL VOMS WLMS JS RB LFC BDII Logging & Bookkeeping 3 CPU Nodes Storage Grid Enabled Resources CPU Nodes Storage Grid Enabled Resources CPU Nodes Storage Grid Enabled Resources CPU Nodes Storage Grid Enabled Resources 4 5 Submitter VOMS-proxy-init 1 Job Submission 2 Job Status? 11 Job Retrieval

An open operating system does not only have advantages? LCGOSGNDGNGS … or this

Who do you trust? No-one? It depends on what you want… (assume its scientific collaboration)

How do I Authorise? Digital Certificates

REAL and SIMULATED data Data Structure Raw Data Reconstruction Data Acquisition Level 3 trigger Trigger Tags Event Summary Data ESD Event Summary Data ESD Event Tags Physics Models Monte Carlo Truth Data MC Raw Data Reconstruction MC Event Summary Data MC Event Tags Detector Simulation Calibration Data Run Conditions Trigger System

Physics Analysis ESD: Data or Monte Carlo Event Tags Event Selection Analysis Object Data AOD Analysis Object Data AOD Calibration Data Analysis, Skims Raw Data Tier 0,1 Collaboration wide Tier 2 Analysis Groups Tier 3, 4 Physicists Physics Analysis Physics Objects Physics Objects Physics Objects INCREASING DATA FLOW

Grid Infrastructure Tier 0 Tier 1 National centres Tier 2 Regional groups Institutes Workstations Offline farm Online system CERN computer centre RAL,UK ScotGridNorthGridSouthGridLondon FranceItalyGermanySpain GlasgowEdinburghDurham Structure chosen for particle physics. Different for others. 11 T1 centres

An Example - ScotGrid Just in time for the LHC Machine Room Downstairs

The Grid Archeology Astronomy Astrophysics Civil Protection Comp. Chemistry Earth Sciences Finance Fusion Geophysics High Energy Physics Life Sciences Multimedia Material Sciences … >250 sites 48 countries >50,000 CPUs >20 PetaBytes >10,000 users >150 VOs >150,000 jobs/day

1. Why? 2. What? 3. How? 4. When? From Particle Physics perspective the Grid is: 1. needed to utilise large-scale computing resources efficiently and securely 2. a) a working system running today on large resources b) about seamless discovery of computing resources c) using evolving standards for interoperation d) the basis for computing in the 21 st Century 3. Using middleware 4.Now available – ready for LHC data

Avoiding Gridlock? Avoid computer lockup using a Grid Avoiding Gridlock provided you have a star network (basis of the internet).. Computing is then almost limitless

Thank YOU