Download presentation
Presentation is loading. Please wait.
Published byIan Frazier Modified over 10 years ago
1
SymmetryBeauty Physics Grid Large Hadron Collider Particle Physics Condensed Matter Astro physics Cosmology Nuclear Physics Atomic Physics Biological Physics
2
Avoiding Gridlock Tony Doyle Particle Physics Masterclass Glasgow, 11 June 2009
3
Outline Introduction – Origin - Why? What is the Grid? How does the Grid work? When will it be ready? The Icemen Cometh
4
Historical Perspective The World Wide Web a global information system which users can read and write via computers connected to the Internet born on March 13 th 1989: A proposal was submitted information Management Tim Berners-Lee CERN 1989-91: Development The first three years were a phase of persuasion to get the Web adopted… 1992-1995: Growth the load on the first Web server ("info.cern.ch") rose steadily by a factor of 10 every year… 1996-1998: Commercialisation Google and other search engines 1999-2001: "Dot-com" boom (and bust) 2002-Present: The ubiquitous Web Web 2.0: blogs and RSS
5
Data is everywhere… Q: What is done with the data? Nothing Read itListen to it Watch it Analyse it 2323 Read A Read B C = A + B Print C 5 Computer Program "Job" Calculate how proteins fold Calculate what the weather is going to do Q: How much data have humans produced? 1,000,000,000,000,000,000,000 Bytes 1 zettabyte or 10 21 Bytes (~ doubling each year) According to IDC, as of 2006 the total amount of digital data in existence was 0.161 zettabytes; the same paper estimates that by 2010, the rate of digital data generated worldwide will be 0.988 zettabytes per year.
6
Why Grid? Analogy with the Electricity Power Grid 'Standard Interface' Power Stations Distribution Infrastructure
7
Computing Grid Computing and Data Centres Fibre Optics of the Internet
8
Why do particle physicists need the Grid? 4 Large Experiments CERN LHC The worlds most powerful particle accelerator
9
Why do particle physicists need the Grid? Example from LHC: starting from this event We are looking for this signature Selectivity: 1 in 10 13 Like looking for 1 person in a thousand world populations Or for a needle in 20 million haystacks! ~100,000,000 electronic channels 800,000,000 proton-proton interactions per second 0.0002 Higgs per second 10 PBytes of data a year (10 Million GBytes = 14 Million CDs) Concorde (15 Km) Mt. Blanc (4.8 Km) One years data from LHC would fill a stack of CDs 20km high
10
Data Grid The Grid enables us to analyse all the data that comes from the LHC Petabytes 100,000 CPUs Distributed around the world Now used in many other areas The Grid
11
1. Rare Phenomena - Huge Background 9 orders of magnitude The HIGGS All interactions When you are face to face with a difficulty you are up against a discovery Lord Kelvin 2. Complexity Why (particularly) the LHC?
12
Four LHC Experiments ALICE - heavy ion collisions, to create quark-gluon plasmas - 50,000 particles in each collision LHCb - to study the differences between matter and antimatter - producing over 100 million b and b-bar mesons each year ATLAS - general purpose: origin of mass, supersymmetry, micro-black holes? - 2,000 scientists from 34 countries CMS - general purpose detector - 1,800 scientists from 150 institutes One Grid to Rule Them All?
13
The Challenges I: Real-Time Event Selection 9 orders of magnitude Time Real-Time In-Time
14
Understand/interpret data via numerically intensive simulations Many events –~10 9 events/experiment/year –>~1 MB/event raw data –several passes required Worldwide Grid computing requirement (2008): ~300 TeraIPS (100,000 of todays fastest processors connected via a Grid) 16 Million channels 100 kHz LEVEL-1 TRIGGER 1 MegaByte EVENT DATA 200 GigaByte BUFFERS 500 Readout memories 3Gigacell buffers 500 Gigabit/s Gigabit/s SERVICE LAN PetaByte ARCHIVE EnergyTracks Networks 1 Terabit/s (50000 DATA CHANNELS) 20TeraIPS EVENT BUILDER EVENT FILTER 40 MHz COLLISION RATE ChargeTimePattern Detectors Grid Computing Service 300TeraIPS The Challenges II: Real-Time Complexity
15
Share more than information Efficient use of resources at many institutes Leverage over other sources of funding Data, computing power, applications Join local communities Challenges: share data between thousands of scientists with multiple interests link major and minor computer centres ensure all data accessible anywhere, anytime grow rapidly, yet remain reliable for more than a decade cope with different management policies of different centres ensure data security be up and running routinely Solution – Build a Grid
16
Middleware is the Key MIDDLEWARE CPU Disks, CPU etc PROGRAMS OPERATING SYSTEM Word/Excel Email/Web Your Program Games CPU Cluster User Interface Machine CPU Cluster CPU Cluster Resource Broker Information Service Single PC Grid Disk Server Your Program Middleware is the Operating System of a distributed computing system Replica Catalogue Bookkeeping Service
17
Something like this…
18
… or this gridui JDL VOMS WLMS JS RB LFC BDII Logging & Bookkeeping 3 CPU Nodes Storage Grid Enabled Resources CPU Nodes Storage Grid Enabled Resources CPU Nodes Storage Grid Enabled Resources CPU Nodes Storage Grid Enabled Resources 4 5 Submitter 6 7 89 10 0 VOMS-proxy-init 1 Job Submission 2 Job Status? 11 Job Retrieval
19
An open operating system does not only have advantages? LCGOSGNDGNGS … or this
20
Who do you trust? No-one? It depends on what you want… (assume its scientific collaboration)
21
How do I Authorise? Digital Certificates
22
REAL and SIMULATED data Data Structure Raw Data Reconstruction Data Acquisition Level 3 trigger Trigger Tags Event Summary Data ESD Event Summary Data ESD Event Tags Physics Models Monte Carlo Truth Data MC Raw Data Reconstruction MC Event Summary Data MC Event Tags Detector Simulation Calibration Data Run Conditions Trigger System
23
Physics Analysis ESD: Data or Monte Carlo Event Tags Event Selection Analysis Object Data AOD Analysis Object Data AOD Calibration Data Analysis, Skims Raw Data Tier 0,1 Collaboration wide Tier 2 Analysis Groups Tier 3, 4 Physicists Physics Analysis Physics Objects Physics Objects Physics Objects INCREASING DATA FLOW
24
Grid Infrastructure Tier 0 Tier 1 National centres Tier 2 Regional groups Institutes Workstations Offline farm Online system CERN computer centre RAL,UK ScotGridNorthGridSouthGridLondon FranceItalyGermanySpain GlasgowEdinburghDurham Structure chosen for particle physics. Different for others. 11 T1 centres
25
An Example - ScotGrid Just in time for the LHC Machine Room Downstairs
26
The Grid Archeology Astronomy Astrophysics Civil Protection Comp. Chemistry Earth Sciences Finance Fusion Geophysics High Energy Physics Life Sciences Multimedia Material Sciences … >250 sites 48 countries >50,000 CPUs >20 PetaBytes >10,000 users >150 VOs >150,000 jobs/day
27
1. Why? 2. What? 3. How? 4. When? From Particle Physics perspective the Grid is: 1. needed to utilise large-scale computing resources efficiently and securely 2. a) a working system running today on large resources b) about seamless discovery of computing resources c) using evolving standards for interoperation d) the basis for computing in the 21 st Century 3. Using middleware 4.Now available – ready for LHC data
28
Avoiding Gridlock? Avoid computer lockup using a Grid Avoiding Gridlock provided you have a star network (basis of the internet).. Computing is then almost limitless
29
Thank YOU
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.