Highest Energy e + e – Collider LEP at CERN 1989-2000 200 GeV ~4km radius First e + e – Collider ADA in Frascati 1961 0.2 GeV ~1m radius e + e – Colliders.

Slides:



Advertisements
Similar presentations
International Grid Communities Dr. Carl Kesselman Information Sciences Institute University of Southern California.
Advertisements

1 AMY Detector (eighties) A rather compact detector.
The Large Hadron Collider By Kathleen McKay. What is the LHC? The most powerful particle accelerator in the world. A synchrotron (ring-shaped particle.
Randall Sobie The ATLAS Experiment Randall Sobie Institute for Particle Physics University of Victoria Large Hadron Collider (LHC) at CERN Laboratory ATLAS.
Alain Romeyer - Dec Grid computing for CMS What is the Grid ? Let’s start with an analogy How it works ? (Some basic ideas) Grid for LHC and CMS.
LHC Large Hadron Collider Richard Lasky – Summer 2010.
Discovering the Unknown at the CERN Large Hadron Collider (LHC) Amy Gladwin University of Arizona.
A. Bay Beijing October Accelerators We want to study submicroscopic structure of particles. Spatial resolution of a probe ~de Broglie wavelength.
POLITEHNICA University of Bucharest California Institute of Technology National Center for Information Technology Ciprian Mihai Dobre Corina Stratan MONARC.
The ATLAS trigger Ricardo Gonçalo Royal Holloway University of London.
1 Developing Countries Access to Scientific Knowledge Ian Willers CERN, Switzerland.
HEP Prospects, J. Yu LEARN Strategy Meeting Prospects on Texas High Energy Physics Network Needs LEARN Strategy Meeting University of Texas at El Paso.
Exploiting the Grid to Simulate and Design the LHCb Experiment K Harrison 1, N Brook 2, G Patrick 3, E van Herwijnen 4, on behalf of the LHCb Grid Group.
CERN/IT/DB Multi-PB Distributed Databases Jamie Shiers IT Division, DB Group, CERN, Geneva, Switzerland February 2001.
D. Duellmann, CERN Data Management at the LHC1 Data Management at CERN’s Large Hadron Collider (LHC) Dirk Düllmann CERN IT/DB, Switzerland
CERN IT Department CH-1211 Genève 23 Switzerland t Status and Plans TERENA 2010 Vilnius, Lithuania John Shade /CERN.
1. 2 CERN European Organization for Nuclear Research Founded in 1954 by 12 countries – Norway one of them Today: 20 member states, around 2500 staff –
The Large Hadron Collider at CERN: Entering a new era in unravelling the mystery of matter, space and time Tsinghua, Beijing April 2009 Felicitas Pauss.
1 Chasing the Higgs boson with a worldwide distributed trigger system Sander Klous NIKHEF VENI proposal 2006.
HEP and Data Grids (Aug. 4-5, 2001)Paul Avery1 High Energy Physics and Data Grids Paul Avery University of Florida
1 Some predictions and experiment prospects of the heavy ion physics at LHC C. Kobdaj, Y. Yan and K. Khosonthongkee School of Physics, Institute of Science.
Future Accelerators at the energy frontier Peter Hansen february 2010 University of Copenhagen.
What are we made of ? Neutrinos Building a Particle Collider The ring is 27km round and on average 100m underground CERN – LEP, LHC.
1 Kittikul Kovitanggoon*, Burin Asavapibhop, Narumon Suwonjandee, Gurpreet Singh Chulalongkorn University, Thailand July 23, 2015 Workshop on e-Science.
Copyright © 2000 OPNET Technologies, Inc. Title – 1 Distributed Trigger System for the LHC experiments Krzysztof Korcyl ATLAS experiment laboratory H.
Finnish DataGrid meeting, CSC, Otaniemi, V. Karimäki (HIP) DataGrid meeting, CSC V. Karimäki (HIP) V. Karimäki (HIP) Otaniemi, 28 August, 2000.
HENP, Grids and the Networks They Depend Upon Shawn McKee March 2004 National Internet2 Day.
ICHEP06, 29 July 2006, Moscow RDIG The Russian Grid for LHC physics analysis V.A. Ilyin, SINP MSU V.V. Korenkov, JINR A.A. Soldatov, RRC KI LCG.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 25 th April 2012.
1 High Energy Physics (HEP) Computing HyangKyu Park Kyungpook National University Daegu, Korea 2008 Supercomputing & KREONET Workshop Ramada Hotel, JeJu,
Introduction to CERN David Barney, CERN Introduction to CERN Activities Intro to particle physics Accelerators – the LHC Detectors - CMS.
1 CERN related research program in Nuclear Physics High energy nuclear physics –ALICE experiment »Installation and Commissioning »Data taking.
What is the Higgs??? Prof Nick Evans University of Southampton.
A.Golunov, “Remote operational center for CMS in JINR ”, XXIII International Symposium on Nuclear Electronics and Computing, BULGARIA, VARNA, September,
V.A. Ilyin,, RIGF, 14 May 2010 Internet and Science: LHC view V.A. Ilyin SINP MSU, e-ARENA.
Event selection and readout Online networks and architectures Online event filter Technologies and trends Computing and communication at LHC.
880.P20 Winter 2006 Richard Kass 1 The Large Hadron Collider LHC is located at CERN CERN is located near Geneva Part of CERN is in France The LHC collides.
Les Les Robertson LCG Project Leader High Energy Physics using a worldwide computing grid Torino December 2005.
Discovering the Higgs Boson J. Pilcher Talk for Graduate Students January 9, 2004.
Eine Einführung ins Grid Andreas Gellrich IT Training DESY Hamburg
Expectations of the first 2 years of LHC operations A.Rozanov ITEP Winter School of Physics February Outline LHC Experiments SM physics Higgs SUSY.
WLCG and the India-CERN Collaboration David Collados CERN - Information technology 27 February 2014.
J. Velkovska1 Lecture 17: Magnetic field sources. Ampere’s law PHYS 117B.02, Feb
ATLAS WAN Requirements at BNL Slides Extracted From Presentation Given By Bruce G. Gibbard 13 December 2004.
Search for a Z′ boson in the dimuon channel in p-p collisions at √s = 7TeV with CMS experiment at the Large Hadron Collider Search for a Z′ boson in the.
Jessica Leonard, U. Wisconsin, December 19, 2006 Preliminary Exam - 1 H->  Jessica Leonard University of Wisconsin - Madison Preliminary Examination.
Large Hadron Collider BY: DARSHAN MISTRY, ALEX GUMBRELL, CHRISTINE AUCIELLO, PATRICK DUNGOG.
US LHC NWG Dynamic Circuit Services in US LHCNet Artur Barczyk, Caltech Joint Techs Workshop Honolulu, 01/23/2008.
Data Processing and the LHC Computing Grid (LCG) Jamie Shiers Database Group, IT Division CERN, Geneva, Switzerland
LCG LHC Computing Grid Project From the Web to the Grid 23 September 2003 Jamie Shiers, Database Group IT Division, CERN, Geneva, Switzerland
LHC - Open for Business Nobel Prize 1985 for decisive contributions.
Computing Issues for the ATLAS SWT2. What is SWT2? SWT2 is the U.S. ATLAS Southwestern Tier 2 Consortium UTA is lead institution, along with University.
tons, 150 million sensors generating data 40 millions times per second producing 1 petabyte per second The ATLAS experiment.
Ian Bird WLCG Networking workshop CERN, 10 th February February 2014
05 Novembre years of research in physics European Organization for Nuclear Research.
Steve Playfer University of Edinburgh 15th Novemebr 2008 Large Hadron Collider at CERN.
LHC LARGE HADRON COLLIDER World’s largest and highest-energy particle accelerator. Built by the European Organization for Nuclear Research(CERN). To study.
CMS Status & Commissioning Menu: 1 Recent Progress Commissioning Prior to and After First Beam Commissioning with first LHC Events Outlook Wolfgang Funk.
Collaborative Research Projects in Australia: High Energy Physicists Dr. Greg Wickham (AARNet) Dr. Glenn Moloney (University of Melbourne) Global Collaborations.
Transporting High Energy Physics Experiment Data over High Speed Genkai/Hyeonhae on 4 October 2002 at Oita Korea-Kyushu Gigabit Network.
CrossGrid Workshop, Kraków, 5 – 6 Nov-2001 Distributed Data Analysis in HEP Piotr MALECKI Institute of Nuclear Physics Kawiory 26A, Kraków, Poland.
LHC collisions rate: Hz New PHYSICS rate: Hz Event selection: 1 in 10,000,000,000,000 Signal/Noise: Raw Data volumes produced.
] Open Science Grid Ben Clifford University of Chicago
CERN’s Large Hadron Collider
LHC DATA ANALYSIS INFN (LNL – PADOVA)
Status of CMS and the Austrian Contribution to the Trigger System
LHC Collisions.
CERN The world’s largest Particle Physics Research Center in Geneva
CERN, the LHC and the Grid
CS258 Spring 2002 Mark Whitney and Yitao Duan
Presentation transcript:

Highest Energy e + e – Collider LEP at CERN GeV ~4km radius First e + e – Collider ADA in Frascati GeV ~1m radius e + e – Colliders

First Beams: April 2007 Physics Runs: from Summer 2007 TOTEM pp, general purpose; HI LHCb: B-physics ALICE : HI  pp  s =14 TeV L=10 34 cm -2 s -1  Heavy ions CMS at LHC: 2007 Start CMS LHC Schedule Reconfirmed at CERN Council June 2003 ATLAS

HCAL Barrels Done: Installing HCAL Endcap and Muon CSCs in SX5 36 Muon CSCs successfully installed on YE-2,3. Avg. rate 6/day (planned 4/day). Cabling+commissioning. HE-1 complete, HE+ will be mounted in Q4 2003

Large Hadron Collider (LHC) Bunch Crossing cm -2 s -1 Luminosity 2835 Bunches/Beam Protons/Bunch 14 TeV Proton Proton Collisions Parton Collisions Higgs Production 7.5 m (25 ns) ~10000 per day 4x10 7 Hz 10 9 Hz

LHC Magnets 9 Tesla field Dipoles separated by 20cm Cooled to superfluid liquid helium temperatures 20 km of magnets

LHC Magnets

LHC Detectors B-physics CP Violation Heavy Ions Quark-gluon plasma CMS ATLAS

LHC CERN Laboratory in Geneva, Switzerland

LHC CMS Detector

LHC 300 foot shaft

LHC CMS Cavern (300 feet underground)

LHC

online system multi-level trigger filter out background reduce data volume level 1 - special hardware 40 MHz (80 TB/sec) level 2 - embedded processors level 3 - PCs 75 KHz (75 GB/sec) 5 KHz (5 GB/sec) 100 Hz ( MB/sec) data processing offline analysis, selection One of the four LHC detectors (CMS) Raw recording rate 0.1 – 1 GB/sec PetaBytes / year LHC Computing: Different from Previous Experiment Generations

Tier2 Center Online System Offline Farm, CERN Computer France Center FNAL Center Italy Center UK Center Institut e Institute ~0.25TIPS Workstations 100–1000 MBytes/sec ~2.4 Gbits/sec Mbits/sec Bunch crossing per 25 nsecs. Event is ~1 MByte in size Physicists work on analysis “channels”. Processing power: ~200,000 of today’s fastest PCs Physics data cache ~PBytes/sec ~ Gbits/sec Tier2 Center ~622 Mbits/sec Tier 0 +1 Tier 1 Tier 3 Tier 4 Tier2 Center Tier 2 Experiment Regional Center Hierarchy (Worldwide Data Grid)

Production BW Growth of Int’l HENP Network Links (US-CERN Example) u Rate of Progress >> Moore’s Law. (US-CERN Example) è 9.6 kbps Analog(1985) è kbps Digital ( ) [X 7 – 27] è 1.5 Mbps Shared (1990-3; IBM) [X 160] è 2 -4 Mbps( ) [X ] è Mbps ( ) [X 1.2k-2k] è Mbps (2001-2) [X 16k – 32k] è 622 Mbps(2002-3) [X 65k] è 2.5 Gbps (2003-4) [X 250k] è 10 Gbps (2005) [X 1M] u A factor of ~1M over a period of (a factor of ~5k during ) u HENP has become a leading applications driver, and also a co-developer of global networks

HENP Major Links: Bandwidth Roadmap (Scenario) in Gbps Continuing the Trend: ~1000 Times Bandwidth Growth Per Decade; We are Rapidly Learning to Use Multi-Gbps Networks Dynamically

History – One large Research Site Current Traffic to ~400 Mbps; Projections: 0.5 to 24 Tbps by ~2012 Much of the Traffic: SLAC IN2P3/RAL/INFN; via ESnet+France; Abilene+CERN

Digital Divide Illustrated by Network Infrastructures: TERENA NREN Core Capacity Core capacity goes up in Large Steps: 10 to 20 Gbps; 2.5 to 10 Gbps; to 2.5 Gbps Current In Two Years SE Europe, Medit., FSU, Middle East: Less Progress Based on Older Technologies (Below 0.15, 1.0 Gbps): Digital Divide Will Not Be Closed Source: TEREN A

The Global Lambda Integrated Facility for Research and Education (GLIF) u Virtual organization supports persistent data-intensive scientific research and middleware development on “LambdaGrids” u Grid applications “ride” on dynamically configured networks based on optical wavelengths. u Architecting an international LambdaGrid infrastructure