… where the Web was born 11 November 2003 Wolfgang von Rüden, IT Division Leader CERN openlab Workshop on TCO Introduction.

Slides:



Advertisements
Similar presentations
1 AMY Detector (eighties) A rather compact detector.
Advertisements

S.L.LloydATSE e-Science Visit April 2004Slide 1 GridPP – A UK Computing Grid for Particle Physics GridPP 19 UK Universities, CCLRC (RAL & Daresbury) and.
GridPP Building a UK Computing Grid for Particle Physics A PPARC funded project.
Searching for the Higgs Tara Shears University of Liverpool.
Why LHC? Tara Shears, University of Liverpool. To understand the universe … Fundamental particles atoms stars and galaxies NOW Investigate with astrophysics,
Computing for LHC Dr. Wolfgang von Rüden, CERN, Geneva ISEF students visit CERN, 28 th June - 1 st July 2009.
Providing Support to Users Maureen E Rennie
CERN IT Department1 / 17 Tour of CERN Computer Center and the Grid at CERN Information Technologies Department Tour of CERN Computer Center and the Grid.
An exciting Opportunity to…. Enhance your A Level studies Develop Key Communication Skills Make personal links with Research staff at a leading University.
Welcome to CERN Accelerating Science and Innovation 2 nd March 2015 – Bidders Conference – DO-29161/EN.
HP Puerto-Rico – 9 February CERN and the LHC Computing Grid Ian Bird IT Department CERN, Geneva, Switzerland HP Puerto Rico 9 February 2004
Randall Sobie The ATLAS Experiment Randall Sobie Institute for Particle Physics University of Victoria Large Hadron Collider (LHC) at CERN Laboratory ATLAS.
Searching for the Higgs – spearheading grid Tara Shears University of Liverpool.
LHC Experiments at Liverpool E2V Visit – Nov 2005 Introduction Si Technology Upgrade/Maintenance Summary.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 15 th April 2009 Visit of Spanish Royal Academy.
CERN/IT/DB Multi-PB Distributed Databases Jamie Shiers IT Division, DB Group, CERN, Geneva, Switzerland February 2001.
The role of Big Laboratories Accelerating Science and Innovation Accelerating Science and Innovation R.-D. Heuer, CERN Nobel Symposium, 16 May 2013.
1. 2 CERN European Organization for Nuclear Research Founded in 1954 by 12 countries – Norway one of them Today: 20 member states, around 2500 staff –
From the Web to the Grid Dr. Francois Grey IT Department, CERN.
CERN - European Laboratory for Particle Physics HEP Computer Farms Frédéric Hemmer CERN Information Technology Division Physics Data processing Group.
The Grid Prof Steve Lloyd Queen Mary, University of London.
ITEP participation in the EGEE project NEC’2005, Varna, Bulgaria Ivan Korolko (ITEP Moscow)
From the Web to the Grid – Feb TIFRAC – The Tata Institute of Fundamental Research Automatic Calculator The first full-scale, general purpose,
08/06/00 LHCb(UK) Meeting Glenn Patrick LHCb(UK) Computing/Grid: RAL Perspective Glenn Patrick Central UK Computing (what.
Bidders’ conference IT-4123 Supply, Replacement and Repair of Crane Rails Introduction to CERN and rails consolidation program Handling Engineering Group.
Machine Architecture CMSC 104, Section 4 Richard Chang 1.
Welcome to this presentation! We’re so glad you came. While you’re here, you can explore many questions: What is CERN and the Large Hadron Collider (LHC)?
Copyright © 2000 OPNET Technologies, Inc. Title – 1 Distributed Trigger System for the LHC experiments Krzysztof Korcyl ATLAS experiment laboratory H.
Finnish DataGrid meeting, CSC, Otaniemi, V. Karimäki (HIP) DataGrid meeting, CSC V. Karimäki (HIP) V. Karimäki (HIP) Otaniemi, 28 August, 2000.
6/26/01High Throughput Linux Clustering at Fermilab--S. Timm 1 High Throughput Linux Clustering at Fermilab Steven C. Timm--Fermilab.
To the Grid From the Web. From the Web to the Grid – 2007 Why was the Web invented at CERN? Science depends on free access to information and exchange.
RAL Site Report John Gordon IT Department, CLRC/RAL HEPiX Meeting, JLAB, October 2000.
2-3 April 2001HEPSYSMAN Oxford Particle Physics Site Report Pete Gronbech Systems Manager.
To the Grid From the Web Dr. Francois Grey IT Department, CERN.
JLAB Computing Facilities Development Ian Bird Jefferson Lab 2 November 2001.
EGEE is a project funded by the European Union under contract IST HEP Use Cases for Grid Computing J. A. Templon Undecided (NIKHEF) Grid Tutorial,
GridPP Building a UK Computing Grid for Particle Physics Professor Steve Lloyd, Queen Mary, University of London Chair of the GridPP Collaboration Board.
S.Jarp CERN openlab CERN openlab Total Cost of Ownership 11 November 2003 Sverre Jarp.
Les Les Robertson LCG Project Leader High Energy Physics using a worldwide computing grid Torino December 2005.
IDE disk servers at CERN Helge Meinhard / CERN-IT CERN OpenLab workshop 17 March 2003.
Presentation of the A particle collision = an event Physicist's goal is to count, trace and characterize all the particles produced and fully.
Dr. Andreas Wagner Deputy Group Leader - Operating Systems and Infrastructure Services CERN IT Department The IT Department & The LHC Computing Grid –
ITEP participation in the EGEE project NEC’2007, Varna, Bulgaria Ivan Korolko (ITEP Moscow)
Computing for LHC Physics 7th March 2014 International Women's Day - CERN- GOOGLE Networking Event Maria Alandes Pradillo CERN IT Department.
Introduction to CERN and Grid Computing Dr. Wolfgang von Rüden CERN, Geneva HP ProCurve event CERN, 20 February 2008.
LHC Computing, CERN, & Federated Identities
Computing Division FY03 Budget and budget outlook for FY04 + CDF International Finance Committee April 4, 2003 Vicky White Head, Computing Division.
LCG LHC Computing Grid Project From the Web to the Grid 23 September 2003 Jamie Shiers, Database Group IT Division, CERN, Geneva, Switzerland
Computing Issues for the ATLAS SWT2. What is SWT2? SWT2 is the U.S. ATLAS Southwestern Tier 2 Consortium UTA is lead institution, along with University.
05 Novembre years of research in physics European Organization for Nuclear Research.
BIDDERS’ CONFERENCE IT-3981 Dismantling, Refurbishment, Replacement and Supply of Electrical Overhead Travelling Cranes Over 10 Tons Capacity January.
M.C. Vetterli; SFU/TRIUMF Simon Fraser ATLASATLAS SFU & Canada’s Role in ATLAS M.C. Vetterli Simon Fraser University and TRIUMF SFU Open House, May 31.
The Worldwide LHC Computing Grid Frédéric Hemmer IT Department Head Visit of INTEL ISEF CERN Special Award Winners 2012 Thursday, 21 st June 2012.
Hans Wenzel CDF CAF meeting October 18 th -19 th CMS Computing at FNAL Hans Wenzel Fermilab  Introduction  CMS: What's on the floor, How we got.
Dominique Boutigny December 12, 2006 CC-IN2P3 a Tier-1 for W-LCG 1 st Chinese – French Workshop on LHC Physics and associated Grid Computing IHEP - Beijing.
Grid technologies for large-scale projects N. S. Astakhov, A. S. Baginyan, S. D. Belov, A. G. Dolbilov, A. O. Golunov, I. N. Gorbunov, N. I. Gromova, I.
LHC collisions rate: Hz New PHYSICS rate: Hz Event selection: 1 in 10,000,000,000,000 Signal/Noise: Raw Data volumes produced.
Hans Hoffmann, Director Technology Transfer & Scientific Computing, ( ) Why CERN provides a unique IT challenge for industry: from the Web to the.
Grid site as a tool for data processing and data analysis
PROGRAMME 10:00 Introduction to presentations and tour (10‘) Francois Grey  10:10 CERN openlab student programme - CERN opencluster (05')    Stephen Eccles 
The LHC Computing Grid Visit of Mtro. Enrique Agüera Ibañez
European Organization for Nuclear Research
CERN presentation & CFD at CERN
The LHC Computing Grid Visit of Her Royal Highness
Tour of CERN Computer Center
3 - STORAGE: DATA CAPACITY CALCULATIONS
Building a UK Computing Grid for Particle Physics
Tour of CERN Computer Center
CERN, the LHC and the Grid
The LHC Computing Grid Visit of Professor Andreas Demetriou
Presentation transcript:

… where the Web was born 11 November 2003 Wolfgang von Rüden, IT Division Leader CERN openlab Workshop on TCO Introduction

LHC will collide beams of protons at an energy of 14 TeV Using the latest super-conducting technologies, it will operate at about – C, just above absolute zero of temperature. With its 27 km circumference, the accelerator will be the largest superconducting installation in the world. What is LHC? LHC is due to switch on in 2007 Four experiments, with detectors as ‘big as cathedrals’: ALICE ATLAS CMS LHCb

A particle collision = an event Provides trivial parallelism, hence usage of simple farms Physicist's goal is to count, trace and characterize all the particles produced and fully reconstruct the process. Among all tracks, the presence of “special shapes” is the sign for the occurrence of interesting interactions. The LHC Data Challenge

Starting from this event… You are looking for this “signature” Selectivity: 1 in Like looking for 1 person in a thousand world populations! Or for a needle in 20 million haystacks! The LHC Data Challenge

LHC data (simplified) 40 million collisions per second After filtering, 100 collisions of interest per second A Megabyte of digitised information for each collision = recording rate of 0.1 Gigabytes/sec collisions recorded each year = 10 Petabytes/year of data CMSLHCbATLASALICE 1 Megabyte (1MB) A digital photo 1 Gigabyte (1GB) = 1000MB A DVD movie 1 Terabyte (1TB) = 1000GB World annual book production 1 Petabyte (1PB) = 1000TB 10% of the annual production by LHC experiments 1 Exabyte (1EB) = 1000 PB World annual information production

LHC data LHC data correspond to about 20 million CDs each year Concorde (15 Km) Balloon (30 Km) CD stack with 1 year LHC data! (~ 20 Km) Mt. Blanc (4.8 Km) Where will the experiments store all of these data?

LHC data processing LHC data analysis requires a computing power equivalent to ~ 70,000 of today's fastest PC processors Where will the experiments find such a computing power?

Expected LHC computing needs Moore’s law (based on 2000 data) Networking: 10 – 40 Gb/s to all big centres today

Computing at CERN today High-throughput computing based on reliable “commodity” technology More than 1500 dual processor PCs More than 3 Petabyte of data on disk (10%) and tapes (90%) Nowhere near enough!

The new computer room is being populated… CPU servers Disk servers Tape silos and servers Computing at CERN today

CPU servers Disk servers Tape silos and servers …while the existing computer centre is being cleared for renovation… Computing at CERN today …and an upgrade of the power supply from 0.5MW to 2.5MW is underway.

What will happen next ? New CERN management takes over in January with reduced top level management, ie more responsibilities move to the Departments (replacing Divisions) Only 3 people above the departments (CEO, CFO, CSO) New IT Department will also include Administrative Computing (AS Division) and some computing services now in ETT EGEE project will start in April 2004 with substantial funding from the European Union The IT department will have over 400 members (includes about 100 non-staff)

What is new ? Planning is now based on P+M, ie the cost of services will include personnel and overhead Personnel plan will be based on budget rather than head count. This allows for re-profiling of the staff skills. Outsourcing will continue, but if justified by a business case, insourcing is possible. TCO considerations are becoming a real option, but our purchasing rules don’t make life easy. If “quality” should be taken into account, tender documents need to contain objectively measurable criteria, ie the bottom line is a number. Will require Finance Committee approval

CERN’s IT strategy so far Use commodity equipment wherever possible (compute servers, disk servers, tape servers) Buy at the “sweet spot” All based on RH Linux (for how long?) “Big stuff” left are the tape robots Other non-commodity equipment: –Machines running the AFS and Database services –Systems for administrative computing –Solaris-based development cluster as secondary platform Equipment needed by experiments is in addition, but not under IT’s responsibility

Questions to our partners: We would like answers to the following questions: –Are there any cost-effective alternatives? –Can you (industry) provide convincing arguments that “paying more is cheaper”? –Are there examples we can look at? –Does CERN have the right skill levels or are we having too many highly skilled and expensive people? –What is the added value of your proposition? Is physics computing the best target or shall we rather look at the technical and administrative computing (<50% of new department is for physics)? Could you consider offering solutions which deviate from your standard products, possibly with the help of 3 rd parties?

Summary We take the TCO approach seriously New possibilities exist with P+M We need measurable criteria to deviate from our “lowest cost” purchasing principle Thank you for your interest in the topic We are looking forward to your proposal and advice