CERN TERENA Lisbon The Grid Project Fabrizio Gagliardi CERN Information Technology Division May, 2000

Slides:



Advertisements
Similar presentations
CERN STAR TAP June 2001 Status of the EU DataGrid Project Fabrizio Gagliardi CERN EU-DataGrid Project Leader June 2001
Advertisements

International Grid Communities Dr. Carl Kesselman Information Sciences Institute University of Southern California.
S.L.LloydATSE e-Science Visit April 2004Slide 1 GridPP – A UK Computing Grid for Particle Physics GridPP 19 UK Universities, CCLRC (RAL & Daresbury) and.
Particle physics – the computing challenge CERN Large Hadron Collider –2007 –the worlds most powerful particle accelerator –10 petabytes (10 million billion.
Resources for the ATLAS Offline Computing Basis for the Estimates ATLAS Distributed Computing Model Cost Estimates Present Status Sharing of Resources.
10-Feb-00 CERN Building a Regional Centre A few ideas & a personal view CHEP 2000 – Padova 10 February 2000 Les Robertson CERN/IT.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
Grid activities in Sweden Paula Eerola IT seminar, Vetenskapsrådet,
CERN Krakow 2001 F. Gagliardi - CERN/IT 1 RTD efforts in Europe by Kyriakos Baxevanidis Foster cohesion, interoperability, cross- fertilization of knowledge,
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 15 th April 2009 Visit of Spanish Royal Academy.
Welcome to CERN Research Technology Training Collaborating.
1 Developing Countries Access to Scientific Knowledge Ian Willers CERN, Switzerland.
Regional Computing Centre for Particle Physics Institute of Physics AS CR (FZU) TIER2 of LCG (LHC Computing Grid) 1M. Lokajicek Dell Presentation.
CERN/IT/DB Multi-PB Distributed Databases Jamie Shiers IT Division, DB Group, CERN, Geneva, Switzerland February 2001.
GridPP Steve Lloyd, Chair of the GridPP Collaboration Board.
October 24, 2000Milestones, Funding of USCMS S&C Matthias Kasemann1 US CMS Software and Computing Milestones and Funding Profiles Matthias Kasemann Fermilab.
Fabric Management for CERN Experiments Past, Present, and Future Tim Smith CERN/IT.
Ben Segal CERN Information Technology Division The European Data Grid Project Data Mining Workshop CSC Scientific Computing - Otaniemi.
Andrew McNab - Manchester HEP - 5 July 2001 WP6/Testbed Status Status by partner –CNRS, Czech R., INFN, NIKHEF, NorduGrid, LIP, Russia, UK Security Integration.
Data GRID Activity in Japan Yoshiyuki WATASE KEK (High energy Accelerator Research Organization) Tsukuba, Japan
Grid Workload Management & Condor Massimo Sgaravatto INFN Padova.
CHEP 2000 (Feb. 7-11)Paul Avery (Data Grids in the LHC Era)1 The Promise of Computational Grids in the LHC Era Paul Avery University of Florida Gainesville,
User views from outside of Western Europe MarkoBonac, Arnes, Slovenia.
Robin Middleton RAL/PPD DG Co-ordination Rome, 23rd June 2001.
Rackspace Analyst Event Tim Bell
Finnish DataGrid meeting, CSC, Otaniemi, V. Karimäki (HIP) DataGrid meeting, CSC V. Karimäki (HIP) V. Karimäki (HIP) Otaniemi, 28 August, 2000.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 25 th April 2012.
LHC Computing Review Recommendations John Harvey CERN/EP March 28 th, th LHCb Software Week.
DataTAG Research and Technological Development for a Transatlantic Grid Abstract Several major international Grid development projects are underway at.
Data Logistics in Particle Physics Ready or Not, Here it Comes… Prof. Paul Sheldon Vanderbilt University Prof. Paul Sheldon Vanderbilt University.
7April 2000F Harris LHCb Software Workshop 1 LHCb planning on EU GRID activities (for discussion) F Harris.
GriPhyN EAC Meeting (Jan. 7, 2002)Carl Kesselman1 University of Southern California GriPhyN External Advisory Committee Meeting Gainesville,
Perspectives on Grid Technology Ian Foster Argonne National Laboratory The University of Chicago.
Bob Jones Technical Director CERN - August 2003 EGEE is proposed as a project to be funded by the European Union under contract IST
1 HiGrade Kick-off Welcome to DESY Hamburg Zeuthen.
HEP-CCC Meeting, November 1999Grid Computing for HEP L. E. Price, ANL Grid Computing for HEP L. E. Price Argonne National Laboratory HEP-CCC Meeting CERN,
GridPP Presentation to AstroGrid 13 December 2001 Steve Lloyd Queen Mary University of London.
The Grid approach for the HEP computing problem Massimo Sgaravatto INFN Padova
Ian Bird LHC Computing Grid Project Leader LHC Grid Fest 3 rd October 2008 A worldwide collaboration.
…building the next IT revolution From Web to Grid…
The LHC Computing Grid – February 2008 The Challenges of LHC Computing Dr Ian Bird LCG Project Leader 6 th October 2009 Telecom 2009 Youth Forum.
Les Les Robertson LCG Project Leader High Energy Physics using a worldwide computing grid Torino December 2005.
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
Last update: 08/12/ :00 DataGrid Overview & Status CNAP Meeting 13th October 2000 R.P.Middleton (DataGrid UK Manager)
WelcomeWelcome CSEM – CERN Day 23 rd May 2013 CSEM – CERN Day 23 rd May 2013 to Accelerating Science and Innovation to Accelerating Science and Innovation.
1CHEP2000 February 2000F. Gagliardi EU HEP GRID Project Fabrizio Gagliardi
6 march Building the INFN Grid Proposal outline a.ghiselli,l.luminari,m.sgaravatto,c.vistoli INFN Grid meeting, milano.
LHC Computing, CERN, & Federated Identities
Data Processing and the LHC Computing Grid (LCG) Jamie Shiers Database Group, IT Division CERN, Geneva, Switzerland
10-Jan-00 CERN Building a Regional Centre A few ideas & a personal view CHEP 2000 – Padova 10 January 2000 Les Robertson CERN/IT.
LHC Computing, SPC-FC-CC-C; H F Hoffmann1 CERN/2379/Rev: Proposal for building the LHC computing environment at CERN (Phase 1) Goals of Phase.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 1 st March 2011 Visit of Dr Manuel Eduardo Baldeón.
1 Particle Physics Data Grid (PPDG) project Les Cottrell – SLAC Presented at the NGI workshop, Berkeley, 7/21/99.
Scientific Computing at Fermilab Lothar Bauerdick, Deputy Head Scientific Computing Division 1 of 7 10k slot tape robots.
10-Feb-00 CERN HepCCC Grid Initiative ATLAS meeting – 16 February 2000 Les Robertson CERN/IT.
Hall D Computing Facilities Ian Bird 16 March 2001.
Clouds , Grids and Clusters
The LHC Computing Grid Visit of Mtro. Enrique Agüera Ibañez
Grid related projects CERN openlab LCG EDG F.Fluckiger
Long-term Grid Sustainability
Russian Regional Center for LHC Data Analysis
UK GridPP Tier-1/A Centre at CLRC
CERN, the LHC and the Grid
LHCb(UK) Computing Status Glenn Patrick
Gridifying the LHCb Monte Carlo production system
LHC Computing Grid Project
Defining the Grid Fabrizio Gagliardi EMEA Director Technical Computing
LHCb thinking on Regional Centres and Related activities (GRIDs)
LHC Computing Grid Project
LHC Computing, RRB; H F Hoffmann
Presentation transcript:

CERN TERENA Lisbon The Grid Project Fabrizio Gagliardi CERN Information Technology Division May, 2000

CERN TERENA Lisbon F. Gagliardi - CERN/IT-May Summary  High Energy Physics and CERN computing problem  An excellent computing model: the GRID  The Data Grid Initiative  (

CERN TERENA Lisbon F. Gagliardi - CERN/IT-May CERN organization CERN organization Largest Particle Physics lab in the world European International Center for Particle Physics Research Budget: 1020 M CHF 2700 staff 7000 physicist users

CERN TERENA Lisbon F. Gagliardi - CERN/IT-May The LHC Detectors CMS ATLAS LHCb 3.5 PetaBytes / year ~10 8 events/year

CERN TERENA Lisbon F. Gagliardi - CERN/IT-May The HEP Problem - Part I The scale...

CERN TERENA Lisbon F. Gagliardi - CERN/IT-May ~10K SI processors Non-LHC technology-price curve (40% annual price improvement) LHC Capacity that can purchased for the value of the equipment present in 2000

CERN TERENA Lisbon F. Gagliardi - CERN/IT-May Non-LHC technology-price curve (40% annual price improvement) LHC

CERN TERENA Lisbon F. Gagliardi - CERN/IT-May Long Term Tape Storage Estimates Current Experiments COMPASS LHC 0 2'000 4'000 6'000 8'000 10'000 12'000 14' Year TeraBytes

CERN TERENA Lisbon F. Gagliardi - CERN/IT-May HPC or HTC High Throughput Computing  mass of modest, independent problems  computing in parallel – not parallel computing  throughput rather than single-program performance  resilience rather than total system reliability Have learned to exploit inexpensive mass market components But we need to marry these with inexpensive highly scalable management tools Much in common with other sciences (see EU-US Annapolis Workshop at Astronomy, Earth Observation, Bioinformatics, and commercial/industrial: data mining, Internet computing, e-commerce facilities, …… Contrast with supercomputing

network servers tape servers disk servers application servers Generic component model of a computing farm

CERN TERENA Lisbon F. Gagliardi - CERN/IT-May The HEP Problem - Part II Geography, Sociology, Funding and Politics...

CERN TERENA Lisbon F. Gagliardi - CERN/IT-May CMS:1800 physicists 150 institutes 32 countries World Wide Collaboration  distributed computing & storage capacity

CERN TERENA Lisbon F. Gagliardi - CERN/IT-May Regional Centres - a Multi-Tier Model Department    Desktop CERN – Tier 0 MONARC report: Tier 1 FNAL RAL IN2P3 622 Mbps 2.5 Gbps 622 Mbps 155 mbps Tier2 Lab a Uni b Lab c Uni n

CERN TERENA Lisbon F. Gagliardi - CERN/IT-May Are Grids a solution?  Change of orientation of US Meta-computing activity  From inter-connected super-computers ….. towards a more general concept of a computational Grid (The Grid – Ian Foster, Carl Kesselman)  Has initiated a flurry of activity in HEP  US – Particle Physics Data Grid (PPDG)  GriPhyN – data grid proposal submitted to NSF  Grid technology evaluation project in INFN  UK proposal for funding for a prototype grid  NASA Information Processing Grid

CERN TERENA Lisbon F. Gagliardi - CERN/IT-May The Grid “Dependable, consistent, pervasive access to [high-end] resources” Dependable: provides performance and functionality guarantees Consistent: uniform interfaces to a wide variety of resources Pervasive: ability to “plug in” from anywhere

CERN TERENA Lisbon F. Gagliardi - CERN/IT-May R&D required Local fabric  Management of giant computing fabrics  auto-installation, configuration management, resilience, self- healing  Mass storage management  multi-PetaByte data storage, “real-time” data recording requirement, active tape layer – 1,000s of users Wide-area - building on an existing framework & RN (e.g.Globus, Geant)  workload management  no central status  local access policies  data management  caching, replication, synchronisation  object database model  application monitoring

CERN TERENA Lisbon F. Gagliardi - CERN/IT-May HEP Data Grid Initiative  European level coordination of national initiatives & projects  Principal goals:  Middleware for fabric & Grid management  Large scale testbed - major fraction of one LHC experiment  Production quality HEP demonstrations  “mock data”, simulation analysis, current experiments  Other science demonstrations  Three year phased developments & demos  Complementary to other GRID projects  EuroGrid: Uniform access to parallel supercomputing resources  Synergy to be developed (GRID Forum, Industry and Research Forum)

CERN TERENA Lisbon F. Gagliardi - CERN/IT-May Participants  Main partners: CERN, INFN(I), CNRS(F), PPARC(UK), NIKEF(NL), ESA-Earth Observation  Other sciences: KNMI(NL), Biology, Medicine  Industrial participation: CS SI/F, DataMat/I, IBM/UK  Associated partners: Czech Republic, Finland, Germany, Hungary, Spain, Sweden (mostly computer scientists)  Formal collaboration with USA  Industry and Research Project Forum with representatives from:  Denmark, Greece, Israel, Japan, Norway, Poland, Portugal, Russia

CERN TERENA Lisbon F. Gagliardi - CERN/IT-May Status  Prototype work already started at CERN and in most of collaborating institutes  Proposal to RN2 submitted  Network requirements discussed with Dante/Geant

CERN TERENA Lisbon F. Gagliardi - CERN/IT-May WAN Requirements  High bandwidth from CERN to Tier 1 centres (5-6)  VPN, Quality of Service  Guaranteed performance during limited test periods and at the end of the project for production quality services  Target requirements (2003) 2.5 Gb/s Mb/s Mb/s  Could saturate for limited amount of test time 2.5 Gb/s (100 MB/s out from a 100 PC farm, we plan for 1000’s PC farm)  Reliability is an important factor:  from WEB client-server model to  GRID peer distributed computing model

CERN TERENA Lisbon F. Gagliardi - CERN/IT-May Conclusions  This project, motivated by HEP and other high data and computing demanding sciences, will contribute to develop and implement a new world-wide distributed computing model: The GRID  An ideal computing model for the next generation Internet  An excellent test case for the next generation of high- performance research networks