CERN IT Department1 / 17 Tour of CERN Computer Center and the Grid at CERN Information Technologies Department Tour of CERN Computer Center and the Grid.

Slides:



Advertisements
Similar presentations
CERN STAR TAP June 2001 Status of the EU DataGrid Project Fabrizio Gagliardi CERN EU-DataGrid Project Leader June 2001
Advertisements

International Grid Communities Dr. Carl Kesselman Information Sciences Institute University of Southern California.
EU DataGrid progress Fabrizio Gagliardi EDG Project Leader
An open source approach for grids Bob Jones CERN EU DataGrid Project Deputy Project Leader EU EGEE Designated Technical Director
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Why Grids Matter to Europe Bob Jones EGEE.
S.L.LloydATSE e-Science Visit April 2004Slide 1 GridPP – A UK Computing Grid for Particle Physics GridPP 19 UK Universities, CCLRC (RAL & Daresbury) and.
GridPP, The Grid & Industry Who we are, what it is and what we can do. Tony Doyle, Project Leader Steve Lloyd, Collaboration Board Chairman Robin Middleton,
Tony Doyle - University of Glasgow GridPP EDG - UK Contributions Architecture Testbed-1 Network Monitoring Certificates & Security Storage Element R-GMA.
Your university or experiment logo here What is it? What is it for? The Grid.
The Grid What is it? what is it for?. Your university or experiment logo here Web: information sharing Invented at CERN by Tim Berners-Lee No. of Internet.
GridPP Building a UK Computing Grid for Particle Physics A PPARC funded project.
Particle physics – the computing challenge CERN Large Hadron Collider –2007 –the worlds most powerful particle accelerator –10 petabytes (10 million billion.
Fighting Malaria With The Grid. Computing on The Grid The Internet allows users to share information across vast geographical distances. Using similar.
Research Councils ICT Conference Welcome Malcolm Atkinson Director 17 th May 2004.
SWITCH Visit to NeSC Malcolm Atkinson Director 5 th October 2004.
The LHC experiments AuthZ Interoperation requirements GGF16, Athens 16 February 2006 David Kelsey CCLRC/RAL, UK
Why LHC? Tara Shears, University of Liverpool. To understand the universe … Fundamental particles atoms stars and galaxies NOW Investigate with astrophysics,
Service Data Challenge Meeting, Karlsruhe, Dec 2, 2004 Forschungszentrum Karlsruhe in der Helmholtz-Gemeinschaft Plans and outlook at GridKa Forschungszentrum.
CBPF J. Magnin LAFEX-CBPF. Outline What is the GRID ? Why GRID at CBPF ? What are our needs ? Status of GRID at CBPF.
GridPP From Prototype to Production David Britton 21/Sep/06 1.Context – Introduction to GridPP 2.Performance of the GridPP/EGEE/wLCG Grid 3.Some Successes.
Grids e HEP. Concorde (15 Km) Balloon (30 Km) CD stack with 1 year LHC data! (~ 20 Km) Mt. Blanc (4.8 Km) Bytes 10 3 Terabytes 1 Petabyte.
INFSO-RI Enabling Grids for E-sciencE EGEE is a project co-funded by the European Commission under contract INFSO-RI The.
Welcome to CERN Accelerating Science and Innovation 2 nd March 2015 – Bidders Conference – DO-29161/EN.
Grids for the LHC Paula Eerola Lund University, Sweden Four Seas Conference Istanbul 5-10 September 2004 Acknowledgement: much of the material is from.
Randall Sobie The ATLAS Experiment Randall Sobie Institute for Particle Physics University of Victoria Large Hadron Collider (LHC) at CERN Laboratory ATLAS.
EGEE is proposed as a project funded by the European Union under contract IST The EGEE International Grid Infrastructure and the Digital Divide.
CERN Krakow 2001 F. Gagliardi - CERN/IT 1 RTD efforts in Europe by Kyriakos Baxevanidis Foster cohesion, interoperability, cross- fertilization of knowledge,
The LHC Computing Grid Project Tomi Kauppi Timo Larjo.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 15 th April 2009 Visit of Spanish Royal Academy.
Open Science Grid For CI-Days Internet2: Fall Member Meeting, 2007 John McGee – OSG Engagement Manager Renaissance Computing Institute.
Frédéric Hemmer, CERN, IT DepartmentThe LHC Computing Grid – October 2006 LHC Computing and Grids Frédéric Hemmer IT Deputy Department Head October 10,
Welcome to this presentation! We’re so glad you came. While you’re here, you can explore many questions: What is CERN and the Large Hadron Collider (LHC)?
From GEANT to Grid empowered Research Infrastructures ANTONELLA KARLSON DG INFSO Research Infrastructures Grids Information Day 25 March 2003 From GEANT.
Jarek Nabrzyski, Ariel Oleksiak Comparison of Grid Middleware in European Grid Projects Jarek Nabrzyski, Ariel Oleksiak Poznań Supercomputing and Networking.
L ABORATÓRIO DE INSTRUMENTAÇÃO EM FÍSICA EXPERIMENTAL DE PARTÍCULAS Enabling Grids for E-sciencE Grid Computing: Running your Jobs around the World.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 25 th April 2012.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
To the Grid From the Web. From the Web to the Grid – 2007 Why was the Web invented at CERN? Science depends on free access to information and exchange.
… where the Web was born 11 November 2003 Wolfgang von Rüden, IT Division Leader CERN openlab Workshop on TCO Introduction.
Introduction to Grid Computing Ed Seidel Max Planck Institute for Gravitational Physics
Bob Jones Technical Director CERN - August 2003 EGEE is proposed as a project to be funded by the European Union under contract IST
To the Grid From the Web Dr. Francois Grey IT Department, CERN.
CERN IT Department CH-1211 Genève 23 Switzerland Visit of Professor Karel van der Toorn President University of Amsterdam Wednesday 10 th.
Authors: Ronnie Julio Cole David
GridPP Building a UK Computing Grid for Particle Physics Professor Steve Lloyd, Queen Mary, University of London Chair of the GridPP Collaboration Board.
…building the next IT revolution From Web to Grid…
The LHC Computing Grid – February 2008 The Challenges of LHC Computing Dr Ian Bird LCG Project Leader 6 th October 2009 Telecom 2009 Youth Forum.
Les Les Robertson LCG Project Leader High Energy Physics using a worldwide computing grid Torino December 2005.
Presentation of the A particle collision = an event Physicist's goal is to count, trace and characterize all the particles produced and fully.
Dr. Andreas Wagner Deputy Group Leader - Operating Systems and Infrastructure Services CERN IT Department The IT Department & The LHC Computing Grid –
Computing for LHC Physics 7th March 2014 International Women's Day - CERN- GOOGLE Networking Event Maria Alandes Pradillo CERN IT Department.
LHC Computing, CERN, & Federated Identities
Data Processing and the LHC Computing Grid (LCG) Jamie Shiers Database Group, IT Division CERN, Geneva, Switzerland
INFSO-RI Enabling Grids for E-sciencE The EGEE Project Owen Appleton EGEE Dissemination Officer CERN, Switzerland Danish Grid Forum.
Induction: What is EGEE? –April 26-28, What is EGEE? John Murison, EGEE Training Team EGEE is funded by the European Union under contract IST
Conference xxx - August 2003 EGEE is proposed as a project funded by the European Union under contract IST Fabrizio Gagliardi EGEE designated.
GRIDSTART Brussels 20/9/02 1www.gridstart.org GRIDSTART and European activities Dr Francis Wray EPCC The University of Edinburgh.
The Worldwide LHC Computing Grid Frédéric Hemmer IT Department Head Visit of INTEL ISEF CERN Special Award Winners 2012 Thursday, 21 st June 2012.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Introduction to Grids and the EGEE project.
Hans Hoffmann, Director Technology Transfer & Scientific Computing, ( ) Why CERN provides a unique IT challenge for industry: from the Web to the.
GridPP, The Grid & Industry
Grid Computing: Running your Jobs around the World
Grid site as a tool for data processing and data analysis
PROGRAMME 10:00 Introduction to presentations and tour (10‘) Francois Grey  10:10 CERN openlab student programme - CERN opencluster (05')    Stephen Eccles 
The LHC Computing Grid Visit of Mtro. Enrique Agüera Ibañez
The LHC Computing Grid Visit of Her Royal Highness
Tour of CERN Computer Center
Tour of CERN Computer Center
The LHC and grids Welcome to this presentation! We’re so glad you came. While you’re here, you can explore many questions: What is CERN and the Large.
The LHC Computing Grid Visit of Professor Andreas Demetriou
Presentation transcript:

CERN IT Department1 / 17 Tour of CERN Computer Center and the Grid at CERN Information Technologies Department Tour of CERN Computer Center and the Grid at CERN Welcome!

CERN IT Department2 / 17 Tour of CERN Computer Center and the Grid at CERN IT Department Computing at CERN 1.General Purpose Computing Environment 2.Administrative Computing Services 3.Physics and engineering computing 4.Consolidation, coordination and standardization of computing activities 5.Physics applications (e.g., for data acquisition/offline analysis) 6.Accelerator design and operations

CERN IT Department3 / 17 Tour of CERN Computer Center and the Grid at CERN LHC Data every year 40 million collisions per second After filtering, 100 collisions of interest per second > 1 Megabyte of data digitised per collision recording rate > 1 Gigabyte / sec 1010 collisions recorded each year stored data > 15 Petabytes / year CMS LHCb ATLAS ALICE 1 Megabyte (1MB) A digital photo 1 Gigabyte (1GB) = 1000MB 5GB = A DVD movie 1 Terabyte (1TB) = 1000GB World annual book production 1 Petabyte (1PB) = 1000TB Annual production of one LHC experiment 1 Exabyte (1EB) = 1000 PB 3EB = World annual information production

CERN IT Department4 / 17 Tour of CERN Computer Center and the Grid at CERN LHC Data every year LHC data correspond to about 20 million CDs each year Balloon (30 Km) Concorde (15 Km) Mt. Blanc (4.8 Km) Where will the experiments store all of these data? CD stack with 1 year LHC data! (~ 20 Km)

CERN IT Department5 / 17 Tour of CERN Computer Center and the Grid at CERN LHC Data Processing LHC data analysis requires a computing power equivalent to ~ 100,000 of today's fastest PC processors Where will the experiments find such a computing power?

CERN IT Department6 / 17 Tour of CERN Computer Center and the Grid at CERN High-throughput computing based on reliable “commodity” technology More than 8500 CPUs in about 3500 boxes (Linux) 4 Petabytes on 14’000 drives (NAS Disk storage) 10 Petabytes on 45’000 tape slots with 170 high speed drives Nowhere near enough! Computing power available at CERN

CERN IT Department7 / 17 Tour of CERN Computer Center and the Grid at CERN Problem: even with Computer Centre upgrade, CERN can provide only a fraction of the necessary resources Europe: 267 institutes 4603 users Ailleurs: 208 institutes 1632 users Computing for LHC Solution: Computing centers, which were isolated in the past, will be connected, uniting the computing resources of particle physicists worldwide

CERN IT Department8 / 17 Tour of CERN Computer Center and the Grid at CERN The World Wide Web provides seamless access to information that is stored in many millions of different geographical locations What is the Grid? In contrast, the Grid is an emerging infrastructure that provides seamless access to computing power and data storage capacity distributed over the globe

CERN IT Department9 / 17 Tour of CERN Computer Center and the Grid at CERN NASA Information Power Grid DOE Science Grid NSF National Virtual Observatory NSF GriPhyN DOE Particle Physics Data Grid NSF TeraGrid DOE ASCI Grid DOE Earth Systems Grid DARPA CoABS Grid NEESGrid DOH BIRN NSF iVDGL One Web but many Grids DataGrid (CERN,...) EuroGrid (Unicore) DataTag (CERN,…) Astrophysical Virtual Observatory GRIP (Globus/Unicore) GRIA (Industrial applications) GridLab (Cactus Toolkit) CrossGrid (Infrastructure Components) EGSO (Solar Physics) UK e-Science Grid Netherlands – VLAM, PolderGrid Germany – UNICORE, Grid proposal France – Grid funding approved Italy – INFN Grid Eire – Grid proposals Switzerland - Network/Grid proposal Hungary – DemoGrid, Grid proposal Norway, Sweden - NorduGrid Grid development has been initiated by the academic, scientific and research community, but industry is also interested.

CERN IT Department10 / 17 Tour of CERN Computer Center and the Grid at CERN 1. Sharing resources on a global scale Main issues are trust, different management policies, virtual organisations, 24 hour access and support. 2. Security Main issues are well-defined yet flexible rules, authentication, authorisation, compatibility and standards 3. Balancing the load This is more than just cycle scavenging, need middleware to monitor and broker resources 4. The death of distance Networks delivered 56Kb/s 10 years ago, now we have 155Mb/s, for the LHC anticipate 10 Gb/s 5. Open standards Grid standards are converging, and include Web services, the Globus Toolkit, various protocols What are the principles behind the Grid? 5 big ideas…

CERN IT Department11 / 17 Tour of CERN Computer Center and the Grid at CERN Medical/Healthcare (imaging, diagnosis and treatment ) Bioinformatics (study of the human genome and proteome to understand genetic diseases) Nanotechnology (design of new materials from the molecular scale) Engineering (design optimization, simulation, failure analysis and remote instrument access and control) Natural Resources and the Environment (weather forecasting, earth observation, modeling and prediction of complex systems) Grid applications for Science

CERN IT Department12 / 17 Tour of CERN Computer Center and the Grid at CERN CERN projects: LHC Computing Grid (LCG) EU funded projects led by CERN: Enabling Grids for E-SciencE (EGEE) + others Industry funded projects: CERN openlab for DataGrid applications

CERN IT Department13 / 17 Tour of CERN Computer Center and the Grid at CERN Timeline 2002: start project 2003: 2003: service opened (LCG-1 started in September with 12 sites) 2004 LCG-2 released : deploy the LCG environment 2006 – 2008: build and operate the LCG service As of April 2007: biggest Grid project in the world 177 sites in more than 30 countries 30’000 processors 14 millions Gigabytes storage LCG: LHC Computing Grid

CERN IT Department14 / 17 Tour of CERN Computer Center and the Grid at CERN Access to a production quality GRID will change the way science and much else is done in Europe A geneticist at a conference, inspired by a talk she hears, will be able to launch a complex biomolecular simulation from her mobile phone. A team of engineering students will be able to run the latest 3D rendering programs from their laptops using the Grid. An international network of scientists will be able to model a new flood of the Danube in real time, using meteorological and geological data from several centres across Europe. The EGEE vision

CERN IT Department15 / 17 Tour of CERN Computer Center and the Grid at CERN Objectives Build an ultrahigh performance computer cluster Link it to the DataGrid and test its performance Evaluate potential of future commodity technology for LCG Openlab for Datagrid Applications

CERN IT Department16 / 17 Tour of CERN Computer Center and the Grid at CERN Computer Centre Tour Ground Floor OpenLab: equipment of the future CIXP: CERN Internet Exchange Point Basement Storage Silos: >10 petabytes PC farm: >3000 PC aligned IMPORTANT – FOR YOUR OWN SAFETY PLEASE DO NOT TOUCH EQUIPMENT AND CABLES DURING VISIT

CERN IT Department17 / 17 Tour of CERN Computer Center and the Grid at CERN To know more about the Grid…