Enabling Grids for E-sciencE INFSO-RI-508833 Dr. Rüdiger Berlich Forschungszentrum Karslruhe Introduction to Grid Computing Christopher.

Slides:



Advertisements
Similar presentations
Forschungszentrum Karlsruhe in der Helmholtz-Gemeinschaft LCG-POB, , Reinhard Maschuw1 Grid Computing Centre Karlsruhe - GridKa Regional/Tier.
Advertisements

Service Data Challenge Meeting, Karlsruhe, Dec 2, 2004 Forschungszentrum Karlsruhe in der Helmholtz-Gemeinschaft Plans and outlook at GridKa Forschungszentrum.
Forschungszentrum Karlsruhe in der Helmholtz-Gemeinschaft Torsten Antoni – LCG Operations Workshop, CERN 02-04/11/04 Global Grid User Support - GGUS -
Forschungszentrum Karlsruhe in der Helmholtz-Gemeinschaft Wofgang Thöne, Institute For Scientific Computing – EGEE-Meeting August 2004 Welcome to the User.
Highest Energy e + e – Collider LEP at CERN GeV ~4km radius First e + e – Collider ADA in Frascati GeV ~1m radius e + e – Colliders.
Introduction to Grids and Grid applications Gergely Sipos MTA SZTAKI
Alain Romeyer - Dec Grid computing for CMS What is the Grid ? Let’s start with an analogy How it works ? (Some basic ideas) Grid for LHC and CMS.
Grids: Why and How (you might use them) J. Templon, NIKHEF VLV T Workshop NIKHEF 06 October 2003.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Infrastructure overview Arnold Meijster &
E-Science Workshop, Santiago de Chile, 23./ KIT ( Frank Schmitz Forschungszentrum Karlsruhe Institut.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 15 th April 2009 Visit of Spanish Royal Academy.
Regional Computing Centre for Particle Physics Institute of Physics AS CR (FZU) TIER2 of LCG (LHC Computing Grid) 1M. Lokajicek Dell Presentation.
CERN/IT/DB Multi-PB Distributed Databases Jamie Shiers IT Division, DB Group, CERN, Geneva, Switzerland February 2001.
© 2008 by M. Stümpert, A. Garcia; made available under the EPL v1.0 | Access the power of Grids with Eclipse Mathias Stümpert (Karlsruhe Institute.
1 Deployment of an LCG Infrastructure in Australia How-To Setup the LCG Grid Middleware – A beginner's perspective Marco La Rosa
ITEP participation in the EGEE project NEC’2005, Varna, Bulgaria Ivan Korolko (ITEP Moscow)
Advanced Computing Services for Research Organisations Bob Jones Head of openlab IT dept CERN This document produced by Members of the Helix Nebula consortium.
Grid Computing - AAU 14/ Grid Computing Josva Kleist Danish Center for Grid Computing
INFSO-RI Enabling Grids for E-sciencE EGEODE VO « Expanding GEosciences On DEmand » Geocluster©: Generic Seismic Processing Platform.
Frédéric Hemmer, CERN, IT Department The LHC Computing Grid – June 2006 The LHC Computing Grid Visit of the Comité d’avis pour les questions Scientifiques.
INFSO-RI Enabling Grids for E-sciencE SA1: Cookbook (DSA1.7) Ian Bird CERN 18 January 2006.
A short introduction to the Worldwide LHC Computing Grid Maarten Litmaath (CERN)
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Related Projects Dieter Kranzlmüller Deputy.
Fermilab User Facility US-CMS User Facility and Regional Center at Fermilab Matthias Kasemann FNAL.
Finnish DataGrid meeting, CSC, Otaniemi, V. Karimäki (HIP) DataGrid meeting, CSC V. Karimäki (HIP) V. Karimäki (HIP) Otaniemi, 28 August, 2000.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 25 th April 2012.
10/24/2015OSG at CANS1 Open Science Grid Ruth Pordes Fermilab
14 Aug 08DOE Review John Huth ATLAS Computing at Harvard John Huth.
Ian Bird LCG Deployment Manager EGEE Operations Manager LCG - The Worldwide LHC Computing Grid Building a Service for LHC Data Analysis 22 September 2006.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
GridPP Deployment & Operations GridPP has built a Computing Grid of more than 5,000 CPUs, with equipment based at many of the particle physics centres.
EGEE is a project funded by the European Union under contract IST HEP Use Cases for Grid Computing J. A. Templon Undecided (NIKHEF) Grid Tutorial,
Overall Goal of the Project  Develop full functionality of CMS Tier-2 centers  Embed the Tier-2 centers in the LHC-GRID  Provide well documented and.
V.A. Ilyin,, RIGF, 14 May 2010 Internet and Science: LHC view V.A. Ilyin SINP MSU, e-ARENA.
Ian Bird LHC Computing Grid Project Leader LHC Grid Fest 3 rd October 2008 A worldwide collaboration.
The LHC Computing Grid – February 2008 The Challenges of LHC Computing Dr Ian Bird LCG Project Leader 6 th October 2009 Telecom 2009 Youth Forum.
GRID Overview Internet2 Member Meeting Spring 2003 Sandra Redman Information Technology and Systems Center and Information Technology Research Center National.
Les Les Robertson LCG Project Leader High Energy Physics using a worldwide computing grid Torino December 2005.
ESFRI & e-Infrastructure Collaborations, EGEE’09 Krzysztof Wrona September 21 st, 2009 European XFEL.
Ian Bird LCG Deployment Area Manager & EGEE Operations Manager IT Department, CERN Presentation to HEPiX 22 nd October 2004 LCG Operations.
INFSO-RI Enabling Grids for E-sciencE Experience of using gLite for analysis of ATLAS combined test beam data A. Zalite / PNPI.
EGEE-III-INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks EGEE-III All Activity Meeting Brussels,
23.March 2004Bernd Panzer-Steindel, CERN/IT1 LCG Workshop Computing Fabric.
Dr. Andreas Wagner Deputy Group Leader - Operating Systems and Infrastructure Services CERN IT Department The IT Department & The LHC Computing Grid –
, VilniusBaltic Grid1 EG Contribution to NEEGRID Martti Raidal On behalf of Estonian Grid.
Università di Perugia Enabling Grids for E-sciencE Status of and requirements for Computational Chemistry NA4 – SA1 Meeting – 6 th April.
© Copyright AARNet Pty Ltd PRAGMA Update & some personal observations James Sankar Network Engineer - Middleware.
NORDUnet NORDUnet e-Infrastrucure: Grids and Hybrid Networks Lars Fischer CTO, NORDUnet Fall 2006 Internet2 Member Meeting, Chicago.
IAG – Israel Academic Grid, EGEE and HEP in Israel Prof. David Horn Tel Aviv University.
The National Grid Service Mike Mineter.
Predrag Buncic (CERN/PH-SFT) Software Packaging: Can Virtualization help?
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Operations Automation Team Kickoff Meeting.
INRNE's participation in LCG Elena Puncheva Preslav Konstantinov IT Department.
12 March, 2002 LCG Applications Area - Introduction slide 1 LCG Applications Session LCG Launch Workshop March 12, 2002 John Harvey, CERN LHCb Computing.
A Computing Tier 2 Node Eric Fede – LAPP/IN2P3. 2 Eric Fede – 1st Chinese-French Workshop Plan What is a Tier 2 –Context and definition To be a Tier 2.
INFSO-RI Enabling Grids for E-sciencE EGEE general project update Fotis Karayannis EGEE South East Europe Project Management Board.
INFSO-RI Enabling Grids for E-sciencE The t-Infrastructure, Gilda Dr. Rüdiger Berlich, Forschungszentrum Karlsruhe / Germany Dr.
Computing infrastructures for the LHC: current status and challenges of the High Luminosity LHC future Worldwide LHC Computing Grid (WLCG): Distributed.
 Prospective Nationale sur les Grilles de Production, Paris, L'état d'avancement des grilles.
Hall D Computing Facilities Ian Bird 16 March 2001.
Grid Computing: Running your Jobs around the World
Clouds , Grids and Clusters
JRA3 Introduction Åke Edlund EGEE Security Head
The LHC Computing Grid Visit of Mtro. Enrique Agüera Ibañez
Ian Bird GDB Meeting CERN 9 September 2003
Grid Computing.
Gonçalo Borges, Mário David, Jorge Gomes
LHC Data Analysis using a worldwide computing grid
Presentation transcript:

Enabling Grids for E-sciencE INFSO-RI Dr. Rüdiger Berlich Forschungszentrum Karslruhe Introduction to Grid Computing Christopher Jung Forschungszentrum Karslruhe Slides contributed by Rüdiger Berlich, Dave Berry and Christopher Jung

Enabling Grids for E-sciencE INFSO-RI Forschungszentrum Karlsruhe Institute for Scientific Computing (IWR) Part of the “Helmholtz Gemeinschaft“ One of the largest independent German research institutions Many different research areas ranging from environmental studies over nano technology to Grid Computing

Enabling Grids for E-sciencE INFSO-RI The GridKa Cluster (1)

Enabling Grids for E-sciencE INFSO-RI The GridKa Cluster (2) Status and future of ressources 12 % Tape [TB] 50 % 1010*2Internet [Gb/s] 18 % Disc [TB] 12 % Compute Power / kSI2k 30 % Processors % of 2008Apr 2005Okt 2004Apr 2004 As of 10/2004: largest Linux cluster in the German science community largest online storage of a single installation in Germany fastest Internet connection in Germany part of a Grid with ca. 80 other European installations routing (full 10 Gbps): GridKa – DFN (Karlsruhe) – DFN (Frankfurt) – Géant (Frankfurt) – Géant (Paris) – Géant (Genf) – CERN * still being tested What for ?

Enabling Grids for E-sciencE INFSO-RI Usage of GridKa Cluster January-December Prozessor Usage [h] Number of Jobs LHC 34% non-LHC 66%

Enabling Grids for E-sciencE INFSO-RI LHC / CMS In LHC: Expect data rates of 1 Petabyte (??) per experiment per year. But: trivial to run in parallel...

Enabling Grids for E-sciencE INFSO-RI Data acquisition at CMS 100 KHz (150 GB/sec) 150 Hz (225 MB/sec) High Level Trigger – PCs Data  40MHz (  60TB/sec) 1 event:  1.5 MB data recording Online-System Offline-Analysis Level 1 Trigger – special hardware multi-level trigger to: filter out uninteresting events reduce data volume

Enabling Grids for E-sciencE INFSO-RI Distributed Collaborations Europe: 267 Institutes, 4603 Users Other: 208 Institutes, 1632 Users Over 6000 LHC Scientists worldwide Want transparent and quick access (very rightly so). Interested more in physics results, than computing revolutions

Enabling Grids for E-sciencE INFSO-RI The LHC Computing Grid LCG helps the experiments’ computing projects Phase 1 – prepare and deploy the environment for LHC computing Phase 2 – acquire, build and operate the LHC computing service SC2 – Software & Computing Committee SC2 includes the four experiments, Tier 1 Regional Centres SC2 identifies common solutions and sets requirements for the project PEB – Project Execution Board PEB manages the implementation organising projects, work packages coordinating between the Regional Centres

Enabling Grids for E-sciencE INFSO-RI The MONARC* study and Tier-1 centers Basic idea: hierarchical distribution of tasks Idea accepted by the LHC Computing Grid (responsible for planning and management of LHC computing) Tier-0: Initial reconstruction and storage of raw events, distribution to Tier-1 Tier-1: Data-heavy analysis, reprocessing of data, regional support Tier-2: Managed disk storage, simulation of PP events, computing * MONARC == Models Of Networked Analysis at Regional Centers Today: “The Grid”

Enabling Grids for E-sciencE INFSO-RI Distributed Computing and PP Distributed computing and particle physics go well together, because: PP analysis is trivial to parallelize (just run each job on a separate machine and collect the results) PP collaborations are distributed by design, as modern experiments usually cannot be financed by a single country anymore Distributed resources already exist and can be used to lower the cost of new experiments physicists in general are willing to set up a production system for distributed computing, not only interest in theory of computing Governments like to spend their money locally (billion dollar investments...)

Enabling Grids for E-sciencE INFSO-RI Requirements Need: transparent access to data replication, virtualisation, global filesystems,... secure storage, authentication and authorisation access control (Unix...), PKI infrastructure, CA, agreed policies, VO accounting (computing costs money) not really solved training, support GGUS, EGEE Workpackages fast networks (low latency, high bandwidth) Geant, DFN,.... Need: (a) software layer “middleware”, (b) fast networks, (c) common policies and (d) services

Enabling Grids for E-sciencE INFSO-RI The Grid: Definition A Virtual Organisation is: People from different institutions working to solve a common goal Sharing distributed processing and data resources Not too different from Unix rights management (access control) “Grid computing is coordinated resource sharing and problem solving in dynamic, multi-institutional virtual organizations” (I.Foster) Genealogy: The term “Grid Computing” comes from the analogy to the electrical power grid - “computing power from a plug in the wall”

Enabling Grids for E-sciencE INFSO-RI Distributed applications – today and tomorrow Existing distributed applications: –tend to be specialised systems –intended for a single purpose or user group Grids go further and take into account: –Different kinds of resources  Not always the same hardware, data and applications –Different kinds of interactions  User groups or applications want to interact with Grids in different ways –Dynamic nature  Resources and users added/removed/changed frequently

Enabling Grids for E-sciencE INFSO-RI The Grid? There is not only one grid! Many grid initiatives Different requirements of different sciences (physics, bioinformatics, meteorology, disaster management,...) and industry Most important ‘grids’ in High Energy Physics: –LCG –NorduGrid –Grid 3 –SAM Those are incompatible (to at least a high degree)

Enabling Grids for E-sciencE INFSO-RI Grid Projects s.htm Many brilliant people with many brilliant (but incompatible) ideas

Enabling Grids for E-sciencE INFSO-RI The diverse world of grid computing Just to illustrate the diversities of ideas in grid computing, I‘ll show you an excerpt of program items at last year‘s „International Summer School of Grid Computing“ Grids, Middleware and Application DAGMAN, Condor-G and Stork Web Services Community Grids Labroatory Grid Portals Application Grids Boat trip to Amalfi (okay, no grid computing) Workflow OGSA-DAI Unicore Commercial Grids ….

Enabling Grids for E-sciencE INFSO-RI The world of LCG … and still growing

Enabling Grids for E-sciencE INFSO-RI Main misunderstandings There are some general misunderstadings about grid computing: Computing power and disk/tape space for free Grid software is easy and fast to install Users will jump at it as soon as they can My favorite Linux flavor is XYZ; for sure it will be easy to install grid software on it I can install all basic services on just one machine Documentation is great Only small amount of man-power is needed for grid administration

Enabling Grids for E-sciencE INFSO-RI „Outlook“: Why do grid computing now? Grid computing is quite easy for the user (you will see this later) At the moment the grid is everything but crowded by users, so there is much computing power available (and much more will become available by 2007) Grid developers need feedback to do further improvements on software Today‘s experiments can already profit from grid computing Who wants to spend time in 2007 on learning grid computing, if there is interesting physics to be done?

Enabling Grids for E-sciencE INFSO-RI Questions? Thanks to the German Federal Ministry of Education and Research, BMB+F, as well as Forschungszentrum Karlsruhe / Germany for their continuous interest and support !