Grid Infrastructure for the ILC Andreas Gellrich DESY European ILC Software and Physics Meeting Cambridge, UK, 04.04.2006.

Slides:



Advertisements
Similar presentations
The LHC experiments AuthZ Interoperation requirements GGF16, Athens 16 February 2006 David Kelsey CCLRC/RAL, UK
Advertisements

INFSO-RI Enabling Grids for E-sciencE Sven Hermann, Clemens Koerdt Forschungszentrum Karlsruhe ROC DECH Report.
FP7-INFRA Enabling Grids for E-sciencE EGEE Induction Grid training for users, Institute of Physics Belgrade, Serbia Sep. 19, 2008.
Plateforme de Calcul pour les Sciences du Vivant SRB & gLite V. Breton.
Southgrid Status Pete Gronbech: 27th June 2006 GridPP 16 QMUL.
The EPIKH Project (Exchange Programme to advance e-Infrastructure Know-How) gLite Grid Services Abderrahman El Kharrim
Technology on the NGS Pete Oliver NGS Operations Manager.
Site Report HEPHY-UIBK Austrian federated Tier 2 meeting
Joining the Grid Andrew McNab. 28 March 2006Andrew McNab – Joining the Grid Outline ● LCG – the grid you're joining ● Related projects ● Getting a certificate.
Makrand Siddhabhatti Tata Institute of Fundamental Research Mumbai 17 Aug
Welcome e-Science in the UK Building Collaborative eResearch Environments Prof. Malcolm Atkinson Director 23 rd February 2004.
ScotGrid: a Prototype Tier-2 Centre – Steve Thorn, Edinburgh University SCOTGRID: A PROTOTYPE TIER-2 CENTRE Steve Thorn Authors: A. Earl, P. Clark, S.
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Configuring and Maintaining EGEE Production.
F.Fanzago – INFN Padova ; S.Lacaprara – LNL; D.Spiga – Universita’ Perugia M.Corvo - CERN; N.DeFilippis - Universita' Bari; A.Fanfani – Universita’ Bologna;
A.Guarise – F.Rosso 1 Enabling Grids for E-sciencE INFSO-RI Comprehensive Accounting Views on large computing farms. Andrea Guarise & Felice Rosso.
Enabling Grids for E-sciencE ENEA and the EGEE project gLite and interoperability Andrea Santoro, Carlo Sciò Enea Frascati, 22 November.
L ABORATÓRIO DE INSTRUMENTAÇÃO EM FÍSICA EXPERIMENTAL DE PARTÍCULAS Enabling Grids for E-sciencE Grid Computing: Running your Jobs around the World.
Finnish DataGrid meeting, CSC, Otaniemi, V. Karimäki (HIP) DataGrid meeting, CSC V. Karimäki (HIP) V. Karimäki (HIP) Otaniemi, 28 August, 2000.
IST E-infrastructure shared between Europe and Latin America High Energy Physics Applications in EELA Raquel Pezoa Universidad.
Steve Traylen Particle Physics Department EDG and LCG Status 9 th December 2003
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
November SC06 Tampa F.Fanzago CRAB a user-friendly tool for CMS distributed analysis Federica Fanzago INFN-PADOVA for CRAB team.
Responsibilities of ROC and CIC in EGEE infrastructure A.Kryukov, SINP MSU, CIC Manager Yu.Lazin, IHEP, ROC Manager
The ILC And the Grid Andreas Gellrich DESY LCWS2007 DESY, Hamburg, Germany
GridPP Deployment & Operations GridPP has built a Computing Grid of more than 5,000 CPUs, with equipment based at many of the particle physics centres.
Grid Glasgow Outline LHC Computing at a Glance Glasgow Starting Point LHC Computing Challenge CPU Intensive Applications Timeline ScotGRID.
Institute of High Energy Physics ( ) NEC’2005 Varna, Bulgaria, September Participation of IHEP in EGEE.
MTA SZTAKI Hungarian Academy of Sciences Introduction to Grid portals Gergely Sipos
INFSO-RI Enabling Grids for E-sciencE OSG-LCG Interoperability Activity Author: Laurence Field (CERN)
…building the next IT revolution From Web to Grid…
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
Eine Einführung ins Grid Andreas Gellrich IT Training DESY Hamburg
Presenter Name Facility Name UK Testbed Status and EDG Testbed Two. Steve Traylen GridPP 7, Oxford.
1 Andrea Sciabà CERN Critical Services and Monitoring - CMS Andrea Sciabà WLCG Service Reliability Workshop 26 – 30 November, 2007.
Grid DESY Andreas Gellrich DESY EGEE ROC DECH Meeting FZ Karlsruhe, 22./
6/23/2005 R. GARDNER OSG Baseline Services 1 OSG Baseline Services In my talk I’d like to discuss two questions:  What capabilities are we aiming for.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks The GILDA t-Infrastructure Roberto Barbera.
Grid Glasgow Outline LHC Computing at a Glance Glasgow Starting Point LHC Computing Challenge CPU Intensive Applications Timeline ScotGRID.
Glite. Architecture Applications have access both to Higher-level Grid Services and to Foundation Grid Middleware Higher-Level Grid Services are supposed.
30/07/2005Symmetries and Spin - Praha 051 MonteCarlo simulations in a GRID environment for the COMPASS experiment Antonio Amoroso for the COMPASS Coll.
HEP SYSMAN 23 May 2007 National Grid Service Steven Young National Grid Service Manager Oxford e-Research Centre University of Oxford.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE Site Architecture Resource Center Deployment Considerations MIMOS EGEE Tutorial.
2-Sep-02Steve Traylen, RAL WP6 Test Bed Report1 RAL and UK WP6 Test Bed Report Steve Traylen, WP6
Rutherford Appleton Lab, UK VOBox Considerations from GridPP. GridPP DTeam Meeting. Wed Sep 13 th 2005.
DTI Mission – 29 June LCG Security Ian Neilson LCG Security Officer Grid Deployment Group CERN.
SAM Sensors & Tests Judit Novak CERN IT/GD SAM Review I. 21. May 2007, CERN.
BaBar Cluster Had been unstable mainly because of failing disks Very few (
Accounting in LCG/EGEE Can We Gauge Grid Usage via RBs? Dave Kant CCLRC, e-Science Centre.
Distributed Physics Analysis Past, Present, and Future Kaushik De University of Texas at Arlington (ATLAS & D0 Collaborations) ICHEP’06, Moscow July 29,
INFN GRID Production Infrastructure Status and operation organization Cristina Vistoli Cnaf GDB Bologna, 11/10/2005.
INRNE's participation in LCG Elena Puncheva Preslav Konstantinov IT Department.
EGEE is a project funded by the European Union under contract IST New VO Integration Fabio Hernandez ROC Managers Workshop,
SAM Status Update Piotr Nyczyk LCG Management Board CERN, 5 June 2007.
Storage Management on the Grid Alasdair Earl University of Edinburgh.
J Jensen/J Gordon RAL Storage Storage at RAL Service Challenge Meeting 27 Jan 2005.
II EGEE conference Den Haag November, ROC-CIC status in Italy
1/3/2006 Grid operations: structure and organization Cristina Vistoli INFN CNAF – Bologna - Italy.
SAM architecture EGEE 07 Service Availability Monitor for the LHC experiments Simone Campana, Alessandro Di Girolamo, Nicolò Magini, Patricia Mendez Lorenzo,
INFN/IGI contributions Federated Clouds Task Force F2F meeting November 24, 2011, Amsterdam.
TIFR, Mumbai, India, Feb 13-17, GridView - A Grid Monitoring and Visualization Tool Rajesh Kalmady, Digamber Sonvane, Kislay Bhatt, Phool Chand,
The status of IHEP Beijing Site WLCG Asia-Pacific Workshop Yaodong CHENG IHEP, China 01 December 2006.
The EPIKH Project (Exchange Programme to advance e-Infrastructure Know-How) gLite Grid Introduction Salma Saber Electronic.
] Open Science Grid Ben Clifford University of Chicago
Grid Technology in Production at DESY Andreas Gellrich * DESY ACAT May 2005 *
Grid Computing: Running your Jobs around the World
Regional Operations Centres Core infrastructure Centres
Moroccan Grid Infrastructure MaGrid
Grid Computing for the ILC
The GENIUS portal and the GILDA t-Infrastructure
Installation/Configuration
Presentation transcript:

Grid Infrastructure for the ILC Andreas Gellrich DESY European ILC Software and Physics Meeting Cambridge, UK,

Andreas Gellrich, DESYILC Meeting, Cambridge, The Grid Dream GRIDMIDDLEWAREGRIDMIDDLEWARE Visualizing Supercomputer, PC-Cluster Data Storage, Sensors, Experiments Internet, Networks Desktop Mobile Access

Andreas Gellrich, DESYILC Meeting, Cambridge, Grid Computing Grid Computing is about virtualization of resources The Grid is there – pioneered by LCG (Almost ) all HEP resources have been moved to the Grid Grid tools become the de facto standard for computing (local batch system globally accessible ) and data access (GridFTP, SRM, Catalog Services) User management done within Virtual Organizations (VO) Certificates for authentication rather local site account Embedded in the EU project Enabling Grids for E-scienE (EGEE)

Andreas Gellrich, DESYILC Meeting, Cambridge, The LHC Computing Model Tier2 Centre ~1 TIPS Online System Offline Farm ~20 TIPS CERN Computer Centre >20 TIPS RAL Regional Centre US Regional Centre French Regional Centre Italian Regional Centre Institute Institute ~0.25TIPS Workstations ~100 MBytes/sec Mbits/sec One bunch crossing per 25 ns 100 triggers per second Each event is ~1 Mbyte Physicists work on analysis “channels” Each institute has ~10 physicists working on one or more channels Data for these channels should be cached by the institute server Physics data cache ~PBytes/sec ~ Gbits/sec or Air Freight Tier2 Centre ~1 TIPS ~Gbits/sec Tier 0 Tier 1 Tier 3 1 TIPS = 25,000 SpecInt95 PC (1999) = ~15 SpecInt95 ScotGRID++ ~1 TIPS Tier 2

Andreas Gellrich, DESYILC Meeting, Cambridge, Building Blocks A Virtual Organization (VO) is a dynamic collection of individuals, institutions, and resources which is defined by certain sharing rules.  A VO represents a collaboration  Users are members of a certain VO (Authorization)  Users authenticate themselves with a certificate (Authentication)  Certificates are issued by a national Certification Authority (CA) Grid Infrastructure  Core Services (mandatory per VO)  VO Membership Services  Grid Information Services  Resources Brokers  Resources (brought in by partners) (Grid sites)

Andreas Gellrich, DESYILC Meeting, Cambridge, Grid activities were initially driven by the demand for resources for MC production of the HERA experiments H1 and ZEUS The International Lattice Data Grid (ILDG) and the International Linear Collider Community (ILC) joined the Grid activities DESY participates in EGEE (since 2004) and D-GRID (since 2005) DESY sets up LCG Tier-2 centers for ATLAS (in federation w/ U Freiburg) and CMS (in federation w/ RWTH Aachen) DESY participated in the Service Challenges SC3 and prepares for SC4 and the Pre-Production Service PPS for LHC DESY operates a complete Grid infrastructure incl. all services DESY …

Andreas Gellrich, DESYILC Meeting, Cambridge, … DESY … Quattor (SL 3.05 for all nodes; complete installation for WNs) LCG-2_6_0 / LCG-2_7_0, Yaim (for all service nodes) Central VO Services: (unique per VO) VO (LDAP)[ grid-vo.desy.de ] VOMS[ grid-voms.desy.de ] Catalogue Services: RLS / LFC [ grid-cat2.desy.de ] Distributed VO Services: (mandatory per VO) Resource Broker (RB) [ grid-rb.desy.de ] Information Index (BDII) [ grid-bdii.desy.de ] Proxy (PXY) [ grid-pxy.desy.de ]

Andreas Gellrich, DESYILC Meeting, Cambridge, … DESY Site Resources: GIIS:DESY-HH [ grid-giis.desy.de ] CE: 83 * XEON/ GHz [ grid-ce0.desy.de ] CE:166 * Opteron/2.4 GHz [ t2-ce0.desy.de ] CE: 34 * XEON/1GHz (ZEUS) [ zeus-ce.desy.de ] SE: dCache-based w/ access to DESY data space of 0.5TB Grid (Tier-2) Resource Planning: 2005: 200 kSpecINT2k 30 TB Now: 350 kSpecINT2k 75 TB 2006: 800 kSpecINT2k200 TB 2007:1400 kSpecINT2k600 TB 2008:1600 kSpecINT2k600 TB 2009:1800 kSpecINT2k800 TB

Andreas Gellrich, DESYILC Meeting, Cambridge, VO DESY VOs hosted at DESY: Global: ‘hone’, ‘ilc’, ‘zeus’ (registration via LCG registrar system) Regional: ‘calice’, ‘dcms’, ‘ildg’ Local: ‘baikal’, ‘desy’, ‘herab’, ‘hermes’, ‘icecube’ VOs supported at DESY: (Tier-2) Global: ‘atlas’, ‘cms’, ‘dteam’ Regional: ‘dech’ H1 Experiment at HERA (‘hone’) desy, uni-dortmund, cscs, gridpp, bham, ucl, lancs, ox, marseille, cyf-kr, saske ILC Community (‘ilc’, ‘calice’) ZEUS Experiment at HERA (‘zeus’) desy, uni-dortmund, gridpp, bham, ucl, lancs, ox, marseille, cyf-kr, saske, infn, utoronto, uam, scotgrid, weizmann, scai, bris, tau, ed

Andreas Gellrich, DESYILC Meeting, Cambridge, Grid The VOs ‘ilc’ and ‘calice’ are hosted at DESY w/ all core services Registration to ‘ilc’ is managed by LCG ( and has become a so-called global VO in EGEE ‘ilc’ is currently supported by ~10 UKI sites, LAL, Freiburg, DESY (27 CEs, 3500 CPUs, 42 TB, 6 RBs) Data have been moved to SLAC with Grid tools ‘calice’ is supported by DESY and Imperial College (IC) The testbeam data of CALICE were moved between DESY and IC using Grid tools (GridFTP, SRM, LFC)

Andreas Gellrich, DESYILC Meeting, Cambridge, GOC Grid Monitoring

Andreas Gellrich, DESYILC Meeting, Cambridge, EGEE Accounting … x Q1/ Opteron y = kSI2k hours

Andreas Gellrich, DESYILC Meeting, Cambridge, … EGEE Accounting … x Q1/ Opteron y = kSI2k hours

Andreas Gellrich, DESYILC Meeting, Cambridge, … EGEE Accounting … x Q1/ Opteron y = kSI2k hours

Andreas Gellrich, DESYILC Meeting, Cambridge, … EGEE Accounting 1 Opteron y = kSI2k hours x Q1/2006

Andreas Gellrich, DESYILC Meeting, Cambridge, Future Aspects The Grid infrastructure will move to gLite middleware  Client commands might slightly change Scientific Linux 4 is coming (pushed by the LHC experiments)  Software must run on SL4 WNs Multiple VO membership has been a hot topic  VOMS is being rolled-out Database access by many concurrent jobs  Caching technologies are under development Continue to use resources parasitically?

Andreas Gellrich, DESYILC Meeting, Cambridge, Conclusions Grid Computing is a strategic technology for the future (Significant) resources will be available in the Grid (only) The Grid requires global thinking DESY maintains a Grid Infrastructure in production DESY is an LCG Tier-2 centre for ATLAS and CMS H1, ILC, and ZEUS heavily use the Grid for MC production Service Challenges SC3/SC4 VOs already exist for ILC and CALICE with international support

Andreas Gellrich, DESYILC Meeting, Cambridge, At Last The Grid is ready for ILC! Get your certificate quick and start to use the Grid!

Andreas Gellrich, DESYILC Meeting, Cambridge, Web DESY Grid Web Sites: Grid Computing Web Sites:   

Andreas Gellrich, DESYILC Meeting, Cambridge, Site3 Site2 output UI LFC WN SE GRIS CE GRIS BDII JDL RB Batch LFC VO(MS) RB CE SE /etc/grid-security/grid-mapfile Site1 ssh $HOME/.globus/ certs GIIS store world Grid Infrastructure R-GMA VO exclusive