Computing Coordination Aspects for HEP in Germany International ICFA Workshop on HEP Networking, Grid and Digital Divide Issues for Global e-Science nLCG.

Slides:



Advertisements
Similar presentations
Service Data Challenge Meeting, Karlsruhe, Dec 2, 2004 Forschungszentrum Karlsruhe in der Helmholtz-Gemeinschaft Plans and outlook at GridKa Forschungszentrum.
Advertisements

LCG-France Project Status Fabio Hernandez Frédérique Chollet Fairouz Malek Réunion Sites LCG-France Annecy, May
 Contributing >30% of throughput to ATLAS and CMS in Worldwide LHC Computing Grid  Reliant on production and advanced networking from ESNET, LHCNET and.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
Open Science Grid June 28, 2006 Bill Kramer Chair of the Open Science Grid Council NERSC Center General Manager, LBNL.
E-Science Workshop, Santiago de Chile, 23./ KIT ( Frank Schmitz Forschungszentrum Karlsruhe Institut.
EGEE is proposed as a project funded by the European Union under contract IST The EGEE International Grid Infrastructure and the Digital Divide.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 15 th April 2009 Visit of Spanish Royal Academy.
DATA PRESERVATION IN ALICE FEDERICO CARMINATI. MOTIVATION ALICE is a 150 M CHF investment by a large scientific community The ALICE data is unique and.
The German HEP Community Grid for the German HEP Community Grid 27-March-2007, ISGC2007, Taipei Agenda: D-Grid in context HEP Community.
October 24, 2000Milestones, Funding of USCMS S&C Matthias Kasemann1 US CMS Software and Computing Milestones and Funding Profiles Matthias Kasemann Fermilab.
LCG Milestones for Deployment, Fabric, & Grid Technology Ian Bird LCG Deployment Area Manager PEB 3-Dec-2002.
ITEP participation in the EGEE project NEC’2005, Varna, Bulgaria Ivan Korolko (ITEP Moscow)
EGI: A European Distributed Computing Infrastructure Steven Newhouse Interim EGI.eu Director.
The Preparatory Phase Proposal a first draft to be discussed.
Grid Computing Oxana Smirnova NDGF- Lund University R-ECFA meeting in Sweden Uppsala, May 9, 2008.
Status of the DESY Grid Centre Volker Guelzow for the Grid Team DESY IT Hamburg, October 25th, 2011.
GGF12 – 20 Sept LCG Incident Response Ian Neilson LCG Security Officer Grid Deployment Group CERN.
LCG Service Challenge Phase 4: Piano di attività e impatto sulla infrastruttura di rete 1 Service Challenge Phase 4: Piano di attività e impatto sulla.
3 June 2004GridPP10Slide 1 GridPP Dissemination Sarah Pearce Dissemination Officer
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 25 th April 2012.
10/24/2015OSG at CANS1 Open Science Grid Ruth Pordes Fermilab
Grid Computing Status Report Jeff Templon PDP Group, NIKHEF NIKHEF Scientific Advisory Committee 20 May 2005.
Forschungszentrum Jülich in der Helmholtz-Gesellschaft Experiences with using UNICORE in Production Grid Infrastructures DEISA and D-Grid Michael Rambadt.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
GridPP Deployment & Operations GridPP has built a Computing Grid of more than 5,000 CPUs, with equipment based at many of the particle physics centres.
Overall Goal of the Project  Develop full functionality of CMS Tier-2 centers  Embed the Tier-2 centers in the LHC-GRID  Provide well documented and.
The Swiss Grid Initiative Context and Initiation Work by CSCS Peter Kunszt, CSCS.
EGEE is a project funded by the European Union under contract IST Middleware Planning for LCG/EGEE Bob Jones EGEE Technical Director e-Science.
CEOS WGISS-21 CNES GRID related R&D activities Anne JEAN-ANTOINE PICCOLO CEOS WGISS-21 – Budapest – 2006, 8-12 May.
EU-IndiaGrid (RI ) is funded by the European Commission under the Research Infrastructure Programme WP5 Application Support Marco.
Ian Bird LHC Computing Grid Project Leader LHC Grid Fest 3 rd October 2008 A worldwide collaboration.
BNL Tier 1 Service Planning & Monitoring Bruce G. Gibbard GDB 5-6 August 2006.
Tony Doyle - University of Glasgow 8 July 2005Collaboration Board Meeting GridPP Report Tony Doyle.
Overview of grid activities in France in relation to FKPPL FKPPL Workshop Thursday February 26th, 2009 Dominique Boutigny.
Les Les Robertson LCG Project Leader High Energy Physics using a worldwide computing grid Torino December 2005.
OSG Consortium Meeting (January 23, 2006)Paul Avery1 University of Florida Open Science Grid Progress Linking Universities and Laboratories.
Ian Bird LCG Deployment Area Manager & EGEE Operations Manager IT Department, CERN Presentation to HEPiX 22 nd October 2004 LCG Operations.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
Nikhef/(SARA) tier-1 data center infrastructure
Future of grids V. Breton CNRS. EGEE training, CERN, May 19th Table of contents Introduction Future of infrastructures : from networks to e-
Status Organization Overview of Program of Work Education, Training It’s the People who make it happen & make it Work.
Storage and Data Movement at FNAL D. Petravick CHEP 2003.
LHC Computing, CERN, & Federated Identities
DJ: WLCG CB – 25 January WLCG Overview Board Activities in the first year Full details (reports/overheads/minutes) are at:
INFSO-RI Enabling Grids for E-sciencE The EGEE Project Owen Appleton EGEE Dissemination Officer CERN, Switzerland Danish Grid Forum.
IAG – Israel Academic Grid, EGEE and HEP in Israel Prof. David Horn Tel Aviv University.
An attempt to summarize…or … some highly subjective observations Matthias Kasemann, CERN & DESY.
Ian Bird WLCG Networking workshop CERN, 10 th February February 2014
LCG Project Organisation Requirements and Monitoring LHCC Comprehensive Review November 24, 2003 Matthias Kasemann Software + Computing Committee (SC2)
WLCG Status Report Ian Bird Austrian Tier 2 Workshop 22 nd June, 2010.
Ian Bird LCG Project Leader Status of EGEE  EGI transition WLCG LHCC Referees’ meeting 21 st September 2009.
1 September 2007WLCG Workshop, Victoria, Canada 1 WLCG Collaboration Workshop Victoria, Canada Site Readiness Panel Discussion Saturday 1 September 2007.
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
A Computing Tier 2 Node Eric Fede – LAPP/IN2P3. 2 Eric Fede – 1st Chinese-French Workshop Plan What is a Tier 2 –Context and definition To be a Tier 2.
Dominique Boutigny December 12, 2006 CC-IN2P3 a Tier-1 for W-LCG 1 st Chinese – French Workshop on LHC Physics and associated Grid Computing IHEP - Beijing.
The German HEP-Grid initiative for the German HEP Community Grid 13-Feb-2006, CHEP06, Mumbai Agenda: D-Grid in context.
The Helmholtz Association Project „Large Scale Data Management and Analysis“ (LSDMA) Kilian Schwarz, GSI; Christopher Jung, KIT.
INFSO-RI Enabling Grids for E-sciencE EGEE general project update Fotis Karayannis EGEE South East Europe Project Management Board.
NDGF – a Joint Nordic Production Grid Lars Fischer ICFA Workshop on HEP Networking, Grid, and Digital Divide Issues for Global e-Science Cracow, 2 October.
The status of IHEP Beijing Site WLCG Asia-Pacific Workshop Yaodong CHENG IHEP, China 01 December 2006.
 Prospective Nationale sur les Grilles de Production, Paris, L'état d'avancement des grilles.
J. Templon Nikhef Amsterdam Physics Data Processing Group Large Scale Computing Jeff Templon Nikhef Jamboree, Utrecht, 10 december 2012.
18/12/03PPD Christmas Lectures 2003 Grid in the Department A Guide for the Uninvolved PPD Computing Group Christmas Lecture 2003 Chris Brew.
Bob Jones EGEE Technical Director
The LHC Computing Grid Visit of Mtro. Enrique Agüera Ibañez
Data Challenge with the Grid in ATLAS
The CCIN2P3 and its role in EGEE/LCG
DGI: The D-Grid Infrastructure
LHC Data Analysis using a worldwide computing grid
Presentation transcript:

Computing Coordination Aspects for HEP in Germany International ICFA Workshop on HEP Networking, Grid and Digital Divide Issues for Global e-Science nLCG for Experiments u Zeus, LHC u Planning for 2008 nThe German e-Science program (D-Grid) u Scope, Goals and Schedule u HEP in D-Grid Daegu, Korea Matthias Kasemann / DESY

Matthias Kasemann Daegu / Korea, /11 Germany:LCG for Experiments nSubstantial improvement using LCG for experiments: u For ZEUS/HERA:  50% MC production u ATLAS, CMS, Alice and LHCb  Data Challenge contributions ~ 10% u Actively participating in Service Challenges SC2 (and SC3 to come) nGood Networking connectivity (national and international) u Tier1 with 10 GBit now u All (but 2) LHC HEP sites are connected to German Research network, plans for 2008 are:  10 GBit: Tier1 and Tier2 sites and to CERN  1 GBit: All LHC Universities 1 year ago… Today…

Matthias Kasemann Daegu / Korea, /11 Germany:Tier1 Centre Plans … as planned in 2001 G-Fraction planned 10% 8% 5% 10% DISK TAPE CPU New and better understood Computing Models needs re- adjustment of Tier1 resources u CPU is sufficient: MC moved to Tier2 u Substantial increase in disk requirements u Higher Tape bandwidth required Need to identify more funding…

Matthias Kasemann Daegu / Korea, /11 Germany:Tier2 Centre Plans Experiments Tier2 Plans No funding is secured by now,  only candidate sites nATLAS Tier2 center candidates: u 1) DESY, 2) MPI/LMU (Munich), 3) Universities Wuppertal/Freiburg nCMS Tier2 center candidates: u 1) DESY, 2) University Aachen nLHCb: no Tier2 center planned, perhaps after 2009… G-Tier1 could potentially served non-German Tier2 centers u Discussions to continue… # of Tier2’sPlanned for Germany ATLAS~ 303 T2 centers CMS~ T2 centers

Matthias Kasemann Daegu / Korea, /11 The German e-Science program: nChances for HEP: u Additional resources to improve Grid Software for HEP with little strings attached u Increase footprint of MW knowledge and involvement u Improve grid software nChallenges for HEP: u Very heterogeneous disciplines and stakeholders u LCG/EGEE is not basis for many other partners  Several are undecided, have little constraints…  Other need templates, portals… u HPC partners strong, favour UNICORE

Matthias Kasemann Daegu / Korea, /11 The German e-Science program: nCall for proposals, deadline was u Approved projects to start January 2005 nWhat will be funded: u 4-5 ‘Community Projects’  To be approved for 3 years  Funding for R&D only, some matching required  “R&D for application and community specific MW and network based services”  Results to be integrated in ‘Integration project’  Budget: a total of 11M€ u 1 Integration Project  “…to create a German e-Science infrastructure…”  To be approved for 2 years (initially)  To be based on existing components (‘gap analysis’)  Extended by R&D and results of Community Projects  Budget: 5M€ Reapply June 6, 2005 Sept. 2005

Matthias Kasemann Daegu / Korea, /11 Community Projects n1. Round: > 12 projects proposed, 7 selected u Communities after round 1: HEP, Astronomy, Engineering, Climate & Earth, Biomedicine, Bioinformatics (Medical Imaging), Integration Project u HEP proposal was very well received “… too little coordination between projects…” “… projects too expensive…” (i.e. not enough money available) “… too many partners…”, “… too unfocused…”,…too…” Result is: try again (…jump higher…) u Delay of 9 months u A lot of work (and frustration)  Finding synergies  Reducing scope and cost  Rescheduling One has to go still …

Matthias Kasemann Daegu / Korea, /11

Matthias Kasemann Daegu / Korea, /11 HEP Community Grid: Goals nFocus on tools to improve data analysis for HEP and Astroparticle Physics with 3 work packages: 1. Data management 1. Advanced scalable data management 2. Job-and data co-scheduling 3. Extendable Metadata catalogues for Astroparticle and Lattice QCD Physics 2. Jobmonitoring and automated user job support 1. Information services 2. Improved Job failure treatment 3. Incremental results of distributed analysis 3. End-user data analysis tools 1. Physics and user oriented jobscheduling, workflows 2. Automatic job scheduling All development is based on LCG / EGEE software and will be kept compatible.

Matthias Kasemann Daegu / Korea, /11 Integration Project (Karlsruhe) Work packages: nFG 1: D-Grid-Basic-Software u Globus, Unicore, LCG u GridSphere, GAT u Data Mgm’t, Large Storage u Data-Interface u VO, Management, Tools nFG 2: Setup + Operation of the D-Grid Infrastructure u Setup, Operation, Integration u Resource-Interoperability u Grid Benchmarking u Monitoring, Accounting, Billing nFG 3: Networks and Security u VPN, Transport Protocols u AAI u Firewalls, CERT nFG 4: Project Coordination u Sustained Business Models u D-Grid Office Budget: 5 M€ for 2 years u Follow-up project very likely

Matthias Kasemann Daegu / Korea, /11 HEP-CC in Germany:Summary nLCG for Experiments u Very successfully increased footprint and usage of LCG for running and coming experiments in Germany nPlanning for 2008 is progressing u Plans for Tier1-Tier2 Infrastructure finalizing u Discussion about resources progressing nThe German e-Science program (D-Grid) is an interesting Programs and is about to start. u HEP is important partner u LCG/EGEE Compatibility and interoperability is important for us… Thank you for this opportunity to present German Computing!