The German HEP-Grid initiative for the German HEP Community Grid 13-Feb-2006, CHEP06, Mumbai Agenda: D-Grid in context.

Slides:



Advertisements
Similar presentations
The High Energy Physics Community Grid Project Inside D-Grid ACAT 07 Torsten Harenberg - University of Wuppertal
Advertisements

EGEE-II INFSO-RI Enabling Grids for E-sciencE The gLite middleware distribution OSG Consortium Meeting Seattle,
FP7-INFRA Enabling Grids for E-sciencE EGEE Induction Grid training for users, Institute of Physics Belgrade, Serbia Sep. 19, 2008.
C. Grimme, A. Papaspyrou Scheduling in C3-Grid AstroGrid-D Workshop Project: C3-Grid Collaborative Climate Community Data and Processing Grid Scheduling.
Plateforme de Calcul pour les Sciences du Vivant SRB & gLite V. Breton.
MTA SZTAKI Hungarian Academy of Sciences Grid Computing Course Porto, January Introduction to Grid portals Gergely Sipos
The EPIKH Project (Exchange Programme to advance e-Infrastructure Know-How) gLite Grid Services Abderrahman El Kharrim
EU-GRID Work Program Massimo Sgaravatto – INFN Padova Cristina Vistoli – INFN Cnaf as INFN members of the EU-GRID technical team.
E-Science Workshop, Santiago de Chile, 23./ KIT ( Frank Schmitz Forschungszentrum Karlsruhe Institut.
Open Science Grid Frank Würthwein UCSD. 2/13/2006 GGF 2 “Airplane view” of the OSG  High Throughput Computing — Opportunistic scavenging on cheap hardware.
The German HEP Community Grid for the German HEP Community Grid 27-March-2007, ISGC2007, Taipei Agenda: D-Grid in context HEP Community.
Makrand Siddhabhatti Tata Institute of Fundamental Research Mumbai 17 Aug
Enabling Grids for E-sciencE Medical image processing web portal : Requirements analysis. An almost end user point of view … H. Benoit-Cattin,
SUN HPC Consortium, Heidelberg 2004 Grid(Lab) Resource Management System (GRMS) and GridLab Services Krzysztof Kurowski Poznan Supercomputing and Networking.
QCDgrid Technology James Perry, George Beckett, Lorna Smith EPCC, The University Of Edinburgh.
OSG End User Tools Overview OSG Grid school – March 19, 2009 Marco Mambelli - University of Chicago A brief summary about the system.
NAREGI WP4 (Data Grid Environment) Hideo Matsuda Osaka University.
INFSO-RI Enabling Grids for E-sciencE Comparison of LCG-2 and gLite Author E.Slabospitskaya Location IHEP.
3 Sept 2001F HARRIS CHEP, Beijing 1 Moving the LHCb Monte Carlo production system to the GRID D.Galli,U.Marconi,V.Vagnoni INFN Bologna N Brook Bristol.
OSG Middleware Roadmap Rob Gardner University of Chicago OSG / EGEE Operations Workshop CERN June 19-20, 2006.
OGF 25/EGEE User Forum Catania, March 2 nd 2009 Meta Scheduling and Advanced Application Support on the Spanish NGI Enol Fernández del Castillo (IFCA-CSIC)
QCDGrid Progress James Perry, Andrew Jackson, Stephen Booth, Lorna Smith EPCC, The University Of Edinburgh.
D C a c h e Michael Ernst Patrick Fuhrmann Tigran Mkrtchyan d C a c h e M. Ernst, P. Fuhrmann, T. Mkrtchyan Chep 2003 Chep2003 UCSD, California.
Introduction to dCache Zhenping (Jane) Liu ATLAS Computing Facility, Physics Department Brookhaven National Lab 09/12 – 09/13, 2005 USATLAS Tier-1 & Tier-2.
Author - Title- Date - n° 1 Partner Logo EU DataGrid, Work Package 5 The Storage Element.
Forschungszentrum Karlsruhe in der Helmholtz-Gemeinschaft 1 Institute for Scientific Computing in the Forschungszentrum Karlsruhe Overview Rainer Kupsch.
EGEE-Forum – May 11, 2007 Enabling Grids for E-sciencE EGEE and gLite are registered trademarks A gateway platform for Grid Nicolas.
Enabling Grids for E-sciencE Introduction Data Management Jan Just Keijser Nikhef Grid Tutorial, November 2008.
NA-MIC National Alliance for Medical Image Computing UCSD: Engineering Core 2 Portal and Grid Infrastructure.
E-science grid facility for Europe and Latin America GridwWin: porting gLite to run under Windows Fabio Scibilia – Consorzio COMETA 30/06/2008.
CEOS WGISS-21 CNES GRID related R&D activities Anne JEAN-ANTOINE PICCOLO CEOS WGISS-21 – Budapest – 2006, 8-12 May.
What is SAM-Grid? Job Handling Data Handling Monitoring and Information.
MTA SZTAKI Hungarian Academy of Sciences Introduction to Grid portals Gergely Sipos
INFSO-RI Enabling Grids for E-sciencE OSG-LCG Interoperability Activity Author: Laurence Field (CERN)
T3 analysis Facility V. Bucard, F.Furano, A.Maier, R.Santana, R. Santinelli T3 Analysis Facility The LHCb Computing Model divides collaboration affiliated.
Overview of grid activities in France in relation to FKPPL FKPPL Workshop Thursday February 26th, 2009 Dominique Boutigny.
July 29' 2010INDIA-CMS_meeting_BARC1 LHC Computing Grid Makrand Siddhabhatti DHEP, TIFR Mumbai.
UKQCD Grid Status Report GridPP 13 th Collaboration Meeting Durham, 4th—6th July 2005 Dr George Beckett Project Manager, EPCC +44.
US LHC OSG Technology Roadmap May 4-5th, 2005 Welcome. Thank you to Deirdre for the arrangements.
Computing Coordination Aspects for HEP in Germany International ICFA Workshop on HEP Networking, Grid and Digital Divide Issues for Global e-Science nLCG.
Andrew McNabSecurity Middleware, GridPP8, 23 Sept 2003Slide 1 Security Middleware Andrew McNab High Energy Physics University of Manchester.
6/23/2005 R. GARDNER OSG Baseline Services 1 OSG Baseline Services In my talk I’d like to discuss two questions:  What capabilities are we aiming for.
1 Grid2Win: porting of gLite middleware to Windows Dario Russo INFN Catania
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE Site Architecture Resource Center Deployment Considerations MIMOS EGEE Tutorial.
INFSO-RI Enabling Grids for E-sciencE The gLite File Transfer Service: Middleware Lessons Learned form Service Challenges Paolo.
Dr. Harald Kornmayer D-GRID CGW 05, 22-Nov-2005 The german Grid initiative A platform for e-Science in Germany Dr. Harald Kornmayer Institut for Scientific.
OSG Abhishek Rana Frank Würthwein UCSD.
INFSO-RI Enabling Grids for E-sciencE Introduction Data Management Ron Trompert SARA Grid Tutorial, September 2007.
Development of e-Science Application Portal on GAP WeiLong Ueng Academia Sinica Grid Computing
Storage and Data Movement at FNAL D. Petravick CHEP 2003.
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Grid2Win : gLite for Microsoft Windows Roberto.
Testing and integrating the WLCG/EGEE middleware in the LHC computing Simone Campana, Alessandro Di Girolamo, Elisa Lanciotti, Nicolò Magini, Patricia.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Interfacing gLite services with the Kepler.
LHCC Referees Meeting – 28 June LCG-2 Data Management Planning Ian Bird LHCC Referees Meeting 28 th June 2004.
Breaking the frontiers of the Grid R. Graciani EGI TF 2012.
EGEE-II INFSO-RI Enabling Grids for E-sciencE Overview of gLite, the EGEE middleware Mike Mineter Training Outreach Education National.
Consorzio COMETA - Progetto PI2S2 UNIONE EUROPEA Grid2Win : gLite for Microsoft Windows Elisa Ingrà - INFN.
ACGT Architecture and Grid Infrastructure Juliusz Pukacki ‏ EGEE Conference Budapest, 4 October 2007.
SAM architecture EGEE 07 Service Availability Monitor for the LHC experiments Simone Campana, Alessandro Di Girolamo, Nicolò Magini, Patricia Mendez Lorenzo,
WP5 – Infrastructure Operations Test and Production Infrastructures StratusLab kick-off meeting June 2010, Orsay, France GRNET.
EGI-InSPIRE RI EGI-InSPIRE EGI-InSPIRE RI EGI Services for Distributed e-Infrastructure Access Tiziana Ferrari on behalf.
The EPIKH Project (Exchange Programme to advance e-Infrastructure Know-How) gLite Grid Introduction Salma Saber Electronic.
Jean-Philippe Baud, IT-GD, CERN November 2007
dCache Status and Plans – Proposals for SC3
DGI: The D-Grid Infrastructure
April HEPCG Workshop 2006 GSI
Partner: LMU (Atlas), GSI (Alice)
LCG middleware and LHC experiments ARDA project
gLite The EGEE Middleware Distribution
Introduction to the SHIWA Simulation Platform EGI User Forum,
Presentation transcript:

The German HEP-Grid initiative for the German HEP Community Grid 13-Feb-2006, CHEP06, Mumbai Agenda: D-Grid in context HEP Community Grid HEP-CG Work Packages Summary

EDG EGEEEGEE 2 LCG R&DWLCG D-Grid in context: e-Science in Germany 10. Berlin Today M€ Initiative

EDG EGEEEGEE 2 LCG R&DWLCG D-Grid in context: e-Science Call and Results 1. Call: 15 M€ Community Grids & Integration P. 2. Call: 15 M€ new communities extensions to D-Grid service providers Today 5 CGs IP Production quality national grid infrastructure Commercial uptake of services

Generic platform and generic Grid services D-Grid Integration Project Astro GridMedi GridC3 Grid HEP Grid In Grid Text GridONTOVERSEWikinger C3 Grid Grid Computing & Knowledge management & e-Learning e-Science =

PC² RRZN TUD RZG LRZ RWTH FZJ FZK FHG/ ITWM Uni-KA D-Grid WPs: Middleware & Tools, Infrastructure, Network & Security, Management & Sustainability Middleware: Globus 4.x gLite (LCG) UNICORE GAT and GridSphere Data Management: SRM/dCache OGSA-DAI Meta data schemas VO Management: VOMS and Shibboleth

HEP Community Grid (HEP CG) Coordination M.Kasemann, DESY A 3 year project, started Sept. 1, 2005

Focus on tools to improve data analysis for HEP and Astroparticle Physics. Focus on gaps, do not reinvent the wheel. Data management Advanced scalable data management Job-and data co-scheduling Extendable Metadata catalogues for Lattice QCD and Astroparticle physics Job monitoring and automated user job support Information services Improved Job failure treatment Incremental results of distributed analysis End-user data analysis tools Physics and user oriented job scheduling, workflows, automatic job scheduling All development is based on LCG / EGEE sw and will be kept compatible!

HEP CG WP1: Data Management Coordination M.Ernst, DESY Developing and supporting a scalable Storage Element based on Grid standards (DESY, Uni Dortmund, UniFreiburg, unfunded FZK) Combined job- and data-scheduling, accounting and monitoring of data used (Uni Dortmund) Development of grid-based, extendable metadata catalogues with semantically world- wide access (DESY, ZlB, unfunded: Humboldt Uni Berlin, NIC)

Pool Manager I/O Door Nodes SRM GFtp dCap (K)Ftp (Krb5,ssl) Http Admin File Name Space Database NFS Server pnfs File Name Space Provider OSM Enstore TSM Admin Doors dCache Components Pool Nodes HSMs Meta Data Operations only – No Data transfer P.Fuhrmann, dCache, the Upgrade, 13-Feb :00 Session: Computing Facilities and Networking

Job WMS MDS (Job Description) Informations- quelle LFC CE PBS GRAM Local Scheduler SE Local Data Scheduler dCache HSM Prediction Engine Replication Scheduler LFN  PFN Created by L. Schley A Computational and Data Scheduling Architecture for HEP Applications Poster session 1 (L.Schley) Improved Scheduling

Source: Dirk Pleiter DESY/Zeuthen D.Pleiter, Using Grid Technologies for Lattice QCD, 16-Feb :40 Session: Grid middleware and e-Infrastructure operation

HEP CG WP2: Job Monitoring + User Support Tools Coordination: P.Mättig, Uni Wuppertal Development of a job information system (TU Dresden) Development of an expert-system to classify job - failures, automatic treatment of most common errors (Uni Wuppertal, unfunded FZK) R&D on interactive job steering and access to temporary, incomplete analysis job results (Uni Siegen)

Ralph Müller-Pfefferkorn Job Monitoring provide users with sufficient information about their jobs focus on „many jobs“ scenario -> graphical interface, visualizations ease of use user should not need to know more than necessary, which should be almost nothing from general to detailed views on jobs information like status, resource usage by jobs, output, time lines etc.

HEP CG WP3: Distributed Interactive Data Analysis Coordination P.Malzacher, GSI (LMU, GSI, unfunded: LRZ, MPI M, RZ Garching, Uni Karlsruhe, MPI Heidelberg) Optimise application specific job scheduling Analyse and test of software environment required Job management and Bookkeeping of distributed analysis Distribution of analysis, sum-up of results Interactive Analysis: Creation of a dedicated analysis cluster Dynamic partitioning of Grid analysis clusters

Start with Gap Analysis LMU: Investigating Job-Scheduler requirements for distributed and interactive analysis GANGA (ATLAS/LHCb) project shows good features for this task Used for test MC production on LCG Distributed Analysis Experiences in Poster Session 1 (J. Elmsheuser) GSI: Analysis based on PROOF Investigating different versions of PROOF clusters Connect ROOT and gLite: TGlite Developing a ROOT interface for gLite in Poster Session 2 (K. Schwarz) class TGrid : public TObject { public: … virtual TGridResult *Query ( … static TGrid *Connect ( const char *grid, const char *uid = 0, const char *pw = 0 … ClassDef(TGrid,0) };

Summary Rather late compared to other national Grid initiatives a German e-science program is well under way. The HEP-CG focuses on three work packages: data management, automated user support and interactive analysis. All HEP-CG development is based on LCG / EGEE software and will be kept compatible. Chances for HEP: Additional resources to improve Grid Software for HEP. Increase footprint of MW knowledge and involvement. Improve grid software. Challenges for HEP: Very heterogeneous disciplines and stakeholders. LCG/EGEE is not basis for many other partners. Several are undecided, have little constraints… Others need templates, portals…