LCG Status GridPP 8 September 22 nd 2003 CERN.ch.

Slides:



Advertisements
Similar presentations
Physicist Interfaces Project an overview Physicist Interfaces Project an overview Jakub T. Moscicki CERN June 2003.
Advertisements

 Contributing >30% of throughput to ATLAS and CMS in Worldwide LHC Computing Grid  Reliant on production and advanced networking from ESNET, LHCNET and.
D. Düllmann - IT/DB LCG - POOL Project1 POOL Release Plan for 2003 Dirk Düllmann LCG Application Area Meeting, 5 th March 2003.
LHC Experiment Dashboard Main areas covered by the Experiment Dashboard: Data processing monitoring (job monitoring) Data transfer monitoring Site/service.
SEAL V1 Status 12 February 2003 P. Mato / CERN Shared Environment for Applications at LHC.
LCG Milestones for Deployment, Fabric, & Grid Technology Ian Bird LCG Deployment Area Manager PEB 3-Dec-2002.
LHCC Comprehensive Review – September WLCG Commissioning Schedule Still an ambitious programme ahead Still an ambitious programme ahead Timely testing.
EGEE is a project funded by the European Union under contract IST JRA1 Testing Activity: Status and Plans Leanne Guy EGEE Middleware Testing.
5 November 2001F Harris GridPP Edinburgh 1 WP8 status for validating Testbed1 and middleware F Harris(LHCb/Oxford)
RLS Tier-1 Deployment James Casey, PPARC-LCG Fellow, CERN 10 th GridPP Meeting, CERN, 3 rd June 2004.
A. Aimar - EP/SFT LCG - Software Process & Infrastructure1 Software Process panel SPI GRIDPP 7 th Collaboration Meeting 30 June – 2 July 2003 A.Aimar -
CERN LCG-1 Status and Issues Ian Neilson for LCG Deployment Group CERN Hepix 2003, Vancouver.
LCG and HEPiX Ian Bird LCG Project - CERN HEPiX - FNAL 25-Oct-2002.
EGEE is a project funded by the European Union under contract IST Testing processes Leanne Guy Testing activity manager JRA1 All hands meeting,
May 8, 20071/15 VO Services Project – Status Report Gabriele Garzoglio VO Services Project – Status Report Overview and Plans May 8, 2007 Computing Division,
WP8 Status – Stephen Burke – 30th January 2003 WP8 Status Stephen Burke (RAL) (with thanks to Frank Harris)
DOSAR Workshop, Sao Paulo, Brazil, September 16-17, 2005 LCG Tier 2 and DOSAR Pat Skubic OU.
LCG Applications Area – Overview, Planning, Resources Torre Wenaus, BNL/CERN LCG Applications Area Manager LHCC Comprehensive Review.
Steve Traylen Particle Physics Department EDG and LCG Status 9 th December 2003
Δ Storage Middleware GridPP10 What’s new since GridPP9? CERN, June 2004.
1 The new Fabric Management Tools in Production at CERN Thorsten Kleinwort for CERN IT/FIO HEPiX Autumn 2003 Triumf Vancouver Monday, October 20, 2003.
David Adams ATLAS ADA, ARDA and PPDG David Adams BNL June 28, 2004 PPDG Collaboration Meeting Williams Bay, Wisconsin.
GLite – An Outsider’s View Stephen Burke RAL. January 31 st 2005gLite overview Introduction A personal view of the current situation –Asked to be provocative!
JRA Execution Plan 13 January JRA1 Execution Plan Frédéric Hemmer EGEE Middleware Manager EGEE is proposed as a project funded by the European.
The Experiments – progress and status Roger Barlow GridPP7 Oxford 2 nd July 2003.
Zprávy z ATLAS SW Week March 2004 Seminář ATLAS SW CZ Duben 2004 Jiří Chudoba FzÚ AV CR.
Owen SyngeTitle of TalkSlide 1 Storage Management Owen Synge – Developer, Packager, and first line support to System Administrators. Talks Scope –GridPP.
SEAL Core Libraries and Services CLHEP Workshop 28 January 2003 P. Mato / CERN Shared Environment for Applications at LHC.
SEAL Project Core Libraries and Services 18 December 2002 P. Mato / CERN Shared Environment for Applications at LHC.
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
CASTOR evolution Presentation to HEPiX 2003, Vancouver 20/10/2003 Jean-Damien Durand, CERN-IT.
GDB Meeting - 10 June 2003 ATLAS Offline Software David R. Quarrie Lawrence Berkeley National Laboratory
US LHC OSG Technology Roadmap May 4-5th, 2005 Welcome. Thank you to Deirdre for the arrangements.
Trusted Virtual Machine Images a step towards Cloud Computing for HEP? Tony Cass on behalf of the HEPiX Virtualisation Working Group October 19 th 2010.
15 December 2015M. Lamanna “The ARDA project”1 The ARDA Project (meeting with the LCG referees) Massimo Lamanna CERN.
SC4 Planning Planning for the Initial LCG Service September 2005.
INFSO-RI Enabling Grids for E-sciencE ARDA Experiment Dashboard Ricardo Rocha (ARDA – CERN) on behalf of the Dashboard Team.
Jan 2010 OSG Update Grid Deployment Board, Feb 10 th 2010 Now having daily attendance at the WLCG daily operations meeting. Helping in ensuring tickets.
G.Govi CERN/IT-DB 1 September 26, 2003 POOL Integration, Testing and Release Procedure Integration  Packages structure  External dependencies  Configuration.
SEAL Project Overview LCG-AA Internal Review October 2003 P. Mato / CERN.
Testing and integrating the WLCG/EGEE middleware in the LHC computing Simone Campana, Alessandro Di Girolamo, Elisa Lanciotti, Nicolò Magini, Patricia.
Grid Technology CERN IT Department CH-1211 Geneva 23 Switzerland t DBCF GT Upcoming Features and Roadmap Ricardo Rocha ( on behalf of the.
Plans for Service Challenge 3 Ian Bird LHCC Referees Meeting 27 th June 2005.
LCG – AA review 1 Simulation LCG/AA review Sept 2006.
Julia Andreeva on behalf of the MND section MND review.
EGEE is a project funded by the European Union under contract IST ARDA Project Status Massimo Lamanna ARDA Project Leader NA4/HEP Cork, 19.
1 Update at RAL and in the Quattor community Ian Collier - RAL Tier1 HEPiX FAll 2010, Cornell.
SL5 Site Status GDB, September 2009 John Gordon. LCG SL5 Site Status ASGC T1 - will be finished before mid September. Actually the OS migration process.
The DataGrid Project NIKHEF, Wetenschappelijke Jaarvergadering, 19 December 2002
CERN Deployment & Experiment Integration Flavia Donno & Markus Schulz LCG LCG Review 24 November 2003.
ATLAS Distributed Analysis Dietrich Liko IT/GD. Overview  Some problems trying to analyze Rome data on the grid Basics Metadata Data  Activities AMI.
Enabling Grids for E-sciencE INFSO-RI Enabling Grids for E-sciencE Gavin McCance GDB – 6 June 2007 FTS 2.0 deployment and testing.
Status of tests in the LCG 3D database testbed Eva Dafonte Pérez LCG Database Deployment and Persistency Workshop.
Distributed Analysis Tutorial Dietrich Liko. Overview  Three grid flavors in ATLAS EGEE OSG Nordugrid  Distributed Analysis Activities GANGA/LCG PANDA/OSG.
Project Work Plan SEAL: Core Libraries and Services 7 January 2003 P. Mato / CERN Shared Environment for Applications at LHC.
D. Duellmann, IT-DB POOL Status1 POOL Persistency Framework - Status after a first year of development Dirk Düllmann, IT-DB.
CERN Certification & Testing LCG Certification & Testing Team (C&T Team) Marco Serra - CERN / INFN Zdenek Sekera - CERN.
Status of gLite-3.0 deployment and uptake Ian Bird CERN IT LCG-LHCC Referees Meeting 29 th January 2007.
CMS Experience with the Common Analysis Framework I. Fisk & M. Girone Experience in CMS with the Common Analysis Framework Ian Fisk & Maria Girone 1.
Trusted Virtual Machine Images the HEPiX Point of View Tony Cass October 21 st 2011.
CNAF - 24 September 2004 EGEE SA-1 SPACI Activity Italo Epicoco.
Status of Task Forces Ian Bird GDB 8 May 2003.
EGEE Middleware Activities Overview
Summary on PPS-pilot activity on CREAM CE
Grid Deployment Area Status Report
Leanne Guy EGEE JRA1 Test Team Manager
Ian Bird LCG Project - CERN HEPiX - FNAL 25-Oct-2002
LHC Data Analysis using a worldwide computing grid
Report on GLUE activities 5th EU-DataGRID Conference
SEAL Project Core Libraries and Services
Presentation transcript:

LCG Status GridPP 8 September 22 nd 2003 CERN.ch

CERN.ch 2 LCG/PEB Work areas  Applications  Fabrics  Grid Technology  Grid Deployment

CERN.ch 3 Applications  SPI  POOL  SEAL  PI  Simulation Software Process & Infrastructure (SPI) Core Libraries & Services (SEAL) Persistency (POOL) Physicist Interface (PI) Simulation … LCG Applications Area Other LCG Projects in other Areas LHC Experiments

CERN.ch 4 Applications  SPI –Software Infrastructure solidly in place. The different components were covered in depth at GridPP 7. »Effort in this area reduced, but incremental improvements being delivered in response to feedback.  POOL  SEAL  PI  Simulation

CERN.ch 5 Applications  SPI  POOL –First production release delivered on schedule in June –Experiment integration now underway »production use for CMS Pre-Challenge Production milestone met at end July »completion of first ATLAS integration milestone expected in September. »POOL deployment on LCG-1 beginning »POOL and SEAL working closely with experiment integrators to resolve bugs and issues exposed in integration. u Lots of them!, but this was expected!  SEAL  PI  Simulation

CERN.ch 6 Applications  SPI  POOL  SEAL –Project on track …  PI  Simulation

CERN.ch 7 Applications ReleaseDateStatusDescription (goals) V /02/03internal  Establish dependency between POOL and SEAL  Dictionary support & generation from header files V /03/03public  Essential functionality sufficient for the other existing LCG projects (POOL)  Foundation library, system abstraction, etc.  Plugin management V /05/03internal  Improve functionality required by POOL  Basic framework base classes V /06/03public  Essential functionality sufficient to be adopted by experiments  Collection of basic framework services  Scripting support Released 04/04/03 Released 14&26/02/03 Released 23/05/03 Released 18/07/03

CERN.ch 8 Applications  SPI  POOL  SEAL –Project on track … –Waiting for detailed feedback on current functionality from POOL & experiments –Planning to develop new requested functionality »Object whiteboard (transient datastore) »Improvements to scripting: LCG dictionary integration, ROOT integration »Complete support for C++ types in the LCG dictionary  PI  Simulation

CERN.ch 9 Applications  SPI  POOL  SEAL  PI –Principal initial mandate, a full ROOT implementation of AIDA histograms, recently completed –Still a small effort with limited scope, though. –Future planning depends on what comes out of the ARDA RTAG »Architectural Roadmap towards Distributed Analysis »Reviewing DA activities, HEPCAL II use cases, interfaces between Grid, LCG and experiment-specific services. »Started in September, scheduled to finish in October.  Simulation

CERN.ch 10 Applications  SPI  POOL  SEAL  PI  Simulation –Physics Validation subproject particularly active »pion shower profile for ATLAS improved »expect extensive round of comparison with testbeam data in autumn. –ROSE: (Revised Overall Simulation Environment) »Looking at generic framework high level design, implementation approach, software to be reused. Decisions expected in September.

CERN.ch 11 Fabrics  CC Infrastructure  Recosting  Management successes  RH release cycles

CERN.ch 12 Fabrics — CC Infrastructure

CERN.ch 13 Fabrics — CC Infrastructure

CERN.ch 14 Fabrics — Recosting I  Representatives from IT and the 4 LHC experiments reviewed the expected equipment cost for LCG phase 2. –Took into account adjusted requirements from the experiments and some slight changes to the overall model. –Results published in July.

CERN.ch 15 Fabrics — Recosting II All units in [ million CHF ]

CERN.ch 16 Fabrics — System Management  Overall management suite christened over the summer: ELFms with components –quattor : EDG/WP4 installation & configuration –Lemon: LHC Era monitoring –LEAF: LHC Era Advanced Fabrics  quattor thoroughly in control of CERN fabric –migration to RH 7.3 managed by quattor in spring. –LSF 5 migration took 10 minutes in late August »Across 800+ batch nodes. Equivalent migration in 2002 took over 3 weeks with much disruption.  EDG/WP4 OraMon repository in production since September 1 st.  State Management System development underway.

CERN.ch 17 Fabrics — RedHat Release Cycles  RedHat are moving to a twin product line –Frequent end-user releases with support limited to 1 year, but free. –Less frequent business releases with long term support at a cost.  Neither product really adapted to our needs –Annual change of system version is too rapid: month cycle more realistic. –Cost of Enterprise server prohibitive for our farms.  Move to negotiate with RedHat for compromise –Major labs club together to pay for limited support (security patches + ?) for the end-user product for, say, 2 years. –Discussions at HEPiX in Vancouver »Plus visit to RedHat?

CERN.ch 18 Grid Deployment — I  Deployment started (with pre-release tag) in July, to original 10 Tier 1 sites –CERN, BNL, CNAF, FNAL, FZK, Lyon, Moscow, RAL, Taipei, Tokyo –Other sites joined: PIC (Barcelona), Prague, Budapest  Situation today (18/9/03): –10 sites up: CERN, CNAF, RAL, FZK, FNAL, Moscow, Taipei, Tokyo, PIC, Budapest –Still working on installation: BNL, Prague, Lyon (situation not clear)  Other sites currently ready to join: –Bulgaria, Pakistan, Switzerland, Spanish Tier 2’s, Nikhef, Sweden  Official “certified” LCG-1 release (tag LCG-1.0.0) was available on 1 September at 5pm CET –Was installed at CERN, Taiwan, CNAF, Barcelona, Tokyo 24 hours later(!), and several others within a few days

CERN.ch 19 Grid Deployment — II LCG-1 is:  VDT (Globus 2.2.4) –Information System (MDS)  Selected software from EDG 2.0: –Workload Management System (RB) –EDG Data Management (RLS, LRC, …)  GLUE Schema LCG extensions  LCG local modifications/additions/fixes, such as: –Special job managers (LCGLSF, LCGPBS, LCGCONDOR) to solve the problem of sharing home directories –Gatekeeper enhancements (adding some accounting and auditing features, log rotation, that LCG requires) –Number of MDS fixes (also coming from NorduGrid) –Number of misc. Globus fixes, most of them included now in the VDT version LCG is using  Some problems remain. Overall, though, impressive improvement in terms of stability.

CERN.ch 20 Grid Deployment — III  Starting to get experiments testing LCG-1 now –Loose cannons currently running on LCG-1 to verify basic functionality –Scheduling now with experiments »Initially we need to carefully control who does what » we need to monitor the system as the tests run to understand the problems –Migrate CMS LCG-0 to LCG-1 –Atlas, US_Atlas (want to demonstrate interoperability) –ALICE – continue with tests started by Loose Cannons –LHCb ? –We are scheduling these tests now, will commence next week  Once experiments verify their software on LCG-1 we must begin to add resources at each site –Currently very basic resources available

CERN.ch 21 Grid Deployment — IV  Basics are in place but many tasks to be done at high priority to make a real production system: –Experiment sw distribution mechanism –Monitors to watch essential system resources on essential services (/tmp, etc) –System cleanup procedures –System auditing – must ensure procedures are in place –Need basic usage accounting in place –Need tool independent WN installation procedure – also for UI –Integration with MSS (setting up task force) »NB sites with MSS will need to implement interfaces –Integration with LXBatch (and others) –Standard procedures – we will start but needs a team from sites and GOC »for setting Runtime Environments »Change procedures »Operations »Incident handling

CERN.ch 22 Summary  In general, good progress –Applications area –Fabrics, …  Yes, LCG-1 is delayed –but don’t forget the vast improvements to the overall system driven by the focus on delivering a production quality environment.  UK contribution to this work is extensive and much appreciated.