23-06-2003L.Perini-CSN11 ATLAS Italia Calcolo Stato e piani: Ruolo di LCG Nessun finanziamento chiesto ora (a Settembre si)

Slides:



Advertisements
Similar presentations
Simulation Project Major achievements (past 6 months 2007)
Advertisements

Current Monte Carlo calculation activities in ATLAS (ATLAS Data Challenges) Oxana Smirnova LCG/ATLAS, Lund University SWEGRID Seminar (April 9, 2003, Uppsala)
LCG Milestones for Deployment, Fabric, & Grid Technology Ian Bird LCG Deployment Area Manager PEB 3-Dec-2002.
ATLAS-Specific Activity in GridPP EDG Integration LCG Integration Metadata.
CMS Report – GridPP Collaboration Meeting VI Peter Hobson, Brunel University30/1/2003 CMS Status and Plans Progress towards GridPP milestones Workload.
ATLAS Data Challenge Production and U.S. Participation Kaushik De University of Texas at Arlington BNL Physics & Computing Meeting August 29, 2003.
5 November 2001F Harris GridPP Edinburgh 1 WP8 status for validating Testbed1 and middleware F Harris(LHCb/Oxford)
HEP Experiment Integration within GriPhyN/PPDG/iVDGL Rick Cavanaugh University of Florida DataTAG/WP4 Meeting 23 May, 2002.
ATLAS and Grid Computing RWL Jones GridPP 13 5 th July 2005.
3 Sept 2001F HARRIS CHEP, Beijing 1 Moving the LHCb Monte Carlo production system to the GRID D.Galli,U.Marconi,V.Vagnoni INFN Bologna N Brook Bristol.
David Adams ATLAS ATLAS Distributed Analysis David Adams BNL March 18, 2004 ATLAS Software Workshop Grid session.
BaBar Grid Computing Eleonora Luppi INFN and University of Ferrara - Italy.
Offline Coordinators  CMSSW_7_1_0 release: 17 June 2014  Usage:  Generation and Simulation samples for run 2 startup  Limited digitization and reconstruction.
Nick Brook Current status Future Collaboration Plans Future UK plans.
Grid Status - PPDG / Magda / pacman Torre Wenaus BNL U.S. ATLAS Physics and Computing Advisory Panel Review Argonne National Laboratory Oct 30, 2001.
Claudio Grandi INFN Bologna CHEP'03 Conference, San Diego March 27th 2003 Plans for the integration of grid tools in the CMS computing environment Claudio.
Finnish DataGrid meeting, CSC, Otaniemi, V. Karimäki (HIP) DataGrid meeting, CSC V. Karimäki (HIP) V. Karimäki (HIP) Otaniemi, 28 August, 2000.
David Adams ATLAS ATLAS Distributed Analysis Plans David Adams BNL December 2, 2003 ATLAS software workshop CERN.
ANL/BNL Virtual Data Technologies in ATLAS Alexandre Vaniachine Pavel Nevski US-ATLAS Core/GRID software workshop Brookhaven National Laboratory May 6-7,
ALICE Upgrade for Run3: Computing HL-LHC Trigger, Online and Offline Computing Working Group Topical Workshop Sep 5 th 2014.
N ATIONAL E NERGY R ESEARCH S CIENTIFIC C OMPUTING C ENTER Charles Leggett The Athena Control Framework in Production, New Developments and Lessons Learned.
ATLAS and GridPP GridPP Collaboration Meeting, Edinburgh, 5 th November 2001 RWL Jones, Lancaster University.
LCG Applications Area – Overview, Planning, Resources Torre Wenaus, BNL/CERN LCG Applications Area Manager LHCC Comprehensive Review.
LHC Computing Review Recommendations John Harvey CERN/EP March 28 th, th LHCb Software Week.
ATLAS Data Challenge Production Experience Kaushik De University of Texas at Arlington Oklahoma D0 SARS Meeting September 26, 2003.
5 May 98 1 Jürgen Knobloch Computing Planning for ATLAS ATLAS Software Week 5 May 1998 Jürgen Knobloch Slides also on:
David Adams ATLAS ADA, ARDA and PPDG David Adams BNL June 28, 2004 PPDG Collaboration Meeting Williams Bay, Wisconsin.
Production Tools in ATLAS RWL Jones GridPP EB 24 th June 2003.
ATLAS Data Challenges US ATLAS Physics & Computing ANL October 30th 2001 Gilbert Poulard CERN EP-ATC.
Zprávy z ATLAS SW Week March 2004 Seminář ATLAS SW CZ Duben 2004 Jiří Chudoba FzÚ AV CR.
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
ATLAS is a general-purpose particle physics experiment which will study topics including the origin of mass, the processes that allowed an excess of matter.
GDB Meeting - 10 June 2003 ATLAS Offline Software David R. Quarrie Lawrence Berkeley National Laboratory
LCG ARDA status Massimo Lamanna 1 ARDA in a nutshell ARDA is an LCG project whose main activity is to enable LHC analysis on the grid ARDA is coherently.
6/23/2005 R. GARDNER OSG Baseline Services 1 OSG Baseline Services In my talk I’d like to discuss two questions:  What capabilities are we aiming for.
Post-DC2/Rome Production Kaushik De, Mark Sosebee University of Texas at Arlington U.S. Grid Phone Meeting July 13, 2005.
PHENIX and the data grid >400 collaborators 3 continents + Israel +Brazil 100’s of TB of data per year Complex data with multiple disparate physics goals.
David Quarrie: ATLAS LCG Apps Area Feedback LCG Applications Area Internal Review – 30 March - 1 April ATLAS LCG Applications Area Feedback David.
SC4 Planning Planning for the Initial LCG Service September 2005.
Integration of the ATLAS Tag Database with Data Management and Analysis Components Caitriana Nicholson University of Glasgow 3 rd September 2007 CHEP,
K. Harrison CERN, 3rd March 2004 GANGA CONTRIBUTIONS TO ADA RELEASE IN MAY - Outline of Ganga project - Python support for AJDL - LCG analysis service.
K. Harrison CERN, 22nd September 2004 GANGA: ADA USER INTERFACE - Ganga release status - Job-Options Editor - Python support for AJDL - Job Builder - Python.
12 March, 2002 LCG Applications Area - Introduction slide 1 LCG Applications Session LCG Launch Workshop March 12, 2002 John Harvey, CERN LHCb Computing.
ATLAS-specific functionality in Ganga - Requirements for distributed analysis - ATLAS considerations - DIAL submission from Ganga - Graphical interfaces.
MND review. Main directions of work  Development and support of the Experiment Dashboard Applications - Data management monitoring - Job processing monitoring.
Introduction S. Rajagopalan August 28, 2003 US ATLAS Computing Meeting.
Pavel Nevski DDM Workshop BNL, September 27, 2006 JOB DEFINITION as a part of Production.
INFSO-RI Enabling Grids for E-sciencE Using of GANGA interface for Athena applications A. Zalite / PNPI.
The MEG Offline Project General Architecture Offline Organization Responsibilities Milestones PSI 2/7/2004Corrado Gatto INFN.
Overview of ATLAS Data Challenge Oxana Smirnova LCG/ATLAS, Lund University GAG monthly, February 28, 2003, CERN Strongly based on slides of Gilbert Poulard.
1 A Scalable Distributed Data Management System for ATLAS David Cameron CERN CHEP 2006 Mumbai, India.
Status of tests in the LCG 3D database testbed Eva Dafonte Pérez LCG Database Deployment and Persistency Workshop.
David Adams ATLAS ATLAS Distributed Analysis (ADA) David Adams BNL December 5, 2003 ATLAS software workshop CERN.
David Adams ATLAS ATLAS Distributed Analysis and proposal for ATLAS-LHCb system David Adams BNL March 22, 2004 ATLAS-LHCb-GANGA Meeting.
ATLAS Distributed Analysis DISTRIBUTED ANALYSIS JOBS WITH THE ATLAS PRODUCTION SYSTEM S. González D. Liko
CMS Production Management Software Julia Andreeva CERN CHEP conference 2004.
David Adams ATLAS ADA: ATLAS Distributed Analysis David Adams BNL December 15, 2003 PPDG Collaboration Meeting LBL.
12 March, 2002 LCG Applications Area - Introduction slide 1 LCG Applications Session LCG Launch Workshop March 12, 2002 John Harvey, CERN LHCb Computing.
Dario Barberis: Conclusions ATLAS Software Week - 10 December Conclusions Dario Barberis CERN & Genoa University.
ATLAS Physics Analysis Framework James R. Catmore Lancaster University.
David Adams ATLAS DIAL Distributed Interactive Analysis of Large datasets David Adams BNL May 19, 2003 BNL Technology Meeting.
Database Replication and Monitoring
U.S. ATLAS Grid Production Experience
Moving the LHCb Monte Carlo production system to the GRID
Readiness of ATLAS Computing - A personal view
ATLAS DC2 ISGC-2005 Taipei 27th April 2005
LCG middleware and LHC experiments ARDA project
US ATLAS Physics & Computing
LHC Data Analysis using a worldwide computing grid
ATLAS DC2 & Continuous production
Presentation transcript:

L.Perini-CSN11 ATLAS Italia Calcolo Stato e piani: Ruolo di LCG Nessun finanziamento chiesto ora (a Settembre si)

2 Layout Stato e piani del s/w –Slides scelte da presentazione di D.Quarrie al GDB del 10 giugno scorso Passato e futuro dei Data Challenges –Slides scelte da presentazione di G.Poulard al GDB del 10 giugno scorso Ruolo di LCG e fallbacks –Slides prodotte dal Computing coordinator per questa riunione, agreed nel gruppo di rappresentanti ATLAS in LCG

GDB Meeting - 10 June 2003 ATLAS Offline Software David R. Quarrie Lawrence Berkeley National Laboratory

David R. Quarrie: ATLAS Offline Software GDB Meeting - 10 June ATLAS is in closing stages of transition from FORTRAN-based software to C++ based For DC-0 & DC-1 simulation was based on Geant3 For DC-2 it will be based on Geant4  ATLAS has been very active in validating Geant4 Common Framework (Athena) based on collaboration with LHCb First version of C++ reconstruction in place  Used in Level 2 and Event Filter as well as offline  First major design iteration underway  Functionality and robustness are already good (>10 6 events in DC-2)  Performance in some areas needs work Software Overview

David R. Quarrie: ATLAS Offline Software GDB Meeting - 10 June Based on concepts of Components Services, Algorithms and Tools Highly modular and flexible Good mapping to GRID services Based on abstract interfaces - no direct coupling with algorithms Compatible with non-GRID environment (e.g. laptop) Integration with interactive scripting language (Python) Athena

David R. Quarrie: ATLAS Offline Software GDB Meeting - 10 June A recurring problem is compatibility with external software e.g. POOL/SEAL need their own set of external software (e.g. Boost) Still grappling with obvious (e.g. incompatible versions) and not so obvious (e.g. compilation/configuration flags) problems This is still an area requiring more work to minimize ATLAS-specific external packages and take advantage of e.g. LCG common installations Software Distribution (2/2)

David R. Quarrie: ATLAS Offline Software GDB Meeting - 10 June LCG Component Integration POOL/SEAL Geant4 Integration Pile-up Infrastructure in place All detectors supported Detector Description Integration Reconstruction and G4 Simulation from common geometry Calibration/alignment infrastructure in place Begin to incorporate feedback from Reco Task Force New Reco EDM Release July 2003

David R. Quarrie: ATLAS Offline Software GDB Meeting - 10 June Dual targets DC-2 (Q2-Q3 2004) Combined Testbeams (Q2-Q3 2004) A major focus is consolidation from Robustness, house-cleaning Performance G4 validated for production Full integration from RecoTaskForce designs/recommendations Interactive as well as batch  Replacement of jobOptions files by Python scripts GRID integration Release (Feb 2004)

David R. Quarrie: ATLAS Offline Software GDB Meeting - 10 June Multiple prototypes developed in conjunction with data challenges Both European and USA Magda, AMI, Grappa, etc. Some overlapping functionality, but necessary to explore Distributed Physics Analysis Projects developing GANGA, DIAL, Chimera Goal is to bring these under a coherent umbrella by end of Q ready for DC-2 GRID Projects

10 Towards ATLAS Data Challenges 2 LCG-GDB 10 th June 2003 Gilbert Poulard ATLAS Data Challenges Co-ordinator CERN EP-ATC

11 Outline nDC1: a starting point for DC2 u What has been achieved nDC2 u Main goals u Planning u Resources

12 ATLAS DC1 (July 2002-April 2003) nPrimary concern was delivery of events to High Level Trigger (HLT) and to Physics communities u HLT-TDR due by June 2003 u Athens Physics workshop in May 2003 nPut in place the full software chain from event generation to reconstruction u Switch to AthenaRoot I/O (for Event generation) u Updated geometry u New Event Data Model and Detector Description u Reconstruction (mostly OO) moved to Athena nPut in place the distributed production u “ATLAS kit” (rpm) for software distribution u Scripts and tools (monitoring, bookkeeping)  AMI database; Magda replica catalogue; VDC  Job production (AtCom) u Quality Control and Validation of the full chain nUse as much as possible Grid tools

13 Tools in DC1 AMIMagda VDC AtCom GRAT replica catalog physics metadata recipe catalog Perm. production log Trans. production log physics metadata perm production log trans production log replica catalog recipe catalog interactive production framework automatic production framework AMI physics metadata

14 DC1 in numbers ProcessNo. of events CPU TimeCPU-days (400 SI2k) Volume of data kSI2k.monthsTB Simulation Physics evt Simulation Single part. 3x Lumi02 Pile-up4x Lumi10 Pile-up2.8x Reconstruction4x Reconstruction + Lvl1/2 2.5x10 6 (84)(6300) Total690 (+84)51000 (+6300) 60

15 ATLAS DC1 Phase 1 : July-August CPU‘s 110 kSI CPU days 5*10* 7 events generated 1*10* 7 events simulated 3*10* 7 single particles 30 Tbytes files 39 Institutes in 18 Countries 1.Australia 2.Austria 3.Canada 4.CERN 5.Czech Republic 6.France 7.Germany 8.Israel 9.Italy 10.Japan 11.Nordic 12.Russia 13.Spain 14.Taiwan 15.UK 16.USA grid tools used at 11 sites

16 Primary data (in 8 sites) Data (TB) Simulation: 23.7 (40%) Pile-up: 35.4 (60%) Lumi02: (14.5) Lumi10: (20.9) Pile-up: Low luminosity ~ 4 x 10 6 events (~ 4 x 10 3 NCU days) High luminosity ~ 3 x 10 6 events ( ~ 12 x 10 3 NCU days) Data replication using Grid tools (Magda)

17 Grid in ATLAS DC1 US-ATLAS EDG Testbed Prod NorduGrid part of simulation reproduce part of full phase 1 & 2 Pile-up phase 1 data production reconstruction several tests reconstruction GRAT & Chimera

18 ATLAS Data Challenges: DC2 July 2003 – July 2004 nAt this stage the goal includes:  Full detector simulation with Geant4  Pile-up and digitization in Athena  Deployment of the complete Event Data Model and the Detector Description  Use as much as possible the LCG Applications software (e.g. POOL)  Test the calibration and alignment procedures  Perform large-scale physics analysis  Use widely the GRID middleware  Use more and more GRID tools  Run as much as possible the production on LCG-1

19 DC2 and LCG-1 (LP summary) nLCG-1 u We intend to use and contribute to validate LCG-1 components when they become available (R-GMA; RLS; …) u ATLAS-EDG becoming ATLAS-LCG task force nScale of DC2 u About 10 7 events simulated as in DC1 (but GEANT4) u All of them pileupped u All of them reconsructed u Analysis….

20 DC2:Time scale nEnd-July: Release 7 nMid-November: pre-production release nFebruary 1 st : ”production” release nApril 1 st nJune 1 st : “DC2” nJuly 15th  Put in place, understand & validate:  Geant4  POOL persistency & LCG App.  Event Data Model  Digitization; pile-up; byte-stream  Conversion of DC1 data to POOL and run reconstruction  Testing and validation  Run test-production  Start final validation  Start simulation  Pile-up & digitization  Transfer data to CERN  Start Reconstruction on “Tier0”  Distribution of ESD & AOD  Calibration; alignment  Start Physics analysis  Reprocessing

21 ATLAS Data Challenges: DC2 nWe are building an ATLAS Grid production & Analysis system nWe intend to put in place a “permanent” Monte Carlo production system u If we continue to produce simulated data during summer 2004 we want to keep open the possibility to run another “DC” later (November 2004?) with more statistics

INFN, 23 June ATLAS Software & Computing and LCG products in Dario Barberis CERN & Genoa University/INFN

INFN, 23 June ATLAS Computing Timeline POOL/SEAL release ATLAS release 7 (with POOL persistency) LCG-1 deployment ATLAS complete Geant4 validation ATLAS release 8 DC2 Phase 1: simulation production DC2 Phase 2: intensive reconstruction (the real challenge!) Combined test beams (barrel wedge) Computing Model paper ATLAS Computing TDR and LCG TDR DC3: produce data for PRR and test LCG-n Computing Memorandum of Understanding Physics Readiness Report Start commissioning run GO! NOW

INFN, 23 June How to get there: 1) Software Software developments in progress: –Geant4 simulation validation for production –GeoModel (Detector Description) integration in simulation and reconstruction –Full implementation of new Event Data Model –Restructuring of trigger selection, reconstruction and analysis environment –POOL persistency –Interval of Validity service and Conditions DataBase –Detector response simulation in Athena –Pile-up in Athena (was in atlsim/G3)

INFN, 23 June SEAL –Plug-in manager Internal use by POOL now Full integration into Athena Q –Data Dictionary Integrated into Athena now Includes Python support POOL –Integration underway –Goal is to have demonstrated support for POOL by 31 July Ability to read and write components of the ATLAS EDM –Complete support by Oct 2003 SEAL Maths Library –Integrate in time for DC-2 PI –Integrate ROOT implementation of AIDA API Q LCG Applications Components

INFN, 23 June Main product to we need urgently is POOL persistency –Right now many integration problems –Several ATLAS and LCG people actively working on them –We assume major problems will be sorted out by end July (ATLAS release 7), and full deployment in October What if...? –If there are problems of principle that cannot be overcome (discovered during the Summer): go back to AthenaROOT (home-made direct coupling of Athena/StoreGate to ROOT I/O already prototyped) write converters by hand introduce delays as more work is needed not nice. –Decision in October 2003 to be ready anyway for DC2 LCG Applications: fall-back solutions

INFN, 23 June How to get there: 2) Data Challenges DC1 ( ) completed in April 2003: –2 nd pass of reconstruction with Trigger L1 and L2 algorithms for HLT TDR in progress –Zebra/Geant3 files will be converted to POOL format and used for large-scale persistency tests –they will be used as input for validation of new reconstruction environment DC2 (1 st half 2004): –provide data for Computing Model document (end 2004) –full use of Geant4, POOL and Conditions DB –simulation of full ATLAS and of 2004 combined test beam –prompt reconstruction of 2004 combined test beam DC3 (2 nd half 2005): –scale up computing infrastructure and complexity –provide data for Physics Readiness Report Commissioning Run (from 2 nd half 2006): –real operation!

INFN, 23 June We plan to test (and use) the LCG-1 infrastructure as soon as deployed and functional –First tests will start in 2 nd half of July as soon as CERN installation is open to the experiments ATLAS-EDG test group will become ATLAS-LCG test group can run jobs of varying complexity (CPU and I/O), simulation, pile- up, reconstruction, (analysis later) –In parallel, we continue developing our production tools we have to live with several Grid flavours for a long time to come and we have for the time being to continue productions in non-Grid environments –Effort on distributed analysis tool underway within several national Grid projects new RTAG-11 should help here to get some coherence in developments internal ATLAS coordination also started in this area LCG-1 Deployment

INFN, 23 June What if...? –We assume there will always be several flavours of Grids and other production sites we have to cope with typical examples are electrical grids: we can move electrical appliances all over the world but we need different connectors and transformers –For large-scale productions we know how to cope in the “old” way in DC1 we have produced >10 7 fully-simulated events using >50 different sites, some linked in Grid systems –The real need is for the “added values” of Grids, mainly useful for end-user data analysis: user certification data and CPU management (submit jobs to a single interface) Conclusion: we can cope with delays in the availability of a fully performant system till Q (DC2): if still problematic at that point we have to re-think our computing model. LCG-1 Deployment: fall-back solutions