Zprávy z ATLAS SW Week March 2004 Seminář ATLAS SW CZ Duben 2004 Jiří Chudoba FzÚ AV CR.

Slides:



Advertisements
Similar presentations
31/03/00 CMS(UK)Glenn Patrick What is the CMS(UK) Data Model? Assume that CMS software is available at every UK institute connected by some infrastructure.
Advertisements

Resources for the ATLAS Offline Computing Basis for the Estimates ATLAS Distributed Computing Model Cost Estimates Present Status Sharing of Resources.
Experience with ATLAS Data Challenge Production on the U.S. Grid Testbed Kaushik De University of Texas at Arlington CHEP03 March 27, 2003.
Current Monte Carlo calculation activities in ATLAS (ATLAS Data Challenges) Oxana Smirnova LCG/ATLAS, Lund University SWEGRID Seminar (April 9, 2003, Uppsala)
U.S. ATLAS Software WBS 2.2 S. Rajagopalan July 8, 2003 DOE/NSF Review of LHC Computing.
December Pre-GDB meeting1 CCRC08-1 ATLAS’ plans and intentions Kors Bos NIKHEF, Amsterdam.
Ian M. Fisk Fermilab February 23, Global Schedule External Items ➨ gLite 3.0 is released for pre-production in mid-April ➨ gLite 3.0 is rolled onto.
August 98 1 Jürgen Knobloch ATLAS Software Workshop Ann Arbor ATLAS Computing Planning ATLAS Software Workshop August 1998 Jürgen Knobloch Slides also.
Argonne National Laboratory ATLAS Core Database Software U.S. ATLAS Collaboration Meeting New York 22 July 1999 David Malon
LCG Milestones for Deployment, Fabric, & Grid Technology Ian Bird LCG Deployment Area Manager PEB 3-Dec-2002.
ATLAS-Specific Activity in GridPP EDG Integration LCG Integration Metadata.
CMS Report – GridPP Collaboration Meeting VI Peter Hobson, Brunel University30/1/2003 CMS Status and Plans Progress towards GridPP milestones Workload.
ATLAS Data Challenge Production and U.S. Participation Kaushik De University of Texas at Arlington BNL Physics & Computing Meeting August 29, 2003.
Alexandre A. P. Suaide VI DOSAR workshop, São Paulo, 2005 STAR grid activities and São Paulo experience.
ATLAS and Grid Computing RWL Jones GridPP 13 5 th July 2005.
Computing Infrastructure Status. LHCb Computing Status LHCb LHCC mini-review, February The LHCb Computing Model: a reminder m Simulation is using.
Offline Coordinators  CMSSW_7_1_0 release: 17 June 2014  Usage:  Generation and Simulation samples for run 2 startup  Limited digitization and reconstruction.
LHC: ATLAS Experiment meeting “Conditions” data challenge Elizabeth Gallas - Oxford - August 29, 2009 XLDB3.
The first year of LHC physics analysis using the GRID: Prospects from ATLAS Davide Costanzo University of Sheffield
David Adams ATLAS ATLAS Distributed Analysis Plans David Adams BNL December 2, 2003 ATLAS software workshop CERN.
L.Perini-CSN11 ATLAS Italia Calcolo Stato e piani: Ruolo di LCG Nessun finanziamento chiesto ora (a Settembre si)
ATLAS and GridPP GridPP Collaboration Meeting, Edinburgh, 5 th November 2001 RWL Jones, Lancaster University.
F. Fassi, S. Cabrera, R. Vives, S. González de la Hoz, Á. Fernández, J. Sánchez, L. March, J. Salt, A. Lamas IFIC-CSIC-UV, Valencia, Spain Third EELA conference,
ATLAS Data Challenge Production Experience Kaushik De University of Texas at Arlington Oklahoma D0 SARS Meeting September 26, 2003.
7April 2000F Harris LHCb Software Workshop 1 LHCb planning on EU GRID activities (for discussion) F Harris.
Status of the LHCb MC production system Andrei Tsaregorodtsev, CPPM, Marseille DataGRID France workshop, Marseille, 24 September 2002.
19 November 98 1 Jürgen Knobloch ATLAS Computing ATLAS Computing - issues for 1999 Jürgen Knobloch Slides also on:
1 PRAGUE site report. 2 Overview Supported HEP experiments and staff Hardware on Prague farms Statistics about running LHC experiment’s DC Experience.
INFSO-RI Enabling Grids for E-sciencE SA1 and gLite: Test, Certification and Pre-production Nick Thackray SA1, CERN.
ATLAS Data Challenges US ATLAS Physics & Computing ANL October 30th 2001 Gilbert Poulard CERN EP-ATC.
ATLAS: Heavier than Heaven? Roger Jones Lancaster University GridPP19 Ambleside 28 August 2007.
CMS Computing and Core-Software USCMS CB Riverside, May 19, 2001 David Stickland, Princeton University CMS Computing and Core-Software Deputy PM.
GDB Meeting - 10 June 2003 ATLAS Offline Software David R. Quarrie Lawrence Berkeley National Laboratory
6/23/2005 R. GARDNER OSG Baseline Services 1 OSG Baseline Services In my talk I’d like to discuss two questions:  What capabilities are we aiming for.
ATLAS WAN Requirements at BNL Slides Extracted From Presentation Given By Bruce G. Gibbard 13 December 2004.
Post-DC2/Rome Production Kaushik De, Mark Sosebee University of Texas at Arlington U.S. Grid Phone Meeting July 13, 2005.
SC4 Planning Planning for the Initial LCG Service September 2005.
Integration of the ATLAS Tag Database with Data Management and Analysis Components Caitriana Nicholson University of Glasgow 3 rd September 2007 CHEP,
13 October 2004GDB - NIKHEF M. Lokajicek1 Operational Issues in Prague Data Challenge Experience.
ATLAS Grid Computing Rob Gardner University of Chicago ICFA Workshop on HEP Networking, Grid, and Digital Divide Issues for Global e-Science THE CENTER.
The ATLAS Computing Model and USATLAS Tier-2/Tier-3 Meeting Shawn McKee University of Michigan Joint Techs, FNAL July 16 th, 2007.
David Stickland CMS Core Software and Computing
Introduction S. Rajagopalan August 28, 2003 US ATLAS Computing Meeting.
LCG Service Challenges SC2 Goals Jamie Shiers, CERN-IT-GD 24 February 2005.
Overview of ATLAS Data Challenge Oxana Smirnova LCG/ATLAS, Lund University GAG monthly, February 28, 2003, CERN Strongly based on slides of Gilbert Poulard.
Victoria, Sept WLCG Collaboration Workshop1 ATLAS Dress Rehersals Kors Bos NIKHEF, Amsterdam.
1 A Scalable Distributed Data Management System for ATLAS David Cameron CERN CHEP 2006 Mumbai, India.
PCAP Close Out Feb 2, 2004 BNL. Overall  Good progress in all areas  Good accomplishments in DC-2 (and CTB) –Late, but good.
ATLAS Distributed Analysis Dietrich Liko IT/GD. Overview  Some problems trying to analyze Rome data on the grid Basics Metadata Data  Activities AMI.
Distributed Analysis Tutorial Dietrich Liko. Overview  Three grid flavors in ATLAS EGEE OSG Nordugrid  Distributed Analysis Activities GANGA/LCG PANDA/OSG.
David Adams ATLAS ATLAS Distributed Analysis (ADA) David Adams BNL December 5, 2003 ATLAS software workshop CERN.
August 98 1 Jürgen Knobloch ATLAS Software Workshop Ann Arbor ACOS Report ATLAS Software Workshop December 1998 Jürgen Knobloch Slides also on:
L. Perini DATAGRID WP8 Use-cases 19 Dec ATLAS short term grid use-cases The “production” activities foreseen till mid-2001 and the tools to be used.
David Adams ATLAS ADA: ATLAS Distributed Analysis David Adams BNL December 15, 2003 PPDG Collaboration Meeting LBL.
INFSO-RI Enabling Grids for E-sciencE File Transfer Software and Service SC3 Gavin McCance – JRA1 Data Management Cluster Service.
Dario Barberis: Conclusions ATLAS Software Week - 10 December Conclusions Dario Barberis CERN & Genoa University.
ATLAS Experience on Large Scale Productions on the Grid CHEP-2006 Mumbai 13th February 2006 Gilbert Poulard (CERN PH-ATC) on behalf of ATLAS Data Challenges;
ATLAS Computing Model Ghita Rahal CC-IN2P3 Tutorial Atlas CC, Lyon
ATLAS-MPI Meeting 104 April 2005 Nektarios Benekos MPI  the problem with the offset of the CSC's in digitization will be fixed with a modification to.
Wouter Verkerke, NIKHEF Status of DC3 (CSC) production Wouter Verkerke (NIKHEF)
Database Readiness Workshop Intro & Goals
ALICE Physics Data Challenge 3
LHCb Computing Model and Data Handling Angelo Carbone 5° workshop italiano sulla fisica p-p ad LHC 31st January 2008.
Readiness of ATLAS Computing - A personal view
ATLAS DC2 ISGC-2005 Taipei 27th April 2005
Zhongliang Ren 12 June 2006 WLCG Tier2 Workshop at CERN
US ATLAS Physics & Computing
LHC Data Analysis using a worldwide computing grid
ATLAS DC2 & Continuous production
Short to middle term GRID deployment plan for LHCb
Presentation transcript:

Zprávy z ATLAS SW Week March 2004 Seminář ATLAS SW CZ Duben 2004 Jiří Chudoba FzÚ AV CR

Dario Barberis: Introduction & News ATLAS Software & Computing Week - 1 Mar ATLAS Computing Timeline POOL/SEAL release (done) ATLAS release 7 (with POOL persistency) (done) LCG-1 deployment (in progress) ATLAS complete Geant4 validation (done) ATLAS release 8 DC2 Phase 1: simulation production DC2 Phase 2: intensive reconstruction (the real challenge!) Combined test beams (barrel wedge) Computing Model paper Computing Memorandum of Understanding (moved to end 2004) ATLAS Computing TDR and LCG TDR DC3: produce data for PRR and test LCG-n Physics Readiness Report Start commissioning run GO! NOW

Dario Barberis: Introduction & News ATLAS Software & Computing Week - 1 Mar Near-term Software Release Plan l 7.5.0: 14th Jan 2004 l 7.6.0: 4th Feb l 7.7.0: 25th Feb <- SPMB Decision 3rd Feb l 8.0.0: 17th Mar <- DC2 & CTB Simulation Release l 8.1.0: 7th Apr l 8.2.0: 28th Apr l 8.3.0: 19th May l 9.0.0: 9th Jun <- DC2 & CTB Reconstruction Release l 9.1.0: 30th Jun l 9.2.0: 21st Jul l 9.3.0: 11th Aug ⇐ 15th Feb: LAr Technical run starts ⇐ 1st May: Baseline DC-2 Simulation starts ⇐ 10th May: Testbeam starts ⇐ 1st Jul: Baseline DC-2 Reconstruction starts ⇐ 14th Jul: Complete Testbeam ⇐ 15th Jul: Baseline DC-2 Physics Analysis starts

Expected Major Milestones  GEANT4 Simulation DC2 (in validation) Test Beam (underway) Pile-Up, Digitization in Athena (debugging)  GeoModel Inner Detector, Muon Spectrometer  Conversion to CLHEP Units (mm, MeV, [-pi,pi])  POOL/SEAL Persistency  Bytestream Converters  Preliminary Conditions Capabilities Other Stepping Stones  Move to InstallArea  jobOption.txt --> jobOption.py  Distribution Kits Release 8

ATLAS DC2 ATLAS Software Workshop 2 March 2004 Gilbert Poulard CERN PH-ATC

DC2: goals  At this stage the goal includes:  Full use of Geant4; POOL; LCG applications  Pile-up and digitization in Athena  Deployment of the complete Event Data Model and the Detector Description  Simulation of full ATLAS and 2004 combined Testbeam  Test the calibration and alignment procedures  Use widely the GRID middleware and tools  Large scale physics analysis  Computing model studies (document end 2004)  Run as much as possible of the production on LCG-2

DC2 operation  Consider DC2 as a three-part operation: o part I: production of simulated data (May-June 2004)  needs Geant4, digitization and pile-up in Athena, POOL persistency  “minimal” reconstruction just to validate simulation suite  will run on any computing facilities we can get access to around the world o part II: test of Tier-0 operation (July 2004)  needs full reconstruction software following RTF report design, definition of AODs and TAGs  (calibration/alignment and) reconstruction will run on Tier-0 prototype as if data were coming from the online system (at 10% of the rate)  output (ESD+AOD) will be distributed to Tier-1s in real time for analysis o part III: test of distributed analysis on the Grid (August-Oct. 2004)  access to event and non-event data from anywhere in the world both in organized and chaotic ways o in parallel: run distributed reconstruction on simulated data

DC2: Scenario & Time scale September 03: Release7 March 17th: Release 8 (production) May 3 rd 04: July 1 st 04: “DC2” August 1 st : Put in place, understand & validate: Geant4; POOL; LCG applications Event Data Model Digitization; pile-up; byte-stream Conversion of DC1 data to POOL; large scale persistency tests and reconstruction Testing and validation Run test-production Start final validation Start simulation; Pile-up & digitization Event mixing Transfer data to CERN Intensive Reconstruction on “Tier0” Distribution of ESD & AOD Calibration; alignment Start Physics analysis Reprocessing

DC2 resources ProcessNo. of events Time duration CPU power Volume of data At CERN Off site monthskSI2kTB Simulation Phase I (May- June) Pile-up (*) Digitization Byte-stream (small)20 16 Total Phase I Reconstruction Tier Phase II (July) Reconstruction Tier Total

Atlas Production System schema RB Chimera RB Task (Dataset) Partition Transf. Definition Task Transf. Definition + physics signature Executable name Release version signature Supervisor 1Supervisor 2Supervisor 4 US GridLCGNG Local Batch Task = [job]* Dataset = [partition]* JOB DESCRIPTION Human intervention Data Management System US Grid Executer LCG Executer NG Executer LSF Executer Supervisor 3 Job Run Info Location Hint (Task) Location Hint (Job) Job (Partition) AMI

Tiers in DC2  Tier-0 o 20% of simulation will be done at CERN o All data in ByteStream format (~16 TB) will be copied to CERN o Reconstruction will be done at CERN (in ~10 days). o Reconstruction output (ESD) will be exported in 2 copies from Tier-0 ( 2 X ~5 TB).

Tiers in DC2  Tier-1s will have to o Host simulated data produced by them or coming from Tier-2; plus ESD (& AOD) coming from Tier-0 o Run reconstruction in parallel to Tier-0 exercise (~2 months)  This will include links to MCTruth  Produce and host ESD and AOD o Provide access to the ATLAS V.O. members  Tier-2s o Run simulation (and other components if they wish to) o Copy (replicate) their data to Tier-1 o ATLAS is committed to LCG  All information should be entered into the relevant database and catalog

Core sites and commitments SiteImmediateLater CERN CNAF FNAL10? FZK100? Nikhef PIC RAL70250 Taipei60? Russia3050 Prague1740 Budapest100? Totals864(+147)>2600(+>90) Initial LCG-2 core sites Other firm commitments Will bring in the other 20 LCG-1 sites as quickly as possible

Comments on schedule  The change of the schedule has been “driven” by o ATLAS side:  the readiness of the software Combined test beam has a highest priority  The availability of the production tools The integration with grid is not always easy o Grid side  The readiness of LCG We would prefer run Grid only!  Priorities are not defined by ATLAS only  For the Tier-0 exercise o It will be difficult to define the starting date before we have a better idea how work the “pile-up” and the “event-mixing” processes

Armin NAIRZ ATLAS Software Workshop, CERN, March 1-5, Generation and GEANT4 Simulation (as in Rel ) are already in a production-like state  stable and robust  CPU times per event, event sizes ‘within specifications’ Digitisation (as in Rel ) could not be tested for all sub-detectors (missing or not working)  for the tested ones, digitisation is working and stable  reason for confidence in working digitisation procedure for the whole detector in/after Rel Pile-up not yet fully functional Documentation on pre-production activities available from DC webpage   contains also how-to’s (running event generation, simulation, digitisation) Conclusions Status of the Software for Data Challenge 2 Armin Nairz Status of the Software for Data Challenge 2 Armin Nairz

ATLAS SW CZ Seminář Další schůze GRID Analysis Tools Distributed Analysis... Atlantis Tutorial Athena Tutorial