Overview of ATLAS Data Challenge Oxana Smirnova LCG/ATLAS, Lund University GAG monthly, February 28, 2003, CERN Strongly based on slides of Gilbert Poulard.

Slides:



Advertisements
Similar presentations
CMS HLT production using Grid tools Flavia Donno (INFN Pisa) Claudio Grandi (INFN Bologna) Ivano Lippi (INFN Padova) Francesco Prelz (INFN Milano) Andrea.
Advertisements

The DataGrid Project NIKHEF, Wetenschappelijke Jaarvergadering, 19 December 2002
Swedish participation in DataGrid and NorduGrid Paula Eerola SWEGRID meeting,
23/04/2008VLVnT08, Toulon, FR, April 2008, M. Stavrianakou, NESTOR-NOA 1 First thoughts for KM3Net on-shore data storage and distribution Facilities VLV.
Experience with ATLAS Data Challenge Production on the U.S. Grid Testbed Kaushik De University of Texas at Arlington CHEP03 March 27, 2003.
Current Monte Carlo calculation activities in ATLAS (ATLAS Data Challenges) Oxana Smirnova LCG/ATLAS, Lund University SWEGRID Seminar (April 9, 2003, Uppsala)
Task 3.5 Tests and Integration ( Wp3 kick-off meeting, Poznan, 29 th -30 th January 2002 Santiago González de la.
August 98 1 Jürgen Knobloch ATLAS Software Workshop Ann Arbor ATLAS Computing Planning ATLAS Software Workshop August 1998 Jürgen Knobloch Slides also.
ATLAS-Specific Activity in GridPP EDG Integration LCG Integration Metadata.
CMS Report – GridPP Collaboration Meeting VI Peter Hobson, Brunel University30/1/2003 CMS Status and Plans Progress towards GridPP milestones Workload.
December 17th 2008RAL PPD Computing Christmas Lectures 11 ATLAS Distributed Computing Stephen Burke RAL.
ATLAS Data Challenge Production and U.S. Participation Kaushik De University of Texas at Arlington BNL Physics & Computing Meeting August 29, 2003.
10/20/05 LIGO Scientific Collaboration 1 LIGO Data Grid: Making it Go Scott Koranda University of Wisconsin-Milwaukee.
Workload Management WP Status and next steps Massimo Sgaravatto INFN Padova.
Alexandre A. P. Suaide VI DOSAR workshop, São Paulo, 2005 STAR grid activities and São Paulo experience.
David Adams ATLAS ATLAS Distributed Analysis David Adams BNL March 18, 2004 ATLAS Software Workshop Grid session.
Computing Infrastructure Status. LHCb Computing Status LHCb LHCC mini-review, February The LHCb Computing Model: a reminder m Simulation is using.
Finnish DataGrid meeting, CSC, Otaniemi, V. Karimäki (HIP) DataGrid meeting, CSC V. Karimäki (HIP) V. Karimäki (HIP) Otaniemi, 28 August, 2000.
ANL/BNL Virtual Data Technologies in ATLAS Alexandre Vaniachine Pavel Nevski US-ATLAS Core/GRID software workshop Brookhaven National Laboratory May 6-7,
L.Perini-CSN11 ATLAS Italia Calcolo Stato e piani: Ruolo di LCG Nessun finanziamento chiesto ora (a Settembre si)
ATLAS Data Challenges CHEP03 La Jolla, California 24 th March 2003 Gilbert Poulard ATLAS Data Challenges Co-ordinator CERN EP-ATC For the ATLAS DC & Grid.
ATLAS and GridPP GridPP Collaboration Meeting, Edinburgh, 5 th November 2001 RWL Jones, Lancaster University.
ATLAS Data Challenge Production Experience Kaushik De University of Texas at Arlington Oklahoma D0 SARS Meeting September 26, 2003.
David Adams ATLAS ADA, ARDA and PPDG David Adams BNL June 28, 2004 PPDG Collaboration Meeting Williams Bay, Wisconsin.
Quick Introduction to NorduGrid Oxana Smirnova 4 th Nordic LHC Workshop November 23, 2001, Stockholm.
Production Tools in ATLAS RWL Jones GridPP EB 24 th June 2003.
David Adams ATLAS DIAL status David Adams BNL November 21, 2002 ATLAS software meeting GRID session.
And Tier 3 monitoring Tier 3 Ivan Kadochnikov LIT JINR
ATLAS Data Challenges US ATLAS Physics & Computing ANL October 30th 2001 Gilbert Poulard CERN EP-ATC.
November 2013 Review Talks Morning Plenary Talk – CLAS12 Software Overview and Progress ( ) Current Status with Emphasis on Past Year’s Progress:
The Experiments – progress and status Roger Barlow GridPP7 Oxford 2 nd July 2003.
Zprávy z ATLAS SW Week March 2004 Seminář ATLAS SW CZ Duben 2004 Jiří Chudoba FzÚ AV CR.
Preparation for Integration Organized access to the code WP6 infrastructure (MDS-2, RC, …) Input from WPs on requirements,... Acquire experience with Globus.
CMS Computing and Core-Software USCMS CB Riverside, May 19, 2001 David Stickland, Princeton University CMS Computing and Core-Software Deputy PM.
05/09/2001ATLAS UK Physics Meeting Data Challenge Needs RWL Jones.
DataGRID PTB, Geneve, 10 April 2002 Testbed Software Test Plan Status Laurent Bobelin on behalf of Test Group.
LHCb production experience with Geant4 LCG Applications Area Meeting October F.Ranjard/ CERN.
CERN IT Department CH-1211 Genève 23 Switzerland t Frédéric Hemmer IT Department Head - CERN 23 rd August 2010 Status of LHC Computing from.
ATLAS is a general-purpose particle physics experiment which will study topics including the origin of mass, the processes that allowed an excess of matter.
GDB Meeting - 10 June 2003 ATLAS Offline Software David R. Quarrie Lawrence Berkeley National Laboratory
ATLAS Data Challenges on the Grid Oxana Smirnova Lund University October 31, 2003, Košice.
The CMS Simulation Software Julia Yarba, Fermilab on behalf of CMS Collaboration 22 m long, 15 m in diameter Over a million geometrical volumes Many complex.
6/23/2005 R. GARDNER OSG Baseline Services 1 OSG Baseline Services In my talk I’d like to discuss two questions:  What capabilities are we aiming for.
Post-DC2/Rome Production Kaushik De, Mark Sosebee University of Texas at Arlington U.S. Grid Phone Meeting July 13, 2005.
David Adams ATLAS DIAL: Distributed Interactive Analysis of Large datasets David Adams BNL August 5, 2002 BNL OMEGA talk.
AliEn AliEn at OSC The ALICE distributed computing environment by Bjørn S. Nilsen The Ohio State University.
Andrew McNab - Manchester HEP - 17 September 2002 UK Testbed Deployment Aim of this talk is to the answer the questions: –“How much of the Testbed has.
ATLAS Data Challenge on NorduGrid CHEP2003 – UCSD Anders Wäänänen
Integration of the ATLAS Tag Database with Data Management and Analysis Components Caitriana Nicholson University of Glasgow 3 rd September 2007 CHEP,
Performance of The NorduGrid ARC And The Dulcinea Executor in ATLAS Data Challenge 2 Oxana Smirnova (Lund University/CERN) for the NorduGrid collaboration.
2 Sep 2002F Harris EDG/WP6 meeeting at Budapest LHC experiments use of EDG Testbed F Harris (Oxford/CERN)
Oxana Smirnova LCG/ATLAS/Lund September 3, 2002, Budapest 5 th EU DataGrid Conference ATLAS-EDG Task Force status report.
ATLAS-specific functionality in Ganga - Requirements for distributed analysis - ATLAS considerations - DIAL submission from Ganga - Graphical interfaces.
Computing Issues for the ATLAS SWT2. What is SWT2? SWT2 is the U.S. ATLAS Southwestern Tier 2 Consortium UTA is lead institution, along with University.
1 A Scalable Distributed Data Management System for ATLAS David Cameron CERN CHEP 2006 Mumbai, India.
Oxana Smirnova LCG/ATLAS/Lund August 27, 2002, EDG Retreat ATLAS-EDG Task Force status report.
ATLAS Distributed Analysis Dietrich Liko IT/GD. Overview  Some problems trying to analyze Rome data on the grid Basics Metadata Data  Activities AMI.
L. Perini DATAGRID WP8 Use-cases 19 Dec ATLAS short term grid use-cases The “production” activities foreseen till mid-2001 and the tools to be used.
ATLAS Distributed Analysis DISTRIBUTED ANALYSIS JOBS WITH THE ATLAS PRODUCTION SYSTEM S. González D. Liko
1 Plans for the Muon Trigger CSC Note. 2 Muon Trigger CSC Studies General performance studies and trigger rate evalution for the full slice Evaluation.
Joe Foster 1 Two questions about datasets: –How do you find datasets with the processes, cuts, conditions you need for your analysis? –How do.
Monthly video-conference, 18/12/2003 P.Hristov1 Preparation for physics data challenge'04 P.Hristov Alice monthly off-line video-conference December 18,
ATLAS Distributed Analysis S. González de la Hoz 1, D. Liko 2, L. March 1 1 IFIC – Valencia 2 CERN.
The EDG Testbed Deployment Details
U.S. ATLAS Grid Production Experience
JRA3 Introduction Åke Edlund EGEE Security Head
Philippe Charpentier CERN – LHCb On behalf of the LHCb Computing Group
ATLAS DC2 ISGC-2005 Taipei 27th April 2005
US ATLAS Physics & Computing
ATLAS DC2 & Continuous production
Presentation transcript:

Overview of ATLAS Data Challenge Oxana Smirnova LCG/ATLAS, Lund University GAG monthly, February 28, 2003, CERN Strongly based on slides of Gilbert Poulard for the ATLAS Plenary on

Data Challenge 1 Main goals: Need to produce data for High Level Trigger & Physics groups Study performance of Athena and algorithms for use in HLT High statistics needed  Few samples of up to 10 7 events in days, O(1000) CPU’s  Simulation & pile-up reconstruction & analysis on a large scale learn about data model; I/O performances; identify bottlenecks etc data management Use/evaluate persistency technology (AthenaRoot I/O) Learn about distributed analysis Involvement of sites outside CERN use of Grid as and when possible and appropriate

DC1, Phase 1: Task Flow Example: one sample of di-jet events PYTHIA event generation: 1.5 x 10 7 events split into partitions (read: ROOT files) Detector simulation: 20 jobs per partition, ZEBRA output Atlsim/Geant3 + Filter 10 5 events Atlsim/Geant3 + Filter Hits/ Digits MCTruth Atlsim/Geant3 + Filter Pythia6 Di-jet Athena-Root I/OZebra HepMC Event generation Detector Simulation (5000 evts) (~450 evts) Hits/ Digits MCTruth Hits/ Digits MCtruth

DC1, Phase 1: Summary July-August institutes in 18 countries 3200 CPU’s, approx.110 kSI95 – CPU-days 5 × 10 7 events generated 1 × 10 7 events simulated 30 Tbytes produced files of output

DC1, Phase 2 Main challenge: luminosity effect simulation Separate simulation for: Physics events & minimum bias events Cavern background for muon studies Merging of: Primary stream (physics) Background stream(s): pileup (& cavern background)

Pile-up task flow ATLSIM Minimum bias 0.5 MB 460 sec Cavern Background 20 KB 0.4 sec Background 0.5 MB Physics 2 MB 340 sec Pile-up 7.5 MB 400 sec (Mixing:80 Digitization: 220) 0.03 sec High Luminosity:  23 events/bunch crossing  61 bunch crossings Low luminosity: 2 x 10 33

DC1, Phase 2, Pile-up Status 56 institutes Most production completed by mid-December Include minimum-bias production Low luminosity (2 x ) Typically 40 minimum bias files used per job High luminosity (10 34 ) Up-to 100 minimum bias files Not completed yet US-Grid Problems with Grid middleware (Globus GRAM) “tails” in few other institutes

Coming: Reconstruction Preparation: Building production infrastructure Get the “reconstruction” software ready and validated Include the dedicated code for HLT studies Today we are working with the ATLAS software release Not ready for the reconstruction of pile-up data Nevertheless, we intend to run small scale production on validation samples (without pile-up) To ensure that nothing is forgotten To test the machinery Expecting to have 6.0.x as the production release Distributed task: Concentrate the data in 9 sites Use as much as possible production databases (AMI & MAGDA) Be ready to use both conventional production and Grid: NorduGrid & US-Grid With dedicated tools:  GRAT (Grid Applications Toolkit)  AtCom to prepare the jobs

DC2-3-4-… DC2: Originally Q3/2003 – Q2/2004 Will be delayed Goals Full deployment of EDM & Detector Description Geant4 replacing Geant3 (fully?) Pile-up in Athena Test the calibration and alignment procedures Use LCG common software (POOL, …) Use widely GRID middleware Perform large scale physics analysis Further tests of the computing model Scale As for DC1: ~ 10 7 fully simulated events DC3: Q3/2004 – Q2/2005 Goals to be defined; Scale: 5 x DC2 DC4: Q3/2005 – Q2/2006 Goals to be defined; Scale: 2 X DC3

DC1 on the Grid Three “Grid flavours”: NorduGrid: full production US Grid (VDT): partial production EDG: tests DC1 Phase 1: 11 out of 39 sites NorduGrid (U. of Bergen, NSC Linköping U., Uppsala U., NBI, U. of Oslo, Lund U. etc) US-Grid (LBL, UTA, OU) DC1 Phase 2 NorduGrid (full pile-up production) US Grid Pile-up in progress Expected to be used for reconstruction Tests with are underway on both NorduGrid and US Grid

NorduGrid production Middleware: Globus-based Grid solution, most services are developed from scratch or amended CA & VO tools – common with EDG, hence common user base History: April 5th 2002: first ATLAS job submitted on the NorduGrid (Athena HelloWorld). May 10th 2002: first pre-DC1-validation-job (Atlsim-test using release 3.0.1). End-May 2002: now clear that NorduGrid is mature enough to do and manage real production. DC1, phase1 (simulation): Total number of fully simulated events: (1.15 × 10 7 of input events) Total output size: 762 GB. All files uploaded to a Storage Element (U. of Oslo) and registered in the Replica Catalog. DC1, pile-up: Low luminosity pile-up for the events above Other details: At peak production, up to 200 jobs were managed by the NorduGrid at the same time. Has most of Scandinavian production clusters under its belt (2 of them are in Top 500), however, not all of them allow for installation of ATLAS Software

US ATLAS Grid Software installation: Globus gatekeepers at 3 (out of 8) sites Software packaged by the WorldGrid (VDT-based) Pre-compiled binaries distributed to the gatekeepers “Grid scheduler”: pull model Approximately 10% of US DC1 commitment Simulation: input events, according to the database Pile-up: a more complex task, exposed several problems (some common with EDG) Still struggling

ATLAS-EDG Tests Started in August 2002, using DC1 simulation jobs Continuous feedback process a la WP8 plans of two years ago Well-known pattern: one bug crushed, two appear EDG 1.4.x is still highly unstable Very inconvenient for ATLAS DC1 jobs, which typically last for 24+ hours Needs a lot of end-user “hacking” Manual RFIO management Manual output post-staging Regular major services failures: Globus & Condor problems RB and JSS problems MDS problems RC problems Even hardware problems

More on Grids EDG ATLAS decided to “keep an eye” on what is going on But it seems “difficult” to run a major production with the current “middleware” LCG-1 Prototype is being deployed should be ready by end-June ATLAS will use the “applications software” (eg. POOL) ATLAS will participate in the “testing” Today ATLAS don’t consider it for production It will become a major concern for DC2 and following DCs