L. Perini DATAGRID WP8 Use-cases 19 Dec 2000 1 ATLAS short term grid use-cases The “production” activities foreseen till mid-2001 and the tools to be used.

Slides:



Advertisements
Similar presentations
L. Perini Milano 6 Mar Centri Regionali e Progetto GRID per ATLAS-Italia La situazione a oggi: decisioni, interesse, impegni, punti da chiarire.
Advertisements

31/03/00 CMS(UK)Glenn Patrick What is the CMS(UK) Data Model? Assume that CMS software is available at every UK institute connected by some infrastructure.
4/2/2002HEP Globus Testing Request - Jae Yu x Participating in Globus Test-bed Activity for DØGrid UTA HEP group is playing a leading role in establishing.
1 First Considerations on LNF Tier2 Activity E. Vilucchi January 2006.
Resources for the ATLAS Offline Computing Basis for the Estimates ATLAS Distributed Computing Model Cost Estimates Present Status Sharing of Resources.
CMS HLT production using Grid tools Flavia Donno (INFN Pisa) Claudio Grandi (INFN Bologna) Ivano Lippi (INFN Padova) Francesco Prelz (INFN Milano) Andrea.
David Adams ATLAS DIAL Distributed Interactive Analysis of Large datasets David Adams BNL March 25, 2003 CHEP 2003 Data Analysis Environment and Visualization.
GRID DATA MANAGEMENT PILOT (GDMP) Asad Samar (Caltech) ACAT 2000, Fermilab October , 2000.
Current Monte Carlo calculation activities in ATLAS (ATLAS Data Challenges) Oxana Smirnova LCG/ATLAS, Lund University SWEGRID Seminar (April 9, 2003, Uppsala)
The new The new MONARC Simulation Framework Iosif Legrand  California Institute of Technology.
General Trigger Philosophy The definition of ROI’s is what allows, by transferring a moderate amount of information, to concentrate on improvements in.
11 Dec 2000F Harris Datagrid Testbed meeting at Milan 1 LHCb ‘use-case’ - distributed MC production
August 98 1 Jürgen Knobloch ATLAS Software Workshop Ann Arbor ATLAS Computing Planning ATLAS Software Workshop August 1998 Jürgen Knobloch Slides also.
CMS Report – GridPP Collaboration Meeting VI Peter Hobson, Brunel University30/1/2003 CMS Status and Plans Progress towards GridPP milestones Workload.
08/06/00 LHCb(UK) Meeting Glenn Patrick LHCb(UK) Computing/Grid: RAL Perspective Glenn Patrick Central UK Computing (what.
Test Of Distributed Data Quality Monitoring Of CMS Tracker Dataset H->ZZ->2e2mu with PileUp - 10,000 events ( ~ 50,000 hits for events) The monitoring.
5 November 2001F Harris GridPP Edinburgh 1 WP8 status for validating Testbed1 and middleware F Harris(LHCb/Oxford)
03/27/2003CHEP20031 Remote Operation of a Monte Carlo Production Farm Using Globus Dirk Hufnagel, Teela Pulliam, Thomas Allmendinger, Klaus Honscheid (Ohio.
Alain Romeyer - 15/06/20041 CMS farm Mons Final goal : included in the GRID CMS framework To be involved in the CMS data processing scheme.
Computing Infrastructure Status. LHCb Computing Status LHCb LHCC mini-review, February The LHCb Computing Model: a reminder m Simulation is using.
Grid Workload Management & Condor Massimo Sgaravatto INFN Padova.
Jean-Yves Nief CC-IN2P3, Lyon HEPiX-HEPNT, Fermilab October 22nd – 25th, 2002.
LHCb computing in Russia Ivan Korolko (ITEP Moscow) Russia-CERN JWGC, October 2005.
DataGrid Applications Federico Carminati WP6 WorkShop December 11, 2000.
Nick Brook Current status Future Collaboration Plans Future UK plans.
Finnish DataGrid meeting, CSC, Otaniemi, V. Karimäki (HIP) DataGrid meeting, CSC V. Karimäki (HIP) V. Karimäki (HIP) Otaniemi, 28 August, 2000.
11 December 2000 Paolo Capiluppi - DataGrid Testbed Workshop CMS Applications Requirements DataGrid Testbed Workshop Milano, 11 December 2000 Paolo Capiluppi,
ATLAS and GridPP GridPP Collaboration Meeting, Edinburgh, 5 th November 2001 RWL Jones, Lancaster University.
Event Data History David Adams BNL Atlas Software Week December 2001.
Computing for LHCb-Italy Domenico Galli, Umberto Marconi and Vincenzo Vagnoni Genève, January 17, 2001.
L. Perini DATAGRID WP6 workshop 11 Dec Report on ATLAS GRID activities Sites, people, applications: setting up testbeds and grid-enabling first.
5 May 98 1 Jürgen Knobloch Computing Planning for ATLAS ATLAS Software Week 5 May 1998 Jürgen Knobloch Slides also on:
Status of the LHCb MC production system Andrei Tsaregorodtsev, CPPM, Marseille DataGRID France workshop, Marseille, 24 September 2002.
The ALICE short-term use case DataGrid WP6 Meeting Milano, 11 Dec 2000Piergiorgio Cerello 1 Physics Performance Report (PPR) production starting in Feb2001.
4/5/2007Data handling and transfer in the LHCb experiment1 Data handling and transfer in the LHCb experiment RT NPSS Real Time 2007 FNAL - 4 th May 2007.
RAL Site Report John Gordon IT Department, CLRC/RAL HEPiX Meeting, JLAB, October 2000.
Quick Introduction to NorduGrid Oxana Smirnova 4 th Nordic LHC Workshop November 23, 2001, Stockholm.
19 November 98 1 Jürgen Knobloch ATLAS Computing ATLAS Computing - issues for 1999 Jürgen Knobloch Slides also on:
David Adams ATLAS DIAL status David Adams BNL November 21, 2002 ATLAS software meeting GRID session.
PHENIX and the data grid >400 collaborators Active on 3 continents + Brazil 100’s of TB of data per year Complex data with multiple disparate physics goals.
ATLAS Data Challenges US ATLAS Physics & Computing ANL October 30th 2001 Gilbert Poulard CERN EP-ATC.
05/09/2001ATLAS UK Physics Meeting Data Challenge Needs RWL Jones.
LOGO Development of the distributed computing system for the MPD at the NICA collider, analytical estimations Mathematical Modeling and Computational Physics.
David Adams ATLAS DIAL: Distributed Interactive Analysis of Large datasets David Adams BNL August 5, 2002 BNL OMEGA talk.
HIGUCHI Takeo Department of Physics, Faulty of Science, University of Tokyo Representing dBASF Development Team BELLE/CHEP20001 Distributed BELLE Analysis.
Status of the Bologna Computing Farm and GRID related activities Vincenzo M. Vagnoni Thursday, 7 March 2002.
Computing R&D and Milestones LHCb Plenary June 18th, 1998 These slides are on WWW at:
NEC' /09P.Hristov1 Alice off-line computing Alice Collaboration, presented by P.Hristov, CERN NEC'2001 September 12-18, Varna.
6 march Building the INFN Grid Proposal outline a.ghiselli,l.luminari,m.sgaravatto,c.vistoli INFN Grid meeting, milano.
Large scale data flow in local and GRID environment Viktor Kolosov (ITEP Moscow) Ivan Korolko (ITEP Moscow)
LHCb Data Challenge in 2002 A.Tsaregorodtsev, CPPM, Marseille DataGRID France meeting, Lyon, 18 April 2002.
Pavel Nevski DDM Workshop BNL, September 27, 2006 JOB DEFINITION as a part of Production.
ATLAS Distributed Computing perspectives for Run-2 Simone Campana CERN-IT/SDC on behalf of ADC.
Grid Workload Management (WP 1) Massimo Sgaravatto INFN Padova.
Overview of ATLAS Data Challenge Oxana Smirnova LCG/ATLAS, Lund University GAG monthly, February 28, 2003, CERN Strongly based on slides of Gilbert Poulard.
January 20, 2000K. Sliwa/ Tufts University DOE/NSF ATLAS Review 1 SIMULATION OF DAILY ACTIVITITIES AT REGIONAL CENTERS MONARC Collaboration Alexander Nazarenko.
ALICE RRB-T ALICE Computing – an update F.Carminati 23 October 2001.
Grid Activities in CMS Asad Samar (Caltech) PPDG meeting, Argonne July 13-14, 2000.
1 Plans for the Muon Trigger CSC Note. 2 Muon Trigger CSC Studies General performance studies and trigger rate evalution for the full slice Evaluation.
VI/ CERN Dec 4 CMS Software Architecture vs Hybrid Store Vincenzo Innocente CMS Week CERN, Dec
Marco Cattaneo, 3-June Event Reconstruction for LHCb  What is the scope of the project?  What are the goals (short+medium term)?  How do we organise.
BaBar & Grid Eleonora Luppi for the BaBarGrid Group TB GRID Bologna 15 febbraio 2005.
Monthly video-conference, 18/12/2003 P.Hristov1 Preparation for physics data challenge'04 P.Hristov Alice monthly off-line video-conference December 18,
LHCb Computing Model and Data Handling Angelo Carbone 5° workshop italiano sulla fisica p-p ad LHC 31st January 2008.
Simulation use cases for T2 in ALICE
US ATLAS Physics & Computing
MonteCarlo production for the BaBar experiment on the Italian grid
ATLAS DC2 & Continuous production
Development of LHCb Computing Model F Harris
Presentation transcript:

L. Perini DATAGRID WP8 Use-cases 19 Dec ATLAS short term grid use-cases The “production” activities foreseen till mid-2001 and the tools to be used

L. Perini DATAGRID WP8 Use-cases 19 Dec Status of ATLAS computing Phase of very active s/w development (ATHENA OO framework, LHCb similarities) –Physics TDR completed > 1 year ago (old s/w) –HLT TDR postponed around end 2001 (new s/w chain availability) No global ATLAS production going on now, but specific detector, trigger and physics studies are active, and willing to exploit and test GRID –TDR and Tile Testbeam Objy federations –B-Physics in Lund –Muon barrel trigger (INFN responsibility)

L. Perini DATAGRID WP8 Use-cases 19 Dec uuuuuuuu

L. Perini DATAGRID WP8 Use-cases 19 Dec Remote use of DB (Objy) Master federation soon available at CERN with TDR runs –First Scenario: A user request for a particular run triggers a check for its presence in the local federation. If it is not present, a grid- enabled transfer of the corresponding database from the CERN master federation is initiated. As in GDMP, the database must not only be transferred, but also appropriately attached to the local Objectivity federation. –Extension scenario: The particular run may not yet be available in the master federation at CERN, either. A request for such a run migh trigger a job at CERN that first imports the Zebra version of the data into the master Objectivity federation, before proceeding as in the scenario above.

L. Perini DATAGRID WP8 Use-cases 19 Dec Remote use of DB (Objy) In May Testbeam data for Tile Calorimeter will be available –Remote subscriptions to data generated at CERN A remote institution wishes to "subscribe" to the testbeam data, so that any new runs added to the master database at CERN are automatically replicated in a federation at a remote site that has subscribed to these data Related and similar DB use-cases may be very naturally generated on these time-scales, however no resources are yet commited in ATLAS for this kind of effort

L. Perini DATAGRID WP8 Use-cases 19 Dec Use-case: B-physics study People involved: Lund University ATLAS group (Chafik Driouichi, Paula Eerola, Christina Zacharatou Jarlskog, Oxana Smirnova) Process: B s  J/  , followed by J/   +  – and  Main background: inclusive bb  J/  X Needs sufficient CPU and storage space; so far CERN and Lund computers are used Physics goal: estimate the  angle of the unitarity triangle Main goal of this use-case: –identify weak points in Grid-like environment by direct experience and by comparing performance with modelling using the MONARC tools

L. Perini DATAGRID WP8 Use-cases 19 Dec Generation Event generator: Atgen-B Signal: –one job generates parton level events, which yields ~400 B s  J/   after Level 1 trigger cuts –job parameters: ~10 4 s CPU at a machine equivalent to 10 SpecINT95, 8 MB memory (max.), 10 MB swap (max.), 13 MB ZEBRA output –to simulate one day of LHC running at low luminosity: ~1.4·10 3 B s  J/   events ( Br(B s  J/   )=9 ·10 -4 ) Background: –estimated nr. of events per day: ~4·10 5 bb  J/  X Both signal and background to be stored (on tapes)

L. Perini DATAGRID WP8 Use-cases 19 Dec Simulation Detector simulation: Geant3 – Dice Full detector simulation takes ~15 min CPU at a machine equivalent to 10 SpecINT95 per event Only Inner Detector and Electromagnetic Calorimeter to be simulated Majority of CPU time is spent on simulating the calorimeters

L. Perini DATAGRID WP8 Use-cases 19 Dec Reconstruction and analysis Reconstruction: –from the Athena framework (at the moment, AtRecon is used) –information from two sub-systems to be stored in the Combined Ntuple (CBNT) Analysis: –estimation of the efficiencies of J/ ,  and B s reconstruction –acceptance, resolution, tagging, errors on the sin(2  ) etc.

L. Perini DATAGRID WP8 Use-cases 19 Dec Workplan Define and understand the available Grid-configuration and environment (requires interaction with all counterparts) –Lund, CERN, NBI, Milan... Generation and simulation (signal and background) –test and compare different job submission configurations –compare with the modeling Reconstruction, analysis

L. Perini DATAGRID WP8 Use-cases 19 Dec Use-case: Barrel Muon Trigger Study Main goals: –finalize the level-1 trigger logic in the barrel; –optimize the level-2 algorithms in the barrel region and study their possible extension to  > 1.; –evaluate the efficiencies of the different trigger levels (also combining muon and calorimetric triggers) for single muons and for relevant physics channels; –estimate the different background contributions to the trigger rate at various nominal thresholds; –study the effects of different layouts on system performances; –prototype (components of) distributed farms; –evaluate distributed computing models.

L. Perini DATAGRID WP8 Use-cases 19 Dec Tasks Simulation of single muons for system optimization: – ~10 8 events, ~3*10 9 SpecINT95*sec, ~500 GB disk space. Generation and simulation of relevant physics channels with muons in the final state for different studies, wait for G4? (DataGrid release 1) – B 0 d  J/  K 0 s, B 0 d      and B 0 d  J/   for CP violation; – B 0 s  D - s  + for B 0 s mixing; – H  4l and pp  Z   for alignment, calibration, overall performances; –B +  J/  (  ) K + and B 0 d  J/  (  ) K *0 for tagging control; –~10 6 events, ~10 10 SpecINT95*sec, ~1.5 TB disk space.

L. Perini DATAGRID WP8 Use-cases 19 Dec Tasks (2) Simulation of background: –simulate particle fluxes in the cavern; –~10 5 events, SpecINT95*sec, ~1. TB disk space. Production of ”complete” events, wait for GEANT4? –physics and background events are merged at hit and digit level; –~10 6 events, 5*10 9 SpecINT95*sec. Reconstruction, continuing till TDR –simulate level-1 trigger data processing; –apply level-2 reconstruction and selection algorithms; –~10 8 events. Analysis, continuing till TDR –study performances: efficiency, resolution, rates...

L. Perini DATAGRID WP8 Use-cases 19 Dec Tools Detector simulation: –GEANT3 based ATLAS simulation program (DICE) exists and works, GEANT4 coming by end Background simulation: –FLUKA + GEANT3: particle fluxes integrated over detectors characteristic times. Reconstruction: –trigger simulation programs running in conventional (ATrig) and OO (Athena) framework. Analysis: –PAW, ROOT(?).

L. Perini DATAGRID WP8 Use-cases 19 Dec Workplan Implement a Globus based distributed GRID architecture and perform increasingly complex operations: –submit event generation and simulation locally and remotely; –store events locally and remotely; –access remote data (e.g., background events, stored centrally); –(partially) duplicate event databases; –schedule job submission; –allocate resources: –monitor job execution; –optimize performances; –…

L. Perini DATAGRID WP8 Use-cases 19 Dec Sites and Resources Sites involved in the different activities: –simulation of physics events: Rome1, Rome2, Naples, Milan, CNAF, CERN; –background simulation: Rome1; –reconstruction and analysis: Rome1, Rome2, Naples, Milan; –HLT prototyping: Rome1, Pavia, Rome3 (Release 1) Available resources: –Linux farms (a total of ~50 Pentium III 800 MHz processors); –>1. TB disc store (>500 GB. on disk servers). Manpower: –~5 FTE (physicists and computing experts).