Offline data taking and processing

Slides:



Advertisements
Similar presentations
T1 at LBL/NERSC/OAK RIDGE General principles. RAW data flow T0 disk buffer DAQ & HLT CERN Tape AliEn FC Raw data Condition & Calibration & data DB disk.
Advertisements

ALICE Operations short summary LHCC Referees meeting June 12, 2012.
Large scale data flow in local and GRID environment V.Kolosov, I.Korolko, S.Makarychev ITEP Moscow.
03/27/'07T. ISGC20071 Computing GRID for ALICE in Japan Hiroshima University Takuma Horaguchi for the ALICE Collaboration
Test Of Distributed Data Quality Monitoring Of CMS Tracker Dataset H->ZZ->2e2mu with PileUp - 10,000 events ( ~ 50,000 hits for events) The monitoring.
ALICE Roadmap for 2009/2010 Patricia Méndez Lorenzo (IT/GS) Patricia Méndez Lorenzo (IT/GS) On behalf of the ALICE Offline team Slides prepared by Latchezar.
Fermilab User Facility US-CMS User Facility and Regional Center at Fermilab Matthias Kasemann FNAL.
TOF Meeting, 9 December 2009, CERN Chiara Zampolli for the ALICE-TOF.
Status of the production and news about Nagios ALICE TF Meeting 22/07/2010.
Costin Grigoras ALICE Offline. In the period of steady LHC operation, The Grid usage is constant and high and, as foreseen, is used for massive RAW and.
1 CERN related research program in Nuclear Physics High energy nuclear physics –ALICE experiment »Installation and Commissioning »Data taking.
Offline report – 7TeV data taking period (Mar.30 – Apr.6) ALICE SRC April 6, 2010.
ALICE Operations short summary ALICE Offline week June 15, 2012.
Prospects in ALICE for  mesons Daniel Tapia Takaki (Birmingham, UK) for the ALICE Collaboration International Conference on STRANGENESS IN QUARK MATTER.
SunSatFriThursWedTuesMon January
2012 RESOURCES UTILIZATION REPORT AND COMPUTING RESOURCES REQUIREMENTS September 24, 2012.
Status of the Production ALICE TF MEETING 11/02/2010.
Large scale data flow in local and GRID environment Viktor Kolosov (ITEP Moscow) Ivan Korolko (ITEP Moscow)
LCG-LHCC mini-review ALICE Latchezar Betev Latchezar Betev for the ALICE collaboration.
LHCb report to LHCC and C-RSG Philippe Charpentier CERN on behalf of LHCb.
Computing Issues for the ATLAS SWT2. What is SWT2? SWT2 is the U.S. ATLAS Southwestern Tier 2 Consortium UTA is lead institution, along with University.
1 Offline Week, October 28 th 2009 PWG3-Muon: Analysis Status From ESD to AOD:  inclusion of MC branch in the AOD  standard AOD creation for PDC09 files.
OPERATIONS REPORT JUNE – SEPTEMBER 2015 Stefan Roiser CERN.
ANALYSIS AT CERN KAZUKI OSHIMA 2 particle correlation about Pb-Pb 3.52TeV.
Tomasz Malkiewicz T0 trigger and luminosity monitoring luminosity monitoring ALICE meeting at JYFL.
Predrag Buncic CERN ALICE Status Report LHCC Referee Meeting 01/12/2015.
Data processing Offline review Feb 2, Productions, tools and results Three basic types of processing RAW MC Trains/AODs I will go through these.
PWG3 analysis (barrel)
FRANCESCO NOFERINI (INFN CNAF – INFN BOLOGNA) 08/12/2009 ALICE TOF - general meeting - F. Noferini - GRID and resonances GRID and  analysis in pp 1.
ALICE Computing Model A pictorial guide. ALICE Computing Model External T1 CERN T0 During pp run i (7 months): P2: data taking T0: first reconstruction.
RAW and MC production cycles ALICE Offline meeting June 21, 2010.
1 Reconstruction tasks R.Shahoyan, 25/06/ Including TRD into track fit (JIRA PWGPP-1))  JIRA PWGPP-2: Code is in the release, need to switch setting.
WLCG Status Report Ian Bird Austrian Tier 2 Workshop 22 nd June, 2010.
Domenico Elia1 ALICE computing: status and perspectives Domenico Elia, INFN Bari Workshop CCR INFN / LNS Catania, Workshop Commissione Calcolo.
ALICE Physics Data Challenge ’05 and LCG Service Challenge 3 Latchezar Betev / ALICE Geneva, 6 April 2005 LCG Storage Management Workshop.
Best 20 jobs jobs sites.
ATLAS Computing: Experience from first data processing and analysis Workshop TYL’10.
Predrag Buncic CERN Plans for Run2 and the ALICE upgrade in Run3 ALICE Tier-1/Tier-2 Workshop February 2015.
LHCb LHCb GRID SOLUTION TM Recent and planned changes to the LHCb computing model Marco Cattaneo, Philippe Charpentier, Peter Clarke, Stefan Roiser.
LHCb Computing 2015 Q3 Report Stefan Roiser LHCC Referees Meeting 1 December 2015.
Predrag Buncic CERN ALICE Status Report LHCC Referee Meeting 24/05/2015.
HTCC coffee march /03/2017 Sébastien VALAT – CERN.
The ALICE Cosmic Ray Physics Program
Analysis trains – Status & experience from operation
Predrag Buncic ALICE Status Report LHCC Referee Meeting CERN
Workshop Computing Models status and perspectives
LHC experiments Requirements and Concepts ALICE
Data Challenge with the Grid in ATLAS
Vanderbilt Tier 2 Project
ALICE – First paper.
Operations in 2012 and plans for the LS1
Offline shifter training tutorial
Experiment Dashboard overviw of the applications
LHCb Computing Model and Data Handling Angelo Carbone 5° workshop italiano sulla fisica p-p ad LHC 31st January 2008.
Philippe Charpentier CERN – LHCb On behalf of the LHCb Computing Group
Dagmar Adamova (NPI AS CR Prague/Rez) and Maarten Litmaath (CERN)
Project Status Report Computing Resource Review Board Ian Bird
Simulation use cases for T2 in ALICE
Offline shifter training tutorial
ALICE Computing Model in Run3
ALICE Computing Upgrade Predrag Buncic
R. Graciani for LHCb Mumbay, Feb 2006
Centralized Classroom and Event Scheduling: Fall 2018
Centralized Classroom and Event Scheduling: Summer 2018
Near Real Time Reconstruction of PHENIX Run7 Minimum Bias Data From RHIC Project Goals Reconstruct 10% of PHENIX min bias data from the RHIC Run7 (Spring.
Heavy Ion Physics Program of CMS Proposal for Offline Computing
Heavy Ion Physics Program of CMS Proposal for Offline Computing
The ATLAS Computing Model
The LHC Computing Grid Visit of Professor Andreas Demetriou
LS 1 start date 12th June Schedule Extension 2012 run Extension of 2012 run approved by the DG on 3rd July 2012.
Presentation transcript:

Offline data taking and processing Raw Data Registration Raw Data Processing Pass 1-6 completed for 900 GeV and 2360 GeV data Pass 0 @T0 introduced for calibration Pass 1 @T0 for 7 TeV data follows the data taking Analysis train running weekly: QA, physics working groups organized analysis 13.7 Mio collision events Run processing – start immediately after RAW data transferred to CERN MSS Average – 5 hours per job At 10 hours, 95% of the run is processed At 15 hours, 99% of the run is processed MC production Several production cycles for 900 GeV and 7 TeV MB pp: 17 Mio events with various generators (Phytia, PHOJET) and conditions from real data taking

Computing resources New LHC schedule: 2010: 1.5×109 pp events (1×109 MB + 0.5×109 rare triggered) 6.4×107 PbPb events (saturating the 1.25GB/s bandwidth) 2011 0.6×109 pp events (0.1×109 MB + 0.5×109 rare triggered) 2012: no data taking Switched from calendar year to RRB year 2010: June 2010 – March 2011 2011: April 2011 – March 2012 Optimize usage of resources by assigning any task (analysis train, end-user analysis, MC production) to any site category whenever resources are available