Roman Pöschl LCTW 09 1 CALICE Software Niels Meyer DESY Hamburg Roman Pöschl LAL Orsay LCTW09 Orsay/France November 2009 - Calice Testbeam Data Taking.

Slides:



Advertisements
Similar presentations
31/03/00 CMS(UK)Glenn Patrick What is the CMS(UK) Data Model? Assume that CMS software is available at every UK institute connected by some infrastructure.
Advertisements

Electromagnetic shower in the AHCAL selection criteria data / MonteCarlo comparison of: handling linearity shower shapes CALICE collaboration meeting may.
1 Databases in ALICE L.Betev LCG Database Deployment and Persistency Workshop Geneva, October 17, 2005.
Grid and CDB Janusz Martyniak, Imperial College London MICE CM37 Analysis, Software and Reconstruction.
23/04/2008VLVnT08, Toulon, FR, April 2008, M. Stavrianakou, NESTOR-NOA 1 First thoughts for KM3Net on-shore data storage and distribution Facilities VLV.
31 May 2007LCWS R&D Review - Overview1 WWS Calorimetry R&D Review: Overview of CALICE Paul Dauncey, Imperial College London On behalf of the CALICE Collaboration.
LCFI physics studies meeting, 28 th June 05 Sonja Hillertp. 1 Report from ILC simulation workshop, DESY June Aim of workshop: preparation for Snowmass;
Anne-Marie Magnan Imperial College London Tuesday, December18th 2007 Calice Software Review -- DESY -- A.-M. Magnan (IC London) 1 Tracking and ECAL reconstruction.
A preliminary analysis of the CALICE test beam data Dhiman Chakraborty, NIU for the CALICE Collaboration LCWS07, Hamburg, Germany May 29 - June 3, 2007.
1 Data Storage MICE DAQ Workshop 10 th February 2006 Malcolm Ellis & Paul Kyberd.
Oxford Jan 2005 RAL Computing 1 RAL Computing Implementing the computing model: SAM and the Grid Nick West.
1Calice-UK WP1 review 9/9/05D.R. Ward David Ward WP1 covered the completion of the original Calice program as proposed in 2002, i.e. Beam tests (electrons.
1Calice NIU s/w review March D.R. Ward David Ward Recent Technical Board review included software. Summarise some of the main conclusions… Monte.
06/15/2009CALICE TB review RPC DHCAL 1m 3 test software: daq, event building, event display, analysis and simulation Lei Xia.
Ties Behnke: EU-LC Simulation and Reconstruction 1 EU-LC Simulation & Reconstruction Full simulation systems: status report The next steps: where do we.
Computing for ILC experiment Computing Research Center, KEK Hiroyuki Matsunaga.
06/03/06Calice TB preparation1 HCAL test beam monitoring - online plots & fast analysis - - what do we want to monitor - how do we want to store & communicate.
D0 SAM – status and needs Plagarized from: D0 Experiment SAM Project Fermilab Computing Division.
HPS Online Software Discussion Jeremy McCormick, SLAC Status and Plans.
Conditions DB in LHCb LCG Conditions DB Workshop 8-9 December 2003 P. Mato / CERN.
AHCAL electronics. Status and Outlook Peter Göttlicher for the AHCAL developers CALICE meeting UT Arlington, March 11th, 2010.
DEPARTEMENT DE PHYSIQUE NUCLEAIRE ET CORPUSCULAIRE Data Acquisition for EUDET Example: JRA1 DAQ Daniel Haas DPNC Genève LCWS Hamburg Outline EUDET JRA1.
Databases E. Leonardi, P. Valente. Conditions DB Conditions=Dynamic parameters non-event time-varying Conditions database (CondDB) General definition:
Status of the LHCb MC production system Andrei Tsaregorodtsev, CPPM, Marseille DataGRID France workshop, Marseille, 24 September 2002.
The ILC And the Grid Andreas Gellrich DESY LCWS2007 DESY, Hamburg, Germany
BeamCal Simulations with Mokka Madalina Stanescu-Bellu West University Timisoara, Romania Desy, Zeuthen 30 Jun 2009 – FCAL Meeting.
CMS Computing and Core-Software USCMS CB Riverside, May 19, 2001 David Stickland, Princeton University CMS Computing and Core-Software Deputy PM.
ATLAS is a general-purpose particle physics experiment which will study topics including the origin of mass, the processes that allowed an excess of matter.
The CMS Simulation Software Julia Yarba, Fermilab on behalf of CMS Collaboration 22 m long, 15 m in diameter Over a million geometrical volumes Many complex.
Nigel Watson / BirminghamCALICE ECAL, UCL, 06-Mar-2006 Test Beam Task List - ECAL  Aim:  Identify all tasks essential for run and analysis of beam data.
TB1: Data analysis Antonio Bulgheroni on behalf of the TB24 team.
Calice TB Review, 14-Oct-2005Nigel Watson / Birmingham1 Software Review Calice Technical Board Open Session Comments on status relative to Feb TB.
1ECFA/Vienna 16/11/05D.R. Ward David Ward Compare these test beam data with Geant4 and Geant3 Monte Carlos. CALICE has tested an (incomplete) prototype.
CALICE ScECAL software development S. Uozumi Sep-3 rd 2010 Japan-Korea joint workshop.
LCIO A persistency framework and data model for the linear collider CHEP 04, Interlaken Core Software, Wednesday Frank Gaede, DESY -IT-
Lucie Linssen, PH senior staff meeting 23 Nov ’05 slide 1 EUDET Links: Public EUDET pages: Internal EUDET pages:
Update on the project - selected topics - Valeria Bartsch, Martin Postranecky, Matthew Warren, Matthew Wing University College London CALICE a calorimeter.
The MEG Offline Project General Architecture Offline Organization Responsibilities Milestones PSI 2/7/2004Corrado Gatto INFN.
ECFA Workshop, Warsaw, June G. Eckerlin Data Acquisition for the ILD G. Eckerlin ILD Meeting ILC ECFA Workshop, Warsaw, June 11 th 2008 DAQ Concept.
LHCb 2009-Q4 report Q4 report LHCb 2009-Q4 report, PhC2 Activities in 2009-Q4 m Core Software o Stable versions of Gaudi and LCG-AA m Applications.
1 Calice TB Review DESY 15/6/06D.R. Ward David Ward Post mortem on May’06 DESY running. What’s still needed for DESY analysis? What’s needed for CERN data.
MAUS Status A. Dobbs CM43 29 th October Contents MAUS Overview Infrastructure Geometry and CDB Detector Updates CKOV EMR KL TOF Tracker Global Tracking.
1 GlueX Software Oct. 21, 2004 D. Lawrence, JLab.
Roman Pöschl CALICE Software Review 1 Details of CALICE Software Model Roman Pöschl LAL Orsay CALICE Software Review 18/12/07 Part I: Calice Dataprocessing.
Software Tools for ILD Detector Studies Status at the eve of the LOI(s) Roman Pöschl LAL Orsay SOCLE Meeting Clermont-Ferrand November 2007.
Calice Days DESY Hamburg Feb CALICE Computing (Status and Practical Tips) Roman Pöschl LAL Orsay CALICE Days - DESY Hamburg February Introductory.
Marcel Stanitzki Rutherford Appleton Laboratory
14.4. Readout systems for innovative calorimeters
Emanuele Leonardi PADME General Meeting - LNF January 2017
Calice Days DESY Hamburg Feb. 2007
Final CALICE OsC meeting: Status and summary of project
Online Monitoring : Detector and Performance check
Calicoes Calice OnlinE System Frédéric Magniette
Pasquale Migliozzi INFN Napoli
CALICE meeting 2007 – Prague
LHC experiments Requirements and Concepts ALICE
Grid Computing for the ILC
Calibrating ALICE.
Commissioning of the ALICE HLT, TPC and PHOS systems
Markus Frank CERN/LHCb CHEP2013, Amsterdam, October 14th–18th 2013
HEP detector description supporting the full experiment life cycle
Computing Infrastructure for DAQ, DM and SC
ILD Ichinoseki Meeting
DESY TB05: Drift Chambers simulation
Linear Collider Simulation Tools
Technical Board – DCAL Software
CALICE meeting 2007 – Prague
Simulation and Physics
ATLAS DC2 & Continuous production
Status and plans for bookkeeping system and production tools
Presentation transcript:

Roman Pöschl LCTW 09 1 CALICE Software Niels Meyer DESY Hamburg Roman Pöschl LAL Orsay LCTW09 Orsay/France November Calice Testbeam Data Taking - Data Management - Event Building and Reconstruction Software - Summary and Outlook

Roman Pöschl LCTW 09 2 CALICE collaboration is preparing/performing large scale testbeam Data taking in Summer 2006/2007 Testbeam program poses software/computing “ challenges” - Data processing from Raw Data to final Clusters in a coherent way - Handling of Conditions Data Detector Configuration Calibration, Alignment etc. -Comparison with simulated data 'Physics' Output TCMT O(15000) calorimeter cells readout by Calice DAQ No Zero Suppression CALICE Testbeam Data Taking Testbeam Setup at CERN 2007 ECAL Analogue HCAL TCMT

Roman Pöschl LCTW 09 3 CALICE “TIER 0” – Infrastructure in the Control Room DAQ Computer Disk Array caliceserv.cern.ch - Online Monitoring - Grid Transfers Gigabit Uplink - High Speed Connection to the outside world - Serves all Calice Control Room Computers Picture courtesy of C. Rosemann DESY Well organized setup of computing Thanks to B. Lutz

Roman Pöschl LCTW 09 4 Data Handling and Processing DESY dcache (backbone) Mass Storage ~40 TByte of Calice Date 2/3 binaries 1/3 converted files Data Replication MC Production Grid UI for direct access Grid CE for massive processing Experimental Site e.g. CERN, FNAL Local Storage for online monitoring and fast checks Data Transfer using Grid Infrastructure Speed up to 240 MBit/s in2p3 Lyon fallback site

Roman Pöschl LCTW 09 5 The Virtual Organisation - vo calice Hosted by DESY: Page for registration is VO Manager: Niels Meyer/DESY, Deputy: A. Gellrich/DESY ~60 Members and counting... 8

Roman Pöschl LCTW 09 6 Supported by: DESY Hamburg Hosting, Computing and Storage LAL Computing and Storage LLR Computing and Storage DESY Zeuthen Computing and Storage Imperial College Computing and Storage Birmingham Computing and Storage cc in2p3 Lyon Computing and Storage Cambridge Computing and Storage Institute of Physics Computing and Storage Prague University College Computing and Storage KEK Computing and Storage Manchester Computing and Storage CIEMAT Madrid Computing and Storage Fermilab Computing and Storage NIKHEF Computing and Storage University of Bonn Computing and Storage Univ. Liverpool Computing and Storage Univ. Oxford Computing and Storage Institutes which provide Grid support for Calice - Most of the sites have been involved in recent data and MC processing Connectivity to Asian Sites still and Issue - Full set of Rawdata avalaible at DESY HH and CC in2p3 Lyon!!! Grid exploitation of Calice paved the way for successful mass production for ILC detector LOIs

Roman Pöschl LCTW 09 7 Conversion to LCIO DAQ data types are converted/wrapped into LCIO on the basis of LCGenericObjects DAQ Data Files/Types Configuration Data + Slow Control Data Event Data Trigger Data Event Building in lcioconverter LC Event Preliminary checks of data integrity Conversion of 2 Gbyte 'native' file with Hz on a standard Grid WN Further Processing within MARLIN Framework Calice db at DESY MySQL Installation of Software using tools developped in EUDET NA2 task ilcinstall, cmake Conditions Data are handled with LCCD Run stable over all ~100 Mio. Events

Roman Pöschl LCTW 09 8 Intermezzo – Conditions Data Handling LCCD – Linear Collider Conditions Data Framework: Software package providing an Interface to conditions data database LCIO files Author Frank Gaede, DESY LCCD works and is heavily used within calice !!! Still too much an expert tool (No real development since 2005) The importance of conditions data (not only) for 'real' data renders the development of a fully functional cd data toolkit to be a fundamental !!! piece of the ILC Software - Efficient storage and access to conditions data Browsing, convenient interfaces - How to 'distribute' conditions data (e.g w.r.t to grid) ? BTW.: LHC does have some headache with that!

Roman Pöschl LCTW 09 9 Calice Software Three main packages Contributions by groups from DESY, Imperial, LAL, LLR, NIU, RHUL calice_lcioconverter calice_userlib calice_reco Current version v converts calice DAQ format into LCIO (LCGenericObjects needs DAQ software expert work MARLIN processors Current version v04-10 Interface classes to LCGenericObjects (these classes should be defined by LCIO) utililty functions e.g. For TriggerHandling Current version v (v04-07 in prep.) RawData into CalorimeterHits (standard LCIO) TrackerHits First stages of higher level analysis MARLIN processors ~250 classes or functions Data of four different Calorimeter Prototypes are available in LCIO format

LCIO ● No data model for conditions- and meta-data, have to be implemented as LCGenericObjects ● This is fine, because most of these structures are easily generalized (e.g. slow control data, detector configuration, beam-line status) ● Current problem: object is bare LCGenericObject after storage/re-read ● danger of confusing different objects with identical size ● dynamic_cast for read-back collections does not work => need to work with constructors to be transparent => unnecessary increase of memory footprint

LCCD ● In principle, the LCIO/Marlin concept is thread-save - if it was not for the change listener pattern for conditions handling. Is there a better way, e.g. by 'collection has changed'-flags? ● The only available database implementation for LCCD is based on (non-maintained) CondDBMySQL. How to provide 'code reliability' here? Who takes care of fixes, e.g. removing I/O overhead to DB server? ● Scalability: Conditions for full scale detector will be huge (current HCal calib: ~50 floats per channel). Need clever memory  CPU  I/O balancing in the future, preferably steerable.

Alignment ● For PFlow calorimeters, cells are smaller than Moliere radius => mis-alignment in data affects the energy distribution over cells ● At least for TB: need to have this mis-alignment also in MC in order to do cell-to-cell comparisons (crucial to proof detector understanding at least for Tile HCal) ● Thus: alignment is conditions data and should be handled using LCCD interfaces ● Closely linked to geometry description (e.g. cell neighbours, cell positions,...)

Geometry ● GEAR is NOT a generic geometry interface, it is the geometry parameterization used for ILD ● currently simulation-driven: geometry defined in MOKKA, fed to reconstruction by XML file - this is not a useful ansatz for test beam setups (which are very flexible) and most probably also not for a full-scale detector with floating alignment ● Would want to have some LCIO-embedded data model that is good for calibration, simulation, and analysis alike. Something more object oriented (hit has pointer to cell rather than complex 'find cell by index from somewhere') would be much easier to use in the end, but this is re-discussing basic LCIO paradigms...

Roman Pöschl LCTW Next Generation Protoypes – EUDET Modules - - From the beginning coherent interface between DAQ and offline processing (D. Decotigny et al.) LCIO will remain backbone!!! Consistent Handling of Low Level Data Coordinated handling of potentially frequent changes in startup phases - Will continue to apply and help to develop ILC Software tools - Need for geometry package Consistent between Data and Simulation

Roman Pöschl LCTW Calice uses ILC Software for processing of Testbeam Data ILC Datataking in a (big) nutshell Allows users to switch easier between testbeam data analysis and physics/simulation studies - Calice uses systematically Grid tools 24h/24h 7h/7h during CERN, FNAL testbeams Experience with testbeam data clearly reveals the needs for a coherent concept to handle 'low level' data within ILC Software - Effort will continue with EUDET Modules on an even broader basis Using of ILC s/w tools already in testbeams is (in the mean time) well established and accepted concept CALICE did/does not only hardware-prototyping but also 'computing prototyping' Computing benefits from collaborative effort and application of ILC software toosl Summary and Outlook