October 21, 2010 David Lawrence JLab Oct. 21, 20101RootSpy -- CHEP10, Taipei -- David Lawrence, JLab Parallel Session 53: Software Engineering, Data Stores,

Slides:



Advertisements
Similar presentations
Track Trigger Designs for Phase II Ulrich Heintz (Brown University) for U.H., M. Narain (Brown U) M. Johnson, R. Lipton (Fermilab) E. Hazen, S.X. Wu, (Boston.
Advertisements

1 The ATLAS Missing E T trigger Pierre-Hugues Beauchemin University of Oxford On behalf of the ATLAS Collaboration Pierre-Hugues Beauchemin University.
The June Software Review David Lawrence, JLab Feb. 16, /16/121Preparations for June Software Review David Lawrence.
CHL -2 Level 1 Trigger System Fully Pipelined Custom ElectronicsDigitization Drift Chamber Pre-amp The GlueX experiment will utilize fully pipelined front.
LHCb Upgrade Overview ALICE, ATLAS, CMS & LHCb joint workshop on DAQ Château de Bossey 13 March 2013 Beat Jost / Cern.
23/04/2008VLVnT08, Toulon, FR, April 2008, M. Stavrianakou, NESTOR-NOA 1 First thoughts for KM3Net on-shore data storage and distribution Facilities VLV.
The LHCb DAQ and Trigger Systems: recent updates Ricardo Graciani XXXIV International Meeting on Fundamental Physics.
LHC’s Second Run Hyunseok Lee 1. 2 ■ Discovery of the Higgs particle.
Hall D Online Data Acquisition CEBAF provides us with a tremendous scientific opportunity for understanding one of the fundamental forces of nature. 75.
23 July, 2003Curtis A. Meyer1 Milestones and Manpower Curtis A. Meyer.
Offline Software Status Jan. 30, 2009 David Lawrence JLab 1.
The JANA Calibrations and Conditions Database API March 23, 2009 David Lawrence JLab 3/23/091JANA Calibration API David Lawrence -- JLab.
The June Software Review David Lawrence, JLab Feb. 16, 2012.
Multi-threaded Event Processing with JANA David Lawrence – Jefferson Lab Nov. 3, /3/08 Multi-threaded Event Processing with JANA - D. Lawrence JLab.
Thomas Jefferson National Accelerator Facility (JLab) 6/16/09Multi-threaded event processing with JANA -- David Lawrence 1 6 GeV electron accelerator user.
Status of Hall C 6 GeV Analysis Software Robust Fortran/CERNLIB code, “ENGINE”, for analysis of HMS/SOS coincidence and single arm experiments that has.
Hall D Trigger and Data Rates Elliott Wolin Hall D Electronics Review Jefferson Lab 23-Jul-2003.
The CMS Level-1 Trigger System Dave Newbold, University of Bristol On behalf of the CMS collaboration.
The GlueX Collaboration Meeting October 4-6, 2012 Jefferson Lab Curtis Meyer.
GlueX electronics Collaboration Meeting May, 2004 Paul Smith.
The GlueX Detector 5/29/091CIPANP The GlueX Detector -- David Lawrence (JLab) David Lawrence (JLab) Electron beam accelerator continuous-wave (1497MHz,
Copyright © 2000 OPNET Technologies, Inc. Title – 1 Distributed Trigger System for the LHC experiments Krzysztof Korcyl ATLAS experiment laboratory H.
Alignment Strategy for ATLAS: Detector Description and Database Issues
8th November 2002Tim Adye1 BaBar Grid Tim Adye Particle Physics Department Rutherford Appleton Laboratory PP Grid Team Coseners House 8 th November 2002.
Finnish DataGrid meeting, CSC, Otaniemi, V. Karimäki (HIP) DataGrid meeting, CSC V. Karimäki (HIP) V. Karimäki (HIP) Otaniemi, 28 August, 2000.
ALICE Upgrade for Run3: Computing HL-LHC Trigger, Online and Offline Computing Working Group Topical Workshop Sep 5 th 2014.
Status of the Beam Phase and Intensity Monitor for LHCb Richard Jacobsson Zbigniew Guzik Federico Alessio TFC Team: Motivation Aims Overview of the board.
Solid SIDIS DAQ Solid collaboration meeting June 2 nd /3 rd 2011 Alexandre Camsonne.
VLVNT Amsterdam 2003 – J. Panman1 DAQ: comparison with an LHC experiment J. Panman CERN VLVNT workshop 7 Oct 2003 Use as example CMS (slides taken from.
5 May 98 1 Jürgen Knobloch Computing Planning for ATLAS ATLAS Software Week 5 May 1998 Jürgen Knobloch Slides also on:
1 Trigger and DAQ for SoLID SIDIS Programs Yi Qiang Jefferson Lab for SoLID-SIDIS Collaboration Meeting 3/25/2011.
JANA and Raw Data David Lawrence, JLab Oct. 5, 2012.
EGEE is a project funded by the European Union under contract IST HEP Use Cases for Grid Computing J. A. Templon Undecided (NIKHEF) Grid Tutorial,
28/03/2003Julie PRAST, LAPP CNRS, FRANCE 1 The ATLAS Liquid Argon Calorimeters ReadOut Drivers A 600 MHz TMS320C6414 DSPs based design.
CHEP ‘06 David Lawrence JLab C++ Introspection and Object Persistency through JIL David Lawrence Ph. D. Jefferson Lab, Newport News VA.
Hall D Electronics Review (July 23-24) Elton Smith Hall D Collaboration Meeting August 4-6.
The LHCb CERN R. Graciani (U. de Barcelona, Spain) for the LHCb Collaboration International ICFA Workshop on Digital Divide Mexico City, October.
Dec.11, 2008 ECL parallel session, Super B1 Results of the run with the new electronics A.Kuzmin, Yu.Usov, V.Shebalin, B.Shwartz 1.New electronics configuration.
Report on CHEP ‘06 David Lawrence. Conference had many participants, but was clearly dominated by LHC LHC has 4 major experiments: ALICE, ATLAS, CMS,
12GeV Trigger Workshop Christopher Newport University 8 July 2009 R. Chris Cuevas Welcome! Workshop goals: 1.Review  Trigger requirements  Present hardware.
01/04/09A. Salamon – TDAQ WG - CERN1 LKr calorimeter L0 trigger V. Bonaiuto, L. Cesaroni, A. Fucci, A. Salamon, G. Salina, F. Sargeni.
Integration of the ATLAS Tag Database with Data Management and Analysis Components Caitriana Nicholson University of Glasgow 3 rd September 2007 CHEP,
Jefferson Laboratory Hall A SuperBigBite Spectrometer Data Acquisition System Alexandre Camsonne APS DNP 2013 October 24 th 2013 Hall A Jefferson Laboratory.
ATLAS Trigger Development
Predrag Buncic Future IT challenges for ALICE Technical Workshop November 6, 2015.
Data Acquisition, Trigger and Control
HPS TDAQ Review Sergey Boyarinov, Ben Raydo JLAB June 18, 2014.
LHCbComputing Computing for the LHCb Upgrade. 2 LHCb Upgrade: goal and timescale m LHCb upgrade will be operational after LS2 (~2020) m Increase significantly.
Overview of EMU Software Rick Wilkinson. Slice Test DAQ We succeeded in using Slice Test DAQ code to take test beam data, combining chamber and trigger.
Status of Hall C 6 GeV Analysis Software Robust Fortran/CERNLIB code, “ENGINE”, for analysis of HMS/SOS coincidence and single arm experiments that has.
GlueX Collaboration May05 C. Cuevas 1 Topics: Infrastructure Update New Developments EECAD & Modeling Tools Flash ADC VXS – Crates GlueX Electronics Workshop.
Projects, Tools and Engineering Patricia McBride Computing Division Fermilab March 17, 2004.
October 19, 2010 David Lawrence JLab Oct. 19, 20101RootSpy -- CHEP10, Taipei -- David Lawrence, JLab Parallel Session 18: Software Engineering, Data Stores,
05/14/04Larry Dennis, FSU1 Scale of Hall D Computing CEBAF provides us with a tremendous scientific opportunity for understanding one of the fundamental.
Super BigBite DAQ & Trigger Jens-Ole Hansen Hall A Collaboration Meeting 16 December 2009.
Barthélémy von Haller CERN PH/AID For the ALICE Collaboration The ALICE data quality monitoring system.
1 GlueX Software Oct. 21, 2004 D. Lawrence, JLab.
J. Templon Nikhef Amsterdam Physics Data Processing Group Large Scale Computing Jeff Templon Nikhef Jamboree, Utrecht, 10 december 2012.
29/05/09A. Salamon – TDAQ WG - CERN1 LKr calorimeter L0 trigger V. Bonaiuto, L. Cesaroni, A. Fucci, A. Salamon, G. Salina, F. Sargeni.
DAQ and Trigger for HPS run Sergey Boyarinov JLAB July 11, Requirements and available test results 2. DAQ status 3. Trigger system status and upgrades.
ATLAS – statements of interest (1) A degree of hierarchy between the different computing facilities, with distinct roles at each level –Event filter Online.
Rainer Stamen, Norman Gee
CMS High Level Trigger Configuration Management
Trigger, DAQ and Online Closeout
CMS EMU TRIGGER ELECTRONICS
Example of DAQ Trigger issues for the SoLID experiment
Hall D Trigger and Data Rates
August 19th 2013 Alexandre Camsonne
ETD parallel session March 18th 2010
Electronics for the PID
Presentation transcript:

October 21, 2010 David Lawrence JLab Oct. 21, 20101RootSpy -- CHEP10, Taipei -- David Lawrence, JLab Parallel Session 53: Software Engineering, Data Stores, and Databases CEBAF at JLab is (soon to be) a 12 GeV e - continuous wave * beam facility in Newport News, Virginia *2ns bunch structure GlueX Offline Software: Preparing for Big Data Volumes on a Small Manpower Budget

Data Rates in Some Modern Experiments Front End DAQ Rate Event Size L1 Trigger Rate Bandwidth to mass Storage GlueX3 GB/s15 kB200 kHz300 MB/s CLAS12100 MB/s20 kB10 kHz100 MB/s ALICE500 GB/s2.5 MB200 kHz200 MB/s ATLAS113 GB/s1.5MB75 kHz300 MB/s CMS200 GB/s1 MB100kHz100 MB/s LHCb40 GB/s40 kB1 MHz100 MB/s STAR50 GB/s80 MB600 Hz450 MB/s PHENIX900 MB/s~60 kB~ 15 kHz450 MB/s Oct. 21, 2010RootSpy -- CHEP10, Taipei -- David Lawrence, JLab2 LHC JLab BNL * CHEP2007 talk Sylvain Chapelin private comm. * Jeff Landgraff private Comm. Feb. 11, 2010 ** CHEP2006 talk MartinL. Purschke **

CW beam  CD data Oct. 21, 2010RootSpy -- CHEP10, Taipei -- David Lawrence, JLab3 Crate Trigger ProcessorF1TDC Signal distribution board Electronics All digitization electronics are fully pipelined VME64x-VXS crates F1TDC (60 ps, 32 ch. or 115 ps 48 ch.) 125 MHz fADC (12 bit, 72 ch.) 250 MHz fADC (12 bit, 16 ch.) Maximum Trigger latency ~3  s 3GB/s readout from front end 300MB/s to mass storage 3PB/yr to tape Total digital sum of roughly 4000 calorimeter channels presented to L1 trigger every 4 ns! (continuous wave) (continuous digitization)

Software Manpower Oct. 21, 2010RootSpy -- CHEP10, Taipei -- David Lawrence, JLab4 Estimating manpower for software is notoriously difficult in a field where the developers are also users. The time spent on developing vs. using is often interspersed making it hard to estimate the overall time spent on either. Models have been developed using source lines of code. This is subject to individual programming style, language, and nature of the code. Project Estimated man-years CMS (LHC)1020 BaBar (SLAC)926 CDF (FermiLab)918 CLEO (Cornell)319 CLAS (JLab)53 GlueX (Jlab, projected)40 Of course, that doesn’t stop physicists from using it! estimates based on lines of source code from survey done in Number of major detector systems including Trigger and DAQ: BaBar = 8 CLAS = 7 Why would BaBar need 18 times as much manpower for software than CLAS ? (BaBar, CDF, CLEO, and CMS courtesy L. Sexton-Kennedy 1/11/2007)

Size doesn’t really matter …(right?) Oct. 21, 2010RootSpy -- CHEP10, Taipei -- David Lawrence, JLab5 Collaboratio n Approx. Size ATLAS3000 CMS3000 Alice1000 LHCb700 BaBar600 STAR515 PANDA450 PHENIX430 CLAS200 MINERvA85 GlueX65 ”… you go to war with the army you have, not the army you might want or wish to have at a later time.” - Donald Rumsfeld (former US Secretary of Defense… and jerk ) Coder’s Ambition Time to complete project

Software is the manpower expansion tank Oct. 21, 2010RootSpy -- CHEP10, Taipei -- David Lawrence, JLab6 People interested in participating in an experiment must be allowed to contribute to it in some way. If more people are involved than there are hardware projects available, then they must find some other way to contribute. On the other hand, if fewer people are involved, then manpower for software is limited and more code is borrowed (or licensed) from other places. What do you do with lots and lots of collaborators? This is not such a bad thing. Software tends to be a flexible source of projects This is fine too since it allows experiments to “adopt” code that took tens or hundreds of (wo)manhours to develop by only investing a few of your own.

Recycling Oct. 21, 2010RootSpy -- CHEP10, Taipei -- David Lawrence, JLab7 Reduce the need for development by borrowing code or ideas wherever appropriate  Successful ideas minimize R&D and the risk of failure (or worse, being stuck with an inefficient system)  Recycling code can translate directly into (wo)man-hours saved  Caveat: must take over maintenance for life of experiment Examples from GlueX: Incorporated reconstruction code from KLOE for barrel calorimeter Gave early results for simulation studies Eventually converted to C/C++ (from FORTRAN) and optimized Adopted KTKinematicData class from CLEO(-II?) Switched from using CLHEP to ROOT linear algebra Developed HDDS based on ATLAS AGDD XML-based geometry description Modified to accommodate repeating structures

Fears for Tiers (or “Who wants the headache of Ruling the World?”) Motivation for a Tier-based Distribution System – Ameliorates a technical problem – Solves a political one GlueX has no formal plan for a tier-based distribution system GRID-based tools for doing Partial Wave Analysis using summary files* as input Oct. 21, 2010RootSpy -- CHEP10, Taipei -- David Lawrence, JLab8 Both of these arise due to large numbers of collaborators *First pass reconstruction done with uninteresting events filtered out at JLab

Summary The GlueX experiment will operate with a collaboration that is 7 to 47 times smaller than experiments with similar data rates to tape This is only possible by focusing limited software (wo)manpower on necessity and borrowing code as needed. ( unexpectedly longer lead time helps too! ) Oct. 21, 2010RootSpy -- CHEP10, Taipei -- David Lawrence, JLab9 n.b. Collaborators wishing to have a big impact on a very fundamental physics experiment by contributing software are welcome:

Backup Slides Oct. 21, 2010RootSpy -- CHEP10, Taipei -- David Lawrence, JLab10

Oct. 21, 2010RootSpy -- CHEP10, Taipei -- David Lawrence, JLab11