CHEP03 P5, 3/24/03, R.Itoh 1 Upgrade of Belle DAQ System Ryosuke Itoh, KEK representing Belle DAQ group 1. Introduction 2. Requirements 3. Design 4. Software.

Slides:



Advertisements
Similar presentations
The Belle Silicon Vertex Detector T. Tsuboyama (KEK) 6 Dec Workshop New Hadrons with Various Flavors 6-7 Dec Nagoya Univ.
Advertisements

LHCb Upgrade Overview ALICE, ATLAS, CMS & LHCb joint workshop on DAQ Château de Bossey 13 March 2013 Beat Jost / Cern.
The LHCb DAQ and Trigger Systems: recent updates Ricardo Graciani XXXIV International Meeting on Fundamental Physics.
28 August 2002Paul Dauncey1 Readout electronics for the CALICE ECAL and tile HCAL Paul Dauncey Imperial College, University of London, UK For the CALICE-UK.
Belle Trigger/DAQ Workshop Workshop Summary - R.Itoh, KEK Shinshu University, 2/17-18/ participants Shinshu Univ. Matsumoto Catsle and Mountains.
Takeo Higuchi Institute of Particle and Nuclear Studies, KEK; On behalf of COPPER working group Jan.20, 2004 Hawaii, USA Super B Factory Workshop Readout.
The Belle SVD Trigger  Tom Ziegler  Vertex 2002  Kailua-Kona, Hawaii, 4-8 th nov The Belle SVD Trigger Tom Ziegler on behalf of the Belle SVD.
Takeo Higuchi Institute of Particle and Nuclear Studies, KEK on behalf of the COPPER working group Apr.22,2005Super B Factory Workshop Detector (DAQ/Computing)
SuperKEKB to search for new sources of flavor mixing and CP violation - Introduction - Introduction - Motivation for L= Motivation for L=
Takeo Higuchi IPNS, KEK COPPER Revision and New CPU.
Hall D Trigger and Data Rates Elliott Wolin Hall D Electronics Review Jefferson Lab 23-Jul-2003.
The Application of DAQ-Middleware to the J-PARC E16 Experiment E Hamada 1, M Ikeno 1, D Kawama 2, Y Morino 1, W Nakai 3, 2, Y Obara 3, K Ozawa 1, H Sendai.
Organization of a computer: The motherboard and its components.
CDF data production models 1 Data production models for the CDF experiment S. Hou for the CDF data production team.
Hall A DAQ status and upgrade plans Alexandre Camsonne Hall A Jefferson Laboratory Hall A collaboration meeting June 10 th 2011.
14 Sep 2005DAQ - Paul Dauncey1 Tech Board: DAQ/Online Status Paul Dauncey Imperial College London.
Copyright © 2000 OPNET Technologies, Inc. Title – 1 Distributed Trigger System for the LHC experiments Krzysztof Korcyl ATLAS experiment laboratory H.
Understanding Data Acquisition System for N- XYTER.
The Computing System for the Belle Experiment Ichiro Adachi KEK representing the Belle DST/MC production group CHEP03, La Jolla, California, USA March.
Ichiro Adachi ACAT03, 2003.Dec.021 Ichiro Adachi KEK representing for computing & DST/MC production group ACAT03, KEK, 2003.Dec.02 Belle Computing System.
Detectors and read-out DetectorNo. of ch.Read-out method Device candidates CsI(Tl)768Flash ADCCOPPER + 65 MHz FADC FINESSE Active Polarimeter600 x 12 x.
Takeo Higuchi IPNS, KEK Belle DAQ group Detector R&D: Belle DAQ System 2008/12/06New Hadrons with Various Flavors.
Data Acquisition for the 12 GeV Upgrade CODA 3. The good news…  There is a group dedicated to development and support of data acquisition at Jefferson.
The ALICE DAQ: Current Status and Future Challenges P. VANDE VYVRE CERN-EP/AID.
DAQ Scheme and Beam Test Results H. Asano, M. Friedl, H. Hyun, T. Higuchi, C. Irmler, Z. Natkaniec, W. Ostrowicz, M. Pernicka, T. Tsubuyama.
FADC progress in Vienna Reported by H.Ishino for Vienna FADC group M.Pernicka and H.Steininger.
University of Nova GoricaBelle Collaboration S. Stanič, STD6, Sep , 2006 Status of the Belle Silicon Vertex Detector and its Development for Operation.
Performance of the AMT-3 Based TDC System at Belle S.Y.Suzuki, T.Higuchi, Y.Arai, K.Tauchi, M.Nakao, R.Itoh (KEK) H.Nakayama (University of Tokyo)
Front-end readout study for SuperKEKB IGARASHI Youichi.
The LHCb CERN R. Graciani (U. de Barcelona, Spain) for the LHCb Collaboration International ICFA Workshop on Digital Divide Mexico City, October.
Serial Data Link on Advanced TCA Back Plane M. Nomachi and S. Ajimura Osaka University, Japan CAMAC – FASTBUS – VME / Compact PCI What ’ s next?
Dec.11, 2008 ECL parallel session, Super B1 Results of the run with the new electronics A.Kuzmin, Yu.Usov, V.Shebalin, B.Shwartz 1.New electronics configuration.
The KLOE computing environment Nuclear Science Symposium Portland, Oregon, USA 20 October 2003 M. Moulson – INFN/Frascati for the KLOE Collaboration.
12GeV Trigger Workshop Christopher Newport University 8 July 2009 R. Chris Cuevas Welcome! Workshop goals: 1.Review  Trigger requirements  Present hardware.
KISTI & Belle experiment Eunil Won Korea University On behalf of the Belle Collaboration.
2003 Conference for Computing in High Energy and Nuclear Physics La Jolla, California Giovanna Lehmann - CERN EP/ATD The DataFlow of the ATLAS Trigger.
LHCb DAQ system LHCb SFC review Nov. 26 th 2004 Niko Neufeld, CERN.
PC clusters in KEK A.Manabe KEK(Japan). 22 May '01LSCC WS '012 PC clusters in KEK s Belle (in KEKB) PC clusters s Neutron Shielding Simulation cluster.
MOLLER DAQ Aug 2015 meeting Team : R. Michaels, P. M. King, M. Gericke, K. Kumar R. Michaels, MOLLER Meeting, Aug, 2015.
Takeo Higuchi (KEK); CHEP pptx High Speed Data Receiver Card for Future Upgrade of Belle II DAQ 1.Introduction – Belle II DAQ Experimental apparatus.
Sep. 17, 2002BESIII Review Meeting BESIII DAQ System BESIII Review Meeting IHEP · Beijing · China Sep , 2002.
HIGUCHI Takeo Department of Physics, Faulty of Science, University of Tokyo Representing dBASF Development Team BELLE/CHEP20001 Distributed BELLE Analysis.
SoLiD/PVDIS DAQ Alexandre Camsonne. DAQ limitations Electronics Data transfer.
Development of PCI Bus Based DAQ Platform for Higher Luminosity Experiments T.Higuchi, 1 H.Fujii, 1 M.Ikeno, 1 Y.Igarashi, 1 E.Inoue, 1 R.Itoh, 1 H.Kodama,
Data transfer performance of SRS (J-PARC E16 Experiment) YUHEI MORINO RIKEN Nishina Center 1.
1 Electronics Status Trigger and DAQ run successfully in RUN2006 for the first time Trigger communication to DRS boards via trigger bus Trigger firmware.
1 CDC Readout - upgrade for Higher Luminosity - Y.Sakai (KEK) 29-Oct-2002 TRG/DAQ Review of Status/Plan (based on materials from S.Uno/M.Tanaka)
STAR Pixel Detector readout prototyping status. LBNL-IPHC-06/ LG22 Talk Outline Quick review of requirements and system design Status at last meeting.
LHCbComputing Computing for the LHCb Upgrade. 2 LHCb Upgrade: goal and timescale m LHCb upgrade will be operational after LS2 (~2020) m Increase significantly.
ROM. ROM functionalities. ROM boards has to provide data format conversion. – Event fragments, from the FE electronics, enter the ROM as serial data stream;
DAQ Selection Discussion DAQ Subgroup Phone Conference Christopher Crawford
Vladimir Zhulanov for BelleII ECL group Budker INP, Novosibirsk INSTR2014, Novosibirsk 2014/02/28 1.
Pixel detector/Readout for SuperB T.Kawasaki Niigata-U.
The Evaluation Tool for the LHCb Event Builder Network Upgrade Guoming Liu, Niko Neufeld CERN, Switzerland 18 th Real-Time Conference June 13, 2012.
SVD FADC Status Markus Friedl (HEPHY Vienna) Wetzlar SVD-PXD Meeting, 5 February 2013.
DAQ ACQUISITION FOR THE dE/dX DETECTOR
WBS 1.03 Readout Systems Specifications, Design and Acceptance
Electronics Trigger and DAQ CERN meeting summary.
Discussion Items Clock PC and/or VME CPU Trigger readout
ALICE – First paper.
RT2003, Montreal Niko Neufeld, CERN-EP & Univ. de Lausanne
Example of DAQ Trigger issues for the SoLID experiment
PID electronics for FDIRC (Focusing Detector of Internally Reflected Cherenkov light) and FTOF (Forward Time of Flight) Christophe Beigbeder and Dominique.
Hall D Trigger and Data Rates
Status of SuperKEKB May 11, 2007 SuperB Workshop Masa Yamauchi KEK.
Trigger session report
LHCb Trigger, Online and related Electronics
Network Processors for a 1 MHz Trigger-DAQ System
The CMS Tracking Readout and Front End Driver Testing
TELL1 A common data acquisition board for LHCb
Presentation transcript:

CHEP03 P5, 3/24/03, R.Itoh 1 Upgrade of Belle DAQ System Ryosuke Itoh, KEK representing Belle DAQ group 1. Introduction 2. Requirements 3. Design 4. Software Trigger and Event Reduction 5. Summary

CHEP03 P5, 3/24/03, R.Itoh 1 1. Introduction Belle Experiment : B-factory experiment at KEK in Japan - Study of CP violation in B meson system - Precise measurement of CKM matrix element 11 22 33 Unitarity Triangle V cd V cb * V ud V ub * V td V tb *    requires very high statistics of B meson decays Latest result: sin2  1 =0.82±0.12±0.05 (ICHEP02) (obtained with 78/fb = 85M BB)

CHEP03 P5, 3/24/03, R.Itoh 1 Peak luminosity : 8.3  /cm 2 /sec Recorded luminosity / day : 457.8/pb Total recorded luminosity : >120/fb - all world records!!

CHEP03 P5, 3/24/03, R.Itoh 1 However, we need more luminosity for * precise determination of     and V ub * search for new physics in rare B decays through penguin diagram need > 3000/fb (cf. ~120/fb up to now, took 3 years) Drastic upgrade of accelerator (KEKB) is necessary : Super KEKB Target Luminosity : >10 35 /cm 2 /sec (current maximum: ~10 34 ) Upgrade of Detector and DAQ is necessary writing LoI aiming at the upgrade in 2006

CHEP03 P5, 3/24/03, R.Itoh 1 2. Requirements Physics trigger rate 100Hz 1KHz Maximum trigger rate 500Hz 10(-30)KHz Event size at L1 40KB/ev KB/ev Data flow rate at L1 20MB/s > 2 GB/sec (>5GB/sec w/o SVD trg.) Data flow at storage 10MB/s 240MB/sec Reduction factor in DAQ Belle SuperKEKB(10 35 ) How to achieve higher event reduction factor  Key of DAQ upgrade

CHEP03 P5, 3/24/03, R.Itoh 1 Upgrade Strategy Pipeline based readout is essential  Gate and Delay method(Belle)  deadtime = 100% at 10KHz Use of common electronics as possible  CoPPER (Common readout platform) * unified handling of pipeline readout using on-board PC module * detail will be covered in tomorrow's P5 session (Higuchi) and poster session (Igarashi) Scalability  to keep up with gradual increase in luminosity

CHEP03 P5, 3/24/03, R.Itoh 1 Expected event size at L1 Pixel - ~100KB SVD 15KB ~30KB CDC 6KB ~10KB PID 3KB(TOF/ACC) ~20KB ECL 8KB ~100KB KLM 3KB ~3KB TRG/others 3KB ~3KB Belle SuperKEKB * Pixel: event size compression possible * ECL : wave form sampling to obtain required resolution (~10 buckets/hit*12bit) -> can be reduced to 1/5 by feature extraction * Other: event size compression using word-packing/"zip" Event Processing on CoPPER ~40KB ~300KB ~100KB/ev

CHEP03 P5, 3/24/03, R.Itoh 1 Current Belle's Event Builder 3. Design..... ~1000 CoPPERs ~50 Readout PCs..... ~10 Event Building Farms ~10 L3 Farms L2.5 L2/size reduction L3 ~1GB/sec ~500MB/sec ~250MB/sec KB/ev -> 100KB/ev Transfer Network mass storage

CHEP03 P5, 3/24/03, R.Itoh 1 Event Building Current system at Belle : switchless event building farm - based on point-to-point 100base-TX/GbE connection - working very stably in current experimental condition - Use this system as a "unit" - Have multiple units operated in parallel

CHEP03 P5, 3/24/03, R.Itoh 1 Event transfer network (~10 100base-TX ports base-T port) CDC 1 CDC 2 CDC Event Builder Units (up to ~10).... (up to ~10) 1000base-T 100base-TX Network Switch CDC 1000base-T * Belle's event builder assumes event fragment from one detector is fed into one NIC on layer 1 PC (1 readout subsystem/detector at Belle  p2p) * Upgraded system : event fragment from one detector is provided from many readout PC's ( up to ~10 /detector) No big network switch  cost effective

CHEP03 P5, 3/24/03, R.Itoh 1 ~30 servers GbE FE Switch PC server (2~4 CPUs; Reconstruction Farm More processing power is required to have more reduction  Real-time event reconstruction for L3 trigger layer1 layer2 layer3/ input distributor ~ 65MB/sec ~125MB/sec/unit 100KB/ev 1.25KHz/unit 100KB/ev ~600Hz 5MB/sec/node ~3MB/sec/node (inc. DST data) 1 Unit  processing power for L=10 34 /cm 2 /sec

CHEP03 P5, 3/24/03, R.Itoh 1 Belle Event Building Farm 28 x 1U 2 Athron MP servers 2xP3 + 80GB*2 2xP3 + 80GB*2 GbE-SX 3com Switch base- TX contol PC DAQnet GbE-LX Planex FMG-226SX 3com Switch 4400 GbE-SX placed in “server room” Comp. Center works as disk/memory cache development in progress Test Bench at Belle

CHEP03 P5, 3/24/03, R.Itoh 1 Storage Belle: Currently using high speed tape device w/ robot(DTF/PetaSite) * SONY gave up to release faster DTF drives * market is small  expensive - Recent disks are much faster than tape drive * ex. Dell/EMC CX600 / Fujitsu ETERNUS - 200MB/sec (2Gbps FiberChannel I/F) * preliminary test of prototype (borrowed from some company) shows >70MB/sec read/write speed using 1Gbps FC cf. DTF : 24MB/sec Record data on disk directly. Parallel data streams.  R&D has been started with Computing people. - will be tested in Belle Environment in next FY

CHEP03 P5, 3/24/03, R.Itoh 1 4. Software trigger / Event reduction 1) Level 2 trigger (on CoPPER modules) - event trigger after pipeline readout - trigger signal is genarated by dedicated hardware (ex. SVD trg) -> latency < ~50  sec (cf. L1 latency : ~10  sec) - trigger signal is distributed to CoPPER via timing logic with event tag - software running on CoPPER CPU rejects the event by looking at the trigger event tag - Event Reduction is very important in high-intensity experiments to keep mass-storage manageable. - We need a versatile and powerful software trigger /event size reduction scheme to obtain reduction factor of <1/10 after L1 trigger. Trigger rate reduction : ~1/3-1/5 (30~50KHz -> 10KHz) Event size reduction : 1 (~ KB)

CHEP03 P5, 3/24/03, R.Itoh 1 2) Event processing on CoPPER/Readout PC - Software Data sparcification * Feature extraction for wave-form sampling * Event size compression by various method (bit-squeezing, zip, etc.) - Raw Data Data Formatting (to Panther / ROOT I/O (?)) Trigger rate reduction : 1 (10KHz) Event size reduction : 1/3 (~300 KB->100KB) CoPPER : linux-operated PC on board  possibility of versatile event data processing

CHEP03 P5, 3/24/03, R.Itoh 1 3) Level 2.5 trigger - Software trigger using partially-built event data (data from one subdetector/several related subdetectors) - Current Belle's "L3" scheme can be used Trigger rate reduction : 1/2 (10KHz->5KHz) Event size reduction : 1 * Fast Tracking + Hardware trigger information (Belle)

CHEP03 P5, 3/24/03, R.Itoh 1 Power of event reduction by "physics skim" at Belle Fraction in events after L2.5 Hadronic 14.2%     /2photon 9.6% Monitor(=e + e -,    ,etc) ~1% (can be scaled) Trigger rate reduction : 1/4 (~2KHz) Event size reduction : 1 (+ reconstruction info(~100KB/ev)) * Data flow rate will increase by a factor of 2 if we leave reconst. info together on storage  requires more multiplicity in storage 4) Level 3 trigger - Software trigger using fully-built and fully-reconstructed data - Trigger at a level of "Physics Skim" * hadronic event selection * selection of specially-interested events

CHEP03 P5, 3/24/03, R.Itoh 1 5. Summary a Upgrade of B-factory at KEK (KEKB) is being planned to achieve >10 times luminosity increase hopefully in Design of new DAQ system to cope with >10KHz trigger rate with an event size of 300KB is in progress based on the system currently used at Belle. The key issue in the design is the reduction of data flow at storage less than a factor of 1/10 of that at L1. The reduction is feasible by the use of widely-distributed processing from readout modules to reconstruction farm. - Other R&D's (timing-distribution, data monitoring, etc.) are also going on. Stay Tuned!

CHEP03 P5, 3/24/03, R.Itoh 1 Backup Slides

CHEP03 P5, 3/24/03, R.Itoh 1 Detector Electronics Quick Summary - SVD : CMS APV25 chip -> promising! - Pixel : candidate = ALICE/BTeV chip too slow! large data size : 4KB(binary)/30KB(8bit analog) + 90KB position - CDC : 3 approaches 1) ADC with waveform sampling 2) TDC only with charge to time conversion (ASDQ) 3) TDC + FADC - ECL : wave form sampling needed to avoid pileup effect (12bit for barrel, 20MHz(?) for pure CsI) - TOP/RICH : need to manage pixel photo-detector * time stretcher(Varner)/AMT(Arai), analog pipeline(Ikeda) - KLM : readout scheme is not so much different from Belle's regardless of choice of detection device (RPC/Sci. Tile) * TDC based multiplexing + on-board data compression all electronics will be equipped as "FINESSE" implemented as daughter board on CoPPER.

CHEP03 P5, 3/24/03, R.Itoh 1 Pipeline and CoPPER L1 pipeline ADC TDC Free running clock Gate Trigger FIFO CPU Network/Serial bus “FINESSE” “CoPPER” Common Readout Platform This part is supplied by each subdetector group as "daughter cards" * Parallel Session 5 Tuesday: 15:00- * Poster Session

CHEP03 P5, 3/24/03, R.Itoh 1 Readout PC - Linux-operated PC with NIC/Serial-bus - Collects event fragments from 1 CoPPER crate (containing ~20 CoPPER boards) - CoPPER : PCI (PMC) based CPU board is built-in -> various choice in the connection to Readout PC Choice : 100BaseTX, GbE, USB2.0, IEEE1394, etc. * possible choice - point-to-point connection using 100Base-TX (no switch) - multi-port 100Base-TX NIC on PC (4 ports x 5 cards) <- we have experiences in current event building farm - Sub-eventbuilding is performed on Readout PC

CHEP03 P5, 3/24/03, R.Itoh 1 DAQ upgrade schedule ● Target year of SuperKEKB upgrade : 2006 ● 1 full prototype (CoPPER to storage) : Copper prototype full testbench by end of FY2003 Copper crate readout TDC/ADC FINESSE reference prototype of subdet. FINESSE test in a crate TTRX prototype system test Recon Farm/Storage prototype Recon farm operation in Belle backend testbench mass production (LoI/fall) (MU06) EFC with CoPPER readout CDC with CoPPER full upgrade SEQ/TTD

CHEP03 P5, 3/24/03, R.Itoh 1