CWG13: Ideas and discussion about the online part of the prototype P. Hristov, 11/04/2014.

Slides:



Advertisements
Similar presentations
HLT Collaboration; High Level Trigger HLT PRR Computer Architecture Volker Lindenstruth Kirchhoff Institute for Physics Chair of Computer.
Advertisements

CWG10 Control, Configuration and Monitoring Status and plans for Control, Configuration and Monitoring 16 December 2014 ALICE O 2 Asian Workshop
André Augustinus ALICE Detector Control System  ALICE DCS is responsible for safe, stable and efficient operation of the experiment  Central monitoring.
HLT - data compression vs event rejection. Assumptions Need for an online rudimentary event reconstruction for monitoring Detector readout rate (i.e.
ALFA: The new ALICE-FAIR software framework
MINERvA DAQ Software D. Casper University of California, Irvine.
ACAT 2002, Moscow June 24-28thJ. Hernández. DESY-Zeuthen1 Offline Mass Data Processing using Online Computing Resources at HERA-B José Hernández DESY-Zeuthen.
CHEP04 - Interlaken - Sep. 27th - Oct. 1st 2004T. M. Steinbeck for the Alice Collaboration1/20 New Experiences with the ALICE High Level Trigger Data Transport.
Status and roadmap of the AlFa Framework Mohammad Al-Turany GSI-IT/CERN-PH-AIP.
Jochen Thäder – Kirchhoff Institute of Physics - University of Heidelberg 1 HLT for TPC commissioning - Setup - - Status - - Experience -
Lucia Silvestris, INFN Bari and CERN/CMC Status Report on CPT Project 23 March 2001, CERN Meeting del Consorzio INFN Status Reports on CPT Project, on.
1 The ATLAS Online High Level Trigger Framework: Experience reusing Offline Software Components in the ATLAS Trigger Werner Wiedenmann University of Wisconsin,
1 Raw Event : ByteStream implementation Muon Workshop, April 2002 A. Nisati, for the Muon Trigger group.
ALFA - a common concurrency framework for ALICE and FAIR experiments
FairRoot framework Mohammad Al-Turany (GSI-Scientific Computing)
Quality Control B. von Haller 8th June 2015 CERN.
ALICE Plenary | March 24, 2014 | Pierre Vande Vyvre O 2 Project : Status Report Pierre VANDE VYVRE 1.
Tom Dietel University of Cape Town for the ALICE Collaboration Computing for ALICE at the LHC.
Artdaq Introduction artdaq is a toolkit for creating the event building and filtering portions of a DAQ. A set of ready-to-use components along with hooks.
ALICE O 2 Plenary | October 1 st, 2014 | Pierre Vande Vyvre O2 Project Status P. Buncic, T. Kollegger, Pierre Vande Vyvre 1.
The High-Level Trigger of the ALICE Experiment Heinz Tilsner Kirchhoff-Institut für Physik Universität Heidelberg International Europhysics Conference.
IceCube DAQ Mtg. 10,28-30 IceCube DAQ: “DOM MB to Event Builder”
ALICE O | Pierre Vande Vyvre O 2 Project :Upgrade of the online and offline computing Pierre VANDE VYVRE 1.
Offline Coordinators  CMSSW_7_1_0 release: 17 June 2014  Usage:  Generation and Simulation samples for run 2 startup  Limited digitization and reconstruction.
MiniBooNE Computing Description: Support MiniBooNE online and offline computing by coordinating the use of, and occasionally managing, CD resources. Participants:
ALICE Upgrade for Run3: Computing HL-LHC Trigger, Online and Offline Computing Working Group Topical Workshop Sep 5 th 2014.
Tracking Task Force Predrag Buncic Offline Week, 19 March 2014.
February 17, 2015 Software Framework Development P. Hristov for CWG13.
CBM Software Workshop for Future Challenges in Tracking and Trigger Concepts, GSI, 9 June 2010 Volker Friese.
ALICE KISTI | June 19, 2014 | Pierre Vande Vyvre O 2 Project : Upgrade of the ALICE online and offline computing Pierre VANDE VYVRE / CERN – PH 1.
Performance measurement with ZeroMQ and FairMQ
Data Acquisition Backbone Core J. Adamczewski-Musch, N. Kurz, S. Linev GSI, Experiment Electronics, Data processing group.
November 2013 Review Talks Morning Plenary Talk – CLAS12 Software Overview and Progress ( ) Current Status with Emphasis on Past Year’s Progress:
4 th Workshop on ALICE Installation and Commissioning January 16 th & 17 th, CERN Muon Tracking (MUON_TRK, MCH, MTRK) Conclusion of the first ALICE COSMIC.
Computing R&D and Milestones LHCb Plenary June 18th, 1998 These slides are on WWW at:
CWG4 – The data model The group proposes a time frame - based data model to: – Formalize the access to data types produced by both detector FEE and data.
Pierre VANDE VYVRE for the O 2 project 15-Oct-2013 – CHEP – Amsterdam, Netherlands.
Predrag Buncic Future IT challenges for ALICE Technical Workshop November 6, 2015.
ALFA - a common concurrency framework for ALICE and FAIR experiments Mohammad Al-Turany GSI-IT/CERN-PH.
O 2 Project Roadmap P. VANDE VYVRE 1. O2 Project: What’s Next ? 2 O2 Plenary | 11 March 2015 | P. Vande Vyvre TDR close to its final state and its submission.
ALICE Online Upgrade P. VANDE VYVRE – CERN/PH ALICE meeting in Budapest – March 2012.
Computing Issues for the ATLAS SWT2. What is SWT2? SWT2 is the U.S. ATLAS Southwestern Tier 2 Consortium UTA is lead institution, along with University.
The MEG Offline Project General Architecture Offline Organization Responsibilities Milestones PSI 2/7/2004Corrado Gatto INFN.
Pierre VANDE VYVRE ALICE Online upgrade October 03, 2012 Offline Meeting, CERN.
CWG9 Data Quality Monitoring, Quality Assurance and Visualization B. von Haller CERN.
Monitoring for the ALICE O 2 Project 11 February 2016.
ALICE O 2 project B. von Haller on behalf of the O 2 project CERN.
ALICE O 2 | 2015 | Pierre Vande Vyvre O 2 Project Pierre VANDE VYVRE.
Calibration algorithm and detector monitoring - TPC Marian Ivanov.
Current activities and short term plans 24/04/2015 P. Hristov.
Generic and Re-usable Developments for Online Software Slow Control, Configuration, Data Format & Online Processing Shebli Anvar, CEA Irfu January 12,
FTK high level simulation & the physics case The FTK simulation problem G. Volpi Laboratori Nazionali Frascati, CERN Associate FP07 MC Fellow.
Monthly video-conference, 18/12/2003 P.Hristov1 Preparation for physics data challenge'04 P.Hristov Alice monthly off-line video-conference December 18,
LHCb Computing 2015 Q3 Report Stefan Roiser LHCC Referees Meeting 1 December 2015.
CALIBRATION: PREPARATION FOR RUN2 ALICE Offline Week, 25 June 2014 C. Zampolli.
CWG9 and Event Display B. von Haller CERN.
Christoph Blume Offline Week, July, 2008
WP18, High-speed data recording Krzysztof Wrona, European XFEL
O2 Project – Phase 2 Predrag Buncic
O2 Project Status Pierre Vande Vyvre
Mock Data Challenge for the MPD experiment on the HybriLIT cluster
CMS High Level Trigger Configuration Management
LHC experiments Requirements and Concepts ALICE
Controlling a large CPU farm using industrial tools
ALICE – First paper.
Jacek Otwinowski (for the DPG QA tools and WP7 groups)
Data Quality Monitoring of the CMS Silicon Strip Tracker Detector
QA tools – introduction and summary of activities
CBM Computing Overview and Status V
Implementation of DHLT Monitoring Tool for ALICE
Presentation transcript:

CWG13: Ideas and discussion about the online part of the prototype P. Hristov, 11/04/2014

2

DAQ lab machines 3  8 machines  Sandy Bridge-EP, dual E GHz, 64GB RAM  2x8 hw cores, 32 threads  Network  4x with 40G Ethernet  4x with 10G Ethernet

Naive test (with ZeroMQ?) 4 FLP1 Data source (DDL sim) Buffering Data reduction sim. Data “sender” FLP1 Data source (DDL sim) Buffering Data reduction sim. Data “sender” EPN1 Data “receiver” Time frame builder Algorithm simulator EPN1 Data “receiver” Time frame builder Algorithm simulator Storage simulator FLP2 Data source (DDL sim) Buffering Data reduction sim. Data “sender” FLP2 Data source (DDL sim) Buffering Data reduction sim. Data “sender” EPN2 Data “receiver” Time frame builder Algorithm simulator EPN2 Data “receiver” Time frame builder Algorithm simulator EPN3 Data “receiver” Time frame builder Algorithm simulator EPN3 Data “receiver” Time frame builder Algorithm simulator FLP3 Data source (DDL sim) Buffering Data reduction sim. Data “sender” FLP3 Data source (DDL sim) Buffering Data reduction sim. Data “sender” DQM simulator

Goals for a naïve test 5  Get experience with the ALFA software (currently in FairRoot), ZeroMQ, etc.  Become familiar with the online computing (especially for the offline people)  Perform some initial measurements, i.e. networking throughput, latency, performance with different size of the messages,…  Provide “sandbox” for development of the online components  logging system  control system  Test the access patterns, the distribution of produced/requested information, i.e. calibration, geometry, etc.

Some questions 6  Base (i.e. for components)  Common  Buffer  Networking/messaging (Pub/Sub, client/server, push/pull…)  FLP  Data Reduction Components  Clusterization  Tracklet finders  Calibration0 Components  EPN  Data aggregation component  Data reduction/reconstruction components  Vertex finding (?)  Global tracking  Rejection of non-associated clusters/cluster reformatting  Huffman encoding  Calibration1 Components  Control/logging  Which parts of the DAQ and HLT software can be reused in the test?  Is there any example in the FairRoot repository?  example/Tutorial3?  How do we organize the software in the repository? i.e.

Proposed actions for the next 2 weeks 7  JIRA project, Git workflow (Peter) JIRA project  Creation of MC points for ITS and TPC (Charis, Mohammad, Andrei)  Description of the Run3 raw data structures (Andrei, Peter)  Discussion with CWG3: 16/04/14  Installation of FairRoot/ALFA on the test nodes (Mohammad, Peter)  Detailed presentation on ALFA (Mohammad, Anar): 25/04/14  Ideas about the condition data base in Run3 (Raffaele): 05/14  Presentation on the ITS & TPC reconstruction components for Run3: cluster finder, fast tracker, track fit (Sergey, Ruben): 05/14  Next meeting 25/04/2014

Backup 8

O 2 Prototype: Complementary approaches “Offline” “Online” 9  Simulation (ITS + TPC) in FairRoot environment  Creation of hits (MC points)  Use of Geant4 w/wo multithreading, comparison (CWG8)  Digitization in time frames  Use of (0MQ) multiprocessing  Creation of raw data in Run3 format (CWG4)  Simulated from digits  Emulated from existing raw  Setup of test nodes (CWG1, CWG12)  Hardware and software data generation  See the “offline” part  Control, configuration and monitoring (CWG10)  ALFA components + control software

O 2 Prototype: Complementary approaches “Offline” “Online” 10  Use of CWG5 + CWG7 demonstrators to process (reconstruct) the raw data  Add more detectors (i.e. the ones in triggered mode) in the simulation  GOTO digitization, creation of raw data, reconstruction  Put some calibration algorithms in the processing (CWG6)  Data transport and fan-in, event-building (CWG3)  Use of generated raw data  Data access for processing, monitoring, QA (CWG9)  Data storage

Collaboration with the other CWGs 11  CWG 1 - Architecture  CWG 2 - Tools and Procedures  CWG 3 - Dataflow and Condition Data  CWG 4 - Data Model  CWG 5 - Computing Platforms  CWG 6 - Calibration  CWG 7 - Reconstruction  CWG 9 - QA, DQM, Visualization  CWG 8 - Physics Simulation  CWG 10 - Control, Configuration & Monitoring  CWG 11 - Software Lifecycle  CWG 12 - Computing Hardware