Online and offline software overview, status and plans Status and Integration to STAR : Trigger DAQ Monitoring Slow Controls Online Databases Online/Offline/MC.

Slides:



Advertisements
Similar presentations
STAR Status of J/  Trigger Simulations for d+Au Running Trigger Board Meeting Dec5, 2002 MC & TU.
Advertisements

June 19, 2002 A Software Skeleton for the Full Front-End Crate Test at BNL Goal: to provide a working data acquisition (DAQ) system for the coming full.
1 Analysis code for KEK Test-Beam M. Ellis Daresbury Tracker Meeting 30 th August 2005.
CFT Calibration Calibration Workshop Calibration Requirements Calibration Scheme Online Calibration databases.
FMS review, Sep FPD/FMS: calibrations and offline reconstruction Measurements of inclusive  0 production Reconstruction algorithm - clustering.
X.Dong, USTC/LBNL Feb. 20th, 04, STAR Collaboration Meeting 1 TOF Software Progress Xin Dong, for TOF Group  TOF detectors in Run IV  Online software.
July test beam DAQ status Marcin Byszewski Konstantinos Ntekas.
Offline Trigger Software Update Akio Ogawa STAR Collaboration Meeting 2003 Feb 28 BNL.
7/19/2004FTPC Review - J. Seyboth1 FTPC Review - Software Janet Seyboth for the FTPC group DAQ Online Calibration Databases Offline Documentation FTPC.
L3 Filtering: status and plans D  Computing Review Meeting: 9 th May 2002 Terry Wyatt, on behalf of the L3 Algorithms group. For more details of current.
CLAS12 CalCom Activity CLAS Collaboration Meeting, March 6 th 2014.
SVX Software Overview Sasha Lebedev VTX meeting 09/07/ SVX Software web page:
Shuei MEG review meeting, 2 July MEG Software Status MEG Software Group Framework Large Prototype software updates Database ROME Monte Carlo.
Alexandre A. P. Suaide VI DOSAR workshop, São Paulo, 2005 STAR grid activities and São Paulo experience.
06/03/06Calice TB preparation1 HCAL test beam monitoring - online plots & fast analysis - - what do we want to monitor - how do we want to store & communicate.
SMACS Slow Monitor And Control System Developed system for CDF-TOF proposed for Atlas-MDT/RPC.
D0 Farms 1 D0 Run II Farms M. Diesburg, B.Alcorn, J.Bakken, T.Dawson, D.Fagan, J.Fromm, K.Genser, L.Giacchetti, D.Holmgren, T.Jones, T.Levshina, L.Lueking,
Some notes on ezTree and EMC data in MuDst Marco van Leeuwen, LBNL.
MySQL and GRID Gabriele Carcassi STAR Collaboration 6 May Proposal.
1 SSD status report – 16 July 2004 Jerome Baudot, Strasbourg STAR Collaboration Meeting BNL, 16 July 2004 Silicon Strip Detector status report  Report.
Time of Flight Analysis Progress on TOF Software Frank Geurts for the TOF group.
STAR Analysis Meeting, BNL, Dec 2004 Alexandre A. P. Suaide University of Sao Paulo Slide 1 BEMC software and calibration L3 display 200 GeV February.
Data Acquisition System of SVD2.0 This series of slides explains how we take normal, calibration and system test data of the SVD 2.0 and monitor the environment.
Prediction W. Buchmueller (DESY) arXiv:hep-ph/ (1999)
Erik Blaufuss University of Maryland Data Filtering and Software IceCube Collaboration Meeting Monday, March 21, 2005.
17-Aug-00 L.RistoriCDF Trigger Workshop1 SVT: current hardware status CRNowFinal Hit Finders64242 Mergers31616 Sequencers2312 AMboards4624 Hit Buffers21212.
CMS pixel data quality monitoring Petra Merkel, Purdue University For the CMS Pixel DQM Group Vertex 2008, Sweden.
STAR Collaboration Meeting, Nantes Alexandre A. P. Suaide Wayne State University Slide 1 EMC Commissioning for the run Review of last.
STAR Collaboration Meeting, BNL, Feb 2005 Alexandre A. P. Suaide University of Sao Paulo Slide 1 BEMC software update L3 display 200 GeV February.
Slides for Xiaoping. LBNL -- 10/9/09STAR Collaboration Meeting2 STAR TOF Offline Analysis Layout STAR/TOF DAQ GSTAR geant hits GSTAR geant hits geometry.
Offline and Monitoring Software for the EMEC+HEC Combined Run Combined Test Beam and LArg Software and Performance 11 September 2002 Rob McPherson University.
LM Feb SSD status and Plans for Year 5 Lilian Martin - SUBATECH STAR Collaboration Meeting BNL - February 2005.
CBM ECAL simulation status Prokudin Mikhail ITEP.
RPC DQM status Cimmino, M. Maggi, P. Noli, D. Lomidze, P. Paolucci, G. Roselli, C. Carillo.
Linda R. Coney – 5 November 2009 Online Reconstruction Linda R. Coney 5 November 2009.
12 October 2001, M. LefebvreHEC-Athena Tutorial: HEC beam test primer1 HEC Beam Test Primer Production modules of the HEC have been tested in particle.
Online (GNAM) and offline (Express Stream and Tier0) monitoring produced results during cosmic/collision runs (Oct-Dec 2009) Shifter and expert level monitoring.
MICE CM28 Oct 2010Jean-Sebastien GraulichSlide 1 Detector DAQ o Achievements Since CM27 o DAQ Upgrade o CAM/DAQ integration o Online Software o Trigger.
Online monitor for L2 CAL upgrade Giorgio Cortiana Outline: Hardware Monitoring New Clusters Monitoring
Online Monitoring System at KLOE Alessandra Doria INFN - Napoli for the KLOE collaboration CHEP 2000 Padova, 7-11 February 2000 NAPOLI.
GLAST LAT Project CU Beam Test Workshop 3/20/2006 C. Sgro’, L. Baldini, J. Bregeon1 Glast LAT Calibration Unit Beam Test Status Report on Online Monitor.
STAR Collaboration Meeting, BNL – march 2003 Alexandre A. P. Suaide Wayne State University Slide 1 EMC Update Update on EMC –Hardware installed and current.
STAR Analysis Meeting, BNL – oct 2002 Alexandre A. P. Suaide Wayne State University Slide 1 EMC update Status of EMC analysis –Calibration –Transverse.
STAR Collaboration meeting, Nantes Alexandre A. P. Suaide Wayne State University Slide 1 EMC analysis update Just to remember … What we have done.
E-EMC developments and plan Jan Balewski IUCF, Indiana Detector status Run 3 goals Calibration Software status STAR Analysis Meeting BNL, October
AliRoot survey: Analysis P.Hristov 11/06/2013. Are you involved in analysis activities?(85.1% Yes, 14.9% No) 2 Involved since 4.5±2.4 years Dedicated.
Muons at CalDet Introduction Track Finder Package ADC Corrections Drift Points Path Length Attenuation Strip-to-Strip Calibration Scintillator Response.
Overview of PHENIX Muon Tracker Data Analysis PHENIX Muon Tracker Muon Tracker Software Muon Tracker Database Muon Event Display Performance Muon Reconstruction.
Online Consumers produce histograms (from a limited sample of events) which provide information about the status of the different sub-detectors. The DQM.
STAR Simulation. Status and plans V. Perevoztchikov Brookhaven National Laboratory,USA.
Markus Frank (CERN) & Albert Puig (UB).  An opportunity (Motivation)  Adopted approach  Implementation specifics  Status  Conclusions 2.
BESF Framework Development Weidong Li
Meeting with University of Malta| CERN, May 18, 2015 | Predrag Buncic ALICE Computing in Run 2+ P. Buncic 1.
Analysis Model Zhengyun You University of California Irvine Mu2e Computing Review March 5-6, 2015 Mu2e-doc-5227.
EEMC STAR Jan Balewski, IUCF, Indiana STAR Collaboration Meeting MSU, August 2003 Upper Structure Mounted 8/1/2003 Run 3 hardware calibration.
MAUS Status A. Dobbs CM43 29 th October Contents MAUS Overview Infrastructure Geometry and CDB Detector Updates CKOV EMR KL TOF Tracker Global Tracking.
1 GlueX Software Oct. 21, 2004 D. Lawrence, JLab.
Fermilab Scientific Computing Division Fermi National Accelerator Laboratory, Batavia, Illinois, USA. Off-the-Shelf Hardware and Software DAQ Performance.
HYDRA Framework. Setup of software environment Setup of software environment Using the documentation Using the documentation How to compile a program.
DAQ and Trigger for HPS run Sergey Boyarinov JLAB July 11, Requirements and available test results 2. DAQ status 3. Trigger system status and upgrades.
GLAST LAT ProjectNovember 18, 2004 I&T Two Tower IRR 1 GLAST Large Area Telescope: Integration and Test Two Tower Integration Readiness Review SVAC Elliott.
Online Monitoring : Detector and Performance check
Italian g-2 software discussion
Monitoring system commissioning
Offline shifter training tutorial
Online Software Status
Overview of CLAS12 Calibration
ILD Ichinoseki Meeting
CMS Pixel Data Quality Monitoring
Presentation transcript:

Online and offline software overview, status and plans Status and Integration to STAR : Trigger DAQ Monitoring Slow Controls Online Databases Online/Offline/MC software 1

Trigger FMS (and FPD and FPDSMD) are all connected to QT boards and DSM Tree Full FMS/FPD trigger algorism document is available – High Tower Trigger – Cluster Sum Trigger – Multi Cluster Trigger – Module Sum Trigger (FPD) – LED trigger Test bed for new QT system (now used for other detectors) Full bit-wise check of DSM Tree (now covers all DSM tree) Plan : Already fully integrated Improvements (Jet Trigger with FHC) 2

DAQ All QT and DSM Tree are read by L2 FMS (and FPD/FPD-SMD) data are part of trigger data bank  DAQ file  Trigger Data file (with pre/post data) Trigger Data file analysis tool (trg2ntp) maintained by FMS group is essential and used by many subsystems Plan: Already fully integrated 3

Monitoring Status Few plots in Pplot as of run9 Because it samples only tiny fraction of data, it was almost useless Experts running monitoring/reconstruction codes on trigger data file ~min after run is taken Les’s online pi0 reconstrucitons (off TrgSscratch) Chris’s bitwise check (off L2) Hank’s monitor (off TrgSscratch) Len monitor ( off HPSS) Steve’s analysis ( off HPSS) 4

Monitoring Plan Improved Pplot to monitor LED signal Ensured bandwidth of LED trigger into EVPool (Jeff said its easy) This monitors everything : HV, gains, LED, Trigger and DAQ Unified and systematic/automated reconstruction codes running Bottleneck is getting data to RCAS disks (Trigger Data file -> L2- -> TrgScratch -> Online farm & HPSS) HPSS -> “ntuple”s on RCAS Bottleneck will be HPSS->RCAS disk speed, and diskspace Minimal CPU ~1min / 10k event file * ~50files / run * 50 run / day = 250 CPU min / day gzipped ntuples ~400 G byte from run9  User monitor codes (Like Len’s)  Reconstruction iterations (Like Les’s and Steve’s) Real-time fast stream MuDst with StTriggerData(raw trigger data) production??? (only if Jerome can be convinced after all those years) 5

Slow Control & Online DB Status Console/command-line based system (No connection to STAR SC) Large cells: Shell scripts to communicate with LeCroy 1440a small cells: PSU software to communicate with PSU HV motherboard Only expert(s) to turn on/off HV or change HV We do not turn on/off for beam dump/refill No HV trips to reset Zener Diode Issue HV Alarm = Les Bland (No alarm to STAR alarm handler) HV setting file on local disk QT-LUT-Gain file on trigger local disk HV log files on local disk every ~min = ~10 G bytes for run9 (no connection to Online DB) 6

Current FMS HV system fpd.daq.bnl.local fmsserv.trg.bnl.local (terminal server) LeCroy1440 cwcontrol.trg.bnl.local PSU Motherboard PUS Motherboard fpdswitch.trg.bnl.local (network power switch) 7 HV Operation Manual / Documentation / HV setting file & logfile location HV cable map QT-LUT-gain North Top Lg cells North Bot Lg cells South Top Lg cells South Bot Lg cells LVs South Sml Cells North Sml Cells

Slow Control & Online DB integration Plan No big change in HV system itself is planned It should remain “experts” only to operate HV (Zener Diode issue, no trip, no on/off while dump) Instruction to shift : Watch LED plots in Pplot, call expert if any change Do no turn on/off HV Who is “expert(s)” / who is allowed to change HV  See Les’s management talk Sending log-files to Online-DB is not essential and not planned – trg.bnl.local bandwidth issue – LED monitor will do most of the job – low priority, but if required, need ~2FTE month Messaging to STAR alarm system is not essential and not planned – It was stable (There was no HV trips) – LED monitor will do most of the job – low priority, but if required, need ~2FTE month 8

FMS reconstruction code Raw Data (Encoded QT data) Decoded QT data Hit (Mapped & Calibrated Energy) list Cluster list Photon list QT decoder Mapping & Calibration Cluster Finder Shower Shape fit MC cell list from GSTAR Summing, mapping Inverse-Calibration, Digitization & Calibration 9 We’ll have 1 st order calibration relatively quick (online pi0 reconstructions) Final calibrations with energy dependence (none-linearity) & run dependence comes much later User analysis Geometry & Cuts

Online and Offline Analysis code Status Trigger Data file DAQ file HBOOK ntuple (raw) BNL package StEvent (raw) MuDST (raw) HBOOK ntuple (raw + EMC/TPC) PSU package Calibration text files Geometry text files Map text files HBOOK ntuple (MC) Parameter text files ExperimentGSTAR (FPD & FMS geometry) PYTHIA with specialized filter 10

BEMC model DAQ file StEvent (raw + cluster + point) MuDST (raw + cluster + point) Offline DB (preliminary) EMC makers in BFC (DB + ClusterFinder + PointFinder) StEvent on memory (raw + cluster + point) MuDST on memory (raw + cluster + point) Offline DB (updated) Mudst2StEvent + EMC makers ExperimentGSTAR GSTAR file g2r 11

Online and Offline Analysis code Status Trigger Data file and ntuple from GSTAR  Analysis is NOT just “online” monitoring Large volume of this analysis is happening just (~min) after run is taken Used as “online” monitoring Used as feedback to HV/LUT gain Used as “offline” analysis for inclusive physics results, up to final papers MC saves many CPU hours by not simulating particles into mid-rapidity/east MuDST file  Analysis MuDST is produced ~months later Used for FMS-TPC or FMS-BEMC correlation analysis 12

Online and Offline Analysis code Status BNL & PSU packages are based on the same reconstruction code (“Yiqun’s code”) - Some differences because improvements made since code was split - BNL package = fortran/hbook+paw wrapped + New cluster finder (Ermes’s work) for hole treatment + Energy dependence/Run dependence corrections + SMD reconstruction Code is available : rcas://hank/put/the/file.tar - PSU package = c++/root wrapped + Code cleanup/re-organization + New cluster finder Code is available : rcas://Steve/put/the/file.tar 13

Analysis Code Plan Preserve “Trigger Data file analysis” path – Established user codes/scripts – Light weight for quick turn around (Concern on speed / overhead if we switch) We want one code to do “trigger data file analysis”, “MuDST analysis” and “MC analysis” – Re-merging “reconstruction (yiqun) code” in BNL and PSU package – One code in CVS, included in both Makers and “BNL/PSU packages” – Separate cluster finder and shower shape fitting – Add options/switches to have different code/versions Define classes in StEvent/MuDST for – Raw Data – Hit (Mapped & calibrated energy) list – Cluster list – Photons list Map, Geometry and Calibration in DB g2r (Never needed correlated MC sample so far) Man power 14

Plan Trigger Data file DAQ file HBOOK ntuple (raw) BNL package StEvent (raw) MuDST (raw) HBOOK ntuple (raw + EMC/TPC) PSU package Calibration text files Geometry text files Map text files HBOOK ntuple (MC) Parameter text files ExperimentGSTAR (FPD & FMS geometry) PYTHIA with specialized filter 15

Plan DAQ file StEvent (raw + cluster + photon) MuDST (raw + cluster + photon) Offline DB (preliminary) FMS makers in BFC (DB + ClusterFinder + Photon Fit) StEvent on memory (raw + cluster + photon) MuDST on memory (raw + cluster + photon) Offline DB (updated) Mudst2StEvent + FMS makers ExperimentGSTAR GSTAR file g2 r Trigger Data file Trigger data file reader 16

Online Package(s) Plan StEvent (raw) MuDST (raw) FMS analysis Maker(s) (Hit + Cluster + Photon list) Cluster Finder Trigger Data file HBOOK ntuple (raw) Shower fit MuDST on memory (raw + hit + cluster + photons) Offline DB text files GSTAR GSTAR file g2 r DAQ file Experiment 17

Summary of FSM integration plan Trigger - Done DAQ - Done Monitor – LED Slow Control / Online DB – No need Software – Unify to one code – Preserve “fast” path – Bring FMS raw data, HIT, Cluster and Photon to MuDST 18

Backup 19

Analysis Code Plan FTE monthProsCons Plan10No work … Not integrated … Plan26???Fully Integrated … A lot of work Filesize … Plan32???Less work Maintain “current” path A step towards plan2 … … 20