June, 2000NIU Workshop Online DAQ System: From Detector to Tape T. Yasuda Fermilab.

Slides:



Advertisements
Similar presentations
June 19, 2002 A Software Skeleton for the Full Front-End Crate Test at BNL Goal: to provide a working data acquisition (DAQ) system for the coming full.
Advertisements

EPICS Experience at Fermilab Geoff Savage August 2005 Controls and Monitoring Group.
All DØ Meeting December 1, 2000 Uncle DØ Needs you!!! Commissioning Status & Plans Jae Yu All DØ Meeting Dec. 1, 2000 What have we been doing? What is.
Tracker Controls MICE Controls and Monitoring Workshop September 25, 2005 A. Bross.
VC Sept 2005Jean-Sébastien Graulich Report on DAQ Workshop Jean-Sebastien Graulich, Univ. Genève o Introduction o Monitoring and Control o Detector DAQ.
Brian Martlew 25 th Sept 2006 MICE Control & Monitoring Plan Brian Martlew.
March 2003 CHEP Online Monitoring Software Framework in the ATLAS Experiment Serguei Kolos CERN/PNPI On behalf of the ATLAS Trigger/DAQ Online Software.
F Fermilab Database Experience in Run II Fermilab Run II Database Requirements Online databases are maintained at each experiment and are critical for.
Windows Server MIS 424 Professor Sandvig. Overview Role of servers Performance Requirements Server Hardware Software Windows Server IIS.
Use of ROOT in the D0 Online Event Monitoring System Joel Snow, D0 Collaboration, February 2000.
L3 Filtering: status and plans D  Computing Review Meeting: 9 th May 2002 Terry Wyatt, on behalf of the L3 Algorithms group. For more details of current.
May 14, 2001E. Gallas/Trigger Database1 Status of the Trigger Database Elizabeth Gallas, Rich Wellner, Vicky White Fermilab - Computing Division See my.
Online Systems Status Review of requirements System configuration Current acquisitions Next steps... Upgrade Meeting 4-Sep-1997 Stu Fuess.
Calo Piquet Training Session - Xvc1 ECS Overview Piquet Training Session Cuvée 2012 Xavier Vilasis.
D0 Run IIb Review 15-Jul-2004 Run IIb DAQ / Online status Stu Fuess Fermilab.
Multi-media Computers and Computer Networks. Questions ? Media is used for ………………… Multimedia computer is capable of integrating ………………………………….. OCR stands.
Clara Gaspar, October 2011 The LHCb Experiment Control System: On the path to full automation.
FPD Status and Data Quality Andrew Brandt UTA Q4 D S Q3S A1A2 P 1 UP p p Z(m) D1 Detector Bellows Roman Pot P 2 OUT Q2 P 1 DN P 2 IN D2.
Online Calibration of the D0 Vertex Detector Initialization Procedure and Database Usage Harald Fox D0 Experiment Northwestern University.
1 Online Calibration of Calorimeter Mrinmoy Bhattacharjee SUNY, Stony Brook Thanks to: D. Schamberger, L. Groer, U. Bassler, B. Olivier, M. Thioye Institutions:
NIU Trigger Workshop CTT Commissioning Trigger Workshop October 22, 1999 WBS & Presented by Fred Borcherding for the D0 Electronics Group.
Dec 7, 1999D0 Dir Review Inst/Com DØ Silicon Hookup and Commissioning p Goals and timeline p Overview of Silicon Read out p “Stand alone” commissioning.
ALICE, ATLAS, CMS & LHCb joint workshop on
March 2008EPICS Meeting in Shanghai1 KEKB Control System Status Mar Tatsuro NAKAMURA KEKB Control Group, KEK.
V.Sirotenko, July Status of Online Databases Currently there are 2 online Oracle Databases running on d0online cluster: 1.Production DB, d0onprd,
Management of the LHCb DAQ Network Guoming Liu * †, Niko Neufeld * * CERN, Switzerland † University of Ferrara, Italy.
Clara Gaspar, March 2005 LHCb Online & the Conditions DB.
Computing Division Requests The following is a list of tasks about to be officially submitted to the Computing Division for requested support. D0 personnel.
DAQ Software Gordon Watts UW, Seattle December 8, 1999 Director’s Review Introduction to the System Goals for Installation & Commissioning Software Tasks.
Introduction CMS database workshop 23 rd to 25 th of February 2004 Frank Glege.
Online Software 8-July-98 Commissioning Working Group DØ Workshop S. Fuess Objective: Define for you, the customers of the Online system, the products.
The Main Injector Beam Position Monitor Front-End Software Luciano Piccoli, Stephen Foulkes, Margaret Votava and Charles Briegel Fermi National Accelerator.
Overview of DAQ at CERN experiments E.Radicioni, INFN MICE Daq and Controls Workshop.
L3 DAQ Doug Chapin for the L3DAQ group DAQShifters Meeting 10 Sep 2002 Overview of L3 DAQ uMon l3xqt l3xmon.
DØ Algorithms Meeting March 9, Leslie Groer Columbia UniversityCalorimeter Online Software Status 1  Examines  Crate Unpacking  Calibration 
DØ Online16-April-1999S. Fuess Online Computing Status DØ Collaboration Meeting 16-April-1999 Stu Fuess.
D0 Status: 01/14-01/28 u Integrated luminosity s delivered luminosity –week of 01/ pb-1 –week of 01/ pb-1 –luminosity to tape: 40% s major.
Sep. 17, 2002BESIII Review Meeting BESIII DAQ System BESIII Review Meeting IHEP · Beijing · China Sep , 2002.
HIGUCHI Takeo Department of Physics, Faulty of Science, University of Tokyo Representing dBASF Development Team BELLE/CHEP20001 Distributed BELLE Analysis.
Control System Overview J. Frederick Bartlett Fermilab June 1,1999.
All Experimenters MeetingDmitri Denisov Week of July 7 to July 15 Summary  Delivered luminosity and operating efficiency u Delivered: 1.4pb -1 u Recorded:
D0 PMG 6/15/00 PMG Agenda June 15, 2000  Overview (Tuts) u Detector status u Reportable milestones u Financial status u Summary  Response to DOE review.
June 17th, 2002Gustaaf Brooijmans - All Experimenter's Meeting 1 DØ DAQ Status June 17th, 2002 S. Snyder (BNL), D. Chapin, M. Clements, D. Cutts, S. Mattingly.
The DØ Control System J. Frederick Bartlett For The DØ Controls Group.
1 EIR Nov 4-8, 2002 DAQ and Online WBS 1.3 S. Fuess, Fermilab P. Slattery, U. of Rochester.
1 L1CAL for DAQ Shifter By Selcuk Cihangir 3/20/2007 Representing L1CAL group (slides from many people)
DØ Online Workshop3-June-1999S. Fuess Online Computing Overview DØ Online Workshop 3-June-1999 Stu Fuess.
DoE Review January 1998 Online System WBS 1.5  One-page review  Accomplishments  System description  Progress  Status  Goals Outline Stu Fuess.
DØ Calorimeter Software Meeting April 26, Leslie Groer Columbia UniversityCalorimeter Online Software Status + Needs 1  Examines  Crate Unpacking.
FPD STATUS Carlos Avila Uniandes/UTA 1. FPD overview 2. Roman pot and detector status 3. FPD readout integration status 4. Software status 5. Stand-alone.
DØ Algorithms Meeting April 6, Leslie Groer, Columbia Univ Ursula Bassler, LPNHE, ParisCalorimeter Online Software Status 1  Examines  Crate Unpacking.
1 Electronics Status Trigger and DAQ run successfully in RUN2006 for the first time Trigger communication to DRS boards via trigger bus Trigger firmware.
Examine Overview D0 Online Workshop June 3, 1999 Jae Yu Outline 1. What is an Examine? 2. How Many Examines? 3. How does it work? 4. What are the features?
Management of the LHCb DAQ Network Guoming Liu *†, Niko Neufeld * * CERN, Switzerland † University of Ferrara, Italy.
R. Krempaska, October, 2013 Wir schaffen Wissen – heute für morgen Controls Security at PSI Current Status R. Krempaska, A. Bertrand, C. Higgs, R. Kapeller,
Control System Overview J. Frederick Bartlett Fermilab June 1,1999.
1 DAQ.IHEP Beijing, CAS.CHINA mail to: The Readout In BESIII DAQ Framework The BESIII DAQ system consists of the readout subsystem, the.
Online Software November 10, 2009 Infrastructure Overview Luciano Orsini, Roland Moser Invited Talk at SuperB ETD-Online Status Review.
Jianming Qian, UM/DØ Software & Computing Where we are now Where we want to go Overview Director’s Review, June 5, 2002.
FLASH Free Electron Laser in Hamburg Status of the FLASH Free Electron Laser Control System Kay Rehlich DESY Outline: Introduction Architecture Future.
20OCT2009Calo Piquet Training Session - Xvc1 ECS Overview Piquet Training Session Cuvée 2009 Xavier Vilasis.
Fermilab Control System Jim Patrick - AD/Controls MaRIE Meeting March 9, 2016.
Fermilab Scientific Computing Division Fermi National Accelerator Laboratory, Batavia, Illinois, USA. Off-the-Shelf Hardware and Software DAQ Performance.
András László KFKI Research Institute for Particle and Nuclear Physics New Read-out System of the NA61 Experiment at CERN SPS Zimányi Winter School ‑ 25.
BaBar Transition: Computing/Monitoring
PC Farms & Central Data Recording
The IFR Online Detector Control at the BaBar experiment at SLAC
The IFR Online Detector Control at the BaBar experiment at SLAC
The Online Detector Control at the BaBar experiment at SLAC
The Performance and Scalability of the back-end DAQ sub-system
Presentation transcript:

June, 2000NIU Workshop Online DAQ System: From Detector to Tape T. Yasuda Fermilab

June, 2000NIU Workshop Overview Hardware Control System Primary/Secondary Data Path DAQ Applications DAQ in action Conclusions

June, 2000NIU Workshop Overview The DØ DAQ system is divided into two components: Trigger system Level 1 hardware trigger components Level 2 specialized processors Level 3 crate readout and software trigger components Online or Host system Detector controls Data Logging Monitoring Control room applications

June, 2000NIU Workshop Overview DAQ Architecture –Event data rate and operational redundancy achieved by a high degree of parallelism Level 3 Host –Capability for multi-user, multi-stream operation With central resource configuration manager –Network-centric Host design

June, 2000NIU Workshop DAQ Components Detector FCC UNIX Servers NT Level 3 Linux PCs Control Room PCs Controls Trigger and Readout

June, 2000NIU Workshop Hardware Description 3 Compaq/Digital Alpha Servers –d0ola: Alpha Server 4000, 1 processor, 466MHz, 500 MB memory –d0olb: Alpha Server 4000, 2 processors, 600 MHz, 500 MB memory –d0olc: Spec out by Aug 3. Probably Alpha Server ES40, 4 processors, 667 MHz –Clustered / redundant –500 GB shared RAID disks for online apps and database (Mirrored) –500+ GB local ‘data buffer’ disks, fiber channel based (40M/sec)

June, 2000NIU Workshop Hardware Description Linux/NT nodes –Buying 6 nodes with dual PIII, 600 MHz, 500MB memory, 2 graphics cards –3 Linux nodes and 3 NT nodes exist –Will run Vmware on Linux nodes Control system embedded 68Ks and PowerPCs (VxWorks) Network –Cisco 6509 GB Ethernet switch for all FCH nodes –Satellite 100 MB switches in MCH –GB fiber to FCC Security –Access control filter to online machines –Kerberos authenticated ssh session only

June, 2000NIU Workshop Control & Monitoring Detector Readout Crate Controls Crate Trigger and Readout UNIX ServersControl Room PCs EPICS Clients: Low Voltage High Voltage Rack Monitor 1553 Devices SMT Monitor FT Monitor etc EPICS Clients: Low Voltage High Voltage Rack Monitor 1553 Devices SMT Monitor FT Monitor etc Controls Ethernet 1553 ORACLE Hardware Database EPICS DB Generator Vertical Interconnect

June, 2000NIU Workshop Control System Built upon EPICS control system –A ‘standard’ toolkit upon which we’ve built DØ extensions –Lots of user-community supplied tools ORACLE Hardware Database –Extract EPICS db from ORACLE –Web-based and batch interfaces Hardware Control –Low Voltage, High Voltage, etc dedicated GUI applications Downloading –Registers, Pedestals, etc Significant Event (alarm) System Interface to Accelerator and Cryogenics controls systems

June, 2000NIU Workshop Status of Control System Calorimeter –Preamp PS, BLS PS, ADC PS control exist –Pulser control (in progress) SMT –EIPCS records for Sequencer, Sequencer control, VRB, VRBC, Emulator (in progress) exist and used in the test stands Muon –Tested communication using 1553 for PDT, MDT, SRC cards FPD –Used RM support to control motors

June, 2000NIU Workshop Status of Control System Luminosity –Scalers and FE processing results communicated to Accelerator via ACNET Cryo –Communicated with DMAX system Common Tools –generic 1553 support –generic VME support –HV used in SMT, Muon, Lum V1 running for months, V2 work starting –Diagnostic support for busses –Standard operator interface (GUI)

June, 2000NIU Workshop Hardware Database Describes Control aspects of electronics Based on ORACLE –2 instances of the database ( dev, user testing) Web based interface for entering, modifying, deleting records Python script for batch entries exists Calorimeter records in the database

June, 2000NIU Workshop Hardware Database

June, 2000NIU Workshop Significant Event (Alarm) System System to detect alarm conditions and state changes in the DAQ system –Server with DAQ components as clients –COOR sends alarm and run control messages –CR, DL, DD send alarm messages –Version 1 Display exists –Working on version 2 Display (Summer student) –Need to integrate EPICS alarms into the system (Fall 2000) on IOC EPICS alarms -> ITC client ITC client sends alarms to Server on host EPICS Alarm Handler can be used for now

June, 2000NIU Workshop Significant Event System Significant Event Server Fault Watcher Archiver HV Control Heartbeat Display Front-End Run Control (COOR) Process Periodic Heartbeat Run Suspend F F F F F F Filter SE Message Filtered Message

June, 2000NIU Workshop Configuration & Run Control Detector L1, L2, TCC L3 Supervisor L3 VRC Readout Crate L3 Filter FCC Data Cable Controls Crate UNIX Servers NT Level 3 Ethernet Linux PCs Control Room PCs ControlsTrigger and Readout Run Control Client Run Control Client Comics DSM COOR Collector / Router Data Logger Disk Data Distributor EXAMINE RIP 1553 Vertical Interconnect

June, 2000NIU Workshop Software Description Configuration Management and Run Control –Coordination (COOR) –User interface (TAKER) –Download manager (COMICS) Primary event path –DAQ State Manager (DSM) –Collector / Router –Data Logger –Event metadata manager (SAM) –Event data manager (enstore) Secondary event path –Secondary DAQ Supervisor –Data Merger

June, 2000NIU Workshop Software Description Event monitoring –Data Distributor –Analysis applications (EXAMINE) DAQ Monitoring –Client/Server access to DAQ flow statistics, trigger rates, etc Detector Monitoring –Front End active & parasitic monitors Calibration –Client/Server interface to database Infrastructure –Databases (ORACLE) –Task-to-task communication (ITC)

June, 2000NIU Workshop Secondary DAQ Data Flow Controls/Readout Crates 1553 Bus VME Bus FCC Linux PCs UNIX Server EXAMINE UNIX Server Collector / Router Data Logger RIP Disk Collector / Router Data Logger Data Logger Disk Data Distributor Data Distributor RIP Data Merge Control Room PCs EPICS CA Client EPICS CA Client Monitor GUI Examine GUI Examine GUI ITC Detector CA Link EPICS CA Server EPICS CA Server Detector DAQ Shared Data

June, 2000NIU Workshop Secondary DAQ System Alternative data path Mainly used for monitoring and calibration Takes advantage of powerful front- end processors Uses the same data path as the primary path after Data Merger

June, 2000NIU Workshop DAQ Monitor Monitors the status of DAQ subsystems (L1/2, CR, DL, DD) Collects statistics information from the subsystems C++ itc Server with python Display clients

June, 2000NIU Workshop DAQ Monitor

June, 2000NIU Workshop Event Monitoring: EXAMINE Samples and reconstructs event based on Stream IDs and trigger IDs Clients of Distributor –Network and file event transfer modes work –Calorimeter EXAMINE used for preamp testing –CFT EXAMINE Getting ready for raw data unpacking MC packed data? –SMT EXAMINE used for SiDet data –Muon EXAMINE used for commissioning

June, 2000NIU Workshop Event Monitoring: EXAMINE Detector L1, L2, TCC L3 Supervisor L3 VRC Readout Crate L3 Filter Collector / Router Data Logger Disk Data Cable UNIX Servers NT Level 3 Ethernet Linux PCsControl Room PCs Data Distributor EXAMINE Express Line ROOT Client Trigger and Readout

June, 2000NIU Workshop Event Monitoring: EXAMINE –Need: L3 EXAMINE Vertex EXAMINE Preshower EXAMINE –Planned improvements Histoscope -> Root after NIU workshop on-the-fly histogram e-browser Framework improvement –name server for accessing only the histograms

June, 2000NIU Workshop Online Event Display

June, 2000NIU Workshop Online Calibration Perform electronics calibration of sub-detectors and insert results to ORACLE database COOR controlled via Taker Common server and database interface for all sub-detectors Calibration results transmitted as special event messages through DAQ paths Current status –Successfully ran SMT calibration at 1% test stand and NW test stand

June, 2000NIU Workshop Calibration Manager Taker COOR COMICS Crates Calibration Manager Calibration Database Calib. Data Processor Database Interface Configure Request start run Request download Download Start run End run Data Pedestals Gains Pedestals Gains Comparison Results Database Access End calib. Start run, End run Validator Calib Manager Display

June, 2000NIU Workshop Online DAQ in Action Electronics/DAQ Commissioning –2 VRB crates with 10 cards each, synchronized with SCL from TFW to L3 –L1 muon crate –1 Muon Scint crate with 2 MRCs –1 Muon PDT crate with 1 MRC –1 Calorimeter crate –Combinations of 2 systems done but not with MCH2+MCH3 –2 simultaneous runs done 3 simultaneous runs require one more L3 node or script runner

June, 2000NIU Workshop Online DAQ in Action SMT Test Stands –1% and NW test stands 1 HDI, 1 Sequencer, 1 VRB, 1 VRBC, 1 VBD Download done with COMICS and database –10% test stand 3 HDIs, a few Interface Boards, a few Sequencers, a few VRBs, 1 VRBC, controler, 1 VBD + L3 Download done by spread sheet for now –Databases (Electronics) exist for all three stands –Calibration run performed at 1% and NW test stands

June, 2000NIU Workshop Online DAQ in Action Commissioning Run –Two detectors installed for the upcoming Commissioning Run Run I Luminosity scintillation counters Forward Proton Detector –Both detectors will be read out using the Run II Online system. –Data will be transferred to and from the Accelerator Controls System via the EPICS/ACNET Gateway.

June, 2000NIU Workshop Conclusions All of the DAQ components exist and function. Improvements are implemented daily following user suggestions. We have been intimately involved in daily commissioning activities for the past few months. Bring in your sub-detectors!!