Readout & Controls Update DAQ: Baseline Architecture DCS: Architecture (first round) August 23, 2001 Klaus Honscheid, OSU.

Slides:



Advertisements
Similar presentations
Chapter 4 Computer Networks
Advertisements

André Augustinus ALICE Detector Control System  ALICE DCS is responsible for safe, stable and efficient operation of the experiment  Central monitoring.
2D Detectors DAQ Overview 2D detectors are organized as tiles providing 10G Ethernet serialized portions of the full.
1 ALICE Detector Control System (DCS) TDR 28 January 2004 L.Jirdén On behalf of ALICE Controls Coordination (ACC): A.Augustinus, P.Chochula, G. De Cataldo,
Supervision of Production Computers in ALICE Peter Chochula for the ALICE DCS team.
I/O Channels I/O devices getting more sophisticated e.g. 3D graphics cards CPU instructs I/O controller to do transfer I/O controller does entire transfer.
VC Sept 2005Jean-Sébastien Graulich Report on DAQ Workshop Jean-Sebastien Graulich, Univ. Genève o Introduction o Monitoring and Control o Detector DAQ.
Applying Wireless in Legacy Systems
Hall D Solenoid Controls. Rockwell Software Overview RSLogix5000 (PLC Programming) RSLinx Classic (Rockwell software/hardware interface) FactoryTalk (GUI.
SCADA and Telemetry Presented By:.
Chapter 6 High-Speed LANs Chapter 6 High-Speed LANs.
SCADA FOR WATER DISTRIBUTION IC DEPT. GECGn SEC28.
K. Honscheid RT-2003 The BTeV Data Acquisition System RT-2003 May 22, 2002 Klaus Honscheid, OSU  The BTeV Challenge  The Project  Readout and Controls.
Hall D Trigger and Data Rates Elliott Wolin Hall D Electronics Review Jefferson Lab 23-Jul-2003.
Using PVSS for the control of the LHCb TELL1 detector emulator (OPG) P. Petrova, M. Laverne, M. Muecke, G. Haefeli, J. Christiansen CERN European Organization.
STEALTH Content Store for SharePoint using Caringo CAStor  Boosting your SharePoint to the MAX! "Optimizing your Business behind the scenes"
Towards a Detector Control System for the ATLAS Pixeldetector Susanne Kersten, University of Wuppertal Pixel2002, Carmel September 2002 Overview of the.
09/11/20061 Detector Control Systems A software implementation: Cern Framework + PVSS Niccolo’ Moggi and Stefano Zucchelli University and INFN Bologna.
NCSX NCSX Preliminary Design Review ‒ October 7-9, 2003 G. Oliaro 1 G. Oliaro - WBS 5 Central Instrumentation/Data Acquisition and Controls Princeton Plasma.
Boosting Event Building Performance Using Infiniband FDR for CMS Upgrade Andrew Forrest – CERN (PH/CMD) Technology and Instrumentation in Particle Physics.
9th September 2001R. BARILLERE - IT-CO1 Industrial and Custom Front-End solutions for Process Controls.
JCOP Workshop September 8th 1999 H.J.Burckhart 1 ATLAS DCS Organization of Detector and Controls Architecture Connection to DAQ Front-end System Practical.
CDF data production models 1 Data production models for the CDF experiment S. Hou for the CDF data production team.
Jump Starting ITS Deployment in Los Angeles County using Wireless Communications Lessons Learned.
Active Safety Systems + EHN1: Draft EN/EL proposal for replacement of Nano PLC 1.Current situation 2.Requirements definition 3.Current solutions (Atlas/CMS)
Unit 5 CONTROL CENTERS AND POWER SYSTEM SECURITY.
Topics of presentation
André Augustinus 17 June 2002 Technology Overview What is out there to fulfil our requirements? (with thanks to Tarek)
6-10 Oct 2002GREX 2002, Pisa D. Verkindt, LAPP 1 Virgo Data Acquisition D. Verkindt, LAPP DAQ Purpose DAQ Architecture Data Acquisition examples Connection.
André Augustinus 10 October 2005 ALICE Detector Control Status Report A. Augustinus, P. Chochula, G. De Cataldo, L. Jirdén, S. Popescu the DCS team, ALICE.
FAIR Accelerator Controls Strategy
ALICE, ATLAS, CMS & LHCb joint workshop on
The Detector Control Power System of the Monitored Drift Tubes of the ATLAS Experiment Theodoros Alexopoulos NTU Athens TWEPP08 September 17, 2008 Naxos,
Management of the LHCb DAQ Network Guoming Liu * †, Niko Neufeld * * CERN, Switzerland † University of Ferrara, Italy.
Clara Gaspar, March 2005 LHCb Online & the Conditions DB.
JCOP Review, March 2003 D.R.Myers, IT-CO1 JCOP Review 2003 Architecture.
Introduction CMS database workshop 23 rd to 25 th of February 2004 Frank Glege.
Online Software 8-July-98 Commissioning Working Group DØ Workshop S. Fuess Objective: Define for you, the customers of the Online system, the products.
Overview of DAQ at CERN experiments E.Radicioni, INFN MICE Daq and Controls Workshop.
LHCb DAQ system LHCb SFC review Nov. 26 th 2004 Niko Neufeld, CERN.
Management of the LHCb Online Network Based on SCADA System Guoming Liu * †, Niko Neufeld † * University of Ferrara, Italy † CERN, Geneva, Switzerland.
B. Hall 17 Aug 2000BTeV Front End Readout & LinksPage 1 BTeV Front End Readout & Links.
New product introduction:
ATLAS DCS ELMB PRR, CERN, March 2002Fernando Varela ELMB Networks CAN/CANopen Interoperability of the ELMB Usage of the ELMB in ATLAS ELMB Networks Full.
CEA DSM Irfu SIS LDISC 18/04/2012 Paul Lotrus 1 Control Command Overview GBAR Collaboration Meeting Paul Lotrus CEA/DSM/Irfu/SIS.
Management of the LHCb DAQ Network Guoming Liu *†, Niko Neufeld * * CERN, Switzerland † University of Ferrara, Italy.
The DCS Databases Peter Chochula. 31/05/2005Peter Chochula 2 Outline PVSS basics (boring topic but useful if one wants to understand the DCS data flow)
CODA Graham Heyes Computer Center Director Data Acquisition Support group leader.
ATLAS DCS ELMB PRR, March 4th 2002, H.J.Burckhart1 Embedded Local Monitor Board ELMB  Context  Aim  Requirements  Add-ons  Our aims of PRR.
Proposal for a “Switchless” Level-1 Trigger Architecture Jinyuan Wu, Mike Wang June 2004.
Artur BarczykRT2003, High Rate Event Building with Gigabit Ethernet Introduction Transport protocols Methods to enhance link utilisation Test.
Programmable Logic Controller & Distributed Control System Yoon-Je Choi 17 th June 2006.
1 COSTING DCS WS L.Jirdén. 2 LAN WAN Storage Configuration DB, Archives FSM servers Terminal servers, ACR consoles, etc Experimental equipment.
ESS (vacuum) control system Daniel Piso Controls Division February 20, 2013.
PREFERRED UTILITIES MFG CORP South St. Danbury CT T: (203) F: (203) www. PREFERRED-MFG.com.
SCADA Supervisory Control And Data Acquisition Pantech Solutions Here is the key to learn more.
Artificial Intelligence In Power System Author Doshi Pratik H.Darakh Bharat P.
Giovanna Lehmann Miotto CERN EP/DT-DI On behalf of the DAQ team
Programmable Logic Controllers: I/O
LHC experiments Requirements and Concepts ALICE
Enrico Gamberini, Giovanna Lehmann Miotto, Roland Sipos
Controlling a large CPU farm using industrial tools
RT2003, Montreal Niko Neufeld, CERN-EP & Univ. de Lausanne
ProtoDUNE SP DAQ assumptions, interfaces & constraints
by Prasad Mane (05IT6012) School of Information Technology
Programmable Logic Controllers (PLCs) An Overview.
Example of DAQ Trigger issues for the SoLID experiment
EPICS: Experimental Physics and Industrial Control System
ESS Main Control Room & ICS Infrastructure
SVT detector electronics
Presentation transcript:

Readout & Controls Update DAQ: Baseline Architecture DCS: Architecture (first round) August 23, 2001 Klaus Honscheid, OSU

Data Rates Input rate from Detector:1.5 TBytes/s 3x the rate from the PTDR Simulation group works on new event size estimate Noise? Includes 50% extra capacity Expansion options L1 Acceptance:1% ~20 GBytes/s to L2/L3 farm Event rate ~100 kHz L2/L3 Acceptance:5% ~ 200 MBytes/s (event size reduction in L3) ~ 4000 Hz

DAQ Architecture I Data Flow Front-end – DCB – L1buffer Non-trigger systems DCB can distribute data from one crossing to many L1Bs OR DCB can send all data from one crossing to a single L1B Trigger systems DCB send all data from one crossing to a single L1B Conclusion Before L1B the question of highways is only relevant for the implementation.

DAQ Implementation: DCB -> L1B Implementation based on highways offers: - Significant advantage/simplifications for L1B processing - Lower event rate and larger packets => easier implementation - (Pseudo) random distribution of crossings possible - No net cost difference DCB +$200K L1B - $200K Proposal: The DCB-L1B connections will be structured as 8 (*) highways. (*) strong preference for a fixed number of highways. 4, 6, 8 or 12 are Considered. For baseline cost estimate 8 highways are assumed.

DAQ Architecture II How much flexibility is needed in sending specific events to specific L2/L3 processors? Provide support to split DAQ into multiple logical partitions that can be operated independently. There is interest in getting the same event delivered to more than one L2/L3 node. Deliver data from consecutive crossings? Support partial event readout (for L2) and complete event readout for L2/L3 A partition (description) includes a list of L2/L3 nodes, trigger condition(s), output stream (?), required rate or kind of service (e.g. sampling) Clearly there would be rate/bandwidth concerns that need to be supervised. Highways need to be connected. Mostly a Global L1 and Event Manager issue. (At rates < 50 MBytes/s no difference between single or multiple highways) Yes

DAQ Implementation: L1B -> L2/L3 Farm Implementation based on highways offers: - Some cost savings ($ K) from smaller switches - Reduced control traffic (per highway) to signal L1 accepts etc. - Not difficult to change: Multiple highways can be combined by adding a second stage of identical switches. Single highway can be split by removing a stage of switches or reprogramming a larger switch. Proposal: The L1B – L2/L3 connections will be structured as 8 (*) highways. (*) strong preference for a fixed number of highways. 4, 6, 8 or 12 are Considered. For baseline cost estimate 8 highways are assumed.

System Overview

Example 1.5 TBytes/s System DCB - 48 serial input 1 Gbps (average 300 Mbps per link ) - 12 serial output 2 Gbps - option to double output links if input rate doubles - 6, 8 (*), or 12 highways DCBs (includes Pixel and Cal) - up to 24,192 FEBs or FEB equivalents Optical Links cables (1 per DCB, expandable to 2) - 12 fibers/cable (6048 total) - bandwidth = 1.5 TBytes/sec (expandable to 3.0) L1B - 24 serial input 2 Gbps - fiber to copper conversion external to L1B (for compatibility with trigger) - 1 serial output 1 Gbps (Gigabit Ethernet) L1Bs Event Builder - 12 switches X 48 ports/switch L2/3 Farm serial input 1 Gbps - up to 12 processors per link (with standard Gigabit to Fast Ethernet switch), 3024 total serial return 1 Gbps - same switch used for L2/3 input and output Storage -12 serial 1Gbps from Event Builder

Example 1.5 TBytes/s System DCB - 48 serial input 1 Gbps (average 300 Mbps per link ) - 12 serial output 2 Gbps - option to double output links if input rate doubles - 6, 8 (*), or 12 highways DCBs (includes Pixel and Cal) - up to 24,192 FEBs or FEB equivalents Optical Links cables (1 per DCB, expandable to 2) - 12 fibers/cable (6048 total) - bandwidth = 1.5 TBytes/sec (expandable to 3.0) L1B - 24 serial input 2 Gbps - fiber to copper conversion external to L1B (for compatibility with trigger) - 1 serial output 1 Gbps (Gigabit Ethernet) L1Bs Event Builder - 12 switches X 48 ports/switch L2/3 Farm serial input 1 Gbps - up to 12 processors per link (with standard Gigabit to Fast Ethernet switch), 3024 total serial return 1 Gbps - same switch used for L2/3 input and output Storage -12 serial 1Gbps from Event Builder

BTeV Controls? A lot of things are hiding behind this topic: 1.Configuration Run state transitions 2.Data Quality Monitor 3.Fast Interlock, Fire Alarm 4.“Classical” Slow Control 5.Calibration Run Control DAQ group provides skeleton software, hardware (Detector Manager) “Consumer” DAQ group provides skeleton software, hardware (?) Detector Control (DCS) DAQ group provides skeleton software, hardware (Detector Manager)

Typical Control System (CMS)

“Classical” Control System (MINOS) SCADA Analog/Digital channels, PLC, Fieldbuses LAN-ETHERNET OPI terminal OPI terminal OPI laptop Experiment Sub-Detectors & Equipment DB server Oracle DB FNAL Safety Server Beam Server PLC CAN fieldbus Beam-line Swics,BPM Magnets, Scalers RS232 LeCroy 1440 RS232 GPS IOS OPI WAN MIL/STD-1553B fieldbus Remote Workstations Local Workstations I/O Servers distributed in Experimental area OPC PLC fieldbuses GPIB IOS GPIB-ENET bridge OPI terminal

Supervisory Control and Data Acquisition Commercial systems, typically used in industrial production plants. Examples include LabView/BridgeView from National Instruments iFIX from Intellution (CDF, MINOS) EPICS (Babar, D0) PVSS II (CERN) OLE for Process Control Defines a standard to interface programs (SCADA) to hardware devices in a control system. Based on Microsoft’s COM/DCOM object model Provides multi-vendor inter-operability

SCADA Architecture User Process SCADA Utilities C,C++,VBA Wizards Engine Process Alarm handling Event/Alarm logging Historical trending Networking Device Servers HMI Logging & Archiving Handles distributed systems Reports Access Control Alarms Trending …

Control Example DAQ Group Detector Group

BTeV Control System (DCS) Solicit feedback from detector groups Treat infrastructure in similar fashion Rack monitoring Magnet Detector hall Evaluate SCADA software Develop/set up DCS test lab Develop sample solutions (HV?) Define DAQ – DCS connection relevant for HV, Pixel “motor” Calibration