SVT workshop October 28, 1998 Online software Stefano Belforte - INFN Pisa1 Online Software Issues Which tasks will be needed for SVT  Operation  Monitor.

Slides:



Advertisements
Similar presentations
Network II.5 simulator ..
Advertisements

System Integration and Performance
Commissione Nazionale 1 Castel Gandolfo 11 Set Piano di Calcolo per CDF Run2a Stefano Belforte - INFN Trieste1 Wedge 10: what I have tested Am_test_vme:
Implementing A Simple Storage Case Consider a simple case for distributed storage – I want to back up files from machine A on machine B Avoids many tricky.
June 19, 2002 A Software Skeleton for the Full Front-End Crate Test at BNL Goal: to provide a working data acquisition (DAQ) system for the coming full.
Reconstruction and Analysis on Demand: A Success Story Christopher D. Jones Cornell University, USA.
GLAST LAT ProjectOnline Peer Review – July 21, Integration and Test L. Miller 1 GLAST Large Area Telescope: I&T Integration Readiness Review.
CFT Calibration Calibration Workshop Calibration Requirements Calibration Scheme Online Calibration databases.
Hands-On Microsoft Windows Server 2008 Chapter 11 Server and Network Monitoring.
LKr readout: present and future R. Fantechi 30/8/2012.
S. Durkin, USCMS-EMU Meeting, Oct. 21, 2005 Critical Data Errors S. Durkin The Ohio State University USCMS EMU Meeting, FNAL, Oct. 20, 2005.
CLEO’s User Centric Data Access System Christopher D. Jones Cornell University.
L3 Filtering: status and plans D  Computing Review Meeting: 9 th May 2002 Terry Wyatt, on behalf of the L3 Algorithms group. For more details of current.
Lecture 18 Page 1 CS 111 Online Design Principles for Secure Systems Economy Complete mediation Open design Separation of privileges Least privilege Least.
06/15/2009CALICE TB review RPC DHCAL 1m 3 test software: daq, event building, event display, analysis and simulation Lei Xia.
Introduction and Overview Questions answered in this lecture: What is an operating system? How have operating systems evolved? Why study operating systems?
Test Of Distributed Data Quality Monitoring Of CMS Tracker Dataset H->ZZ->2e2mu with PileUp - 10,000 events ( ~ 50,000 hits for events) The monitoring.
SVT workshop October 27, 1998 XTF HB AM Stefano Belforte - INFN Pisa1 COMMON RULES ON OPERATION MODES RUN MODE: the board does what is needed to make SVT.
06/03/06Calice TB preparation1 HCAL test beam monitoring - online plots & fast analysis - - what do we want to monitor - how do we want to store & communicate.
This document is for informational purposes only, and Tekelec reserves the right to change any aspect of the products, features or functionality described.
D0 Farms 1 D0 Run II Farms M. Diesburg, B.Alcorn, J.Bakken, T.Dawson, D.Fagan, J.Fromm, K.Genser, L.Giacchetti, D.Holmgren, T.Jones, T.Levshina, L.Lueking,
Software Development Software Testing. Testing Definitions There are many tests going under various names. The following is a general list to get a feel.
14 Sep 2005DAQ - Paul Dauncey1 Tech Board: DAQ/Online Status Paul Dauncey Imperial College London.
Plans for Trigger Software Validation During Running Trigger Data Quality Assurance Workshop May 6, 2008 Ricardo Gonçalo, David Strom.
Online Calibration of the D0 Vertex Detector Initialization Procedure and Database Usage Harald Fox D0 Experiment Northwestern University.
André Augustinus 10 September 2001 DCS Architecture Issues Food for thoughts and discussion.
November 5, 1998Stefano Belforte - INFN Pisa1 SUMMARY FROM THE SVT WORKSHOP CDF Collaboration Meeting Stefano Belforte for the SVT group.
6-10 Oct 2002GREX 2002, Pisa D. Verkindt, LAPP 1 Virgo Data Acquisition D. Verkindt, LAPP DAQ Purpose DAQ Architecture Data Acquisition examples Connection.
Febryary 10, 1999Stefano Belforte - INFN Trieste1 CDF Run II Computing Workshop. A user’s perspective Stefano Belforte INFN - Trieste.
GLAST Large Area Telescope Instrument Flight Software Flight Unit Design Review 16 September 2004 Software Watchdog Steve Mazzoni Stanford Linear Accelerator.
Operating Systems David Goldschmidt, Ph.D. Computer Science The College of Saint Rose CIS 432.
The Alternative Larry Moore. 5 Nodes and Variant Input File Sizes Hadoop Alternative.
EGEE is a project funded by the European Union under contract IST HEP Use Cases for Grid Computing J. A. Templon Undecided (NIKHEF) Grid Tutorial,
Svtsim status Bill Ashmanskas, CDF simulation meeting, Main authors: Ashmanskas, Belforte, Cerri, Nakaya, Punzi Design goals/features: –main.
17-Aug-00 L.RistoriCDF Trigger Workshop1 SVT: current hardware status CRNowFinal Hit Finders64242 Mergers31616 Sequencers2312 AMboards4624 Hit Buffers21212.
CMS pixel data quality monitoring Petra Merkel, Purdue University For the CMS Pixel DQM Group Vertex 2008, Sweden.
Overview of DAQ at CERN experiments E.Radicioni, INFN MICE Daq and Controls Workshop.
Online Monitoring for the CDF Run II Experiment T.Arisawa, D.Hirschbuehl, K.Ikado, K.Maeshima, H.Stadie, G.Veramendi, W.Wagner, H.Wenzel, M.Worcester MAR.
Trigger Commissioning Workshop, Fermilab Monica Tecchio Aug. 17, 2000 Level 2 Calorimeter and Level 2 Isolation Trigger Status Report Monica Tecchio University.
MICE CM28 Oct 2010Jean-Sebastien GraulichSlide 1 Detector DAQ o Achievements Since CM27 o DAQ Upgrade o CAM/DAQ integration o Online Software o Trigger.
EPICS Release 3.15 Bob Dalesio May 19, Features for 3.15 Support for large arrays Channel access priorities Portable server replacement of rsrv.
Pixel DQM Status R.Casagrande, P.Merkel, J.Zablocki (Purdue University) D.Duggan, D.Hidas, K.Rose (Rutgers University) L.Wehrli (ETH Zuerich) A.York (University.
Online Monitoring System at KLOE Alessandra Doria INFN - Napoli for the KLOE collaboration CHEP 2000 Padova, 7-11 February 2000 NAPOLI.
IT3002 Computer Architecture
GLAST LAT Project CU Beam Test Workshop 3/20/2006 C. Sgro’, L. Baldini, J. Bregeon1 Glast LAT Calibration Unit Beam Test Status Report on Online Monitor.
GAN: remote operation of accelerator diagnosis systems Matthias Werner, DESY MDI.
Can we tag on SVT at L2? Bill Ashmanskas, CDF TDSWG meeting, SVT has been producing fitted tracks since April Resolution has improved, fake.
DAQ Status & Plans GlueX Collaboration Meeting – Feb 21-23, 2013 Jefferson Lab Bryan Moffit/David Abbott.
October Test Beam DAQ. Framework sketch Only DAQs subprograms works during spills Each subprogram produces an output each spill Each dependant subprogram.
SVT workshop October 27, 1998Stefano Belforte - INFN Pisa1 SVT Vertical Slice Test Goals Schedule Hardware to be tested Software  pre-existing  developed.
LKr readout and trigger R. Fantechi 3/2/2010. The CARE structure.
LAV thresholds requirements Paolo Valente. LAV answers for Valeri’s questions (old) 1.List of hardware to control (HV, LV, crates, temperatures, pressure,
I/O Software CS 537 – Introduction to Operating Systems.
LHC CMS Detector Upgrade Project RCT/CTP7 Readout Isobel Ojalvo, U. Wisconsin Level-1 Trigger Meeting June 4, June 2015, Isobel Ojalvo Trigger Meeting:
Calorimeter global commissioning: progress and plans Patrick Robbe, LAL Orsay & CERN, 25 jun 2008.
Markus Frank (CERN) & Albert Puig (UB).  An opportunity (Motivation)  Adopted approach  Implementation specifics  Status  Conclusions 2.
1 DAQ.IHEP Beijing, CAS.CHINA mail to: The Readout In BESIII DAQ Framework The BESIII DAQ system consists of the readout subsystem, the.
January 2010 – GEO-ISC KickOff meeting Christian Gräf, AEI 10 m Prototype Team State-of-the-art digital control: Introducing LIGO CDS.
Sumary of the LKr WG R. Fantechi 31/8/2012. SLM readout restart First goal – Test the same configuration as in 2010 (rack TS) – All old power supplies.
Analysis Model Zhengyun You University of California Irvine Mu2e Computing Review March 5-6, 2015 Mu2e-doc-5227.
Barthélémy von Haller CERN PH/AID For the ALICE Collaboration The ALICE data quality monitoring system.
Scalable Readout System Data Acquisition using LabVIEW Riccardo de Asmundis INFN Napoli [Certified LabVIEW Developer]
DHH at DESY Test Beam 2016 Igor Konorov TUM Physics Department E18 19-th DEPFET workshop May Kloster Seeon Overview: DHH system overview DHE/DHC.
THIS MORNING (Start an) informal discussion to -Clearly identify all open issues, categorize them and build an action plan -Possibly identify (new) contributing.
Online Software Status
Run experience Cover FW SRB FW DAQ and monitoring
Example of DAQ Trigger issues for the SoLID experiment
Chapter 2: Operating-System Structures
Chapter 2: Operating-System Structures
Chapter 13: I/O Systems.
Presentation transcript:

SVT workshop October 28, 1998 Online software Stefano Belforte - INFN Pisa1 Online Software Issues Which tasks will be needed for SVT  Operation  Monitor  Diagnosis  Debug How will they be implemented. By whom. How do we interface with run_control. How do we interface with online/offline DB, offline data analysis... How much CDF “standard” online software is re-used What will run on VME CPU, what on local host(s), what on remote institutions Which local computing resources will we need (CPU, disks…)

SVT workshop October 28, 1998 Online software Stefano Belforte - INFN Pisa2 Present Status Let’s be honest: we do not have a plan. Today’s goal:  List of tasks  Estimate of manpower needs  Solicit volunteers  Assign responsibilities  Agree on a baseline for basic interface issues with the rest of the world  Define a time frame for a plan This talk:  My best understanding at the time = lots of wrong ideas to stimulate you to come forward with the right ones (even if not immediately)

SVT workshop October 28, 1998 Online software Stefano Belforte - INFN Pisa3 Tasks Operation  define constants to download  how often do they change  do they change during the run/store ?  Check beam position before data taking.  Do we re-compute patterns at this time ? Better be prepared to.  Monitor beam position during data taking  adjust dynamically impact parameter measurement ? Monitor  goal: make sure SVT is working  tentative spec: check every hour that it is OK to 1% level  need 10K events accepted by SVT and 10K rejected  is the CO tool

SVT workshop October 28, 1998 Online software Stefano Belforte - INFN Pisa4 More Tasks Diagnostic  If monitor spots a problem, find out where it is in such a way to allow the data taking to resume correctly with “simple” actions:  download proper constants in place of erroneous ones  swap bad board  push loose cable  is the ACE tool Debug  For detailed problem finding/solving. Pinpoint problem source, identify possible hardware failure and/or real bug in the hardware or software of SVT  is the SVT expert tool

SVT workshop October 28, 1998 Online software Stefano Belforte - INFN Pisa5 Run_Control interface Data to be downloaded must be ready in before run start, ideally SVT has been fully configured and downloaded after power up. Run_Control must have a way to check (force?) this Run_Control must start SVT monitor process(es) After some data tacking (1 min ?) Run_Control must query SVT for a check on beam position. If beam position has to be adjusted, data taking has to continue to give feedback. If SVT parameters needs to be changed run has to stop/restart with a different number (does it?). SVT severe errors are handled via CDF Error/Recovery/Start protocol No need for software message passing during run. At run end Run_Control signals SVT monitor process(es) to send statistics for proper storage (DB ? EndRun record ?)

SVT workshop October 28, 1998 Online software Stefano Belforte - INFN Pisa6 Online vs. Offline Offline analysis will want to validate SVT trigger, compute efficiencies etc. Exact information of SVT configuration at the time data were taken must be available. There may be drawback. E.g. would prevent dynamical adjustment for beam movement by subtraction to smooth out small oscillations on the minute time scale. SVT geometry must be made to relate with offline one. Not an understood problem yet. What if two beam monitor program report differently?

SVT workshop October 28, 1998 Online software Stefano Belforte - INFN Pisa7 SVT monitoring Will likely need 2 monitor programs:  one standard consumer runs off the Level3 data to validate SVT triggers (make sure when we say impact parameter is high, it really is. Do not waste trigger bandwidth on fakes)  let’s not worry about this one now  one special SVT Spy Buffer monitor program (SVT monitor) looks at events rejected by SVT to make sure they really had to get lost. SVT reduction is O(10 3 ), and these events do not make it past Level 2 in other ways… for each event we see in Level 3, there are 999 we will never see. Make sure efficiency is under control.  Spy Buffers keep history of all data processed by SVT  Spy Buffers are read from VME without interfering with trigger processing.  Spy Buffers are the favorite place for SVT monitor

SVT workshop October 28, 1998 Online software Stefano Belforte - INFN Pisa8 Monitoring beyond SVT Making sure that the hardware is doing its job is not enough  make sure SVT is always well tuned to the external environment is the beam moving ? is the detector (SVX) moving ?  we do not expect it, but… better check, maybe only seldom must monitor SVX strips noise:  a few very noisy channels could jam SVT  maybe monitored elsewhere ?  need some way to feed it back to Hit Finder

SVT workshop October 28, 1998 Online software Stefano Belforte - INFN Pisa9 SVT monitor vs. TRIGMON etc. SVT monitor does not run from CDF DAQ data stream  there are good reasons for this  it is done this way and can not be changed  event lost by SVT will not be in DAQ stream  need to monitor L1 triggers that fail L2 Will need our own little DAQ Still can try to make it look like standard DAQ  special AC++ input module  L1 data collected into events and analyzed by standard AC++ consumer  use standard CDF tools for histograms, message passing, remote connections Can stick to standard approach with (SVTMON?) code running off CDF DAQ stream.

SVT workshop October 28, 1998 Online software Stefano Belforte - INFN Pisa10 Spy Buffer Monitoring. How good is it ? Time to read Spy Buffers:  128Kwords/Buffer, 2 Buffer/Board (typ.), 100 Boards   3*10 4 *10 3 words  100 Mbytes  few sec's  There is lot of redundancy (two buffers per cable!), most tasks will not require reading everything.  Each Spy Buffer full dump has several events in it (>100) Run full SVTSIM:  few ev/sec on ALPHA 255MHZ, assume 10Hz on RunII CPU SVTSIM timing matched to Spy Buffer reading: will fully check 10 4 events each hour or better, just the 1% we asked for. Spy Buffer can do a lot more: error monitoring and reporting will heavily use them (separate session), still:  hope we are not swamped by errors  SVTSIM and error monitoring can go in parallel

SVT workshop October 28, 1998 Online software Stefano Belforte - INFN Pisa11 More on Spy Buffer Monitoring Spy Buffers also provide tool for simple monitoring  collect statistics of occupancy, YMON style plots  check for non-severe errors and count/list them This will not require SVTSIM.  Accuracy will be limited by readout rate  will benefit from parallelisation of various crates and exploitation of local CPU SVTSIM is a complex piece of obscure C code, doubt we will want to run it on local CPU (but it would allow parallel execution)  Spy Monitoring will require heavy usage of network and CPU. Better have our private ones.

SVT workshop October 28, 1998 Online software Stefano Belforte - INFN Pisa12 Spy Buffer as Beam Finder ? This topic belongs to another workshop (SVT & the Beam) Just a place holder for more possible software needs Simple algorithm can track beam position parameters in real time and feed back into SVT constants (not fully proven), need ~10K tracks. All SVT output tracks go through the Spy Buffer of the last MRG  Data flow into Leve2: a few tracks each 20  sec  O(100KHz)  Spy Buffer is filled at the rate it can be read by VME CPU  Simple program in crate controller could look at half the tracks produced by SVT (50% of time Spy Buffer is written, 50% is read)  fit beam center with ~100K tracks each second ! Not clear yet how to turn this into ( x,y,  x,  y ) in accelerator coord, but “it must be possible”. CPU & memory needs still not defined (should be small). Would this interfere with SVT monitoring ?

SVT workshop October 28, 1998 Online software Stefano Belforte - INFN Pisa13 Spy Buffer monitoring. How to do it ? Our own DAQ. Spy Buffers are read by VME CPU, asynchronously with data flow: SpyDAQ. Not a standard DAQ though:  One Spy Buffers holds many (up to O(1000)!) events.  Same event is in different places in different boards  part of current event is in FIFO’s and not even read out How do we implement the needed functionalities ?  Remember also that  same data can/should be used for several purposes  different task may need (very) different amount of data  this data are very useful for the expert, who is far away, across the ocean e.g. !  Spy Buffer freezing can be “spontaneous” on error detection by the Spy Controls even without SpyDAQ requesting it

SVT workshop October 28, 1998 Online software Stefano Belforte - INFN Pisa14 Simple minded SpyDAQ The easy way: do everything in high level software in the SVT workstation VME SVT workstation Ethernet Run basic VISION, only buffer block read to host. Run many processes here, one for each task, each of them reads as needed the Spy Buffers it wants. Some locking mechanism to prevent UnFreezing while reading must be devised.

SVT workshop October 28, 1998 Online software Stefano Belforte - INFN Pisa15 Structured SpyDAQ (just a dream at present !) VME ROBIN LAN WAN Event Builder Buffer Manager Reformat locally Spy Buffers into event fragments User Process AC++ input consumer User Process AC++ input consumer Consumers go to/from VME as needed SVT workstation Data Publisher LAN Browser

SVT workshop October 28, 1998 Online software Stefano Belforte - INFN Pisa16 Needed Hardware Disk.  Constants: See DataBase talk. E.g. 1MByte/pattern set. Need to have many sets ready (>100), e.g. precalculated for different beam positions. Best guess now: 2~3 GB.  Data: spool area where to keep Spy Buffer dumps for offline investigation: again a few GB. Computer.  One place where to gather pieces from all CPU’s (event builder), and store data to be served to various monitor tasks (buffer manager).  Could be a piece of a large system  But maybe our software will take time to get stable and system will crash often… better a separate workstation. How powerful ? I wish I knew.

SVT workshop October 28, 1998 Online software Stefano Belforte - INFN Pisa17 More Needed Hardware As long as monitor tasks will be standard AC++ consumers, they could (should?) run on same CPU’s as rest of monitor tasks Some place where to run diagnostic programs for the expert… nothing special SVT test stand: our own crate next door with one SVT vertical slice  debug bad boards  can re-run SVX+XFT data from Spy Buffers (even “tape”) through SVT hardware to understand (hopefully) even the subtlest problem  test code, patterns, programming etc. in the full system without interfering with data taking  provides hot swappable spares

SVT workshop October 28, 1998 Online software Stefano Belforte - INFN Pisa18 Needed Software Just all of it. This is to be filled (by someone else) after some work has been done for the time being  SVTSIM ~ there (need to be ported to AC++, e.g.)  Basic Spy Buffer handling within CDFVME framework (test, freeze, read, unfreeze) will be there by vertical slice time

SVT workshop October 28, 1998 Online software Stefano Belforte - INFN Pisa19 Summary: the situation till yesterday The promised land The hardware mountain                    The software ocean                   

SVT workshop October 28, 1998 Online software Stefano Belforte - INFN Pisa20 Perspectives: the situation tomorrow The promised land The hardware mountain The “someone else” boat

SVT workshop October 28, 1998 Online software Stefano Belforte - INFN Pisa21 Conclusions: what about the goals ? Today’s goal:  List of tasks: already too many ?  Estimate of manpower needs: what about 5 ?  Solicit volunteers : doing it right now  Assign responsibilities : better later ?  Agree on a baseline for basic interface issues with the rest of the world: too early ?  Define a time frame for a plan: winter’s end ?