Drew Baden, UMD Nov 20001 HCAL Trigger/Readout Texas Tech Meeting February 2001 Drew Baden, Tullio Grassi University of Maryland.

Slides:



Advertisements
Similar presentations
Questionnaire Response
Advertisements

José C. Da Silva OFF DETECTOR WORSHOP, April 7, 2005, Lisboa SLB and DCC commissioning for 904.
TileCal Optical Multiplexer Board 9U VME Prototype Cristobal Cuenca Almenar IFIC (Universitat de Valencia-CSIC)
The LAr ROD Project and Online Activities Arno Straessner and Alain, Daniel, Annie, Manuel, Imma, Eric, Jean-Pierre,... Journée de réflexion du DPNC Centre.
CMS Week Sept 2002 HCAL Data Concentrator Status Report for RUWG and Calibration WG Eric Hazen, Jim Rohlf, Shouxiang Wu Boston University.
USCMS/Rochester. Oct 20, 2001HCAL TriDAS1 HCAL TPG and Readout CMS HCal meeting at Rochester Oct Oct 2001 Tullio Grassi, Drew Baden University of.
06-Dec-2004 HCAL TriDAS 1 TriDAS Status HCAL Group.
Alice EMCAL Meeting, July 2nd EMCAL global trigger status: STU design progress Olivier BOURRION LPSC, Grenoble.
CMS/CERN. Nov, 2001HCAL TriDAS1 HCAL Fiber Link Issues Use of CMS Tracker Fiber Links Drew Baden University of Maryland John Elias Fermilab.
Status of the Optical Multiplexer Board 9U Prototype This poster presents the architecture and the status of the Optical Multiplexer Board (OMB) 9U for.
06-Dec-2004 HCAL TriDAS 1 TriDAS Status HF Luminosity HF Trigger Slicing.
CMS/HCAL/TriDas. Mar, 2004 HCAL TriDAS 1 Tridas Status Drew Baden University of Maryland June 2004.
4 Dec 2001First ideas for readout/DAQ1 Paul Dauncey Imperial College Contributions from all of UK: result of brainstorming meeting in Birmingham on 13.
CMS/CERN. Nov, 2001HCAL TriDAS1 HCAL TPG and Readout CMS HCAL Readout Status CERN Tullio Grassi, Drew Baden University of Maryland Jim Rohlf Boston University.
Drew Baden, UMD Nov HCAL Trigger/Readout CMS/Tridas November 2000 Drew Baden, Tullio Grassi University of Maryland.
HCAL FIT 2002 HCAL Data Concentrator Status Report Gueorgui Antchev, Eric Hazen, Jim Rohlf, Shouxiang Wu Boston University.
CMS/CERN. Mar, 2002HCAL TriDAS1 HCAL TPG and Readout CMS HCAL Readout Status CERN Drew Baden University of Maryland March 2002
Uli Schäfer 1 JEM1: Status and plans power Jet Sum R S T U VME CC RM ACE CAN Flash TTC JEM1.0 status JEM1.1 Plans.
Using the Trigger Test Stand at CDF for Benchmarking CPU (and eventually GPU) Performance Wesley Ketchum (University of Chicago)
CMS Electronics Week November Electronics Issues Drew Baden University of Maryland USCMS HCAL.
CMS/HCAL/TriDas. Dec, 2003 HCAL TriDAS 1 Tridas Status Drew Baden University of Maryland December 2003 What’s new since September 2003…
Global Trigger H. Bergauer, K. Kastner, S. Kostner, A. Nentchev, B. Neuherz, N. Neumeister, M. Padrta, P. Porth, H. Rohringer, H. Sakulin, J. Strauss,
Status and planning of the CMX Philippe Laurens for the MSU group Level-1 Calorimeter Trigger General Meeting, CERN May 24, 2012.
21 January 2003Paul Dauncey - UK Electronics1 UK Electronics Status and Issues Paul Dauncey Imperial College London.
14 Sep 2005DAQ - Paul Dauncey1 Tech Board: DAQ/Online Status Paul Dauncey Imperial College London.
PHENIX upgrade DAQ Status/ HBD FEM experience (so far) The thoughts on the PHENIX DAQ upgrade –Slow download HBD test experience so far –GTM –FEM readout.
Status of the Beam Phase and Intensity Monitor for LHCb Richard Jacobsson Zbigniew Guzik Federico Alessio TFC Team: Motivation Aims Overview of the board.
LNL 1 SLOW CONTROLS FOR CMS DRIFT TUBE CHAMBERS M. Bellato, L. Castellani INFN Sezione di Padova.
Status of Global Trigger Global Muon Trigger Sept 2001 Vienna CMS-group presented by A.Taurok.
CMS/CERN. June, HCAL TPG and Readout CMS HCAL Readout Status CERN Drew Baden University of Maryland June 2002
CPT Week, April 2001Darin Acosta1 Status of the Next Generation CSC Track-Finder D.Acosta University of Florida.
Claudia-Elisabeth Wulz Anton Taurok Institute for High Energy Physics, Vienna Trigger Internal Review CERN, 6 Nov GLOBAL TRIGGER.
CMX status and plans Yuri Ermoline for the MSU group Level-1 Calorimeter Trigger Joint Meeting CERN, October 2012,
Status and planning of the CMX Wojtek Fedorko for the MSU group TDAQ Week, CERN April , 2012.
CSC ME1/1 Upgrade Status of Electronics Design Mikhail Matveev Rice University March 28, 2012.
DAQMB Production Status S. Durkin The Ohio State University Florida EMU Meeting 2004.
FED RAL: Greg Iles5 March The 96 Channel FED Tester What needs to be tested ? Requirements for 96 channel tester ? Baseline design Functionality.
Samuel Silverstein Stockholm University CMM++ firmware development Backplane formats (update) CMM++ firmware.
FVTX Electronics (WBS 1.5.2, 1.5.3) Sergey Butsyk University of New Mexico Sergey Butsyk DOE FVTX review
TTC for NA62 Marian Krivda 1), Cristina Lazzeroni 1), Roman Lietava 1)2) 1) University of Birmingham, UK 2) Comenius University, Bratislava, Slovakia 3/1/20101.
US CMS DOE/NSF Review, 20 November Trigger - W. Smith1 WBS Trigger Wesley H. Smith, U. Wisconsin CMS Trigger Project Manager DOE/NSF Status.
13-Feb-2004 HCAL TriDAS 1 HCAL Tridas SLHC Drew Baden University of Maryland February 2004.
AFP Trigger DAQ and DCS Krzysztof Korcyl Institute of Nuclear Physics - Cracow on behalf of TDAQ and DCS subsystems.
Feb 2002 HTR Status CMS HCal meeting at FIT Feb. 7-9, 2002 Tullio Grassi University of Maryland.
Rutherford Appleton Laboratory September 1999Fifth Workshop on Electronics for LHC Presented by S. Quinton.
25 Sept 2001Tullio Grassi From MAPLD2001 ( space community conference ) Major causes of failures.
E. Hazen - DTC1 DAQ / Trigger Card for HCAL SLHC Readout E. Hazen - Boston University.
E. Hazen1 New DCC Status and Plans Jim Rohlf, Eric Hazen, Arno Heister, Phil Lawson, Jason St John, Shouxiang Wu Boston University.
DAQ / Trigger Card for HCAL SLHC Readout E. Hazen - Boston University
The CMS HCAL Data Concentrator: A Modular, Standards-based Design
HCAL DAQ Path Upgrades Current DCC Status New DCC Hardware Software
Status of NA62 straw readout
Initial check-out of Pulsar prototypes
Current DCC Design LED Board
Production Firmware - status Components TOTFED - status
The CMS HCAL Data Concentrator: A Modular, Standards-based Design
HCAL Data Concentrator Production Status
L0 processor for NA62 Marian Krivda 1) , Cristina Lazzeroni 1) , Roman Lietava 1)2) 1) University of Birmingham, UK 2) Comenius University, Bratislava,
The CMS HCAL Data Concentrator: A Modular, Standards-based Design
CMS EMU TRIGGER ELECTRONICS
COVER Full production should arrive today
Introduction HCAL Front-End Readout and Trigger Joint effort between:
HCAL Configuration Issues
8-layer PC Board, 2 Ball-Grid Array FPGA’s, 718 Components/Board
USCMS HCAL FERU: Front End Readout Unit
Lehman 2000 Project Timeline
Data Concentrator Card and Test System for the CMS ECAL Readout
FED Design and EMU-to-DAQ Test
HCAL DAQ Interface Eric Hazen Jim Rohlf Shouxiang Wu Boston University
TTC setup at MSU 6U VME-64 TTC Crate: TTC clock signal is
Presentation transcript:

Drew Baden, UMD Nov HCAL Trigger/Readout Texas Tech Meeting February 2001 Drew Baden, Tullio Grassi University of Maryland

Drew Baden, UMD Feb 8, Outline ® Trigger/DAQ Demonstrator project HTR demonstrator Front-end Emulator DCC demonstrator DAQ ® Cost/Schedule

Drew Baden, UMD Feb 8, Demonstrator System Design ® Front-end and LHC emulator Fiber data source for HTR  Uses crystal clock Signals for TTC ® HTR demonstrators Data inputs TTCrx daughterboard TPG output ® TTC system TTCvi and TTXvx ® DCC demonstrator Full 9U card S-link output ® Level 1 “Emulator” Wisc to provide Rx ~6/01 Carlos to provide Tx HTR is ready ® Crate CPU Commercial card Running Linux S-link input TTCrx HTR TTCvi/TTCvx Data Clock (etc) FE/LHC Emulator TTCrx HTR DCC TTCrx TTC CPU Level 1 “Emulator” TPG Output S-Link

Drew Baden, UMD Feb 8, Front End/LHC Emulator ® 6U VME board ® 8 fiber data outputs simulates HCAL ® System signals: Internal 40MHz crystal + FPGA (Altera 10k50) Generates master clock, L1A, and BC0 All are ECL outputs  Clock, L1A, Orbit, BC0, spares ® LHC pattern generated internally ® Board is being stuffed, back in 1 week

Drew Baden, UMD Feb 8, HTR Demonstrator ® 6U VME board ® 2 Dual Optical Rx (4 fibers) ® HP deserializer chips ® TTCrx daughterboard ® APEX 20k400 Has enough memory ® LVDS output to DCC ® Link piggy-board footprint for TPG output ® Board is being stuffed, back in 1 week

Drew Baden, UMD Feb 8, HTR/FEE FPGA Firmware ® Emulator Hans Breden already working on it “pretty simple” in comparison to HTR ® HTR (“LPD” etc) Tullio is working on this  Baden to help soon as D0 stuff is finished  Software tools (Quartus, Synplify, Aldec, etc.)  Algorithm (LPD)  Grassi/Baden met with Magnus Hansen Jan 01  VHDL and knowledge changed hands Simplified version soon to test I/O This is a major work in progress towards a more final version

Drew Baden, UMD Feb 8, DCC Logic Layout ® TTCrx daughtercard ® Data from 18 HTR buffered in iFIFO dual PCI buses, 9 HTR per PCI ® Large FPGA reads events from FIFO distributes to 4 FIFO-like streams Each stream can prescale events and/or select by header bits. ® Local control FPGA provides independent access via VME/PCI for control/monitoring while running.

Drew Baden, UMD Feb 8, DCC Demonstrator ® DCC VME motherboard 2 prototypes tested and working Components for 5 for CMS on order ® Link Receiver Boards (input & event building) 2 prototypes (2 nd revision) under test Components for 10 “in stock”

Drew Baden, UMD Feb 8, DCC Motherboard ® What’s been tested so far: PCI tested to 35MHz (33MHz is the PCI spec.  This is faster than the anticipated 28MHz Burst transfers tested on PCI3 All PCI bridges work All PC-MIP sites work  Used commercial digital I/O board All PMC sites work  Used S-Link interface

Drew Baden, UMD Feb 8, DCC Overview

Drew Baden, UMD Feb 8, DCC Demonstrator Motherboard

Drew Baden, UMD Feb 8, DCC Demonstrator ® DCC Logic module CAD Schematic done PCB layout to start in 2-3 weeks FPGA firmware design underway Using S-link for interfaces works fine

Drew Baden, UMD Feb 8, DAQ ® 6U VME Pentium CPU board running Linux VME interface Connect to TTCvi, HTR, and DCC ® Use S-Link for direct data interface with DCC Will also have VME interface if needed ® UMD, BU, and UIC already purchased Not necessarily have gotten Linux up yet ® Chris Tully has agreed to help Students, maybe himself ® Overall integration still up in the air Need to get organized on this Video meeting next week to start  Goal: get UIC, BU, Princeton, UMD workers to be at UMD and work on this together, pound it out!

Drew Baden, UMD Feb 8, Testing Goals DAQ ® Run Demonstrator project at UMD ® Be ready to run FNAL source calibration project next summer

Drew Baden, UMD Feb 8, Testing Goals HTR ® Receiving optical data G-links clock recovery Asynchronous FIFO TTC clock for FIFO RCLK ® Maintain pipeline With error reporting ® Crossing determination Send data to HTR demo coincident with selected 25ns time bucket Recover this particular 25ns time bucket Investigate various algorithms (with help) DeserializerO-to-E “n” Optical “1” WCLK Recovered Clock RCLK “n” TTC Clock

Drew Baden, UMD Feb 8, Testing Goals HTR ® Synchronization All TPG data from same bucket are aligned for transmit to L1 trigger Use of Synch Chip on our boards ® From L1A, verify correct data gets into DCC L1A generated in the TTCvi by FE Emulator ® TPG output needs a source and a receiver! Wait for Wisconsin to provide Use Carlos daSilva’s Link Piggy board Synch Chip TTC Clock TPG out SLB MAIN FPGA “Wesley Smith”?

Drew Baden, UMD Feb 8, Testing Goals DCC ® Input Test LVDS electrical from HTR Test data protocol  Grassi/Bard/Hazen have worked it out Test (multiple) PCI interface and event building ® Buffers Multiple FIFOs for various functions  Output to L2/DAQ (all data)  Monitoring (preselected)  Trigger verification (prescaled)  Other?

Drew Baden, UMD Feb 8, Testing Goals DCC ® Error checking and monitoring Event number check against L1A from TTC  Demonstration of system-wide synch capability Line error monitoring  Built in Hamming ECC FIFO occupancy monitoring ® DCC output Use S-LINK to connect to CPU via PMC connector  Incurs an additional $5k expense  Cards already build and delivered from CERN  BU already tested This allows us to move forward in DCC/DAQ integration ahead of schedule

Drew Baden, UMD Feb 8, Schedule ® FEE/LHC Emulator Layout, fab complete, physically in board house now Expect board by Feb 12 FPGA firmware being worked on now  Expect end of Feb, in time for testing ® HTR Layout, fab complete, also in board house now Expect board by Feb 12 Firmware will be an ongoing project  Preliminary firmware for checkout in time for testing ® DCC Motherboard tested and working Link receiver cards (PC-MIP) tested and working PMC Logic board sometime in March or April  We can get started using S-Link, direct interface between DCC motherboard and CPU  WU is now officially full-time on this

Drew Baden, UMD Feb 8, Schedule (cont) ® Integration Hardware checkout to begin this month Software needs a lot of work  Serious efforts have to begin now on Linux DAQ etc.  Will begin weekly video conference meetings with  UMD, BU, UIC, Princeton Goal:  Hardware and software integration to start in March  Be ready for FNAL source calibrations this summer We are behind by 2-3 months  Can make some of this up by eliminating prototype stages  Still plan to try to stick to Lehman schedule  Note DCC effort is already underway, HTR is 1 week away from testing and begin of integration

Drew Baden, UMD Feb 8, Demonstrator Project Costs ® Working on reevaluation of project costs Will present to Level 2 WBS mgmt soon Some examples of additional Demonstrator costs:  HTR $3k original, $3.5k actual  FEE $2.5k original, $2.8k total  S-Link $5k  Additional 9U VME crate with 6U slots  UMD CPU board $4k with software  Quartus requires >512M memory to run!  Our computer uses RDRAM!  Will need to clone a system for Tully  2 additional DCC Motherboards ordered  Several additional BU startup expenses (software, etc.) No difficulties expected (yet)

Drew Baden, UMD Feb 8, Overall T/DAQ Costs ® Cost savings for increasing to 3 channels/fiber  Can do 12 inputs for 36 channels instead of 16 inputs for 32 channels per card ® Current: 463 cards $2.5k/board, $480 for Rx chips $1.158M total ® New: 413 cards $2.4k/board, $360 for RX chips $0.991M total, saves almost $160k ® Realistically, this is in the continency noise but it DOES reduce the baseline

Drew Baden, UMD Feb 8, Cost Variables ® New vs Old G-Links Old = 2 channels/fiber, 16 fibers/HTR  20 bits x 40MHz = 800MHz New = 3 channels/fiber, 12 fibers/HTR  32 bits (40) x 40MHz = 1.2 (1.6) GHz Advantage of New:  Fewer HTR (~10%)  Take advantage of fiber mass terminators  12/connector fits nicely, use 1 per HTR  Significant reduction in cost of o-to-e  12 vs. 16, these are expensive  Is it real? ® FPGA Smaller is definitely cheaper Critical that HCAL stick to no more than 5 channels per sum. Why?  Can then use 8 FPGA per HTR, 5 channels per FPGA  Puts constraints on fiber cabling but will help cost alot

Drew Baden, UMD Feb 8, Overall T/DAQ Costs ® Changes since baseline ($1.5M for HTR) FPGA cost for Altera 20k400 TODAY (1/2001) is $1000 for single!!!!  Quantity in 2 years? Maybe 1/4-1/3  Baseline cost for FPGA: $1000  It might fit Optical o-e  Baseline estimate for 2-fiber connectors $30  Grossly underestimated – more like $100  This is an $800 underestimate, which amounts to a 30% increase ($1.5M)  With increase to 36 channels per card, this amounts to about $200  Can use mass terminated fiber connectors, 12 per connector, cost is $400  Would bring the costs back nearer to baseline  Warnings from Magnus on delivery time, etc for these parts now.