SLAC June 7 2006 1 3 Gpix Camera Camera DAQ/Control System SLAC Program Review T. Schalk CCS team.

Slides:



Advertisements
Similar presentations
Hooking up a meta-network with VOEvent Robert White Stuart Evans W. Thomas Vestrand James Wren Przemyk Wozniak Los Alamos National Laboratory Alasdair.
Advertisements

CMSC 2006 Orlando Active Alignment System for the LSST William J. Gressler LSST Project National Optical Astronomy Observatory (NOAO) Scott Sandwith New.
Khaled A. Al-Utaibi  Computers are Every Where  What is Computer Engineering?  Design Levels  Computer Engineering Fields  What.
June 2010 At A Glance The Room Alert Adapter software in conjunction with AVTECH Room Alert™ devices assists in monitoring computer room environments as.
Jeff Kantor LSST Data Management Systems Manager LSST Corporation Institute for Astronomy University of Hawaii Honolulu, Hawaii June 19, 2008 LSST Data.
1 LSST: Dark Energy Tony Tyson Director, LSST Project University of California, Davis Tony Tyson Director, LSST Project University of California, Davis.
WBS & AO Controls Jason Chin, Don Gavel, Erik Johansson, Mark Reinig Design Meeting (Team meeting #10) Sept 17 th, 2007.
Building a Framework for Data Preservation of Large-Scale Astronomical Data ADASS London, UK September 23-26, 2007 Jeffrey Kantor (LSST Corporation), Ray.
Figure 1.1 Interaction between applications and the operating system.
1 Large Synoptic Survey Telescope Review Kirk Gilmore - SLAC DOE Review June 15, 2005.
Integration and Test 14 October 2008 Martin Nordby.
Slide 3-1 Copyright © 2004 Pearson Education, Inc. Operating Systems: A Modern Perspective, Chapter 3 Operating System Organization.
LSST Electronics Review – BNL, January LSST Electronics Review BNL January Monitoring and Configuration R. Van Berg Electronics.
Traffic Enforcement And Management System Malam-Team is Israel's largest systems integration organization, One of the most exciting solutions offered by.
Control Software Integration German Schumacher T&S Software Lead.
ZTF Server Architecture Roger Smith Caltech
Corner Raft Test Station Kirk Arndt, Ian Shipsey Purdue University LSST Camera Workshop SLAC Sept , 2008.
Input/OUTPUT [I/O Module structure].
Chapter 3: Operating-System Structures System Components Operating System Services System Calls System Programs System Structure Virtual Machines System.
Chapter 2: Information Technology and AISs
1 3-General Purpose Processors: Altera Nios II 2 Altera Nios II processor A 32-bit soft core processor from Altera Comes in three cores: Fast, Standard,
DCS Overview MCS/DCS Technical Interchange Meeting August, 2000.
National Center for Supercomputing Applications Observational Astronomy NCSA projects radio astronomy: CARMA & SKA optical astronomy: DES & LSST access:
Test and Operation of AST3 (Survey Control and Data System) Zhaohui Shang Tianjin Normal University National Astronomical Observatories, CAS On behalf.
MASSACHUSETTS INSTITUTE OF TECHNOLOGY NASA GODDARD SPACE FLIGHT CENTER ORBITAL SCIENCES CORPORATION NASA AMES RESEARCH CENTER SPACE TELESCOPE SCIENCE INSTITUTE.
Requirements Review – July 21, Requirements for CMS Patricia McBride July 21, 2005.
NCSX NCSX Preliminary Design Review ‒ October 7-9, 2003 G. Oliaro 1 G. Oliaro - WBS 5 Central Instrumentation/Data Acquisition and Controls Princeton Plasma.
Data Management Subsystem Jeff Valenti (STScI). DMS Context PRDS - Project Reference Database PPS - Proposal and Planning OSS - Operations Scripts FOS.
LSST Scheduler requirements
ALMA Software B.E. Glendenning (NRAO). 2 ALMA “High Frequency VLA” in Chile Presently a European/North American Project –Japan is almost certainly joining.
LSST Alert Management VOEvent Meeting Tucson, AZ December 5-6, 2005 Robyn Allsman,LSSTC/NOAO Maria Nieto-Santisteban, JHU Ray Plante, NCSA Tim Axelrod,
Realtime Computing and Dark Energy Experiments Klaus Honscheid The Ohio State University Real Time 07, Fermilab.
NOAO Brown Bag Tucson, AZ March 11, 2008 Jeff Kantor LSST Corporation Requirements Flowdown with LSST SysML and UML Models.
1 CHAPTER 8 TELECOMMUNICATIONSANDNETWORKS. 2 TELECOMMUNICATIONS Telecommunications: Communication of all types of information, including digital data,
DC2 Post-Mortem/DC3 Scoping February 5 - 6, 2008 DC3 Goals and Objectives Jeff Kantor DM System Manager Tim Axelrod DM System Scientist.
EScience May 2007 From Photons to Petabytes: Astronomy in the Era of Large Scale Surveys and Virtual Observatories R. Chris Smith NOAO/CTIO, LSST.
Data Analysis Software Development Hisanori Furusawa ADC, NAOJ For HSC analysis software team 1.
Operating System Principles And Multitasking
Online Software 8-July-98 Commissioning Working Group DØ Workshop S. Fuess Objective: Define for you, the customers of the Online system, the products.
LSST VAO Meeting March 24, 2011 Tucson, AZ. Headquarters Site Headquarters Facility Observatory Management Science Operations Education and Public Outreach.
LSST and VOEvent VOEvent Workshop Pasadena, CA April 13-14, 2005 Tim Axelrod University of Arizona.
Ray Plante for the DES Collaboration BIRP Meeting August 12, 2004 Tucson Fermilab, U Illinois, U Chicago, LBNL, CTIO/NOAO DES Data Management Ray Plante.
Pan-STARRS PS1 Published Science Products Subsystem Presentation to the PS1 Science Council August 1, 2007.
Camera Control System and Data Flow Oct 14, 2008 internal review Stuart Marshall Mike Huffer *, Terry Schalk, Jon Thaler.
1. 2 Purpose of This Presentation ◆ To explain how spacecraft can be virtualized by using a standard modeling method; ◆ To introduce the basic concept.
Computer and Data Communications Read Chapters 1 & 2.
Mountaintop Software for the Dark Energy Camera Jon Thaler 1, T. Abbott 2, I. Karliner 1, T. Qian 1, K. Honscheid 3, W. Merritt 4, L. Buckley-Geer 4 1.
V3 SLAC DOE Program Review Gunther Haller SLAC June 13, 07 (650) SNAP Electronics.
LSST Camera Electronics Overview September 16, 2008 R. Van Berg for the LSST Electronics Team.
Department of Particle & Particle Astrophysics Modular Data Acquisition Introduction and applicability to LCLS DAQ Michael Huffer,
Fermilab Scientific Computing Division Fermi National Accelerator Laboratory, Batavia, Illinois, USA. Off-the-Shelf Hardware and Software DAQ Performance.
1 Board Telescope & Site Tucson, Arizona July 18-20, 2011 LLST Control Software Review Resources Tucson, January
1 OBSERVATORY CONTROL SYSTEM (OCS) FRANCISCO DELGADO OCS CAM.
T. Axelrod, NASA Asteroid Grand Challenge, Houston, Oct 1, 2013 Improving NEO Discovery Efficiency With Citizen Science Tim Axelrod LSST EPO Scientist.
Commissioning Planning
LSST Commissioning Overview and Data Plan Charles (Chuck) Claver Beth Willman LSST System Scientist LSST Deputy Director SAC Meeting.
ComCam Hardware and Interfaces James Howard ComCam CAM LSST 2017 Project & Community Workshop 2017 Aug 15.
Agenda ComCam planning and scheduling (James & Jacques) CCS & CCS-TCS Interface (Tony) Software to use ComCam Software to get data out of ComCam (Mike)
Telescope Assembly, Integration and Verification (AIV) WBS 04C. 14 J
From LSE-30: Observatory System Spec.
ComCam Hardware and Interfaces James Howard ComCam CAM LSST 2017 Project & Community Workshop 2017 Aug 15.
LSST Commissioning Overview and Data Plan Charles (Chuck) Claver Beth Willman LSST System Scientist LSST Deputy Director SAC Meeting.
LDF “Highlights,” May-October 2017 (1)
SLAC Program Review June 7, 2006 Kirk Gilmore Stanford/SLAC/KIPAC
DOE: Transition from MIE to Early Operations Kevin Reil LSST Camera Commissioning Lead LSST Commissioning Plan Review January 24-26, 2017.
Software & Hardware for Wavefront Estimation Pipeline Development Te-Wei Tsai Software Engineer Joint Technical Meeting March 6, 2017.
SLAC DOE Program Review
Characteristics of Reconfigurable Hardware
Operating Systems: A Modern Perspective, Chapter 3
Presentation transcript:

SLAC June Gpix Camera Camera DAQ/Control System SLAC Program Review T. Schalk CCS team

SLAC June LSST Control Systems Observatory Control System Telescope Control System Data Mgmt. Control System Camera Control System Aux. Equip. / Calibration Control System Primary Command Bus Target/Mode Request/Ack. to/from scheduler Status Data Bus Status/Data Bus Facility Database Time \ Date Distribution Data transport Scheduling activities within camera

SLAC June Camera Assembly Filter in stored location L1 Lens L2 Lens Shutter L1/L2 Housing Camera Base Ring Camera Housing Cryostat outer cylinder Cold Plates L3 Lens in Cryostat front-end flange Raft Tower (Raft with Sensors + FEE) Filter Carousel main bearing Utility Trunk Filter in light path Filter Changer rail paths Focal Plane fast actuators BEE Module

SLAC June The LSST Focal Plane Guider Sensors (yellow) 3.5 deg FOV Illumination Limit Wavefront Sensors (red) 3.2 Gpixels

SLAC June Full CCD showing segmentation. Science Data acquisition begins here Read Out from CCD (16*2 *9 ccd’s = 288 a to d’s)

SLAC June Control is distributed to the local subsystem level where possible, with time critical loops closed at the local level. Subsystem control imbedded in subsystem and communicates with CCS Master/Slave protocol. One camera control system (CCS) module (CCM) is the master and responsible for scheduling tasks and communication with the OCS Coordination via messages between the CCS and its subsystems. No direct subsystem to subsystem communication. Publish/subscribe model. Separate Command Control Bus and data buses. Extensive logging capabilities. Assume need to support engineering and maintenance modes Accommodations made for test-stand(s) support. Design strategy for this system

SLAC June Camera buses Camera Body Thermal (T5U)Science DAQ (SDS) Guide Analysis (GAS) WF DAQ (WDS) Lens (L2U) Thermal (T3U,T4U) Shutter (SCU) Filters (FCS) Vacuum (VCS) Power/Signal (PSU) Cryostat Thermal (T1U,T2U) FP actuation (FPU) Guide array (GSS) Wave Front (WFS) Science array (SAS) Raft Alignment (RAS) Camera Control (CCS) Command Status Auxiliary systems Control Room Observatory buses Camera Control Architecture

SLAC June OCS Every arrow has an interface at each end. SASSDS CommandResponse Data DM FCS Red means it’s a CCS group responsibility. CCS Subsystems that produce data. Subsystems that do not produce data (only status info). Similar for WFS/WDS and GSS/GAS (see next slide) Similar for TSS, RAS, SCU, VCS, and L2U Subsystem managers Subsystems mapping to Managers

SLAC June — The camera’s data are carried on 25 (21?) optical fibers (one per raft) — Data are delivered by the camera to the SDS in 2 seconds. — These fibers carry only data — Data flows only from camera to SDS on these fibers (half duplex) — The fiber protocol is TBD — The data rate from a (fully populated) raft is Mbytes/sec (2.25 Gbits/sec) — Total aggregate data (201 CCDs) output rate is Gbytes/sec — Data must be carried from camera and delivered to its client (software) interface with a latency of not more then one (1) second. — Interfaces define commodity networking as a MAC layer => trade study CCD transport Design assumptions

SLAC June Camera specific => Standard I/O CDS Architecture

SLAC June First detailed designs are for DAQ

SLAC June RNA Hardware layout a pizza box

SLAC June Simultaneous DMA to memory for speed

SLAC June Infrastructure Layer Long-Haul Communications Base to Archive and Archive to Data Centers Networks are 10 gigabits/second protected clear channel fiber optics, with protocols optimized for bulk data transfer Base Facility In Chile,. Nightly Data Pipelines and Products are hosted here on 25 teraflops class supercomputers to provide primary data reduction and transient alert generation in under 60 seconds. Mountain Site In Chile Data acquisition from the Camera Subsystem and the Observatory Control System, with read-out in 2 seconds and data transfer to the Base at 10 gigabits/second. Archive/Data Access Centers In the United States. Nightly Data Pipelines and Data Products and the Science Data Archive are hosted here. Supercomputers capable of 60 teraflops provide analytical processing, re-processing, and community data access via Virtual Observatory interfaces to a 7 petabytes/year archive. The data archive will grow at a rate of roughly 7 PB/yr.

SLAC June Application Layer Data Acquisition Infrastructure Image Processing Pipeline Detection Pipeline Association Pipeline Image Archive Source Catalog Object Catalog Alert Archive Deep Detect Pipeline Deep Object Catalog VO Compliant Interface Middleware Classification Pipeline Moving Object Pipeline Calibration Pipeline End User Tools Alert Processing Eng/Fac Data Archive Common Pipeline Components Nightly Pipelines and Data Products Nightly Pipelines are executed and Data Products are produced within 60 seconds of the second exposure of each visit Science Data Archive These pipelines are executed on a slower cadence and the corresponding data products are those that require extensive computation and many observations for their production.

SLAC June Data Management Organization Team is headquartered at LSST Corporation, Tucson –Project Manager, Project Scientist, Software Engineers R&D Team is creating the MREFC, DOE proposals Construction Team will be a Tucson-based management/functional team, with a small number of single- location outsourced implementation teams (e.g. NCSA, IPAC) Application LayerMiddleware LayerInfrastructure Layer Caltech IPAC - Application architecture GMU, LLNL - Community Science scenarios NOAO - Lensed Supernovae, Non-moving transients, Photometry Princeton U - Image Processing, Galaxy Photometry U Arizona - Image Processing, Moving Objects, Association, Photometry UC Davis - Deep Detection, Shape Parameters U Pittsburgh/CMU - Photo Z, Moving Objects U Washington - Image Processing, Detection, Classification USNO - Astrometry SLAC, JHU - Database Schema/Indexing, Provenance, Performance/Scalability (ingest/query) LLNL, UCB - Database/Pipeline integration, Pipeline Construction, Alerting NCSA - Archive Data Access, Pipeline Control & Management, Security NOAO - Community Data Access/Virtual Observatory SDSC - Data Product Preservation SLAC - Data Acquisition, Mountain/Base Communications LLNL - Base Pipeline Server, Data Base Server NCSA, BNL - Archive Center/ Data Center Pipeline Servers, File Servers, Data Access Servers, Storage, Communications NOAO - Base to Archive Communications

SLAC June

SLAC June ACRONYMS !! CCS camera control system CCM camera control master/module OCS Observatory control system TCS telescope control system DM LSST data manage system SAS Science array system SDS Science array DAQ system RNA Raft network adapter SCU Sample Correction Unit WFS Wave front system WDS Wave front data system GSS Guide sensor system GAS Guide sensor Acquisition system DSP digital signal processor FPU Focal Plane actuation TSS thermal control system RAS Raft alignment system SCU Shutter control system FCS Filter control system VCS vacuum control system L2U L2 actuation system UML Unified Modeling Language MAC layer medium access control (MAC) Layer, which provides a variety of functions that support the operation of local area networking FPGA Field-Programmable Gate Array DMA direct memory access MGT Multi-Gigabit Transceivers IBA InfiniBand Architecture SDR single data rate