Australian SKA Pathfinder (ASKAP): The Project, its Software Architecture, Current and Future Developments Juan Carlos Guzman ASKAP Computing IPT – Software.

Slides:



Advertisements
Similar presentations
Distributed Data Processing
Advertisements

ASKAP Status Update CSIRO ASTRONOMY AND SPACE SCIENCE David Brodrick EPICS [Spring] Collaboration Meeting 22nd October 2014.
Bill SahrEVLA M&C Transition System Software CDR December 5-6, EVLA Monitor & Control Transition System Software Overview.
Network Management Overview IACT 918 July 2004 Gene Awyzio SITACS University of Wollongong.
ASKAP and DVP David DeBoer ASKAP Project Director 15 April 2010 Arlington, VA.
–Streamline / organize Improve readability of code Decrease code volume/line count Simplify mechanisms Improve maintainability & clarity Decrease development.
Agenda Adaptation of existing open-source control systems from compact accelerators to large scale facilities.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 18 Slide 1 Software Reuse 2.
SOA, BPM, BPEL, jBPM.
Databases and the Internet. Lecture Objectives Databases and the Internet Characteristics and Benefits of Internet Server-Side vs. Client-Side Special.
Hunt for Molecules, Paris, 2005-Sep-20 Software Development for ALMA Robert LUCAS IRAM Grenoble France.
ADASS XI Sept30-Oct3, 2001 The ALMA Common Software (ACS) as a basis for a distributed software development G.Raffi, G.Chiozzi (ESO), B.Glendenning (NRAO)
Web Application Beamline Control Module Very Sensitive Elemental and Structural Probe Employing Radiation from a Synchrotron (VESPERS) The VESPERS beamline.
Institute of Computer and Communication Network Engineering OFC/NFOEC, 6-10 March 2011, Los Angeles, CA Lessons Learned From Implementing a Path Computation.
The ALMA Common Software: a developer friendly CORBA-based framework G.Chiozzi d, B.Jeram a, H.Sommer a, A.Caproni e, M.Pesko bc, M.Sekoranja b, K.Zagar.
DISTRIBUTED COMPUTING
Matthias Clausen, DESY CSS GSI Feb. 2009: Introduction XFEL The European X-Ray Laser Project X-Ray Free-Electron Laser 1 CSS – Control System.
Cluster Reliability Project ISIS Vanderbilt University.
Service Computation 2010November 21-26, Lisbon.
EPE Release 2 IOC Review August 7, 2012 Ocean Observatories Initiative OOI EPE Release 2 Initial Operating Capability Review System Development Overview.
BLU-ICE and the Distributed Control System Constraints for Software Development Strategies Timothy M. McPhillips Stanford Synchrotron Radiation Laboratory.
Dec 8-10, 2004EPICS Collaboration Meeting – Tokai, Japan MicroIOC: A Simple Robust Platform for Integrating Devices Mark Pleško
EPICS Direction to Support Large Projects and Incorporate New Technology Leo R. Dalesio 09/21/99.
1/15 G. Manduchi EPICS Collaboration Meeting, Aix-en-Provence, Spring 2010 INTEGRATION OF EPICS AND MDSplus G. Manduchi, A. Luchetta, C. Taliercio, R.
Engr. M. Fahad Khan Lecturer Software Engineering Department University Of Engineering & Technology Taxila.
Advanced Computer Networks Topic 2: Characterization of Distributed Systems.
RI User Support in DEISA/PRACE EEF meeting 2 November 2010, Geneva Jules Wolfrat/Axel Berg SARA.
1 Geospatial and Business Intelligence Jean-Sébastien Turcotte Executive VP San Francisco - April 2007 Streamlining web mapping applications.
Control in ATLAS TDAQ Dietrich Liko on behalf of the ATLAS TDAQ Group.
ATF Control System and Interface to sub-systems Nobuhiro Terunuma, KEK 21/Nov/2007.
Final Review of ITER PBS 45 CODAC – PART 1 – 14 th, 15 th and 16 th of January CadarachePage 1 FINAL DESIGN REVIEW OF ITER PBS 45 CODAC – PART 1.
1 Computing Challenges for the Square Kilometre Array Mathai Joseph & Harrick Vin Tata Research Development & Design Centre Pune, India CHEP Mumbai 16.
Eugenia Hatziangeli Beams Department Controls Group CERN, Accelerators and Technology Sector E.Hatziangeli - CERN-Greece Industry day, Athens 31st March.
NA-MIC National Alliance for Medical Image Computing UCSD: Engineering Core 2 Portal and Grid Infrastructure.
14-Nov-2000EPICS Workshop - Oak Ridge1 PCaPAC Review Matthias Clausen DESY/ MKS-2.
Computing Division Requests The following is a list of tasks about to be officially submitted to the Computing Division for requested support. D0 personnel.
R MoeserCorrelator f2f Meeting1 MCAF (Metadata Capture and Formatting) Rich Moeser.
ICALEPCS’ GenevaACS in ALMA1 Allen Farris National Radio Astronomy Observatory Lead, ALMA Control System.
W.M. Keck Observatory 26-April The next generation of EPICS at the W.M. Keck Observatory Presented by Jimmy Johnson, Kevin Tsubota.
INNOV-10 Progress® Event Engine™ Technical Overview Prashant Thumma Principal Software Engineer.
Online Software 8-July-98 Commissioning Working Group DØ Workshop S. Fuess Objective: Define for you, the customers of the Online system, the products.
ASKAP: Setting the scene Max Voronkov ASKAP Computing 23 rd August 2010.
5-Oct-051 Tango collaboration status ICALEPCS 2005 Geneva (October 2005)
Australian SKA Pathfinder (ASKAP) David R DeBoer ATNF Assistant Director ASKAP Theme Leader 06 November 2007.
ICALEPCS 2005 Geneva, Oct. 12 The ALMA Telescope Control SystemA. Farris The ALMA Telescope Control System Allen Farris Ralph Marson Jeff Kern National.
EPICS Development for the ASKAP Design Enhancements Program ASTRONOMY AND SPACE SCIENCE Craig Haskins 18 th October 2015 EPICS User Meeting – Melbourne.
August 2003 At A Glance The IRC is a platform independent, extensible, and adaptive framework that provides robust, interactive, and distributed control.
Development of e-Science Application Portal on GAP WeiLong Ueng Academia Sinica Grid Computing
Review of Non-Commercial Frameworks for Distributed Control Systems B. Lopez European Gravitational Observatory ACS Workshop 2007.
Connecting LabVIEW to EPICS network
25 April Unified Cryptologic Architecture: A Framework for a Service Based Architecture Unified Cryptologic Architecture: A Framework for a Service.
Computing Facilities CERN IT Department CH-1211 Geneva 23 Switzerland t CF Agile Infrastructure Monitoring HEPiX Spring th April.
SoftwareServant Pty Ltd 2009 SoftwareServant ® Using the Specification-Only Method.
Satisfying Requirements BPF for DRA shall address: –DAQ Environment (Eclipse RCP): Gumtree ISEE workbench integration; –Design Composing and Configurability,
11 th February 2008Brian Martlew EPICS for MICE Status of the MICE slow control system Brian Martlew STFC, Daresbury Laboratory.
MPD Slow Control System historical background, present status and plans D.S. Egorov, R.V. Nagdasev, V.B. Shutov V.B.Shutov /21.
Control System Overview J. Frederick Bartlett Fermilab June 1,1999.
Bill SahrEVLA Advisory Committee Meeting September 8-9, EVLA Monitor & Control.
EPICS Developments in ASKAP (Australian SKA Pathfinder) Craig Haskins (on behalf of Juan Carlos Guzman) ASKAP Software Engineer 17 th December 2010 – ACAS.
An Introduction to Epics/Tango Steve Hunt Alceli EPICS Meeting 2008 INFN Legnaro 15 Oct 17:15.
A System for Monitoring and Management of Computational Grids Warren Smith Computer Sciences Corporation NASA Ames Research Center.
Fermilab Control System Jim Patrick - AD/Controls MaRIE Meeting March 9, 2016.
Using COTS Hardware with EPICS Through LabVIEW – A Status Report EPICS Collaboration Meeting Fall 2011.
Overview of TANGO Control system
Software Overview Sonja Vrcic
The CSIRO ASKAP Science Data Archive Progress and Plans
                                                                       TANGO Collaboration Experience for running a multi-site international software project.
Joseph JaJa, Mike Smorul, and Sangchul Song
INTEGRATING THE SKA-MPI DISH INTO MEERKAT
EPICS: Experimental Physics and Industrial Control System
Presentation transcript:

Australian SKA Pathfinder (ASKAP): The Project, its Software Architecture, Current and Future Developments Juan Carlos Guzman ASKAP Computing IPT – Software Engineer ASKAP Monitoring and Control Task Force Lead EPICS Collaboration Meeting - 30 th April 2009

EPICS Collaboration Meeting - April Vancouver, Canada Outline Overview of the ASKAP Project ASKAP Software Architecture: EPICS and ICE Current and Future Developments

Australian SKA Pathfinder Project = 1% SKA EPICS Collaboration Meeting - April Vancouver, Canada Wide field of view telescope (30 sq degrees) Sited at Boolardy, Western Australia Observes between 0.7 and 1.8 GHz 36 antennas, 12m diameter 30 beam phased array feed on each antenna Mainly a survey telescope, but it has to support also targeted observations Scientific capabilities Survey HI emission from 1.7 million galaxies up z ~ 0.3 Deep continuum survey of entire sky Polarimetry over entire sky Technical pathfinder Demonstration of WA as SKA site Phased Array Feeds (PAF) Computing

ASKAP Site EPICS Collaboration Meeting - April Vancouver, Canada Traditional lands of the Wajarri Yamatji Murchison Radio Observatory (MRO): Australia’s SKA Candidate site

ASKAP Site EPICS Collaboration Meeting - April Vancouver, Canada gazetted towns: 0 population: “up to 160” Geraldton

ASKAP Site EPICS Collaboration Meeting - April Vancouver, Canada gazetted towns: 0 population: “up to 160” Geraldton

ASKAP Site (current office) EPICS Collaboration Meeting - April Vancouver, Canada

ASKAP Site (current staff) EPICS Collaboration Meeting - April Vancouver, Canada

ASKAP Computing Challenges Remote operations of telescope Technical operations from Geraldton Science operations from Sydney Science archive in Perth Novel active element in data flow - Phased Array Feeds Parallel mosaicing! Performance of PAFs unknown Very large data flow ~ 30 VLAs running in parallel Cannot archive all observed data Automated processing vital Parallel and distributed computing necessary Science analysis on the fly Optimal algorithms running automatically Limited computing staff ~ 60 FTE-years planned for mid 2006 to mid 2013 EPICS Collaboration Meeting - April Vancouver, Canada

Innovative “sky-mount” in ASKAP antennas Three-axis telescope to keep antenna side-lobes (and PAF) fixed on sky A triumph of software over hardware! EPICS Collaboration Meeting - April Vancouver, Canada

ASKAP Status and Timeline ASKAP Antenna contract awarded to 54th Research Institute of China Electronics Technology Group Corporation (known as CETC54) late 2008 Antenna #1 to be delivered in Dec 2009 Antenna #36 to be delivered in Dec 2011 Digital, Analogue and Computing system PDR early this year. CDR are expected to be completed at end 2009 Boolardy Engineering Test Array (BETA) 6 antenna interferometer First antenna to be deployed end 2009 First antenna with PAF, Analogue, Digital and Computing systems mid 2009 Commissioning starts end 2010 Full ASKAP commissioning to begin late 2011 Full ASKAP operational late 2012 EPICS Collaboration Meeting - April Vancouver, Canada

Outline Overview of the ASKAP Project ASKAP Software Architecture: EPICS and ICE Current and Future Developments

ASKAP Top-Level Software Architecture EPICS Collaboration Meeting - April Vancouver, Canada Monitoring and Control is one part of larger data flow system

Technical Requirements for Monitoring and Control Components Archive of monitoring data permanently 100 – 150k monitoring points 4 TB/year approx. Reporting tools and access to external software Monitor and control many widely distributed devices built both in-house and COTS (heterogeneous distributed system) Accurate time synchronisation between hardware subsystems Deliver visibilities and its meta-data to Central Processor for subsequent processing Collection, archiving and notification of alarm conditions Collection, archiving and display of logs EPICS Collaboration Meeting - April Vancouver, Canada

TOS Technologies: Evaluation of control software frameworks EPICS Collaboration Meeting - April Vancouver, Canada Four alternatives selected Being used in other major research projects or facilities Supported on Linux One commercial (SCADA) PVSS-II Three non-commercial Experimental Physics and Industrial Control System (EPICS) ALMA Common Software (ACS) TANGO

TOS Technologies: Evaluation of control software frameworks – cont. EPICS Collaboration Meeting - April Vancouver, Canada PVSS-IIEPICSACSTANGO Developed byETM (Siemens)ANL, LANLESOESRF, SOLEIL, ELETTRA, ALBA Released Number of projects~ PlatformsLinux, Windows, Solaris Windows, Linux, MacOS, Solaris, vxWorks, RTEMS Linux, vxWorks, RTAIWindows, Linux, Solaris Programming Languages C/C++, CTRL (scripts) C/C++ (IOC); Java, Python and other (clients) C++, Java, Python MiddlewareN/AChannel AccessCORBA Supported I/O devices~ Hundreds30010 (?)350 DatabaseYes (proprietary)YesYes (XML)Yes (MySQL) GUI ToolkitYes NoYes Logging supportYes Archiving ToolsYes NoYes Alarm HandlerYes ? Bulk Data SupportNo YesNo SecurityYes NoNo (proposed) Redundancy SupportYesNo

TOS Technologies: Evaluation of control software frameworks – cont. EPICS Collaboration Meeting - April Vancouver, Canada Developing all technical software infrastructure in-house is costly and risky There are several alternatives both commercial and non-commercial that can “do the job” PVSS relatively expensive for ASKAP scale and we will tie ourselves to a commercial entity ACS (CORBA-based), despite being used in a radio interferometer, is relatively new and too complex for our needs TANGO (CORBA-based) is also relatively new and has a small user base mainly within the European synchrotron community For ASKAP we have chosen EPICS as the software framework for the monitoring and control system Evaluation report (ASKAP-SW-0002) version 1.0 released in late December 2008

TOS Technologies: Why EPICS? EPICS Collaboration Meeting - April Vancouver, Canada It is free, open source and with a very active community Both clients and servers can run in many platforms; not only VxWorks! Proven technology Proven scalability All the client needs to know is the PV name. No messing around with fixed addresses Lots of software tools available on the web Real-time database design appeals also to non-programmers Presents a unified interface to high-level control (easier integration) Common software for hardware subsystems developers

TOS Technologies: Limitations of EPICS Close loop control via Channel Access up to 20 Hz Fast control (hard real-time) will be implemented at lower level Channel Access supports limited data types (e.g. no XML) Soft real-time control usually does not require complex data types Complex data types will be handled at a higher-level (ICE middleware) Alternatives exist to extend data types: arrays, aSub record Channel Access is optimised for 16k packets (no support for bulk data transfer) The maximum size can be increased, but there is a penalty on performance Bulk data transfer will use a different mechanism to avoid overloading of the monitoring and control network, e.g. point to point UDP (visibilities) or files (beamformer weights) No built-in request/response-type of communication Workarounds are available, e.g. CAD/CAR/SIR records, aSub, use of two records Real-time control is usually asynchronous Has not being used recently in astronomical projects Misconceptions and “bad press” rather than technical limitations EPICS Collaboration Meeting - April Vancouver, Canada

Preliminary TOS Logical View EPICS Collaboration Meeting - April Vancouver, Canada Hardware Subsystems: Antenna IOC Beamformer IOC LO IOC Event Generator IOC Correlator IOC PAF Receiver IOC Critical Monitoring IOC Weather Station IOC Total ~120 SoftIOCs

Technologies for Top-Level Message Bus EPICS Collaboration Meeting - April Vancouver, Canada

Technologies for Top-Level Message Bus: Evaluation of Middlewares Requirements Support Linux Language bindings for Java, C++ and Python Support for Request / Response style communication Synchronous Asynchronous Support for Publish / Subscribe style communication Promote loose coupling Support for fault tolerance (replication, etc.) (Desirable) Free license (Desirable) Mature Three alternatives evaluated Apache Tuscany ActiveMQ (JMS) Internet Communication Engine (ICE) EPICS Collaboration Meeting - April Vancouver, Canada

Technologies for Top-Level Message Bus: Evaluation of Middlewares Apache Tuscany was found to be too immature for our needs Both ActiveMQ and ICE were found to meet our needs, but we have selected ICE over ActiveMQ/JMS primarily because of its interface definition language Avoids us having to define our own interface definition language Avoids us having to build our own bindings between the language and the interface definition ICE’s interface definition language is reasonably strict, so eliminates ambiguity which is present in other methods of defining such interfaces Also many of our interfaces appear to be best suited to an object oriented model EPICS Collaboration Meeting - April Vancouver, Canada

Outline Overview of the ASKAP Project ASKAP Software Architecture: EPICS and ICE Current and Future Developments

Current Developments related to EPICS MoniCA, our EPICS CA Archiver SNMP device support based on devSNMP (see Euan Troup’s talk) USB2SPI device support In-house USB-to-SPI hub device to communicate with Analogue system (Receivers, Local Oscillators and some primary monitoring devices) Uses FTDI USB DLP-2232M Module Driver support based on ASYN driver framework Parkes Testbed Prototype system Operator Display (EDM) Observation scripts in Python 4 Linux IOCs (deployed in one PC) Antenna (TCP, StreamDevice) Digital Back-End (ASCII socket, ASYN) Synthesizers (vxi11, StreamDevice) PAF and power (usb2spi, ASYN) EPICS Collaboration Meeting - April Vancouver, Canada

MoniCA Monitoring system developed in-house over recent years, primarily for Australia Telescope Compact Array (ATCA) Used at all existing ATNF observatories Familiar to Operations staff Many “data sources” exist and new ones can be added EPICS Data source already implemented and in use at Parkes Server and client (GUI) implemented in Java Extensive use of inheritance Relatively easy to extend/customise Supports aggregate/virtual points derived by combining values from other points Archive to different types of databases: single file or relational database (MySQL) There is an open-source version available in google code, but not much documentation yet (see monica/) EPICS Collaboration Meeting - April Vancouver, Canada

MoniCA Client EPICS Collaboration Meeting - April Vancouver, Canada

Future Developments on ASKAP Control System MoniCA enhancements Hierarchical support Scalability beyond 20,000 points Develop hardware subsystems interfaces (EPICS IOCs) in conjunction with other teams Use of ASYN framework for device support of ASKAP Digital devices: ADC, Beamformers and Correlator (FPGA-based) Integrate EPICS IOC logs to our logging server based on ICE Alarm Management System Looking for alternatives: AMS(DESY), LASER (CERN) BETA integration should start early 2010 EPICS Collaboration Meeting - April Vancouver, Canada

Contact Us Phone: or Web: Thank you Australia Telescope National Facility Juan Carlos Guzman ASKAP Computing IPT - Software Engineer Phone: Web: