Requirements for ITER CODAC

Slides:



Advertisements
Similar presentations
ITER CODAC Plant Control Design Handbook October 2008
Advertisements

Control System Studio (CSS)
ITER Fast Controller Prototype Feng Wang, Shi Li and Xiaoyang Sun Institute of Plasma Physics, Chinese Academy of Sciences 4/15/20151 The Spring 2010 EPICS.
The System and Software Development Process Instructor: Dr. Hany H. Ammar Dept. of Computer Science and Electrical Engineering, WVU.
Software Project Management
This plan covers The design, implementation, installation, integration and commissioning of the Central Instrumentation & Control System (CODAC, CIS and.
IEC Substation Configuration Language and Its Impact on the Engineering of Distribution Substation Systems Notes Dr. Alexander Apostolov.
Cracow Grid Workshop, November 5-6, 2001 Towards the CrossGrid Architecture Marian Bubak, Marek Garbacz, Maciej Malawski, and Katarzyna Zając.
Building Enterprise Applications Using Visual Studio ®.NET Enterprise Architect.
Rational Unified Process
L4-1-S1 UML Overview © M.E. Fayad SJSU -- CmpE Software Architectures Dr. M.E. Fayad, Professor Computer Engineering Department, Room #283I.
DCS Architecture Bob Krzaczek. Key Design Requirement Distilled from the DCS Mission statement and the results of the Conceptual Design Review (June 1999):
Packaging of EPICS-basedControl System Software
Diagnostics and Controls K. Gajewski ESS Spoke RF Source Accelerator Internal Review.
EPICS Collaboration meeting, Pohang,, Oct 2012 Page 1IDM UID: 97W6QN Status of ITER Core Software (“CODAC Core System”) CHD/CIT/CODAC ITER Organization.
Agenda Adaptation of existing open-source control systems from compact accelerators to large scale facilities.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 18 Slide 1 Software Reuse 2.
EPICS Collaboration meeting Fall 2012, SDD status and plansPage 1 Self-Described Data - SDD status and plans Lana Abadie, ITER CODAC.
12-CRS-0106 REVISED 8 FEB 2013 EPICS Collaboration Meeting 2013 CSS An integrated development and runtime environment for ITER plant system local controls.
CODAC Core System, 2-June-2010, EPICS Collaboration Meeting Aix-en-Provence Page 1 CODAC Core System F. Di Maio ITER IO / CHD / CIT / CODAC.
Rational Unified Process Fundamentals Module 4: Disciplines II.
DCS Overview MCS/DCS Technical Interchange Meeting August, 2000.
Matthias Clausen, DESY CSS GSI Feb. 2009: Introduction XFEL The European X-Ray Laser Project X-Ray Free-Electron Laser 1 CSS – Control System.
EPICS Collaboration Meeting Spring 2010, Aix France, Jun 2, 2010 Page 1 ITER CODAC COntrol, Data Access and Communication System for ITER Anders Wallander.
LIPAc status report EPICS Integration and Commissioning + RFQ LCS status at INFN/LNL Alvaro Marqueta LIPAc Project Team on behalf of the LIPAc Control.
ITER – Interlocks Luis Fernandez December 2014 Central Interlock System CIS v0.
XFEL The European X-Ray Laser Project CSS Core Meeting Introduction into CSS DESY, August 7, 2006 Matthias Clausen MKS-2.
SOFTWARE DESIGN (SWD) Instructor: Dr. Hany H. Ammar
Update on Database Issues Peter Chochula DCS Workshop, June 21, 2004 Colmar.
Contents 1.Introduction, architecture 2.Live demonstration 3.Extensibility.
Elder Matias, Diony Medrano, Dong Liu (At Michigan State University - Nov 2011) IRMIS at CLS.
The european ITM Task Force data structure F. Imbeaux.
Managed by UT-Battelle for the Department of Energy Kay Kasemir ORNL/SNS Jan Control System Studio, CSS Overview.
Control in ATLAS TDAQ Dietrich Liko on behalf of the ATLAS TDAQ Group.
1 / Name / Date IDA Interface for Distributed Automation The journey toward Distributed Intelligence.
Final Review of ITER PBS 45 CODAC – PART 1 – 14 th, 15 th and 16 th of January CadarachePage 1 FINAL DESIGN REVIEW OF ITER PBS 45 CODAC – PART 1.
Unified Modeling Language* Keng Siau University of Nebraska-Lincoln *Adapted from “Software Architecture and the UML” by Grady Booch.
L6-S1 UML Overview 2003 SJSU -- CmpE Advanced Object-Oriented Analysis & Design Dr. M.E. Fayad, Professor Computer Engineering Department, Room #283I College.
The System and Software Development Process Instructor: Dr. Hany H. Ammar Dept. of Computer Science and Electrical Engineering, WVU.
EPICS Release 3.15 Bob Dalesio May 19, Features for 3.15 Support for large arrays - done for rsrv in 3.14 Channel access priorities - planned to.
5-Oct-051 Tango collaboration status ICALEPCS 2005 Geneva (October 2005)
Software Maintenance Speaker: Jerry Gao Ph.D. San Jose State University URL: Sept., 2001.
Requirements Engineering-Based Conceptual Modelling From: Requirements Engineering E. Insfran, O. Pastor and R. Wieringa Presented by Chin-Yi Tsai.
ITER Update, 11-Oct-2010, EPICS Collaboration Meeting Brookhaven Page 1 ITER Update F. Di Maio ITER IO / CHD / CIT / CODAC.
NOVA A Networked Object-Based EnVironment for Analysis “Framework Components for Distributed Computing” Pavel Nevski, Sasha Vanyashin, Torre Wenaus US.
Workforce Scheduling Release 5.0 for Windows Implementation Overview OWS Development Team.
EPICS Collaboration Meeting, 05-Oct-2011, Willingen Page 1 ITER Tools Franck Di Maio, Lana Abadie CHD/CSD/CODAC ITER Organization.
Review of Non-Commercial Frameworks for Distributed Control Systems B. Lopez European Gravitational Observatory ACS Workshop 2007.
CERN Timing Workshop, Geneva, 15 Feb Geneva, 15 Feb 2008 Franck Di Maio – ITER IO Geneva, 15 Feb 2008 Franck Di Maio – ITER IO CERN Timing Workshop.
B. Dalesio, N. Arnold, M. Kraimer, E. Norum, A. Johnson EPICS Collaboration Meeting December 8-10, 2004 Roadmap for IOC.
Unit – I Presentation. Unit – 1 (Introduction to Software Project management) Definition:-  Software project management is the art and science of planning.
Control System Considerations for ADS EuCARD-2/MAX Accelerators for Accelerator Driven Systems Workshop, CERN, March 20-21, 2014 Klemen Žagar Robert Modic.
CMS Luigi Zangrando, Cern, 16/4/ Run Control Prototype Status M. Gulmini, M. Gaetano, N. Toniolo, S. Ventura, L. Zangrando INFN – Laboratori Nazionali.
Spring 2015 EPICS Collaboration Meeting, May 2015, FRIB East Lansing, MI, USA © 2015, ITER Organization Page 1 Disclaimer: The views and opinions.
An Introduction to Epics/Tango Steve Hunt Alceli EPICS Meeting 2008 INFN Legnaro 15 Oct 17:15.
Online Software November 10, 2009 Infrastructure Overview Luciano Orsini, Roland Moser Invited Talk at SuperB ETD-Online Status Review.
ESS (vacuum) control system Daniel Piso Controls Division February 20, 2013.
Daniel Piso Fernández Some Ideas about Integration Support Cases and PLC Activities at Controls Division.
Using COTS Hardware with EPICS Through LabVIEW – A Status Report EPICS Collaboration Meeting Fall 2011.
Combining safety and conventional interfaces for interlock PLCs
Building Enterprise Applications Using Visual Studio®
Status of I&C System Development for ITER Diagnostic Systems in Japan
Current Status of ITER I&C System as Integration Begins
ITER Instrumentation and Control - Status and Plans
F. Di Maio ITER IO / CHD / CIT / CODAC
Control System Studio (CSS)
COntrol, Data Access and Communication System for ITER
PLCs integration into the ICS
Automation of Control System Configuration TAC 18
ASYN based S7 PLC Driver Jignesh Patel
Presentation transcript:

Requirements for ITER CODAC Franck Di Maio CODAC & IT CHD Department

Outline Introduction: PSH & Mini-CODAC R&D Tasks The EPICS Decision Plans Self-Description Architecture & tools Conclusion

Introduction ITER I&C Architecture

CODAC Architecture Introduction Operator Software Central CODAC Systems Plasma Control Operator Software Synchronous Data Middleware Plant System Host Fast Controller PLC

Plant System Host (PSH) Introduction CODAC Architecture Plant System Host (PSH) Is integrated in the Plant System Instrumentation & Control (I&C). Provides a single point of entry for communication between the CODAC Systems and the plant system’s local controllers. Is in charge of: configuration management, command dispatching, state monitoring, alarms and logs interfacing, data flow and events dispatching. Is supplied by ITER IO Central CODAC Systems Plasma Control Operator Software Synchronous Data Middleware Plant System Host Fast Controller PLC

CODAC Architecture Introduction Before Integration Mini-CODAC Implements a sub-set of the CODAC systems functions. Provides a SCADA environment for the development Configuration Management Local supervision Human Machine Interface (HMI) Alarms, logs, data handling… Is a tool for acceptance tests at factory and on site Before Integration Mini-CODAC Synchronous Data Middleware Plant System Host Fast Controller PLC

Outline Introduction R&D Tasks The EPICS Decision Plans Self-Description Architecture & tools Conclusion

Data Exchange Modeling Task R&D Tasks Data Exchange Modeling Task “MODEX” Model of the Generic Plant Interface (GPI) interactions between CODAC and Plant Systems Prototype Engineering model developed using SysML Prototype built and demonstrated Task Agreement with ITER-IN

R&D Tasks SCADA Survey Market Survey + evaluation of selected products against ITER Requirements. The open source products, EPICS & TANGO, are the ones that match the ITER requirements at best. TANGO relies on recent technology. EPICS benefits from a large community and a strong support. iFIX Contract with ATOS Origin

Communication Technologies Survey R&D Tasks Communication Technologies Survey Evaluation of selected communication technologies against some specific user-cases. Channel Access and CORBA have limits but replacement or complement isn’t justified now. It is recommended to use an API that abstract the implementation. Channel Access CORBA DDS ICE TAO, OmniORB RTI ZeroC Channel Access Limits No built-in commands invocation OK for a model with simple commands. Performances OK, except for large data New version or dedicated data-stream services Scalability is limited Multi-layers architecture Contract with Cosylab 10 10

PSH Prototype R&D Tasks Objectives: On site evaluation of EPICS and Tango Pre-engineering of CODAC concepts Use cases: Direct control of the I/O channels of a PLC. Integration of another type of PLC simulating the control of a plant system. Implemented on both EPICS and TANGO. Contract with Alceli Hunt Beratung

Prototype Architecture R&D Tasks Prototype Architecture EPICS TANGO Mini-CODAC 1 Mini-CODAC 2 MEDM, Striptool Python/Java/C/C++ Jdraw, atkmoni Python/Java/C/C++ Ethernet NW (PON) S7 IOC, CPS IOC, Modbus IOC, S7 DS, CPS DS, Modbus DS, Simple case DS PSH 1 PSH 2 Ethernet NW (PS LAN) S7 / TCP Modbus / TCP Power Supply Simulation (complex case) Siemens S7/300 Yokogawa Stardom FCJ I/O interface (simple case)

Prototyping Results R&D Tasks The 2 use cases have been implemented. Both EPICS and Tango are acceptable base for implementing the functions and the CODAC concepts. PSH Slow Controller

The EPICS Decision It is a necessity for ITER to standardize the Plant System’s controller software at the very beginning. Currently: 161 plant systems with FAT starting in 2012. Feb-09: EPICS will be used as the baseline for the software environment for the ITER control system Plant Control Design Handbook, vs. 4.1, 06-May-2009: The software infrastructure for PSH and Mini-CODAC is EPICS version R3.14.10. [R111] EPICS version R3.14.10 shall be used for PS fast controllers. [R112] Communication between PS fast controllers and PSH shall use EPICS Channel Access. [R113] The Operating System of the PS fast controllers shall be Linux (version [TBD]). Deviations may be considered by IO for difficult real-time cases. … and PLC are Siemens Simatic S7 (same document)

Outline Introduction R&D Tasks The EPICS Decision Plans Self-Description Architecture & tools Conclusion

Core Systems Plans Core functions: Communications “SCADA” functions: HMI Alarm Handling Error & trace logging Parameters monitoring Plant system supervision Data Archiving Testing Configuration management (self-description) To be implemented by packaged CODAC core systems Built and distributed in an incremental manner. One major release / year (1st quarter) Starting from 2010 According to the Mini-CODAC architecture for now.

Roadmap Plans Vs 1 2010/Q1 Preliminary Release Vs 2 2011/Q1 Stable release for developers Vs 3 2012/Q1 Stable release for FAT Priorities: Integrate PLCs (Siemens S7) Develop configuration management (self-description) Freeze the Application Programming Interfaces (APIs) Integrate fast controllers (EPICS IOC)

Resources Plans IO staff estimation: 4-6 ppy (2009-2012) Task Agreement with ITER-IN (3 years, from 2009/Q3) on PSH and Prototype Mini-CODAC Support contract (3 years, from 2009/Q4) for EPICS, QA & user support. New contracts in 2010 Surveys and collaborations EPICS Tools survey (June 2009) Task Agreement with ITER-KO on EPICS for Tokamak (August 2009)

Outline Introduction R&D Tasks The EPICS Decision Plans Self-Description (Denis Stepanov) Architecture & tools Conclusion

Plant System Self-Description … is a concept of providing all the necessary information about Plant Systems along with the Plant Systems themselves. The ultimate goal is to make both Plant Systems I&C and CODAC software system-neutral, decreasing the hard-coded programming part of the system specificity and increasing the data configuration part. … represents static configuration data not changing during the Plant System operation. It can be modified through dedicated maintenance procedures. … forms a part of software interface between the Plant Systems I&C and Central I&C Systems. … shall capture all “hidden knowledge” of Plant System configuration, at least in the form of documentation. … is expressed in XML constrained by a well-defined W3C XML Schema (XSD). … has to be introduced and actively supported by the software from the very beginning to avoid being a huge set of inconsistent, unreliable, poorly maintained data.

Self-Description Scope The Self-Description Data consist of: Plant System I&C unique identification; Command list; Alarms list; Set-points list; Plant System I&C Operating Limits and Conditions; Physical (raw) signals list (I/O); Processed / converted signals list; Data streams list; Logging messages list; Definition of the Plant System I&C state machine in accordance with the defined Plant System operating states; Definitions of Plant System I&C HMI; Initial values for run-time configuration used for Plant System I&C start-up; Identification of source codes and binary packages of the Plant System I&C specific software; Documentation. (as stated in the Plant Control Design Handbook v 4.1, May 2009)

Device descriptions for EPICS and TANGO in XML (mapping of EPICS text templates and substitution lists) TANGO (mapping of TANGO’s MySQL database)

Self-description dataflow: operation PS parameters 2 PS response 7 PS data 6 PS dynamic parameters 3 PSH static configuration 1 PS data 5 PS devices dynamic parameters 4

Self-description dataflow: development CODAC test data 12 PS development progress 12 PS requirements and needs 12 PCDH deliverables 11 Problem report 10 PS parameters 4 PS description 1 PS response 9 PS data 8 PS dynamic parameters 5 PSH static configuration 2 Devel tools’ project files 2 Program development 3 PS data 7 PS devices dynamic parameters 6 PS devices programs + static configuration 3

Outline Introduction R&D Tasks The EPICS Decision Plans Self-Description Architecture & tools Conclusion

Core Systems, Mini-CODAC Architecture CODAC Systems: Alarm Handling (AH) Error & Trace Logging (EL) Live Database/Monitoring (LD) Data Archiving (DA) Data Retrieval (DR) Testing Tools (TT) Communication Middleware (CM) Generic Plant System Software (PS) Visualization / HMI Builder (VB) Plant System self description (SD)

Pure EPICS Architecture Configuration Data Archiver Select the best EPICS mature tools to cover the core functions Benefit: Stable Widely used (support) Limits: Not fully consistent Migration to new ITER adapted tools. The 2010 version: EPICS tools A 1st version of the self-description system S7 PLC integration Synoptics Errors/trace Logging Alarms

The Eclipse Mini-CODAC Alternative Use the Eclipse technology for Mini-CODAC a consistent environment integrating the different functions. Join the CSS club for adopting/improving and developing new tools. Considered for future releases (> 1). Use Eclipse RCP for Mini-CODAC

Vs 1 Architecture (pure EPICS) EPICS components: VDCT SNL and Sequencer EDM autoSave ALH Channel Archive/Retrieval Channel Access Gateway IocLogServer Wireshark-CACasnooper CAJ CoThread S7PLC driver

Conclusion - Main Requirements 2012: start of FAT 2015: start of integration 2018: first plasma DA IO Ind Lab Specific constraints The ITER schedule The ITER procurement model The ITER size (~200 systems) Main requirements: A very good EPICS Base for many years (procurement: 10+) Prescribed as a standard for all plant systems controllers, from R3.14.10. Plant System Host with high reliability and high performance The critical interface between IOCs and central systems Key components: CA gateway, S7 driver, RIOC (Linux, ATCA?) Implementation of the self-description concept A “prescriptive” management system for the plan systems With special requirements (XML, deliverable) New tools for central services and HMI CSS tools? To be evaluated for Vs 2 or 3 A stable API for the high-level applications Required for the design of ITER-specific applications (ex: scheduling system).