CERN Timing Workshop, Geneva, 15 Feb 2008 1 Geneva, 15 Feb 2008 Franck Di Maio – ITER IO Geneva, 15 Feb 2008 Franck Di Maio – ITER IO CERN Timing Workshop.

Slides:



Advertisements
Similar presentations
ITER CODAC Plant Control Design Handbook October 2008
Advertisements

2 nd ADVANCED COURSE ON DIAGNOSTICS AND DATA ACQUISITION Instrumentation Buses, Digital Communication and Protocols J. Sousa.
ITER Fast Controller Prototype Feng Wang, Shi Li and Xiaoyang Sun Institute of Plasma Physics, Chinese Academy of Sciences 4/15/20151 The Spring 2010 EPICS.
Georgia Interoperability Network
This plan covers The design, implementation, installation, integration and commissioning of the Central Instrumentation & Control System (CODAC, CIS and.
1 ITER Standardization for Integration of local and central interlocks Riccardo Pedica PLC Based Interlock systems Workshop 4-5 December 2014 – ITER Organization.
Page 1 ITER Control System – meeting – date © 2013, ITER Organization The Control System of ITER Overview, status and PS integration CODAC team ITER IO.
Oracle Data Guard Ensuring Disaster Recovery for Enterprise Data
WG-AL3S - Recommendations1 H. Laeger; SAPOCO 9/11/98 Working Group LHC Safety Alarms (AL3S) DSO TIS & FB LHC GLIMOS CONTROLS Civil Engineering Legal Service.
A U.S. Department of Energy Office of Science Laboratory Operated by The University of Chicago Argonne National Laboratory Office of Science U.S. Department.
PLC Based Interlock systems Workshop 4-5 December ITER Organization Headquarters - St Paul-lez-Durance-France ITER Central Interlock System: Central.
RT2010, Lisboa Portugal, May 28, 2009 Page 1 Baseline architecture of ITER control system Anders Wallander, Franck Di Maio, Jean-Yves Journeaux, Wolf-Dieter.
Downtown Seattle Transit Tunnel (DSTT)
Instituto de Plasmas e Fusão Nuclear Instituto Superior Técnico Lisbon, Portugal B. Gonçalves | Lisbon, February 20, 2009 |
© PROFIBUS International 2001PROFInetnet... More than justEthernet... More than just Ethernet.
Diagnostics and Controls K. Gajewski ESS Spoke RF Source Accelerator Internal Review.
EPICS Collaboration meeting, Pohang,, Oct 2012 Page 1IDM UID: 97W6QN Status of ITER Core Software (“CODAC Core System”) CHD/CIT/CODAC ITER Organization.
SC Project Review of NCSX WBS5 - Central Controls and Computing August 15-17, 2007 WBS5 Manager: Paul Sichta.
REAL-TIME SOFTWARE SYSTEMS DEVELOPMENT Instructor: Dr. Hany H. Ammar Dept. of Computer Science and Electrical Engineering, WVU.
Requirements for ITER CODAC
SNS Integrated Control System EPICS Collaboration Meeting SNS Machine Protection System SNS Timing System Coles Sibley xxxx/vlb.
CODAC Core System, 2-June-2010, EPICS Collaboration Meeting Aix-en-Provence Page 1 CODAC Core System F. Di Maio ITER IO / CHD / CIT / CODAC.
Cullen College of Engineering RFID-Based Solutions for Piping Technology Piping Tech & UH July, 2007.
ITER Control Simulator by Martynas Prokopas 1
DLS Digital Controller Tony Dobbing Head of Power Supplies Group.
EPICS Collaboration Meeting Spring 2010, Aix France, Jun 2, 2010 Page 1 ITER CODAC COntrol, Data Access and Communication System for ITER Anders Wallander.
NCSX NCSX Preliminary Design Review ‒ October 7-9, 2003 G. Oliaro 1 G. Oliaro - WBS 5 Central Instrumentation/Data Acquisition and Controls Princeton Plasma.
ITER – Interlocks Luis Fernandez December 2014 Central Interlock System CIS v0.
CERN Safety Alarm Monitoring Presented by Luigi Scibile ST division / MO group.
Final Review of ITER PBS 45 CODAC – PART 1 – 14 th, 15 th and 16 th of January CadarachePage 1 FINAL DESIGN REVIEW OF ITER PBS 45 CODAC – PART 1.
Proposal for Decisions 2007 Work Baseline M.Jonker for the Cocost* * Collimation Controls Steering Team.
André Augustinus 10 October 2005 ALICE Detector Control Status Report A. Augustinus, P. Chochula, G. De Cataldo, L. Jirdén, S. Popescu the DCS team, ALICE.
Final Review of ITER PBS 45 CODAC – PART 1 – 14 th, 15 th and 16 th of January CadarachePage 1 FINAL DESIGN REVIEW OF ITER PBS 45 CODAC – PART 1.
1 Week #10Business Continuity Backing Up Data Configuring Shadow Copies Providing Server and Service Availability.
TS Workshop, Archamps, May 24 – May 26, 2005 The LHC Access System Status report – May 2005 P. Ninin & the Access project team TS/CSE.
Operation & Maintenance in ST-MA presented by Rui Nunes.
REAL-TIME SOFTWARE SYSTEMS DEVELOPMENT Instructor: Dr. Hany H. Ammar Dept. of Computer Science and Electrical Engineering, WVU.
The Fully Networked Car Geneva, 3-4 March Ethernet in Automotive applications Helge Zinner Norbert Balbierer Continental Automotive GmbH.
Eugenia Hatziangeli Beams Department Controls Group CERN, Accelerators and Technology Sector E.Hatziangeli - CERN-Greece Industry day, Athens 31st March.
Service Level Agreements Service Level Statements NO YES The process of negotiating and defining the levels of user service (service levels) required.
MA CS workshop Adriaan Rijllart1 LabVIEW integration into the LHC control system The LHCLabVIEW.
Status of ITER collaboration for Machine Protection I. Romera On behalf of the colleagues who contribute to the project Thanks to: Sigrid, Markus, Rüdiger,
NCSX NCSX Project Meeting March 21, 2003 G. E. Oliaro Slide 1 WBS5 Central Instrumentation/Data Acquisition and Controls G. E. Oliaro Computer I&C, Network.
Princeton University Review of the NCSX Project, March 2008 WBS5 - Central Controls and Computing March 13,2008 WBS5 Manager: Paul Sichta.
LINAC4 COMMISSIONING 1 LESSONS LEARNT DURING MY VISIT AT LINAC4 COMMISSIONING Evangelia Vaena, 02/02/2016.
TS workshop 2004U. Epting, M.C. Morodo Testa - TS department1 Improving Industrial Process Control Systems Security Uwe Epting (TS/CSE) Maria Carmen Morodo.
CLOUD COMPUTING WHAT IS CLOUD COMPUTING?  Cloud Computing, also known as ‘on-demand computing’, is a kind of Internet-based computing,
Control System Considerations for ADS EuCARD-2/MAX Accelerators for Accelerator Driven Systems Workshop, CERN, March 20-21, 2014 Klemen Žagar Robert Modic.
First discussion on MSS for Katrin March 26, 2013 M.Capeans CERN PH-DT.
ASIPP/ EAST Presented by Zuchao Zhang on behalf of EAST Central Control Team 1 Institute of Plasma Physics, Chinese Academy of Sciences Implementation.
Spring 2015 EPICS Collaboration Meeting, May 2015, FRIB East Lansing, MI, USA © 2015, ITER Organization Page 1 Disclaimer: The views and opinions.
Conventional Facilities integration: Approach and Issues Daniel Piso Fernández WP Leader (WP13 Conventional Facilities Integration Support) November 5,
An overview of I&C Systems in APR 1400 Parvaiz Ahmed Khand December 28, 2007.
CV works in the non- LHC accelerator complex during 2008 and plans for 2009 ATOP days 2009.
PLC based Interlock Workshop CIS Team February 2016 ITER Central Interlock System Fast Interlock Controller.
“Implementation and Initial Commissioning of KSTAR Integrated Control System,” Mikyung Park NFRC, KOREA The 6 th IAEA Technical Meeting,
Artificial Intelligence In Power System Author Doshi Pratik H.Darakh Bharat P.
ITER & CODAC Core System Status Update
Combining safety and conventional interfaces for interlock PLCs
Pedro Moreira CERN BE-CO-HT
Data providers Volume & Type of Analysis Kickers
Status of I&C System Development for ITER Diagnostic Systems in Japan
Current Status of ITER I&C System as Integration Begins
the CERN Electrical network protection system
MAUVE CO2 cooling review
ITER Instrumentation and Control - Status and Plans
Status of Fast Controller EPICS Supports for ITER Project
COntrol, Data Access and Communication System for ITER
Designed for powerful live monitoring of larger installations
Global One Communications
Presentation transcript:

CERN Timing Workshop, Geneva, 15 Feb Geneva, 15 Feb 2008 Franck Di Maio – ITER IO Geneva, 15 Feb 2008 Franck Di Maio – ITER IO CERN Timing Workshop ITER

CERN Timing Workshop - ITER, Geneva, 15 Feb Highlights CODAC overview High Performance networks A view on the planning Ref: IBF07IBF07

CERN Timing Workshop - ITER, Geneva, 15 Feb CODAC Packages WBS 4.5: CODAC (COntrol Data Access & Communications) WBS 4.6: Central Interlock System (Investment Protection) WBS 4.8: Central Safety System These are all Fund procurements. Means procured by ITER, not in-kind. But the plant systems and their control will be in-kind procurements.

CERN Timing Workshop - ITER, Geneva, 15 Feb CODAC Systems Overview

CERN Timing Workshop - ITER, Geneva, 15 Feb CODAC Systems Overview

CERN Timing Workshop - ITER, Geneva, 15 Feb Networks segregation

CERN Timing Workshop - ITER, Geneva, 15 Feb Plant System I &C Sub-system Controller CODAC Networks Interface Units Plant System Host Sub-system Controller Equipment Local Control Panels Plant Safety Systems Plant Interlock System Central Safety Systems Central Interlock System Central CODAC Systems Synchronous data Events Clocks Video Field bus CODAC Networks Safety Networks Interlock Network WBS 4.5 WSB 4.8 WBS 4.6 Plant Systems Integration Part of the Plant System Procurement

CERN Timing Workshop - ITER, Geneva, 15 Feb Plant System I &C Sub-system Controller CODAC Networks Interface Units Plant System Host B Sub-system Controller Equipment Local Control Panels Plant Safety Systems Plant Interlock System Mini-CODAC V2 Synchronous data Events Clocks Video Field bus CODAC Networks WBS 4.5 WSB 4.8 WBS 4.6 Plant Systems Tests, Phase II

CERN Timing Workshop - ITER, Geneva, 15 Feb Plant System I &C Sub-system Controller Plant System Host A Sub-system Controller Equipment Local Control Panels Plant Safety Systems Plant Interlock System Mini-CODAC V1 Field bus CODAC Networks WBS 4.5 WSB 4.8 WBS 4.6 Plant Systems Tests, Phase I

CERN Timing Workshop - ITER, Geneva, 15 Feb CODAC – Scope Plant Systems I&C Support –General I&C specifications and interfaces definition –Selection of standardized components & tools (ex: PLC) –Support for specifications, design and integration tests –Mini-CODAC (for integration tests) CODAC Hardware & Infrastructure –Servers: general purpose, data-storage, high performance. –Networks: conventional + dedicated (synchronous data, events distribution, time distribution) –Control Rooms infrastructure CODAC Software –Core software (communications, alarms, reporting, visualization…) –Operation software (schedule management, pulse control…) –Plant System Host (PSH) software –Data management Central Interlock System Central Safety Systems (nuclear, access, conventional)

CERN Timing Workshop - ITER, Geneva, 15 Feb ITER Networks (1/2) Conventional networks within CODAC scope: The Plant Operation Network (PON) is responsible for the provision and management of all standard Network Communications connecting Plant Systems and CODAC Systems and is the backbone of CODAC. The Plant Commissioning Network (PCN) has the same functionality as the Plant Operation Network, but is connected only to Plant Systems which are in an un- commissioned state. It will be used during initial integration and commissioning as well as during and following interventions to a Plant System which reduce the reliability of the Plant System, to the extent that formal re-commissioning will be required. The Network Monitoring Network (NMN) provides an independent monitoring of all CODAC Networks and creates a CIS alarm in case of failure of a network. The Disaster Backup Network (DBN) provides an independent link between the CODAC data store and an off-site store with no common-mode risks. The Audio-Video Network (AVN) provides the physical and logical support for Audio and Video communication throughout the ITER plant. These data include both monitoring information and experimental data. Conventional networks outside of CODAC scope: The General ITER Network (GIN) provides the link between CODAC and all other on- site and off-site activities. GIN will use the high performance RENATER network for all off-site user connections. The Open Public Network (OPN) provides the link between CODAC and all other on- site and off-site activities which do not require ITER authorisation. OPN will use the high performance RENATER network for all off-site connections.

CERN Timing Workshop - ITER, Geneva, 15 Feb ITER Networks (2/2) High performance CODAC networks The Synchronous DataBus Network (SDN) is complementary to the CODAC Communication Network. It communicates data required for operation, including feedback control, when the delay or jitter of these data must be stricter than the Quality of Service guaranteed by the PON. The Time Communication Network (TCN) provides a project-wide definition of time and communicates this time to all Systems. It allows actions, events and data in all ITER Systems to be synchronised. The Event Distribution Network (EDN) manages the signaling of intermittent events between CODAC Systems or Plant Systems with a lower latency than the Synchronous DataBus. Safety & Interlocks: The Central Interlock System Network (CIN) provides the physical network and the supervisory logic to allow the continued operation of the plant in its present state, or to cause an appropriate corrective action if an off-normal event is detected and not avoided by preventive CODAC actions. The Central Safety System Networks (CSN) provide the physical network and the supervisory logic to allow the continued operation of the plant in a safe state, or to cause an appropriate corrective action if an off-normal event is detected and not avoided by preventive CODAC actions.

CERN Timing Workshop - ITER, Geneva, 15 Feb Synchronous Data Bus Non-persistent, transient, low (1msec) latency data, representing engineering and physics. Needed for fast control of the plant and of the plasma Number of signals4000 Jitter0.05 ms Data producers (plant systems)80 Data consumers50 Transmission delay1 ms Source to Destination delay5 ms

CERN Timing Workshop - ITER, Geneva, 15 Feb Time Communication Network The Time Communication Network (TCN) provides a project-wide definition of time and communicates this time to all Systems. It allows actions, events and data in all ITER Systems to be synchronised. Time resolution10 ns Time precision10 ns Synchronise itself to UTC as the absolute reference, to within 1 s Guarantee monotonic progression of its time, recovering any error with respect to UTC with a correction rate of 1e-4 ??

CERN Timing Workshop - ITER, Geneva, 15 Feb Event Distribution Network The Event Distribution Network (EDN) manages the signaling of intermittent events between CODAC Systems or Plant Systems with a lower latency than the Synchronous DataBus. Number of sources of events500 Number of consumers of events500 Maximum latency10 µs ??

CERN Timing Workshop - ITER, Geneva, 15 Feb CODAC – Provisional Schedule Plant Systems I&C Support (Partial) Specifications Factory Tests Provisional: to be adjusted with IPS milestones PSH Software PSH Hardware Mini-CODAC

CERN Timing Workshop - ITER, Geneva, 15 Feb CODAC – Provisional Schedule CODAC Hardware & Infra. (partial) Buildings Servers Networks Provisional: to be adjusted with IPS milestones On-site tests

CERN Timing Workshop - ITER, Geneva, 15 Feb CODAC – Provisional Schedule CODAC Software (partial) On-site tests Commissioning Core Software Operation Software Plant System Host Software Provisional: to be adjusted with IPS milestones New schedule