Presentation is loading. Please wait.

Presentation is loading. Please wait.

Andreas Werner and W7-X CoDaC Team

Similar presentations


Presentation on theme: "Andreas Werner and W7-X CoDaC Team"— Presentation transcript:

1 The Wendelstein 7-X steady state DAQ and control system for initial operation
Andreas Werner and W7-X CoDaC Team Max-Planck-Institut für Plasmaphysik, Germany EURATOM-Association

2 Status and scope of the W7-X CoDaC project:
Intro Status and scope of the W7-X CoDaC project: Definition of initial configuration (end 2014) Sketch of the project until 2014 Achievements Improvements

3 Coarse Control & DAQ Project Plan
CoDa concept & basic implementation WEGA test Internal reviews final implementations & setup frontend systems External Design Review W7-X commissioning 2005 2007 2011 2012 2014

4 W7-X CoDaC Structure HPSS Central Control System PLC
Central Safety System PLC Sequence Controller Plasma Control Configuration DB Central Time Event Distribution GPS Control Room & Montoring Scientific& Administration Offices SOA / Data Analysis Control Room and Office Cloud Virtual Desktops Storage Systems High Availability Storage Archive Data Server High Scalability Storage Central Server Cloud Central IT-Services Local PLC Segment RTControl DAQ Local PLC Segment RTControl DAQ Frontend System Frontend System Long Term Archive & Policy Driven Data Management HPSS Images by Siemens, Dell, EMC, HP, Oracle

5 WEGA Testbed & Achievements
Concept proof of PLC and RT segment control [J. Schacht et al. 6th IAEA TM] Test of individual DAQ systems and integration of FPGA based systems Test of communication between systems Collaborative work on CoDa-system configuration Indestructable small size stellarator Test of experiment program planning tool (Xedit) and early involvement of experiment leaders

6 WEGA: Experiment Planning High Level Parameter (HLP) Concept
Initial W7-X operation with constructive editor approach (deductive approach postponed): Modification of segment programs & modification of HLPs Preview & check of HLP set points => Xedit [A. Spring O3-4] Time Gas Flow Magnetic Field : Valve open Post Processing Open valve Prep. Magn.1 Magn. 1 ready up Magnetrons off Magnetron (1,2)Power Prep. ready Magn.2 Large Plasma Radius Iota / safety factor Max. 39 s Plasma Operation Start up Preparation Transformation to low level (technical) parameter [HLP concept H. Riemann et al. 7th IAEA TM] Combination of different transformations and limits into component models [M. Lewerentz O3-3]

7 WEGA Segment Control ECRH OXB Transition
Power rise density rises -> OXB µWave stray radiation drops in Bernstein wave heating scenario asap Combined signal by stray radiation and density Segment switch by Plasma state Magnetron + Gyrotron heating Gyrotron heating, central deposition only [H. Laqua O3-5]

8 Improvements/Tasks for W7-X initial operation
Initial phase (2014 – 2018) is made for short pulses (< 40s), CoDa system will be prepared for steady state: Get the steady state control and data acquisition reliably running Segment control works routinely DAQ system works routinely, higher reliability required DAQ hardware ATCA, SIO, NI-PXI + FPGA specials Establishment of plasma interlock systems from uncertain information, e.g. video of first wall and interlocks Establishment of station configuration and experiment program planning for large system configurations (step from WEGA to W7-X) Establishment of W7-X scenarios, i.e. intelligent segment programs

9 Setup of Frontend Systems
Current planning 27 diagnostics + 8 experiment aux. systems: Setup of the 12 most important diagnostic systems until 2014, others follow in OP1 No access for one week (superconducting magnets, mechanical supports) => remote management comprising many actors [J. Schacht O3-6] Configuration setup in database for segment control High performance DAQ: At present, focus on ATCA system as the standard high performance DAQ for W7-X [B. Goncalves O2-4] Local PLC Segment RTControl DAQ DAQ achievements so far: 46 Mbyte/s continuous limited by archive DB ATCA 64 channels 2 MSPS (single controller) => 280 Mbyte/s envisaged

10 Improvement of the DAQ Software
Suppose that a DAQ station is likely to fail after 1000 shots of 10 s duration: W7-X will have 100 stations and 1000 s shots => 10 station failures within a single „shot“ !! High reliability requires good software quality [QM processes, G. Kühner P2-02] and very good testability (long runs) of the software. Preparation for real time support in DAQ stations

11 Collaborative Work with the Configuration and Segment DB
computer(s) controlling a component Components (Magnets, ECRH, etc.) + diagnostics W7-X (or lab experiment) Specific software modules, e.g. hardware control Project Comp. A Comp. Z ( … ) Control station b Control station a Module 2 Module 1 Module n Comp. B Description of: Hardware, Software, HLP, parameter values in segments,… DB View Tools for concurrent access, introduction of team work processes 11

12 Enhanced HLP Structure (2 Layers)
Model Parameter (HLP) calculation/check of model limits/domains Sophisticated Physics and Technical Model or even only 1:1 Mapping Modelling Framework component models coupling to SOA Named Control Parameter (HLP) definition of limits/domains Component A Component B Component C Technical Parameter (LLP)

13 Scalable Mass Storage System
30 Gbyte/s data rate 4 Pbyte/a theoretically 1 Pbyte/a design value DAQ station data streaming to data server (decoupling) Data server with parallel filesystem (IBM GPFS) Data migration with HPSS (IBM) Based on policies Integration of different storage techniques => requires scalable network

14 Scalable and Virtual Cluster Network
Scalable Core Switches Clos fabric instead of crossbar backplane (N+2 redundancy) Multi chassis trunking ( >2 cores) Single core routing capacity up to 15 Tbit/s Ethernet fabric for virtual (VMware) cluster made of several layer 2 switches distributed knowledge of MAC addresses Supports virtual IP subnets (VM motions) Decoupling between computing center configuration and hardware structure Brocade solution (NetIron MLXe)

15 Cloud system / control room
User wants to take sessions from the control room to his office High availibility solution + simple maintenance + backup / instant recovery System under test (ECRH control room): Control Room and Office Cloud Virtual Desktops test Central Server Cloud Central IT-Services productive

16 Service Oriented Architecture for Data Analysis
SOA / Data Analysis Production SOA in progress (with eventing for analysis chains) Running on VMware vSphere environment (Cloud Solution) Physists started to implement their services, e.g. fast stellarator equlibrium FP (20 ms response time)

17 Summary Coarse Plan till 2014, tight plan for frontend system series production Consolidation of the CoDaC software stack and in few cases basic improvements (DAQstation software and configuration database) Middleware setup (SOA and cloud infrastructure) Setup of backend systems like mass storage, network/server systems and control room/office workstations

18 W7-X CoDaC Team Plasma control and data acquisition development:
T. Bluhma, M. Grahla, P. Heimannb, C. Henniga, H. Kroissb, J. Kroma, G. Kühnera , H. Laquaa, M. Lewerentza, J. Maierb, H. Riemanna, J. Schachta, A. Springa, A. Wernera, M. Zilkerb a Max-Planck-Institute for Plasmaphysics, Teilinstitut Greifswald, Wendelsteinstr. 1, Greifswald, Germany b Max-Planck-Institute for Plasmaphysics, Computing Center RZG, Boltzmannstr. 2, Garching, Germany


Download ppt "Andreas Werner and W7-X CoDaC Team"

Similar presentations


Ads by Google