PXIE/PIP-II Control System J. Patrick October 7, 2014.

Slides:



Advertisements
Similar presentations
Control System Studio (CSS)
Advertisements

Paul Chu FRIB Controls Group Leader (Acting) Service-Oriented Architecture for High-level Applications.
Introduction Main technologies: core written in Java embedded Jython interpreter code managed using the Eclipse plugin framework client program uses Eclipse.
Model Server for Physics Applications Paul Chu SLAC National Accelerator Laboratory October 15, 2010.
EPICS Meeting 2005ORNL Rdb systems at SNS Jeff Patton and cast April 27-29, 2005 EPICS Meeting 2005.
Patrick Krejcik LCLS April 16-17, 2007 Breakout Session: Controls Controls Commissioning Experience.
Software Frameworks for Acquisition and Control European PhD – 2009 Horácio Fernandes.
Presented by IBM developer Works ibm.com/developerworks/ 2006 January – April © 2006 IBM Corporation. Making the most of Creating Eclipse plug-ins.
–Streamline / organize Improve readability of code Decrease code volume/line count Simplify mechanisms Improve maintainability & clarity Decrease development.
Client/Server Architectures
Agenda Adaptation of existing open-source control systems from compact accelerators to large scale facilities.
Managed by UT-Battelle for the Department of Energy Open XAL Open Development of the XAL Accelerator Application Framework Christopher K. Allen Tom Pelaia.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 18 Slide 1 Software Reuse 2.
Scan System Kay Kasemir, Xihui Chen Jan Managed by UT-Battelle for the U.S. Department of Energy Automated Experiment Control “Scan” should be.
Scan System: Experiment Automation Kay Kasemir, Xihui Chen RAL EPICS Meeting, May 2013.
Evolution to CIMI Charles (Cal) Loomis & Mohammed Airaj LAL, Univ. Paris-Sud, CNRS/IN2P3 29 August 2013.
Designing a HEP Experiment Control System, Lessons to be Learned From 10 Years Evolution and Operation of the DELPHI Experiment. André Augustinus 8 February.
Managed by UT-Battelle for the Department of Energy Kay Kasemir ORNL/SNS 2012, April at SLAC Control System Studio - Introduction.
Matthias Clausen, DESY CSS GSI Feb. 2009: Introduction XFEL The European X-Ray Laser Project X-Ray Free-Electron Laser 1 CSS – Control System.
This material is based upon work supported by the U.S. Department of Energy Office of Science under Cooperative Agreement DE-SC Michigan State.
Drag and Drop Display and Builder. Timofei B. Bolshakov, Andrey D. Petrov FermiLab.
ICALEPCS 2007 Summary Karen S. White Jefferson Lab.
HPS Online Software Discussion Jeremy McCormick, SLAC Status and Plans.
© 2007 by Prentice Hall 1 Introduction to databases.
1 Kenneth Osborne, 9/14/07 Inter Group communications at the Advanced Light Source. Overview of the different methods of communication between different.
Elliott Wolin Hall D Online Meeting 4-Mar  A few weeks ago Elke asked how long it would take to get an EPICS system going  I didn’t know  I had.
Jan Hatje, DESY CSS ITER March 2009: Technology and Interfaces XFEL The European X-Ray Laser Project X-Ray Free-Electron Laser 1 CSS – Control.
The CSS Scan System Kay-Uwe Kasemir SNS/ORNL Dec
Managed by UT-Battelle for the Department of Energy Kay Kasemir ORNL/SNS Jan Control System Studio, CSS Overview.
ATF Control System and Interface to sub-systems Nobuhiro Terunuma, KEK 21/Nov/2007.
Final Review of ITER PBS 45 CODAC – PART 1 – 14 th, 15 th and 16 th of January CadarachePage 1 FINAL DESIGN REVIEW OF ITER PBS 45 CODAC – PART 1.
March 2008EPICS Meeting in Shanghai1 KEKB Control System Status Mar Tatsuro NAKAMURA KEKB Control Group, KEK.
CSS – Control System Studio
Petra III Status Teresa Núñez Hasylab-DESY Tango Meeting DESY,
Online Software 8-July-98 Commissioning Working Group DØ Workshop S. Fuess Objective: Define for you, the customers of the Online system, the products.
Managed by UT-Battelle for the Department of Energy CSS Update Matthias Clausen, Helge Rickens, Jan Hatje and DESY Delphy Armstrong, Xihui Chen,
Paul Chu SLAC App. 10/14/ SLAC (LCLS) Application Plans P. Chu for SLAC High-Level Application Team.
Managed by UT-Battelle for the Department of Energy Kay Kasemir ORNL/SNS 2012, January 9-12 at NSRRC, Taiwan Control System Studio Training.
F Drag and Drop Controls Display and Builder (Synoptic Display) Timofei Bolshakov, Andrey Petrov Fermilab Accelerator Controls Department March 26, 2007.
1 BROOKHAVEN SCIENCE ASSOCIATES EPICS Version 4 – Development Plan V4 Team – presented by Bob Dalesio EPICS Meeting October 12, 2010.
07/10/2007 VDCT Status Update EPICS Collaboration, October 2007 Knoxville, Tennessee
Managed by UT-Battelle for the Department of Energy Kay Kasemir ORNL/SNS April 2013 Control System Studio, CSS Overview.
August 2003 At A Glance The IRC is a platform independent, extensible, and adaptive framework that provides robust, interactive, and distributed control.
Plug-in Architectures Presented by Truc Nguyen. What’s a plug-in? “a type of program that tightly integrates with a larger application to add a special.
Jan Hatje, DESY CSS GSI Feb. 2009: Technology and Interfaces XFEL The European X-Ray Laser Project X-Ray Free-Electron Laser 1 CSS – Control.
1 BROOKHAVEN SCIENCE ASSOCIATES High Level Applications Infrastructure and Current Status Guobao Shen, Lingyun Yang* Controls Group & Accelerator Physics.
1 The ILC Control Work Packages. ILC Control System Work Packages GDE Oct Who We Are Collaboration loosely formed at Snowmass which included SLAC,
Applications Kay Kasemir ORNL/SNS Using Information and pictures from Matthias Clausen, Jan Hatje, and Helge Rickens (DESY) October 2007.
SPI NIGHTLIES Alex Hodgkins. SPI nightlies  Build and test various software projects each night  Provide a nightlies summary page that displays all.
TRIUMF HLA Development High Level Applications Perform tasks of accelerator and beam control at control- room level, directly interfacing with operators.
Project X RD&D Plan Controls Jim Patrick AAC Meeting February 3, 2009.
Linux in the Accelerator Division Jim Smedinghoff Linux User Group Jan /28/2015Smedinghoff | Linux in the Accelerator Division1.
ESS Integrated Control System Software Core Components S.Gysin
This material is based upon work supported by the U.S. Department of Energy Office of Science under Cooperative Agreement DE-SC , the State of Michigan.
Software tools for digital LLRF system integration at CERN 04/11/2015 LLRF15, Software tools2 Andy Butterworth Tom Levens, Andrey Pashnin, Anthony Rey.
ORNL is managed by UT-Battelle for the US Department of Energy Status Report: Data Acquisition and Instrument Controls for the Spallation Neutron Source.
Online Software November 10, 2009 Infrastructure Overview Luciano Orsini, Roland Moser Invited Talk at SuperB ETD-Online Status Review.
BOY, A Modern Graphical Operator Interface Editor and Runtime Xihui Chen, Kay Kasemir RAD Control Group.
Control Room Logbook Status September 28, 2007 Suzanne Gysin.
Fermilab Control System Jim Patrick - AD/Controls MaRIE Meeting March 9, 2016.
Brian Drendel October 31, This talk  Beams-doc private/DocDB/ShowDocument?docid=4474https://beamdocs.fnal.gov/AD-
Control System Tools for Beam Commissioning Timo Korhonen Controls Division Chief Engineer April 8, 2014.
LCLS Commissioning & Operations High Level Software
Progress Apama Fundamentals
ATF/ATF2 Control System
The ILC Control Work Packages
LCLS Commissioning & Operations High Level Software
Control System Studio (CSS)
NICOS – IBEX Interactions
Presentation transcript:

PXIE/PIP-II Control System J. Patrick October 7, 2014

Outline PXIE Control System Thoughts on PIP-II Control System Focus on general infrastructure, not specifics And not much on hardware October 7,

Goals for PXIE Control System Control/operate the PXIE accelerator Learn what is needed for PIP-II design Test bed for for PIP-II Try out new hardware and software Without compromising the first goal October 7,

Control System Scope High level subsystem control and monitoring: Power Supplies Water Cryo Instrumentation RF Motion Control Machine Protection Beam control applications Timing system October 7,

PXIE General Plan Based on ACNET It is what we know and use in the main complex Similar to NML control system NML tries to be forward looking: No CAMAC, C190/290 MADCs, CIA vacuum, IRMs, … Instead VME, PLCs, HRMs Limited console style applications. Instead use of synoptic displays, Java applications, … Assume most development by FNAL Need discussion with ANL on HWR subsystem October 7,

Synoptic Display Drag and Drop builder, no code writing required Launch via console index page, or from a web browser Web display available, no extra work required Viewable on phones Supports plots, fetching data logger data Can create links to other displays Basic functional expressions available without writing code Possible to have Finite State Machine back-end Possible to attach ACL scripts for more complex things Discouraged for very complicated things Heavily used at NML and by cryo in general October 7,

NML Overview Synoptic October 7,

NML Laser Room Synoptic October 7,

NML Cryomodule Synoptic October 7,

Nova Near Detector (Web version) October 7,

NML Coupler Conditioning October 7, Backed by FSM

Applications at PXIE Currently no synoptic displays at PXIE Scan application recently written for use at PXIE Other work uses Labview or console plots Need to agree on what is needed, how they will be written, and who will write them We do have an interface to Matlab if there is interest Currently no code management or launch from console October 7,

PXIE Scan Tool October 7,

NML Camera Image Tool October 7,

PXIE Summary Follow NML control system philosophy Current control system but with Modern hardware/front-ends Emphasis on synoptic displays Java applications for complicated things A good place to try something else if someone has an idea October 7,

Thoughts on PIP-II Major Constraint: Large existing complex PIP-II linac must interoperate with it Ideally common a common control system Timing and Machine Protection Not practical to completely replace the current control system Major Unknown: Role of India The following concerns the general infrastructure Limited discussion of details No discussion of hardware platforms An equally intense religious issue October 7,

Current Control System Operates a very large and complicated complex Largest and most complicated aside from CERN Major increase in complexity beginning with start of BNB No fundamental architectural changes required Very accessible – anyone can run a console Very large code base well managed (mostly…) Great flexibility in data acquisition (event+delay) Major evolution during Run II Data acquisition engines, greatly expanded data logging VAXes -> Linux Many new hardware systems (BPMs, BLMs, instrumentation…) The very modular nature of the system allowed this to happen October 7,

Some Issues Console user interface is not very modern Java applications have existed for many years Some have been useful but most were not successful Attempts at modern C++ GUI framework have not been successful However several MI programs use the ROOT C++ package Lots of old hardware Lots of old software Large data packets not well supported (images) Structured data supported in ad hoc way Limited capacity for logging large arrays Images and waveforms Note cataloging and fetching them out is harder than logging them October 7,

Often Asked Question Why don’t we use EPICS? Everyone else does Then we just download it and we are mostly done October 7,

What is EPICS? EPICS is a toolkit, not a complete control system It specifies a communication protocol and front-end architecture, these are common among all installations It includes a basic set of tools which are fine for getting started or small installations, but not at all suitable for a large complex. Not only are machines different, but so are the people who build and operate them. October 7,

EPICS cont. So every large installation does something different for Core applications (time plots, device viewer, etc.) Synoptic display Application framework and programming language Alarms Data logging ….. With multiple large projects currently in progress, there is an attempt at more commonality NSLS-II, FRIB, ESS, LCLS-II, ITER, … But there are still very major differences for each And other light source labs are not so much involved in this effort October 7,

Other Labs & EPICS SNS LCLS-I & II NSLS-II XFEL ESS FRIB Fermilab There are many other users, but the theme is similar October 7,

SNS SNS began operating in 2007, and was the largest EPICS system at the time. They developed a new everything on the previous list Synoptic display (EDM) Machine application framework (XAL) Including common machine description and online model Device database (and other databases) Alarm system (multiple iterations) Data logging (multiple iterations) E-log (multiple iterations) … Developed Control System Studio after started operations Integrated framework for “core” applications, more later 23

LCLS(-I) Had custom legacy control system + hardware About as old as the Fermilab control system Less evolution over the years, less modular system than ours General strategy was: Keep old system for the part of the linac used for LCLS Use EPICS for new things Some interoperability between the two Several SNS people moved to SLAC Original plan was to reuse SNS code, but in the end not much was SLAC had a long history of doing things in a certain way Different machine model format, different programming languages Machine control migrated to EPICS but in a different way Matlab primary application language going forward October 7,

NSLS-II Developed separately from RHIC – separate accelerator New technologies, initially less influence of past culture But influenced by the tastes of those who developed it No reuse from original SNS or SLAC They do pick up Control System Studio from SNS New database for machine model and everything else IRMIS-II. Different framework for machine applications (python) Initiated extending EPICS communication protocol to support structured data (EPICS v4; pvAccess) New data logger based on ”big data” technology October 7,

FRIB and ESS SNS machine application framework and core applications have been refactored to make it more modular, modern, and usable by other places (OpenXAL) FRIB and ESS have contributed and expect to use it They also expect to use Control System Studio They are also jointly working on basically a next iteration of the NSLS-II database architecture (DISCS) How much and how long they will stay in sync is not clear Note ESS plans to have a small core group and outsource much of the development to CosyLab October 7,

XFEL Marriage of four different control systems DOOCS (updated from FLASH) TINE (updated from HERA) EPICS (cryo system) Tango (experimental beamlines) Each is a complete control system and has its own user interface framework, logging, alarms, etc. etc. User interfaces in one system can access quite a lot on the other systems Several different interface technologies developed I don’t consider this a good model… October 7,

Control System Studio Integrated package of “core” applications Plotting, alarms display, device viewer, synoptic display, logbook, hardware configuration, …. Based on Eclipse development framework Plugin based architecture Plugins can communicate context with each other You can create an instance with a selected set of plugins Originated at DESY, intended to be system agnostic Picked up by SNS, became mostly EPICS-centric In operational use by SNS, NSLS-II XFEL, ESS, FRIB, ITER and others plan to use it And here at Fermilab! – Nova Detector Control System October 7,

CSS Data Browser October 7,

Alarms Display w/Link to Other Tools October 7,

SNS Operational Displays October 7,

Fermilab HINS and NML both started as EPICS only systems Used mostly older core EPICS tools Striptool for plots, old alarm system, etc. EDM synoptic display from SNS Data logger from SNS Some python applications written Lacked the organization of the main control system Independent code management No central console – run from command line To set it up properly was significant work and was not done initially. October 7,

Transition at HINS/NML HINS mostly transitioned to ACNET, NML did completely EDM synoptic displays put in MECCA Made launchable from ACNET consoles Dual EPICS/ACNET front-end developed Selected instrumentation front-ends + all IRMs Possible to access ACNET devices in EDM Front-ends eventually all converted to pure ACNET Synoptic displays completely replaced EDM displays Python applications ported to Java October 7,

Lessons Learned It was interesting to try We all learned a lot To do it in a scalable, maintainable way requires work Ideally with some thought before doing the work Especially for a small system, there really wasn’t any advantage to doing this in EPICS. Expected collaboration with other labs never developed, at least for EPICS related things It did take significant resources to do this. CD was involved also. October 7,

Timing System PIP-II would benefit from a new system New system for pulsed Project X prototyped in GHz bandwidth Much larger number of possible events (TCLK limited to 256) Data payload associated with clock event Removing need for separate MDAT link Counter cycle stamp associated with clock event Aid in correlating data across front-ends Should rethink this in context of new accelerator design Collect requirements – make suitable for future CW? Try out at PXIE? Significant lead time will be required October 7,

Accelerator Hierarchy (SNS)  Accelerator hierarchy from the accelerator physicist point-of-view

XML Accelerator Representation (partial example)  XML is a convenient text form to represent structured data  Fast enough to parse an entire accelerator (35 k lines) ~ 1 second  XML file is generated from a database representation for SNS  Can easily add temporary “bad status” flags for broken equipment that may be quickly repaired. Databases tend to be more tightly configuration controlled

Summary Common control system for PIP-II & existing complex Start with what he have and evolve it Can’t disrupt operations between now and then Modernization of both hardware and software is needed! Plan needed for the existing complex too! If parts of EPICS help us get there, we should consider it We have experience in running a mixed system There needs to be a benefit to justify any extra needed resources EPICS is different at every large accelerator There is a variety of past experience and personal tastes And EPICS constantly evolves along with computing technology October 7,