Download presentation
Presentation is loading. Please wait.
1
CERN – AB Department b. frammery -10.10.2005 1 The LHC Control System B. Frammery For the CERN - AB/CO Group
2
CERN – AB Department b. frammery -10.10.2005 2 Content A brief introduction The hardware infrastructure Machine timing & sequencing Data management Communications Software frameworks General services Critical systems for LHC The CERN Control Center A brief status and a few issues [THx.y-zz] Presentation or Poster to this conference
3
CERN – AB Department b. frammery -10.10.2005 3 A brief INTRODUCTION
4
CERN – AB Department b. frammery -10.10.2005 4 CERN machines (LEP) LHC SL Division PS Division Until 2003Since 2003 AB Department
5
CERN – AB Department b. frammery -10.10.2005 5 In 2003 & 2004 (LEP) LHC for physicsfor R&D
6
CERN – AB Department b. frammery -10.10.2005 6 In 2005 (LEP) LHC for physicsCommissioning for LHCfor R&D
7
CERN – AB Department b. frammery -10.10.2005 7 Strategy Develop new software and hardware infrastructures For LHC To be used & tested on all the new developments To be spread over all the CERN accelerators at a later stage Integrate industrial solutions as much as possible Meaning that, meanwhile, the “legacy” controls for LINAC2, the PSB, the PS and the SPS are to be maintained
8
CERN – AB Department b. frammery -10.10.2005 8 Hardware infrastructure
9
CERN – AB Department b. frammery -10.10.2005 9 The network : the Technical Network o Dedicated to accelerators & technical services o No direct connectivity to the outside world o Linked to the office network (the Public Network) o Security strategy to be deployed from 2006 o Gigabit backbone A 3-tier structural layout o Resource tier (Front Ends for equipment) o Business tier (servers for general services) o Presentation tier (consoles for GUIs) LHC control hardware infrastructure [Oral TU3.4-30]
10
CERN – AB Department b. frammery -10.10.2005 10 The CERN Technical Network LHC Local Points CERN control rooms CC CCC Computer Center CERN Control Center
11
CERN – AB Department b. frammery -10.10.2005 11 TCP/IP communication services CERN GIGABIT ETHERNET TECHNICAL NETWORK LHC MACHINE TCP/IP communication services PUBLIC ETHERNET NETWORK LHC MACHINE ACTUATORS AND SENSORS CRYOGENICS, VACUUM, ETC… QUENCH PROTECTION AGENTS, POWER CONVERTERS FUNCTIONS GENERATORS, … BEAM POSITION MONITORS, BEAM LOSS MONITORS, BEAM INTERLOCKS, RF SYSTEMS, ETC… ANALOGUE SIGNAL SYSTEM LHC controls architecture diagram
12
CERN – AB Department b. frammery -10.10.2005 12 TCP/IP communication services CERN GIGABIT ETHERNET TECHNICAL NETWORK LHC MACHINE TCP/IP communication services PUBLIC ETHERNET NETWORK LHC MACHINE LHC controls architecture diagram ACTUATORS AND SENSORS CRYOGENICS, VACUUM, ETC… QUENCH PROTECTION AGENTS, POWER CONVERTERS FUNCTIONS GENERATORS, … BEAM POSITION MONITORS, BEAM LOSS MONITORS, BEAM INTERLOCKS, RF SYSTEMS, ETC… Linux/LynxOS PC Front Ends WorldFIP SEGMENT (1, 2.5 MBits/sec) PLCs PROFIBUS FIP/IO RT/LynxOS VME Front Ends OPTICAL FIBERS cPCI Front Ends ANALOGUE SIGNAL SYSTEM All the front-end equipment is located in surface buildings in non-radioactive areas (ease of maintenance) 35020030160
13
CERN – AB Department b. frammery -10.10.2005 13 TCP/IP communication services CERN GIGABIT ETHERNET TECHNICAL NETWORK FILE SERVERS Linux/HP ProLiant APPLICATION SERVERS PVSS/Linux SCADA SERVERS RT/LynxOS VME Front Ends Linux/LynxOS PC Front Ends PLCs LHC MACHINE TCP/IP communication services PUBLIC ETHERNET NETWORK TIMING GENERATION LHC MACHINE ACTUATORS AND SENSORS CRYOGENICS, VACUUM, ETC… QUENCH PROTECTION AGENTS, POWER CONVERTERS FUNCTIONS GENERATORS, … BEAM POSITION MONITORS, BEAM LOSS MONITORS, BEAM INTERLOCKS, RF SYSTEMS, ETC… WorldFIP SEGMENT (1, 2.5 MBits/sec) OPTICAL FIBERS cPCI Front Ends ANALOGUE SIGNAL SYSTEM T T TT T T TTTT PLCs PROFIBUS FIP/IO LHC controls architecture diagram
14
CERN – AB Department b. frammery -10.10.2005 14 TCP/IP communication services CENTRAL OPERATOR CONSOLES LOCAL OPERATOR CONSOLES FIXED DISPLAYS CERN GIGABIT ETHERNET TECHNICAL NETWORK FILE SERVERS Linux/HP ProLiant APPLICATION SERVERS PVSS /Linux PC SCADA SERVERS RT/LynxOS VME Front Ends Linux/LynxOS PC Front Ends PLCs LHC MACHINE TCP/IP communication services PUBLIC ETHERNET NETWORK TIMING GENERATION LHC MACHINE ACTUATORS AND SENSORS CRYOGENICS, VACUUM, ETC… QUENCH PROTECTION AGENTS, POWER CONVERTERS FUNCTIONS GENERATORS, … BEAM POSITION MONITORS, BEAM LOSS MONITORS, BEAM INTERLOCKS, RF SYSTEMS, ETC… WorldFIP SEGMENT (1, 2.5 MBits/sec) PLCs PROFIBUS FIP/IO OPTICAL FIBERS cPCI Front Ends ANALOGUE SIGNAL SYSTEM TTTT T T TT T T LHC controls architecture diagram
15
CERN – AB Department b. frammery -10.10.2005 15 Machine Timing & sequencing
16
CERN – AB Department b. frammery -10.10.2005 16 CERN machines
17
CERN – AB Department b. frammery -10.10.2005 17 Timing & Sequencing (2) 21.6 s
18
CERN – AB Department b. frammery -10.10.2005 18 Timing & Sequencing (3) GPS One pulse per Second (1 PPS) Smart clock PLL 1 PPS 10 MHz Basic Period 1200/900/600 ms Advanced (100us) 1PPS Synchronized 1KHz Phase locked 10MHz Phase looked 40 MHz Event encoding clock 40MHz PLL UTC time (NTP or GPS) External events RS485 Timing Distribution Central Timing Generator Module Edited events CERN UTC Time Timing events Telegrams Timing Receivers Control System (25ns steps) Delay Central Timing Generation PSB PS SPS LHC
19
CERN – AB Department b. frammery -10.10.2005 19 Timing & Sequencing (4) The data that are distributed on the timing network: LHC Telegram (Cycle-Id, Beam-Type, Target LHC Bunch- Number, Bucket-Number, Ring, CPS-Batches, Basic-Period number, Cycle-Tag, Particle-type) Millisecond clock The UTC time Machine-events (Post-mortem trigger, warnings, beam dump, virtual mode events, …)
20
CERN – AB Department b. frammery -10.10.2005 20 Data Management
21
CERN – AB Department b. frammery -10.10.2005 21 Databases : the 4 domains of data Physical Equipment Serial Number Physical equipment Use of the general CERN MTF database for asset management
22
CERN – AB Department b. frammery -10.10.2005 22 Databases : the 4 domains of data Physical Equipment Machine Layout Serial Number Installed Equip t Type Optics Powering Equipment Catalogue LHC machine description LHC layout (mechanical, optics, electrical) DC magnet powering 1’612 electrical circuits 80’000 connections
23
CERN – AB Department b. frammery -10.10.2005 23 Databases : the 4 domains of data Physical Equipment Machine Layout Controls Configuration Serial Number Installed Equip t Type Optics Powering Computer Address Equipment Catalogue Controls Configuration PS Model extended to LHC [MO4A.1-70]
24
CERN – AB Department b. frammery -10.10.2005 24 Databases : the 4 domains of data Physical Equipment Machine Layout Controls Configuration Operational Data Serial Number Installed Equip t Type Optics Powering Computer Address Settings Measurements Alarms Logging Post-Mortem Equipment Catalogue >200’000 signals
25
CERN – AB Department b. frammery -10.10.2005 25 Databases : the 4 domains of data Physical Equipment Machine Layout Controls Configuration Operational Data Serial Number Installed Equip t Type Optics Powering Computer Address Settings Measurements Alarms Logging Post-Mortem Equipment Catalogue Consistent naming and identification scheme as defined in Quality Assurance Plan
26
CERN – AB Department b. frammery -10.10.2005 26 Communications
27
CERN – AB Department b. frammery -10.10.2005 27 The Controls MiddleWare (CMW) Ensemble of protocols, Application Programming Interfaces (API) and software frameworks for communications. Two conceptual models are supported: o the device access model (using CORBA). Typical use is between Java applications running in the middle tier and equipment servers running in the resource tier. Unique API for both Java and C++. o the messaging model (using the Java Message Service). Typical use is within the business tier or between the business tier and applications running in the presentation tier.
28
CERN – AB Department b. frammery -10.10.2005 28 Software frameworks
29
CERN – AB Department b. frammery -10.10.2005 29 The software frameworks (1) Front-End Software Architecture (FESA) Complete environment for Real-Time Model-driven control software implemented in C++ for the LynxOS and Linux platforms Java framework for accelerator controls o Uses J2EE application servers with lightweight containers o Plain Java objects (no EJB beans) o Applications can run (for test) in a 2-tier setup o Unified Java API for Parameter Control (JAPC) to access any kind of parameter. [FR1.2-5O] [TU1.3-5O] [TH1.5-8O]
30
CERN – AB Department b. frammery -10.10.2005 30 The software frameworks (2) UNified Industrial Control System (UNICOS) o Complete environment for designing, build and programming industrial based control systems for the LHC. o Supervision layer: PVSS II (SCADA from ETM) UNICOS & the Java framework for accelerator controls use the same graphical symbols and color codes [WE3A.2-60] [WE2.2-6I]
31
CERN – AB Department b. frammery -10.10.2005 31 GENERAL SERVICES
32
CERN – AB Department b. frammery -10.10.2005 32 The Alarm System LHC Alarm SERvice (LASER) LEIR (FESA) New SPS alarms (FESA) PS alarm Gateway LASER Service LHC (FESA) Legacy PS alarms Legacy CAS alarms (SPS, TCR, CSAM) Current New Gateway Broke r CAS alarm Gateway « Standard » 3-tier architecture Java message service (JMS) Subscription mechanism [TH2.2-70]
33
CERN – AB Department b. frammery -10.10.2005 33 Logging Several 10 5 parameters will be logged Every data or setting is timestamped (UTC) Parameters are logged o on regular intervals (down to 100 ms) o on request o on-change
34
CERN – AB Department b. frammery -10.10.2005 34 Analogue signals Open Analogue Signals Information System (OASIS) o To visualize and correlate in Real-Time time critical signals in the control room o ~500 signals for LHC – 50 MHz bandwidth (+ ~1000 in PS/SPS) o Distributed cPCI system using analogue MPX and oscilloscope modules (Acqiris or other types) close to the equipment o Triggers through the timing network for precise time correlations o Standard 3-tier architecture. The ancestor [TH3A.1-50]
35
CERN – AB Department b. frammery -10.10.2005 35 Core control application software (LSA) Normalized data model valid for o Settings, measurements, optics parameters Set of software modules for o Optics definition o Setting generation & management o “Trims” (coherent global modifications of settings) Set of generic applications Developed together with OP, based on experience with LEP and tested already for 2 new extractions from SPS (CNGS, TI8) [TU1.3-5O]
36
CERN – AB Department b. frammery -10.10.2005 36 Post Mortem Automatic (typ. when an interlock appears) or manual trigger No beam allowed if PM not ready Capture of o Logged data o Alarms (LASER) o Transient recorder signals (OASIS) o Fixed displays Analysis o A few Gigabytes per Post Mortem capture o Structured sorting of causes & effects o Needed from October 2005 for Hardware commissioning o Continuous development effort for the years to come TI8 extraction test in October 2004 already proved the importance of a PM system To take a snapshot of the LHC vital systems.
37
CERN – AB Department b. frammery -10.10.2005 37 Critical systems for LHC
38
CERN – AB Department b. frammery -10.10.2005 38 Powering Interlock System (1) IP6 IP7 IP8 IP1IP2 IP3 IP4 IP5 For POWERING, LHC is equal to 8 sectors Sector 1-8 IP1IP8 AtlasLHC-b 6 large cryostats
39
CERN – AB Department b. frammery -10.10.2005 39 Powering Interlock System (1) To protect 1612 electrical circuits with 10’000 supraconducting magnets SIEMENS 13 SIEMENS 14 SIEMENS 26 43 SIEMENS 15 PLCs SIEMENS 14 Control Network to Beam Interlock System
40
CERN – AB Department b. frammery -10.10.2005 40 Powering Interlock System (2) Powe r Conv erter Powe r Conv erter Powe r Conv erter QPS Power Conver ters PC_PERMIT PC_FAST_ABORT POWERING_FAILURE Magnet/ Quench ProtectS ystem CIRCUIT_QUENCH / MAGNET OVERTEMP Profibus HW Current loops for connections of clients Technical Network AUG UPS Beam Permit Beam Interlock system Patch Panels and Electronics [PO2.036-3] Siemens PLC (process control & configuration) Hardware system PVSS Console and Server (monitoring & configuration)
41
CERN – AB Department b. frammery -10.10.2005 41 Beam Interlock System (1) Two independent hardware loops as « beam permit » signal transmission. Connects the Beam Loss Monitors and many others systems to the Beam Dump request.
42
CERN – AB Department b. frammery -10.10.2005 42 up to 1200 meters Beam Permit Loops Beam Permit #1 #3 Beam Permit Beam Permit #2 User Interfaces (installed in User’s rack) Test & Monitoring Module Patching Core module F.O. interface Safe Beam Par. (via Timing) Safe Beam Parameter Receiver + Beam Interlock Controller copper cable Technical Network [PO2.031-3] Beam Interlock System (2) 16 VME CRATES Java Application
43
CERN – AB Department b. frammery -10.10.2005 43 Real-Time Feedback systems LHC orbit feedback o 2000 Beam position parameters o 1000 steering dipoles o 10 Hz frequency LHC tune feedback Modest system – 4 parameters and some 30 PCs (up to 50 Hz ?). LHC Chromaticity feedback Considered but difficulty to have reliable measurements
44
CERN – AB Department b. frammery -10.10.2005 44 FB Centralized architecture > 100 VME crates involved Through the Technical network Tests on SPS in 2004 successful Simulations show 25Hz capability Orbit Feedback system
45
CERN – AB Department b. frammery -10.10.2005 45 Quench Protection System I2C ANALOG I2C ANALOG WorldFIP PVSS Data Server Supervision/Monitoring LHC Logging Post-mortem Alarms (LASER) Power Interlocks DQQDL DQHDS DQQDC DQQDG DQQDC DQQDG DQQDC DQQDG DQQDC DQQDG DQQDC DQQDI DQHDS DQSMB DQRMB Retrieve and present data Send data PVSS Expert GUI L H C S u p r a c o n d u c t i n g m a g n e t s PC Gateway
46
CERN – AB Department b. frammery -10.10.2005 46 Controls for cryogenics Local Cryogenic control room Central Cryogenic control room Quantum S7-400. Schneider PLCs Siemens PLCs PROFIBUS DP networks PROFIBUS PA networks WFIP Networks (4) WFIP Gateways (LINUX) Gateway Quantum. PVSS Data Servers 130 PLCs ( Schneider & Siemens) Application built on UNICOS framework [WE3A.2-60]
47
CERN – AB Department b. frammery -10.10.2005 47 Controls for cryogenics Surface Operational/deployed/ready end 2005 To be deployed / to be developed & deployed in 2006
48
CERN – AB Department b. frammery -10.10.2005 48 Collimation System (1) Compulsory to gain 3 orders of magnitude in performance beyond other hadron colliders. 162 collimators when fully deployed 5 degrees of freedom & 10 measurements of absolute and relative positions and gaps per collimator Synchronous move with 10 m precision within a few 10 ms in relation with o Local orbit o Beam loss measurements [PO2.016-2]
49
CERN – AB Department b. frammery -10.10.2005 49 From MP channel: Intensity, energy, * Motor Drive Control Position Readout and Survey Environmental Survey System Collimator Supervisory System Central Control Application Function of motor setting, warning levels, dump levels versus time. Motor parameters (speed, …). Beam-loss driven functions. BLM’s BPM readings Timing Warning & error levels. Info and post mortem. Temperature sensors (jaw, cooling water, …) Vibration measurements & water flow rates Vacuum pressure & radiation measurements Motor status & switches Abort Functions (motor, warning, dump level). Info and post mortem. All position sensors. STOP Function motor. Motor parameters. Measurements. Post mortem. Warnings. Motor and switches. Abort Collimation System (2)
50
CERN – AB Department b. frammery -10.10.2005 50 The CERN CONTROL CENTER (CCC)
51
CERN – AB Department b. frammery -10.10.2005 51 The CERN Control Center A single control room for CERN to control o All accelerators o All technical services Grown from the SPS (LEP) control room on the French CERN site (Prévessin) Work started in November 2004, to be delivered in October 2005 & to be operational in February 2006 All CERN machines operated from the CCC in 2006
52
CERN – AB Department b. frammery -10.10.2005 52 The CERN Control Center The Architect drawing The current aspect
53
CERN – AB Department b. frammery -10.10.2005 53 The current aspect The CERN Control Center The architect’s view SPS PS Complex Tech. services + cryoplants LHC 40 console modules 16 large LCD displays
54
CERN – AB Department b. frammery -10.10.2005 54 The CERN Control Center One of the 20 workplaces of the CCC (for 2 operators ) Erich Keller
55
CERN – AB Department b. frammery -10.10.2005 55 A brief Status OF the LHC Control System
56
CERN – AB Department b. frammery -10.10.2005 56 Status : the basic infrastructure Basic infrastructureconceptionimplementationcomments Network done CERN security strategy to be applied VME FECs purchaseddone LEIR: 100% installed, LHC Hardware Commissioning :50% installed PC gateways purchaseddone LHC Hardware Commissioning : 50% installed PLC FECs purchaseddone Cryogenics : 60% installed Powering Interlock system : 30% installed WorldFIP done tunnel & Surface buildings: deployed 100%, qualified : 35% Remote reboot done Installed sectors 7-8, 8-1 Servers purchasedprovisional installation to be installed in CCC < Feb 2006 Consoles equipment defined and purchased to be delivered in oct.05 to be installed Nov 2005 - March 2006 for CCC Installed in Field CR - UA83 Central Timing done to be installed in CCC before March 2006 Timing distribution & receivers donedone for all modules installed in LHC Points 1, 7 & 8
57
CERN – AB Department b. frammery -10.10.2005 57 Status : the software components Control. Subsystems. Test opportunities Post Mortem Logging Timing Alarms (LASER) Powering Interlocks Automated Test Procedures Analogue Signals (OASIS) CMW FESA PVSS/ UNICOS Application software/ LSA core TT40/TI8 extraction test NOYESPartialNOYESNOYESBOTH Both OK LEIR beam Commissioning NOYES NOYESBOTH (vacuum) Generic applics 1 st QRL tests NOYESNOYESNO YES QPS surface tests YESNO FESANO LSS8L tests YES NOBOTHYESPartial/OK Large electrical circuit commissioning YES NOBOTHYESPartial/OK SPS/TI2/CNGS YES NOYESBOTHYESPartial/OK Tests in progressTests already done [TH4.2-10]
58
CERN – AB Department b. frammery -10.10.2005 58 Issues (1) Basic Infrastructure o Security policy to be implemented on the Technical Network without jeopardizing the deployment of the Consoles & servers. o Deployment of the new timing system on the pre-injectors. Software o While generic application and general services are in line, specific application programs for LHC cannot yet be specified. o Software modules not tested at full scale.
59
CERN – AB Department b. frammery -10.10.2005 59 Issues (2) Hardware commissioning o Time to commission the LHC becomes thinner and thinner. o Manpower very limited to face both LHC installation, hardware commissioning and support to operational machines: Beam commissioning o Some critical systems are pretty late (ex:collimation) o Strategy to be found to inject some beam despite of all the security systems!! The “legacy software” o To get the manpower for LHC, the existing controls infrastructures have been somewhat neglected. o The restart of the machines in 2006 will be difficult.
60
CERN – AB Department b. frammery -10.10.2005 60 Conclusion The basic LHC control system exists today. There is a strong commitment by everyone to be ready to start LHC with beam in Summer 2007. More news in October 2007 …
61
CERN – AB Department b. frammery -10.10.2005 61 Thank you for your attention
62
CERN – AB Department b. frammery -10.10.2005 62
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.