1 Events for the SPS Legacy & Common Implications.

Slides:



Advertisements
Similar presentations
1 of : Multi-Currency Payments / DA0813 Last updated: Project Walkthrough: Multi-Currency Payments Multi-Currency Payments.
Advertisements

NetComm Wireless Logging Architecture Feature Spotlight.
ILDG File Format Chip Watson, for Middleware & MetaData Working Groups.
20 September 2000 LHC slow timing implementations brainstorming on slow timing Wednesday 20 September M.Jonker.
Operating Systems High Level View Chapter 1,2. Who is the User? End Users Application Programmers System Programmers Administrators.
BE-CO work for the TS Nov 8 Nov 11P.Charrue - BE/CO - LBOC1.
Portability CPSC 315 – Programming Studio Spring 2008 Material from The Practice of Programming, by Pike and Kernighan.
The TIMING System … …as used in the PS accelerators.
June 27th, 2005AB/CO/HT - Review of Activities for PS-SPS 2006 Startup TIMING IN C.C.C - SPS Purpose, Scope CHERY.R.
E. Hatziangeli – LHC Beam Commissioning meeting - 17th March 2009.
06/05/2004AB/CO TC RF controls issues Brief overview & status Requested from AB/CO Hardware, Timing, VME/FESA for LEIR, SPS, LHC Controls for LHC RF Power.
Distribution of machine parameters over GMT in the PS, SPS and future machines J. Serrano, AB-CO-HT TC 6 December 2006.
SPS Beam Loss System The SPS Beam-Loss System Hardware: Gianfranco Ferioli Software: Lars K. Jensen CERN SL/BI SL/OP Linkman: Antoine Ferrari.
Timing upgrades after LS1 Jean-Claude BAU BE-CO-HT1.
WWWWhat timing services UUUUsage summary HHHHow to access the timing services ›I›I›I›Interface ›N›N›N›Non-functional requirements EEEExamples.
BI day 2011 T Bogey CERN BE/BI. Overview to the TTpos system Proposed technical solution Performance of the system Lab test Beam test Planning for 2012.
BI-TB on BST Systems Introduction (L. Jensen) Hardware for SPS/LHC masters and receivers (old and new) (JJ. Savioz) Software for masters and receivers.
The CERN LHC central timing A vertical slice Pablo Alvarez Jean-Claude Bau Stephane Deghaye Ioan Kozsar Julian Lewis Javier Serrano.
BCT for protection LHC Machine Protection Review April 2005 David Belohrad, AB-BDI-PI
The CERN LHC central timing A vertical slice
T HE BE/CO T ESTBED AND ITS USE FOR TIMING AND SOFTWARE VALIDATION 22 June BE-CO-HT Jean-Claude BAU.
SPS Timing. Outline Timing table Modes of operation Mode switch mechanism External events Creating a timing table Timing event cleaning.
Timing Requirements for Spallation Neutron Sources Timing system clock synchronized to the storage ring’s revolution frequency. –LANSCE: MHz.
FGC Upgrades in the SPS V. Kain, S. Cettour Cave, S. Page, J.C. Bau, OP/SPS April
Sept-2003Jean-Claude BAU1 SEQUENCING AT THE PS Let’s take a quick tour.
1 LTC Timing AB/CO/HT Central timing hardware layout Telegrams and events Postmortem, XPOC LHC Central Timing API Fill the LHC Use Case Julian Lewis.
The CERN LHC central timing A vertical slice Pablo Alvarez Jean-Claude Bau Stephane Deghaye Ioan Kozsar Julian Lewis Javier Serrano.
Nov, F. Di Maio, M.Vanden Eynden1 CO Proposal concerning AB Front-End Software Responsibilities First detailed proposal based on the global Front-end.
Timing for AP: introduction Javier Serrano AB-CO-HT 15 September 2006.
CS 147 Virtual Memory Prof. Sin Min Lee Anthony Palladino.
1 LHC Timing Requirements and implementation J.Lewis for AB/CO/HT 21 st /Sep/06 TC presentation.
09/01/2016James Leaver SLINK Current Progress. 09/01/2016James Leaver Hardware Setup Slink Receiver Generic PCI Card Slink Transmitter Transition Card.
Overview of the main events related to TS equipment during 2007 Definition Number and category of the events Events and measures taken for each machine.
BE-CO review Looking back at LS1 CERN /12/2015 Delphine Jacquet BE/OP/LHC Denis Cotte BE/OP/PS 1.
SPS proton beam for AWAKE E. Shaposhnikova 13 th AWAKE PEB Meeting With contributions from T. Argyropoulos, T. Bohl, H. Bartosik, S. Cettour.
Feedbacks from EN/STI A. Masi On behalf of EN-STI Mathieu Donze` Odd Oyvind Andreassen Adriaan Rijllart Paul Peronnard Salvatore Danzeca Mario Di Castro.
CERN Timing Overview CERN timing overview and our future plans with White Rabbit Jean-Claude BAU – CERN – 22 March
22/09/05CO Review: FESA IssuesJJ Gras [AB-BDI-SW] 1/18 AB-CO Review FESA  The Functionality  The Tools  The Documentation  The Support  Maintenance.
Linac2 and Linac3 D. Küchler for the linac team. Planning first preparative meeting for the start-up of Linac2 in June 2013 –this early kick-off useful.
CO Timing Review: The OP Requirements R. Steerenberg on behalf of AB/OP Prepared with the help of: M. Albert, R. Alemany-Fernandez, T. Eriksson, G. Metral,
© 2001 By Default! A Free sample background from Slide 1 Controls for LEIR AB/CO Technical Committee - 18 th March 2004.
Archives/References for SPS Faraday Cage Timing Vito Baggiolini AB/CO after discussions with M. Arruat, J.-C. Bau, R. Billen, A. Butterworth, F. Follin,
AWAKE synchronization with SPS Andy Butterworth, Thomas Bohl (BE/RF) Thanks to: Urs Wehrle (BE/RF), Ioan Kozsar, Jean-Claude Bau (BE/CO)
Digital LLRF: achievements and LS1 plans M. E. Angoletta, A. Blas, A. Butterworth, A. Findlay, M. Jaussi, P. Leinonen, T. Levens, J. Molendijk, J. Sanchez.
LHC RT feedback(s) CO Viewpoint Kris Kostro, AB/CO/FC.
Software tools for digital LLRF system integration at CERN 04/11/2015 LLRF15, Software tools2 Andy Butterworth Tom Levens, Andrey Pashnin, Anthony Rey.
RF Commissioning D. Jacquet and M. Gruwé. November 8 th 2007D. Jacquet and M. Gruwé2 RF systems in a few words (I) A transverse dampers system ACCELERATING.
HP-PS beam acceleration and machine circumference A.LachaizeLAGUNA-LBNO General meeting Paris 18/09/13 On behalf of HP-PS design team.
New PSB beam control BIS Interlocks A. Blas PSB Beam Control 20/01/ BIS: Beam Interlock.
Proposals 4 parts Virtual Accelerator Open CBCM Data Base Cycle Server Quick Fix.
SPS availability K. Cornelis Acknowledgments : A. Rey and J. Fleuret.
1 Synchronization and Sequencing for high level applications Julian Lewis AB/CO/HT.
H2LC The Hitchhiker's guide to LSA Core Rule #1 Don’t panic.
Timing Review Evolution of the central and distributed timing How we got where we are and why What are its strengths and weaknesses.
V4.
SPS Applications Software issues - startup in 2006 and legacy problems
RF acceleration and transverse damper systems
Data providers Volume & Type of Analysis Kickers
LHC General Machine Timing (GMT)
Status and Plans for InCA
Timing Review FESA Requirements with respect to the current Timing implementation FESA Team FE section TimingReview FESA Requirements.
SLS Timing Master Timo Korhonen, PSI.
Javier Serrano CERN AB-CO-HT 29 February 2008
Database involvement in Timing
CPSC 315 – Programming Studio Spring 2012
Michael Benedikt AB/OP
Portability CPSC 315 – Programming Studio
Lecture Topics: 11/1 General Operating System Concepts Processes
Interrupts and exceptions
Presentation transcript:

1 Events for the SPS Legacy & Common Implications

2 Motivation for common events Ability to use the same hardware and software on different accelerators.  In particular FESA  CTR & CTG over GMT Save on exploitation costs. Save on development costs. Common approach for operations for all machines in the CCC. High precision UTC time stamping. Ability to mix events from different accelerators. on the same timing cable.

3 Attitude for Legacy SPS events Keep the strict minimum going so that legacy applications continue to work. Don’t use them for new developments. Remove restrictive hardware that blocks us.  such as Tg3 (Old payloads, 4 Ev per Ms)  update existing hardware like the Pulse Gaters as a temporary solution

4 Multiple timing cables Normally users NEVER see events, only CTIM equipments. This has worked well in the past, however…  All accelerators at CERN are controlled via general machine timing (GMT) cables: 6 for the LHC Injector Chain (LIC) network, and 1 for CTF network CTF, LEIRLHC LINAC-II & PSB, LINAC-III & LEIR, CPS, ADE, SPS, LHC  Some events must be sent over many cables, so we need some standards for event layout… 0xMTCCPPPP

5 0: Not a machine event 1: Is the LHC, was LEP 2: Is the SPS 3: Is the CPS and SCT 4: Is the PSB and FCT 5: Is LEIR, was the LPI 6: Is ADE 7-8: Reserved for future use Machine IDs and events

6 Event types 20Legacy SPS SSC, followed by super-cycle number 01Millisecond, lots of different modulo formats 21Legacy SPS Machine event e.g. 0x21B SPS Start super-cycle, no payload 2FSPS Ancient events X3Telegram event, followed by number and value X4Same as 1, machine event, but CYTAG payload X5Telegram description event, followed by number and type for TSU modules 02,03,04 SPS Second, Minute, Hour 05,06,07 SPS Day, Month, Year 08 PS Day, Month, Year 09 PS Hour, Minute, Second 0ACable ID, followed by GMT cable code B5,B6 UTC Second, MSW, LSW X = Machine ID, 1..8

7 Legacy and Common events 4 events per millisecond * [02,03,04] Time [05,06,07] Date [01] 8-Bit millisecond modulo [20] Super-cycle number [22] Start super-cycle [21] Machine events  Event payload is: Type, Number [2F] Ancient events 8 events per millisecond UTC Second 32-Bits via [B5,B6] [01] 16-Bit millisecond modulo [23] Telegrams [24] Machine events  Event payload is: Cycle Tag Legacy support

8 Incorporated the SPS in the CBCM in 2004 Extended the idea of cycle, containing an integer number of basic periods towards the SPS. Incorporate these cycles in the Beam descriptions used by the MTG. PSB SFTPRO PSB SFTPRO CPS SFTPRO CPS SFTPRO SPS FIXED TARGET

9 Why Cycle Tags ? Beam-in Beam-out Fixed Target MD (0.9) S periods [X n] W-Start Cycle Beam-In Beam-Out Master Inject (Virtual Event) F( cycle ) Basic Period Cycle Boundary Beam Cycle Boundary Field Time CPS

10 Implication SPS Timing Layout Work needs to be done to define how the new 24 events will be laid out, its not just a change from 1 to 4 in the header and replacing 0101 payloads with Cycle Tags.  A small band of experts will start work very soon… Lars Jensen, Gabriel Metral, Markus Albert, Lasse Norman, Delphine Jacquet, Andy Butterworth, and HT people. Adjusting the timing twice should be avoided This is an ongoing light activity, it will need the basics complete for the Faraday cage

11 Conclusion We will be able to continue with both Legacy and 24 headers on the SPS cable, there will be some other machine events. More than twice the bandwidth, 7 events instead of Bit Payloads are illegal, implications for 20 SPS-SSC event. UTC time stamps will be available. SPS Telegrams are available, and others. This approach will be used in the RF cage with new CTRV hardware, hence must be ready. The Old SL Tg8 modules must work with the new events, UTC and Telegrams, they would be too expensive to replace. The PS events will carry CYTAG payloads to, so modifications of the PS Tg8s are also needed.

12 Implication SPS-TG8 Old Firmware Old modules running on old style DSCs will be mostly unaffected. There are two small differences…  The super-cycle number is 16 bit Big Endian instead of 24 bit little Endian.  There are no Tg3 Date Time events any more, so the date is garbage.  During the tests 8 events per millisecond was very close to the limit with 3 wild cards We can down load the new firmware on old crates if needed….

13 Implication SPS-TG8 New Firmware The SPS-Tg8 is accessible across TimLib. In this case new Tg8 firmware is downloaded automatically when the library is initialized. This library supports the FESA API. Some points…  UTC events and time support backwards compatible.  The SUPER-CYCLE number returned by the firmware is the 32 bit UTC time of the event, namely the Super-Cycle stamp.  SPS Telegrams work correctly.  Connecting to 21 headers gives a warning and soon will give an error.  24 and 21 headers are supported simultaneously  Firmware runs/is ~30% faster/shorter due to O3 option It is possible for old stuff and FESA to coexist on the same module, but I don’t recommended it. Mostly finished all this, seems to work so far.

14 Implication PS-TG8 Due to the extra loading on processing events with payloads there will be more loading of the PS-TG8 firmware which may cause problems.  A new version needs to be made that simply ignores payloads in triggers. No big problem as the firmware is downloaded at DSC start-up.

15 Implication CBCM Event payloads will be turned on everywhere. New SPS event layout. Ongoing occasional meetings as needed. Introduction of reflective memory for multiple system synchronization. Handling LIC during an LHC Fill requires a hard look at the CBCM especially for the number of batches, target bucket and ring. Some of these features will be required in 2006; There is a need for hard synchronization in the business layer at least. This needs to be implemented, no matter what. There is a debate going on with CPS operators and CO. They use two tiers not three and want reliable synchronization too. Soft synchronization needs working on. A more reliable version of DTM is nearly ready

16 FT MD 16.8 seconds SPS Super-Cycles 2006

17 LHC pilot +3CNGS 22.8 sec. SPS Super-Cycles 2006

18 LHC filling 21.6 sec SPS Super-Cycles 2006

19 FT MD CNGS 34.8 sec SPS Super-cycles 2006

20 SPS Super-Cycles 2006 Some of these SC’s are breaking new ground, never tried before on the SPS  Modifications of CBCM FIDO needed  Even legacy stuff needs testing Serious discussions needed on  External conditions  Economy modes  Legacy payloads Some discussions still needed on economy mode switching