LS1 Review P.Charrue. Audio/Video infrastructure LS1 saw the replacement of BI and RF analog to digital video transport Was organised in close collaboration.

Slides:



Advertisements
Similar presentations
Long shutdown 2013: Electrical distribution activities 09/03/2012 IEFC 2012 Workshop F. Duval on Behalf of EN/EL group.
Advertisements

Intervention Priority Management This talk will show the CERN priority list, the corresponding check list and the tools used by operators to diagnose a.
Supervision of Production Computers in ALICE Peter Chochula for the ALICE DCS team.
BE-CO work for the TS Nov 8 Nov 11P.Charrue - BE/CO - LBOC1.
Industrial Control Engineering Industrial Controls in the Injectors: "You (will) know that they are here" Hervé Milcent On behalf of EN/ICE IEFC workshop.
CLOUD COMPUTING.  It is a collection of integrated and networked hardware, software and Internet infrastructure (called a platform).  One can use.
Accelerator Complex Controls Renovation, LHC Excluded Purpose and Scope M.Vanden Eynden on behalf of the AB/CO Group.
HIE-ISOLDE (High Intensity and Energy) project Protection of superconducting solenoids MPE work shop – December 14, 2010 G.J. Coelingh TE-MPE-CP.
Natick Public Schools Technology Update January 26, 2009 Dennis Roche, CISA Director of Technology.
Consolidation of access systems for the injector Complex ATOP days 4-6 March 2009 P. Ninin & R, Nunes in behalf of the PS and SPS access project teams…
Barriere O, Le Roux P
Accelerator Consolidation Workshop TE/EPC consolidation plan 2 Jean-Paul Burnet ACC consolidation day, 12/09/2013.
Roles Committees Meetings
Workshop 12/04/2006AT/MTM SM18 Test Facility A. Siemko "Workshop on Test Facilities and measurement equipment needed for the LHC exploitation"
LS1 : EL Upgrades & Consolidation Chamonix– February 8 th 2012 F. Duval on behalf of EN/EL The goal of LS1 is to implement everything needed for a safe.
 A brief history of time  In the years BC.  My years at CERN  Current position  What is a Technical Coordinator?  The different type of intervention.
Wojciech Sliwinski BE/CO for the RBAC team 25/04/2013.
Work Effort in the LHC Injector Complex for the Upgrade Scenarios B. Mikulec Thanks to all the valuable input from the LIU project leaders and LIU work-
CERN Raul Murillo Garcia BE-CO LS1 review – TE-EPC feedback BE-CO LS1 review TE-EPC feedback Raul Murillo Garcia on behalf of TE-EPC Daniel Calcoen Stephen.
K.Hanke – PS/SPS Days – 19/01/06 K.Hanke - PS/SPS Days 19/01/06 Recommissioning Linac2/PSB/ISOLDE from CCC  remote operation from CCC  upgrades & changes.
The recent history and current state of the linac control system Tom Himel Dec 1,
Managed by UT-Battelle for the Department of Energy SCL Vacuum Control System Upgrade Derrick Williams
Cooling & Ventilation feed-back M. Nonis – EN/CV.
Infrastructure availability and Hardware changes Slides prepared by Niko Neufeld Presented by Rainer Schwemmer for the Online administrators.
TE-CRG-CE OMP Meeting 06 / 11 / 2014TE-CRG-CE OMP Meeting.
Feedbacks from EN/STI A. Masi On behalf of EN-STI Mathieu Donze` Odd Oyvind Andreassen Adriaan Rijllart Paul Peronnard Salvatore Danzeca Mario Di Castro.
CERN Timing Overview CERN timing overview and our future plans with White Rabbit Jean-Claude BAU – CERN – 22 March
POST-ACCOR renovations until LS2 – DEBRIEFING – Marine Pace, CO3 – 17 September 2015 Input from Chris, Marc, Stephen, Stephane, Wojtek.
Quality assurance - documentation and diagnostics during interventions Corrective maintenance seen from the Technical Infrastructure operation Peter Sollander,
AB/CO Review, Interlock team, 20 th September Interlock team – the AB/CO point of view M.Zerlauth, R.Harrison Powering Interlocks A common task.
Final Report – Injector Re- Commissioning Working Group (IRWG) Working group to find strategy for more efficient start-up of injectors and associated facilities.
Conclusions on UPS powering test and procedure I. Romera Acknowledgements: V. Chareyre, M. Zerlauth 86 th MPP meeting –
LHC Section Meeting 1.eLogbook 2.LHC Controls Security Panel.
Beam Instrumentation during LS1 Ray Veness on behalf of the BE/BI group.
LS1 – View from Applications BE-CO LS1 review – 1 December 2015 Greg Kruk on behalf of the Applications section.
– Machine Controls Coordinators (MCC): team and role – Overview of renovations during LS1 – Proposal for after-LS1 Commissioning organization ACCOR PROJECT.
Linac2 and Linac3 D. Küchler for the linac team. Planning first preparative meeting for the start-up of Linac2 in June 2013 –this early kick-off useful.
TE-CRG Activities D. Delikaris, TE-CRG.
Proposal: Use of ECRs for “Controls” Changes and Renovations Rende Steerenberg, Samy Chemli, Marine Gourber-Pace, Klaus Hanke, Verena Kain, Bettina Mikulec,
Report on Controls, TCC meeting 23 th June 2005 – Cl.Dehavay & Co - AB/CO/HT 1 Claude DEHAVAY - CERN/AB/CO/HT Installation Readiness Report - Controls.
Controls, Safety and Engineering Databases 18 Jan 2006PS & SPS Days Access Safety Systems Modifications & tests in 2006 Presented by Rui Nunes TS-CSE-AAS.
TCR Remote Monitoring for the LHC Technical Infrastructure 6th ST Workshop, Thoiry 2003U. Epting, M.C. Morodo Testa, S. Poulsen1 TCR Remote Monitoring.
Conventional Facilities integration: Approach and Issues Daniel Piso Fernández WP Leader (WP13 Conventional Facilities Integration Support) November 5,
Upgrades of Operational Linux Platforms Vito Baggiolini BE-CO-DO 1.
February 4 th, Quality assurance plan: production, installation and commissioning N.Vauthier TE-CRG Technical Review on Beam Screen Heater Electronics.
1. Baseline – from LMC  Presentation by F. Bordry at LHC Machine Committee 5.10 on LS1 Organisation: 1. Linac4 is not going to be connected.
CERN Converter Control Electronics Strategy for LHC Machine Electronics : Limitations & Risks
LS1 Review BE-CO-SRC Section Contributions from: A.Radeva, J.C Bau, J.Betz, S.Deghaye, A.Dworak, F.Hoguin, S.Jensen, I.Koszar, J.Lauener, F.Locci, W.Sliwinski,
LS1 Day: The Injectors S. Baird EN/MEF Thanks to R Brown, N Gilbert, D McFarlane & V Chohan, IEFC workshop…. LS1 Day: The Injectors 1.
V4.
SPS Personnel Protection System Renovation Project
BE-CO YETS 2015/2016 Planned BE-CO tasks during the YETS 2015/2016
Accelerator Controls Renovation Project “ACCOR”
Technical Services: Unavailability Root Causes, Strategy and Limitations Data and presentation in collaboration with Ronan LEDRU and Luigi SERIO.
Framework Schedule from EYETS 2016 to LS2
CONS and HL-LHC day Analysis of needs from BE-CO
Status and Plans for InCA
Equipment simplified model SMR UR safety
P. Charrue on behalf of AB/CO/IN
BI-day 2014, The SEM-grid renovation project Michel Duraffourg
CV PVSS project architecture
J. Uythoven for the MPE-MI & MS Teams
the CERN Electrical network protection system
SPS activities during LS2 TE-EPC (Power Converter Group)
GIS PORTAL RACKS Integration of Equipment Racks in the Geographic Information Service (SMB/SE) Olivier Barrière.
Overview & baseline of LS2 and roadmap for CRG
GIS Portal Racks Project
TEST PLANS for HL LHC IT STRING
LS3 Study & Readiness Meetings
Presentation transcript:

LS1 Review P.Charrue

Audio/Video infrastructure LS1 saw the replacement of BI and RF analog to digital video transport Was organised in close collaboration with the groups involved and with OP Both old and new distribution worked in parallel until OP and equipment groups agreed to stop the old one Today the deployed solution is giving a correct service to OP, BI and RF, with a very short delay between analog and digital (around 10ms)

CCC/CCR : What does a long shutdown mean for your services in general? As the CCR provides services to all equipment groups, including access and safety, CV, Cryo, PVSS, there is NO such thing as Shutdown for the CCR team Out of 340 servers in the CCR, about 300 were kept operational during LS1 On the contrary, some services become more critical, such as the LHC CV that has to ‘handle’ human presence in the tunnel Therefore our shutdown work demands a lot of coordination and the need to offer redundant services

CCC/CCR: How did you prepare for LS1 There were 4 main activities in the CCC/CCR: Complete re-engineering of the EL and CV infrastructure of the CCR Network backbone upgrade from 1Gb/s to 10Gb/s BE/CO ‘classic’ upgrades and maintenance of the installed servers Change the CCC consoles Preparation: Mainly via meeting with our ‘clients’ to explain what we had to do and to find the best possible window to deploy our changes And via meetings with EN/EL, EN/CV and IT/CS to organise the big changes in the CCR with as little perturbations as possible for our users Plus presentations in ad-hoc meetings (LMC, IEFC, TIOC, …)

CCC/CCR: What worked well in LS1? - What didn’t work well in LS1 All the above mentioned activities went extremely smooth, with almost no perturbation to all our users The CCC console change however was not smooth The standard CERN-store PC were not working well due to mother-board issues (PC blocking, Video output frozen, …) Therefore a crash program to replace the 110 PCs with new ones had to take place later in 2015, with the associated perturbation to OP What could be improved is the communication FROM our users regarding their activities: We had to chase our users to understand what were their needs in term of availability of the services from the CCR The operational program of the machines (CTF, LINAC, ISOLDE, …) was not always clear and/or communicated early enough for us to plan our work.

CCC/CCR: Planning and organisation As said before, mainly ad-hoc meetings with clients or service providers (EN/EL, EN/CV and IT/CS) or presentations to official meetings (LMC, IEFC, TIOC) No specific tools and process were used We influenced the CCR EL and CV work in order to perturb our users as little as possible.

CCC/CCR: Technical impact The work in the CCR improved significantly the local infrastructure in terms of : Network bandwidth, Electrical power and Cooling power Increased redundancy on CV and EL services These changes were relatively transparent to our users who might see a communication speed increase, and the overall operation will be much more stable (no more impact due to EL or CV issue) In close collaboration with EL, CV and IT, several acceptance tests were organised before the system was put in operation

Audio/Video: Outlook for LS2 In close collaboration with BE/ICS (former GS/ASE), with the input from OP, a replacement of PublicAddress (and Intercom ?) will be deployed during LS2 The complete video infrastructure will be renovated with modern digital technology.

CCC/CCR: Outlook for LS2 LS2 will not differ from LS1 as our services from CCR will continue to be operational We will appreciate to have as soon as possible the list of services/machines that should be kept operational E.g. will LHC stay cold or warm? Will the vacuum be kept empty? What are the access and safety needs? Which machine/experiment will have beam or will be commissioned during LS2? And at what exact dates this will happen? In terms of high-level plans for LS2, we anticipate a review of the BackEnd as the market is evolving fast in this domain. The servers infrastructure in 2020 might be different from the one of today.

(New)IN section outlook for LS2 The outcome of the 3 CO3 projects (Platform, Fieldbus, I/O) will incur pilot installations and the IN section is closely involved to anticipate needs Post-ACCOR actions planned by CO3 and followed up by the installation team Follow-up the new HT initiatives (Pulse repeater, RS485 GMT, new WFIP Master, WR for timing, OASIS, Btrain) And be ready for planning these new installations In parallel, the KONTRON platform is secured till end 2018 (before LS2) but its future will have to be addressed in the coming years With the planning and installation inherent to a possible change The same will apply to the distributed consoles outside CCC around the accelerators. An initiative will be launched to study the best solution covering the needs of our users E.g. windows access for TE/ABT Not to forget the next phases of the GIS Rack Portal