Remote Operations For High Energy Physics

Slides:



Advertisements
Similar presentations
Which server is right for you? Get in Contact with us
Advertisements

Recent Activities toward Worldwide Remote Operations of Accelerators N. Yamamoto, KEK ICALEPCS2003,Korea.
J. Patrick - Major Challenges/MOPA021 – a new Remote Operations Center at Fermilab J. Patrick, et al Fermilab.
International collaboration in high energy physics experiments  All large high energy physics experiments today are strongly international.  A necessary.
US CMS Collaboration Meeting: April 7-8, LHC Remote Operations Center Erik Gottschalk Fermilab.
Suzanne Gysin1 Software for the LHC Types of Software Current Prototyping Architecture Ideas Requirements Revisited WBS considerations.
HEPAP and P5 Report DIET Federation Roundtable JSPS, Washington, DC; April 29, 2015 Andrew J. Lankford HEPAP Chair University of California, Irvine.
LHC Accelerator Coordination Meeting: March 28, Remote Operations for CMS Erik Gottschalk.
Brochure: Lucas Taylor CHEP 2009, Prague1CMS Centres Worldwide : A New Collaborative.
J. Patrick - Major Challenges/MOPA021 – a new Remote Operations Center at Fermilab J. Patrick, et al Fermilab.
Presentation to OHEP, OFES, OASCR: Nov 7, Remote Operations and Collaborative Technologies for Distributed Science --- HEP & FES --- Erik Gottschalk.
Requirements Review – July 21, Requirements for CMS Patricia McBride July 21, 2005.
Fermilab User Facility US-CMS User Facility and Regional Center at Fermilab Matthias Kasemann FNAL.
1 Kenneth Osborne, 9/14/07 Inter Group communications at the Advanced Light Source. Overview of the different methods of communication between different.
IAEA International Atomic Energy Agency. IAEA Outline LEARNING OBJECTIVES REVIEW TEAM AND COUNTERPARTS Team Composition Qualification PREPARATORY PHASE.
P5 and the HEP Program A. Seiden Fermilab June 2, 2003.
US CMS/D0/CDF Jet/Met Workshop – Jan. 28, The CMS Physics Analysis Center - PAC Dan Green US CMS Program Manager Jan. 28, 2004.
A.Golunov, “Remote operational center for CMS in JINR ”, XXIII International Symposium on Nuclear Electronics and Computing, BULGARIA, VARNA, September,
1 HiGrade Kick-off Welcome to DESY Hamburg Zeuthen.
US LHC Accelerator Research Program Jim Strait For the BNL-FNAL-LBNL LHC Accelerator Collaboration DOE Meeting 18 April 2003 brookhaven - fermilab - berkeley.
F Drag and Drop Controls Display and Builder (Synoptic Display) Timofei Bolshakov, Andrey Petrov Fermilab Accelerator Controls Department March 26, 2007.
Accelerator Physics Department Meeting July 26, 2006 Jean slaughter1 Status Report APD Meeting July 26, 2007 Jean Slaughter.
IAEA International Atomic Energy Agency. IAEA Outline LEARNING OBJECTIVES REVIEW TEAM AMD COUNTERPARTS Team Composition Qualification PREPARATORY PHASE.
1 Future Circular Collider Study Preparatory Collaboration Board Meeting September 2014 R-D Heuer Global Future Circular Collider (FCC) Study Goals and.
LARP Collaboration Meeting April BNL -FNAL - LBNL - SLAC Status Report E. Harms 28 March 2006.
Americas comments on Linear Collider organization after 2012 P. Grannis, for LCSGA – Aug. 24, 2011 ILCSC GDE.
Welcome and the ATF2 international collaboration in future 1.Introduction 2.Mission of ATF/ATF2 3.Organization of ATF/ATF2 4.International Collaboration.
Remote Monitoring for Hardware & Beam Commissioning E. Harms/FNAL 7 April 2005 Port Jefferson, New York bnl - fnal- lbnl - slac US LHC Accelerator Research.
Nigel Lockyer Fermilab Operations Review 16 th -18 th May 2016 Fermilab in the Context of the DOE Mission.
11/3/2010 CM15 Controls for LARP and LAFS - 1 Terri Lahey LARP/LAFS/CERN and SLAC.
Nigel Lockyer Fermilab Operations Review 16 th -18 th May 2016 Fermilab in the Context of the DOE Mission.
CD Strategy Session/Briefing on Collaboration Tools: Sept. 12, Collaboration Tools Erik Gottschalk.
Visit to CERN/CMS Jan 2006 Patricia McBride Fermilab Slides taken from presentations by Hans Hoffmann and Werner Jank.
Committee – June 30, News from CERN Erik Gottschalk June 30, 2005.
Remote Operations Committee – May 12, Subgroups and Scenarios Erik Gottschalk May 12, 2005.
UEC Meeting: May 6, Remote Operations Center Erik Gottschalk.
Operations Coordination Team Maria Girone, CERN IT-ES GDB, 11 July 2012.
Blended Learning Environment
CMS ROC Report Patricia McBride May 26, 2005.
The CMS Experiment at LHC
Planning for Succession
Bob Jones EGEE Technical Director
Collaborative Innovation Communities: Bringing the Best Together
Online remote monitoring facilities for the ATLAS experiment
LHC Science Goals & Objectives
Remote Operations Erik Gottschalk Fermilab
JRA3 Introduction Åke Edlund EGEE Security Head
CMS Centres for Control, Monitoring, Offline Operations and Analysis
Ian Bird GDB Meeting CERN 9 September 2003
Erik Gottschalk for Jean Slaughter
Remote Operations Erik Gottschalk 5 October, 2005
Assessment of Our Status After 183 Days (6 months)
CMS Centres Worldwide CMS Centres Worldwide
Remote Operations for CMS & LHC Erik Gottschalk Fermilab
International Relations Sector and IR-ECO Group
Remote Operations at Fermilab
CERN Global Network Presentation to ACCU Claudio Parrinello.
Control system network security issues and recommendations
Computing infrastructure for accelerator controls and security-related aspects BE/CO Day – 22.June.2010 The first part of this talk gives an overview of.
DWQ Web Transformation
Update Erik Gottschalk.
Remote Operations Erik Gottschalk 7 September, 2005
Operations Model Thursday, October 20, 2005 October 13, 2005 Jean Slaughter.
Large International Collaborations
ICAO Aviation English Language Test Endorsement
Computer Supported Cooperative Work
Preliminary Project Execution Plan
Chapter to Provide Title
OU BATTLECARD: Oracle WebCenter Training
LHC Computing, RRB; H F Hoffmann
Presentation transcript:

Remote Operations For High Energy Physics LHC@FNAL Patricia McBride Fermilab

HEP Remote Operations With the growth of large international collaborations, the need to participate in daily operations from afar has increased. Remote monitoring of experiments is nothing new. In fact, the internet has made it relatively easy to check on your experiment from almost anywhere. Remote operations is the next step to enable participation of collaborators from anywhere in world - and perhaps to take on shift duties.

Key components for remote operations For successful participation in operations at a distance, collaborations must first address issues of communications and trust. These issues can be partially addressed through choices of appropriate technologies and the establishment of clear collaboration policies. A successful long-term remote operations center takes the strong commitment of the local community and the right environment. Exchange of personnel between sites is key. Nothing can replace time spent at the experiment. The principle goal is to enable people to participate in operations when they are unable to travel to the experiment.

LHC remote operations center at Fermilab has contributed to CMS detector construction, hosts the LHC physics center for US-CMS, is a Tier-1 computing center for CMS, has built/delivered LHC machine components, and is part of the LARP collaboration. The LHC physics center (LPC) had plans for remote data quality monitoring of CMS during operations. Could we expand this role to include remote shifts? What are the limits? We saw an opportunity for US accelerator scientists and detector experts to contribute their expertise during LHC commissioning. Could they help with studies without moving to CERN for a year? The idea of a joint remote operations center at Fermilab was formed and people from each area joined together to plan a single center.

What is LHC@FNAL? A Place A Communications Conduit An Outreach tool That provides access to information in a manner that is similar to what is available in control rooms at CERN Where members of the LHC community can participate remotely in CMS and LHC activities A Communications Conduit Between CERN and members of the LHC community located in North America An Outreach tool Visitors will be able to see current LHC activities Visitors will be able to see how future international projects in particle physics can benefit from active participation in projects at remote locations.

Planning for LHC@FNAL We formed a task force with members from all FNAL divisions, university groups, CMS, LARP, and LHC. The advisory board had an even broader base. The LHC@FNAL task force developed a plan with input from many sources including CMS, LHC, CDF, D0, MINOS, MiniBoone and Fusion Energy Sciences. We worked with CMS and US-CMS management, as well as members of LARP (LHC Accelerator Research Program) and the LHC machine group at all steps in the process. A detailed requirements document for LHC@FNAL was prepared and reviewed in 2005. We visited 9 sites (e.g. Hubble, NIF, ESOC) to find out how other projects do remote operations. We are now engaged in construction, integration, software development and outreach activities. CMS Remote Operations Center is already in operation. The goal is to have LHC@FNAL ready for detector commissioning and startup of beam in 2007. We plan to work with the ILC controls group to develop plans for ILC remote operations.

Remote operations for LHC and LARP LHC remote operations: training prior to stays at CERN remote participation in studies ‘service after the sale’: to support components we built. access to monitoring information CCC at CERN LARP: The US LHC Accelerator Research Program (LARP) consists of four US laboratories, BNL, FNAL, LBNL and SLAC, who collaborate with CERN on the Large Hadron Collider (LHC). The LARP program enables U.S. accelerator specialists to take an active and important role in the LHC accelerator during its commissioning and operations, and to be a major collaborator in LHC performance upgrades. CCC

GPN Terminal Server or ssh Oracle DB Server LHC TN cern.ch CMS EN Connecting to CERN Reliable communications tools and robust secure software are critical for operations. Some general requirements: Remote users should see applications as they are in the main control room(s) when possible. However, they might not have the same privileges. Communication channels should be kept open. Establish clear policies for shifts. The goal is to assist in operations, and not to place additional requirements on CERN personnel. Terminal Server or ssh Oracle DB Server LHC TN GPN cern.ch CMS EN fnal.gov Issues: access to information on technical networks (TN,EN), latency, authorization, authentication, 24x7 communications

LHC Accelerator Fermilab Software (LAFS) It will be difficult for outside visitors to make significant contributions to the LHC once beam commissioning has started. Unfamiliarity with the control system Critical problems will most likely be assigned to in-house staff. Fermilab will be more welcomed at CERN if the lab can bring real resources to the table and has the ability to solve operational problems. Fermilab has experience in the software issues of running a collider complex The Fermilab control system based on Java is similar to the LHC Java based control system and has a large pool of Java software expertise to draw on. Fermilab is already collaborating with CERN on a number of software projects Goal of LAFS: Develop a suite of software products to enable Fermilab accelerator physicists to make key contributions to the beam commissioning of the LHC. A small team of computer professionals, operational experts, and accelerator physicists has been assembled to contribute to select LHC software tasks. Software projects underway - in collaboration with CERN: Role Based Access Sequenced Data Acquisition (SDA) Sequencer Tune Tracker

Role Based Access (RBA) An approach to restrict system access to authorized users. What is a ROLE? A role is a job function within an organization. Examples: LHC Operator, SPS Operator, RF Expert, PC Expert, Developer, … A role is a set of access permissions for a device class/property group Roles are defined by the security policy A user may assume several roles What is being ACCESSED? Physical devices (power converters, collimators, quadrupoles, etc.) Logical devices (emittance, state variable) What type of ACCESS? Read: the value of a device once Monitor: the device continuously Write/set: the value of a device Requirements written and in EDMS Authentication Authorization Status: Design document in progress Fermilab/CERN collaboration The software infrastructure for RBA is needed for remote operations. Permissions can be setup to allow experts outside the control room to read or monitor a device safely.

US-CMS and remote operations The remote operations center will serve the US-CMS community. There are nearly 50 US institutions in CMS and about 600 signing physicists/engineers/guest scientists in the US. The LPC will provide a place in the US for physics/analysis discussions and meetings. The CMS remote operations center at Fermilab will provide a US hub for operations activities. Collaborators could take CMS shifts at the center. Fermilab

CMS Remote Operations Center CMS ROC web page: http://www.uscms.org/LPC/lpc_roc The LHC Physics Center (LPC) first developed the idea of a remote operations center for CMS at Fermilab. The center is already in operation for the cosmic tests of the full detector in the surface building at LHC Point 5 (MTCC). CMS ROC activities will move to the LHC@FNAL center next year.

CMS MTCC at FNAL ROC

Remote operations for CMS MTCC Data quality monitoring Software tests Tier0-Tier1 transfers ROC at Fermilab CMS temporary Control Room at CERN P5 MTCC@CERN Integration test Detector tests with cosmics Field mapping Global DAQ commissioning Data Transfer to Tier0 Summer-Fall 2006

FNAL ROC and MTCC Coordinated effort with CERN MTCC Operation/Computing/Software groups. MTCC-Phase 1 Goal and Strategy (DQM was not running continuously at Point 5): • transfer events to FNAL • locally run available DQM programs and event display systematically • make results easily accessible to everyone as fast as possible • Take shifts to contribute to the MTCC operation by doing quasi-online monitoring. MTCC-Phase 2 Goal and Strategy (DQM programs are running more systematically at Point 5): • Do real time Data Quality Monitoring by looking at DQM results running at Point 5 and take official DQM shifts. • Run Event Display locally on events transferred in real time. • Continue to do quasi-online monitoring as in Phase-1 with the transferred data. This has the advantage of running on every event, and it is possible to do reprocessing with improved programs with good constants. We have achieved both the phase 1 & 2 goals!

Goals for joint facility: LHC@FNAL Dedicated facility to support both CMS and LHC commissioning and operations. Remote shifts for CMS Enable experts to access information in real time Participate in studies and commissioning Training Facilitate communication with CMS and LHC control rooms. Call center for US - to contact experts, answer questions. Test collaboration tools to improve communication Take advantage of a unique opportunity to have detector and accelerator experts working together to solve problems.

LHC@FNAL Location & Layout Operations Center (Phase 1) Conference Room (Phase 1) Office Space (Phase 2)

LHC@FNAL – renderings

LHC@FNAL Outreach Outreach Planning and Activities: Developed scenarios for various escorted and non-escorted outreach. Examples: tour for senior citizens, VIP tour, tour as part of a physics conference, non-technical Fermilab employees on their lunch hours, general public attending a lecture or concert , a bike rider who stopped at the hi-rise In the process of writing recommendations based on these scenarios Using the plywood construction wall for displays charting the progress of construction and also general info on LHC and CMS The large 4' by 7' display has been received and the group will soon start experimenting with it. Have begun work on specifications for an interactive public outreach display. The location of LHC@FNAL in the main building makes the facility an ideal place for creative outreach activities.

Construction of LHC@FNAL Photo from October 27, 2006

Summary The CMS ROC is already in operation for CMS detector commissioning. The LAFS development program has been successfully launched. Plans for a joint CMS and LHC remote operations center are well advanced. Construction of LHC@FNAL began in Sept 2006, and will be completed by the end of the year. LHC@FNAL will be a place to support commissioning and operations for both the LHC accelerator and CMS. The center aims to enable the US HEP community to remain actively engaged in activities at CERN. We look forward to developing plans for ILC remote operations along with the global ILC community.

Additional Slides

* Accelerator Subgroup LHC@FNAL Task Force Erik Gottschalk – Chair (FNAL-PPD) Kurt Biery (FNAL-CD) Suzanne Gysin* (FNAL-CD) Elvin Harms* (FNAL-AD) Shuichi Kunori (U. of Maryland) Mike Lamm* (FNAL-TD) Mike Lamont* (CERN-AB) Kaori Maeshima (FNAL-PPD) Patty McBride (FNAL-CD) Elliott McCrory* (FNAL-AD) Andris Skuja (U. of Maryland) Jean Slaughter* (FNAL-AD) Al Thomas (FNAL-CD) The formal LHC@FNAL task force had its last meeting on March 29, 2006. The group has evolved into an integration task force with a new charge and a few new members. Task force was charged by the Fermilab Director in April, 2005. Task force wrote a requirements document and WBS. Work completed in March, 2006. * Accelerator Subgroup

Some assumptions (CMS operations) For CMS CMS will have a shift schedule, a run plan, and a protocol that defines responsibilities and roles of shift personnel. We assume that a shift leader is responsible for CMS shift activities. LHC@FNAL will have shift operators who will be able to assist US-CMS collaborators with CMS activities during commissioning and operations. LHC@FNAL will participate in CMS shifts. Neither the duration nor the frequency of the LHC@FNAL shifts has been determined. The CMS Collaboration will have a protocol for access to the CMS control system (PVSS), and a policy for how access to the control system will vary depending on the physical location of an individual user. The CMS Collaboration will have a policy that defines how DAQ resources are allocated. This includes allocation of DAQ resources to various detector groups for calibration and testing. The CMS Collaboration will have a protocol that defines how on-demand video conferencing will be used in CMS control rooms and LHC@FNAL. The CMS Collaboration will provide web access to electronic logbook and monitoring information to collaborators worldwide The CMS Collaboration will maintain a call tree that lists on-call experts worldwide for each CMS subsystem during commissioning and operations For both CMS & LHC LHC@FNAL will comply with all CERN and Fermilab safety and security standards.

Site Visits Technology Research, Education, and Commercialization Center (TRECC) – West Chicago, Illinois (Aug. 25, 2005) Gemini Project remote control room – Hilo, Hawaii (Sept. 20, 2005) http://docdb.fnal.gov/CMS-public/DocDB/ShowDocument?docid=425 Jefferson Lab control room – Newport News, Virginia (Sept. 27, 2005) http://docdb.fnal.gov/CMS-public/DocDB/ShowDocument?docid=505 Hubble Space Telescope & STScI – Baltimore, Maryland (Oct. 25, 2005) National Ignition Facility – Livermore, California (Oct. 27, 2005) http://docdb.fnal.gov/CMS-public/DocDB/ShowDocument?docid=532 General Atomics – San Diego, California (Oct. 28, 2005) Spallation Neutron Source – Oak Ridge, Tennessee (Nov. 15, 2005) http://docdb.fnal.gov/CMS-public/DocDB/ShowDocument?docid=570 Advanced Photon Source – Argonne, Illinois (Nov. 17, 2005) European Space Operations Centre – Darmstadt, Germany (Dec. 7, 2005) http://docdb.fnal.gov/CMS-public/DocDB/ShowDocument?docid=622