J. Patrick - Major Challenges/MOPA021 – a new Remote Operations Center at Fermilab J. Patrick, et al Fermilab.

Slides:



Advertisements
Similar presentations
Consoles and Hardware Status Erik Gottschalk. Overview Consoles will be installed Dec. 19 & 20 Network installation will begin Dec. 11 Installation of.
Advertisements

SECAM Systems Product Presentation SECAM Systems © 2010.
Development of the System remote access real time (SRART) at JINR for monitoring and quality assessment of data from the ATLAS LHC experiment (Concept.
Network and Server Basics. 6/1/20152 Learning Objectives After viewing this presentation, you will be able to: Understand the benefits of a client/server.
Cambodia-India Entrepreneurship Development Centre - : :.... :-:-
Installing software on personal computer
Screen Snapshot Service Kurt Biery LAFS Meeting, 08-May-2007.
L. Taylor 2 March CMS Centres Worldwide See paper (attached to agenda) “How to create a CMS My Institute” which addresses the questions:
“ Does Cloud Computing Offer a Viable Option for the Control of Statistical Data: How Safe Are Clouds” Federal Committee for Statistical Methodology (FCSM)
IPNexus Briefing Instant Messaging and Collaboration.
Introduction to Information System Development.
F Run II Experiments and the Grid Amber Boehnlein Fermilab September 16, 2005.
October 24, 2000Milestones, Funding of USCMS S&C Matthias Kasemann1 US CMS Software and Computing Milestones and Funding Profiles Matthias Kasemann Fermilab.
US CMS Collaboration Meeting: April 7-8, LHC Remote Operations Center Erik Gottschalk Fermilab.
Suzanne Gysin1 Software for the LHC Types of Software Current Prototyping Architecture Ideas Requirements Revisited WBS considerations.
Screen Snapshot Service Kurt Biery SiTracker Monitoring Meeting, 23-Jan-2007.
Thomas Baron 4th June » 20 member states » 2300 staff people in 10 departments » >10000 users on site ˃608 universities and 113 nationalities »
LHC Accelerator Coordination Meeting: March 28, Remote Operations for CMS Erik Gottschalk.
Oracle Open World 2014 Integrating PeopleSoft for Seamless IT Service Delivery: Tips from UCF 1 Robert Yanckello, Chief Technology Officer University of.
Brochure: Lucas Taylor CHEP 2009, Prague1CMS Centres Worldwide : A New Collaborative.
J. Patrick - Major Challenges/MOPA021 – a new Remote Operations Center at Fermilab J. Patrick, et al Fermilab.
Presentation to OHEP, OFES, OASCR: Nov 7, Remote Operations and Collaborative Technologies for Distributed Science --- HEP & FES --- Erik Gottschalk.
1 This workshop is part of our commitment to periodically provide you with updated information about BPA’s contracting and project management processes.
Ocean Observatories Initiative Common Execution Infrastructure (CEI) Overview Michael Meisinger September 29, 2009.
Requirements Review – July 21, Requirements for CMS Patricia McBride July 21, 2005.
Drag and Drop Display and Builder. Timofei B. Bolshakov, Andrey D. Petrov FermiLab.
2010 W EST V IRGINIA GIS C ONFERENCE Wednesday, June 9, 2010.
1 Kenneth Osborne, 9/14/07 Inter Group communications at the Advanced Light Source. Overview of the different methods of communication between different.
André Augustinus 10 October 2005 ALICE Detector Control Status Report A. Augustinus, P. Chochula, G. De Cataldo, L. Jirdén, S. Popescu the DCS team, ALICE.
Module 5: Implementing Printing. Overview Introduction to Printing in the Windows Server 2003 Family Installing and Sharing Printers Managing Access to.
US CMS/D0/CDF Jet/Met Workshop – Jan. 28, The CMS Physics Analysis Center - PAC Dan Green US CMS Program Manager Jan. 28, 2004.
A.Golunov, “Remote operational center for CMS in JINR ”, XXIII International Symposium on Nuclear Electronics and Computing, BULGARIA, VARNA, September,
Collaborating at a Distance: Operations Centers, Tools, and Trends, Erik Gottschalk 1 Collaborating at a Distance: Operations Centers, Tools, and Trends.
Lucas Taylor LHC Grid Fest CERN 3 rd Oct Collaborating at a distance 1 Collaborating at a Distance Lucas Taylor ▪ LHC Grid Fest ▪ 3 Oct 2008 ▪ CERN.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
Summary Collaborative tools track Eva Hladká Masaryk University & CESNET Czech Republic.
1 Local Readiness Team Lead Meeting June 6, 2007.
F Drag and Drop Controls Display and Builder (Synoptic Display) Timofei Bolshakov, Andrey Petrov Fermilab Accelerator Controls Department March 26, 2007.
HEPiX 2 nd Nov 2000 Alan Silverman Proposal to form a Large Cluster SIG Alan Silverman 2 nd Nov 2000 HEPiX – Jefferson Lab.
Accelerator Physics Department Meeting July 26, 2006 Jean slaughter1 Status Report APD Meeting July 26, 2007 Jean Slaughter.
Plan to go forward Peter Wilson SBN Program Coordinator 27 September 2014.
Computing at SSRL: Experimental User Support Timothy M. McPhillips Stanford Synchrotron Radiation Laboratory.
NSF Middleware Initiative Purpose To design, develop, deploy and support a set of reusable, expandable set of middleware functions and services that benefit.
LARP Collaboration Meeting April BNL -FNAL - LBNL - SLAC Status Report E. Harms 28 March 2006.
R. Krempaska, October, 2013 Wir schaffen Wissen – heute für morgen Controls Security at PSI Current Status R. Krempaska, A. Bertrand, C. Higgs, R. Kapeller,
Remote Monitoring for Hardware & Beam Commissioning E. Harms/FNAL 7 April 2005 Port Jefferson, New York bnl - fnal- lbnl - slac US LHC Accelerator Research.
Suzanne Gysin 1, Andrey D. Petrov 1, Pierre Charrue 2, Wojciech Gajewski 2, Kris Kostro 2, Maciej Peryt 2 1 Fermi National Accelerator Laboratory, 2 European.
Nigel Lockyer Fermilab Operations Review 16 th -18 th May 2016 Fermilab in the Context of the DOE Mission.
11/3/2010 CM15 Controls for LARP and LAFS - 1 Terri Lahey LARP/LAFS/CERN and SLAC.
September 29, Trip to JLAB MCC September 27, 2005.
CD Strategy Session/Briefing on Collaboration Tools: Sept. 12, Collaboration Tools Erik Gottschalk.
Visit to CERN/CMS Jan 2006 Patricia McBride Fermilab Slides taken from presentations by Hans Hoffmann and Werner Jank.
Console Status Erik Gottschalk. Status Network - installed, upgrade switch to add more “external” ports Wireless network - available, upgrade to improve.
Committee – June 30, News from CERN Erik Gottschalk June 30, 2005.
UEC Meeting: May 6, Remote Operations Center Erik Gottschalk.
Network and Server Basics. Learning Objectives After viewing this presentation, you will be able to: Understand the benefits of a client/server network.
CMS Centre Project - Green Light Meeting
LHC Science Goals & Objectives
Remote Operations Erik Gottschalk Fermilab
Remote Operations For High Energy Physics
CMS Centres for Control, Monitoring, Offline Operations and Analysis
Remote Operations Erik Gottschalk 5 October, 2005
Assessment of Our Status After 183 Days (6 months)
CMS Centres Worldwide CMS Centres Worldwide
Remote Operations for CMS & LHC Erik Gottschalk Fermilab
Remote Operations at Fermilab
Update Erik Gottschalk.
Remote Operations Erik Gottschalk 7 September, 2005
CMS Centres for Control, Monitoring, Offline Operations and Analysis
Joint Application Development (JAD)
Presentation transcript:

J. Patrick - Major Challenges/MOPA021 – a new Remote Operations Center at Fermilab J. Patrick, et al Fermilab

J. Patrick - Major Challenges/MOPA022 AbstractAbstract Commissioning the LHC accelerator and experiments will be a vital part of the worldwide high-energy physics program beginning in A remote operations center, has been built at Fermilab to make it easier for accelerator scientists and experimentalists working in North America to help commission and participate in operations of the LHC and experiments. Evolution of this center from concept through construction and early use will be presented as will details of its controls system, management, and expected future use.

J. Patrick - Major Challenges/MOPA023 ContentsContents Introduction Concept Design Construction Early Use Details/Special Features Future Plans Summary & Acknowledgements

J. Patrick - Major Challenges/MOPA024 What is A Place That provides access to information in a manner that is similar to what is available in control rooms at CERN Where members of the LHC community can participate remotely in CMS and LHC activities A Communications Conduit Between CERN and members of the LHC community located in North America An Outreach tool Visitors will be able to see current LHC activities Visitors will be able to see how future international projects in particle physics can benefit from active participation in projects at remote locations.

J. Patrick - Major Challenges/MOPA025 What is Allow experts located at Fermilab to participate in CMS and LHC commissioning and operations. Hardware and software necessary to participate effectively in CMS and LHC. Facilitate communication and help members of the LHC community in North America contribute their expertise to CMS and LHC. An extension of the CERN Control Centre (CCC). For example, to assist members of US/LARP in training and data analysis. An extension of the CMS Control Room. For example, to provide a call center for US-CMS collaborators to access information about CMS and the LHC accelerator. A unique opportunity to have detector and accelerator experts in close proximity to each other solving problems together.

J. Patrick - Major Challenges/MOPA026 Remote operations for LHC and LARP CCC LHC remote operations: training prior to stays at CERN remote participation in studies ‘service after the sale’: to support accelerator components built in the U.S. access to monitoring information software development for LHC controls system (LAFS) CCC at CERN LARP: The US LHC Accelerator Research Program (LARP) consists of four US laboratories, BNL, FNAL, LBNL and SLAC, who collaborate with CERN on the LHC. The LARP program enables U.S. accelerator specialists to take an active and important role in the LHC accelerator during its commissioning and operations, and to be a major collaborator in LHC performance upgrades.

J. Patrick - Major Challenges/MOPA027 How did the Concept Evolve? Fermilab has contributed to CMS detector construction, hosts the LHC Physics Center (LPC) for US-CMS, is a Tier-1 grid computing center for CMS, has designed and fabricated LHC machine components, is part of the LHC Accelerator Research Program (LARP), and is involved in software development for the LHC controls system through a collaboration agreement with CERN called Software (LAFS). The LPC had always planned for remote data quality monitoring of CMS during operations. Could we expand this role to include remote shifts? LARP was interested in providing support for US-built components, training people before going to CERN, and remote participation in LHC studies. We saw an opportunity for US accelerator scientists and engineers to work together with detector experts to contribute their combined expertise to LHC & CMS commissioning. The idea of joint remote operations center at FNAL emerged

J. Patrick - Major Challenges/MOPA028 ConceptConcept Some proof of principle work done by LHC/LARP personnel Thanks to AB/OP colleagues at CERN CMS Remote Operations Center Unified Task Force formed at request of FNAL Director First meeting 4 May 2005 Close-out 19 October 2006 Requirements document created and reviewed CMS LHC CMS/LHC combined Constraints 63 total requirements Review 21 July 2005 Proposal to Directorate and constitutients Construction Authorization and Engineering May 2006 Construction initiated September 2006

J. Patrick - Major Challenges/MOPA029 Site Visits Technology Research, Education, and Commercialization Center (TRECC) – West Chicago, Illinois (Aug. 25, 2005) Gemini Project remote control room – Hilo, Hawaii (Sept. 20, 2005) Jefferson Lab control room – Newport News, Virginia (Sept. 27, 2005) Hubble Space Telescope & STScI – Baltimore, Maryland (Oct. 25, 2005) National Ignition Facility – Livermore, California (Oct. 27, 2005) General Atomics – San Diego, California (Oct. 28, 2005) Spallation Neutron Source – Oak Ridge, Tennessee (Nov. 15, 2005) Advanced Photon Source – Argonne, Illinois (Nov. 17, 2005) European Space Operations Centre – Darmstadt, Germany (Dec. 7, 2005)

J. Patrick - Major Challenges/MOPA0210 DesignDesign Design work done in-house Variation of CCC design due to purpose High Visibility location preferred Laboratory Director Adjacent to meeting and office areas Provide Security Maintain Privacy Special Features Storefront/mullion-free glass Projection Wall & Screens Privacy glass between center and adjacent conference room Window treatment - morning glare

J. Patrick - Major Challenges/MOPA0211 LocationLocation Wilson Hall Main Entrance Cafeteria

J. Patrick - Major Challenges/MOPA0212 RenderingsRenderings

J. Patrick - Major Challenges/MOPA0213 ConsolesConsoles Three Bids submitted Consider Cost & Specifications Same Vendor as for CCC selected

J. Patrick - Major Challenges/MOPA0214 Construction Summary Safety No injuries One incident On-time 12-week schedule Under budget

J. Patrick - Major Challenges/MOPA0215 Construction Slide Show

J. Patrick - Major Challenges/MOPA0216 ComputingComputing Separate Computing Systems/Platforms for Consoles Outreach Videoconferencing Projectors Gateway Server Protected access as appropriate

J. Patrick - Major Challenges/MOPA0217 Early Use Current Organization Engineering Working Group Operations Support Team CMS WorkingGroup LHC Working Group Outreach Working Group Primary user to date is CMS Tier-1 Computing support Remote shifts by Tracking group from March - May Test beam activities by HCAL group Currently Global Commissioning runs CMS data operations - ‘CSA07’ computing challenge LARP SPS Beam Study period LHC Hardware Commissioning Software Applications development

J. Patrick - Major Challenges/MOPA0218 Noteworthy Features Features that are currently available: CERN-style consoles with 8 workstations shared by CMS & LHC Videoconferencing installed for 2 consoles, can be expanded to 4 consoles Webcams for remote viewing of Secure keycard access to Secure network for console PCs (dedicated subnet, physical security, dedicated router with Access Control Lists to restrict access, only available in 12-minute video essay displayed on the large “Public Display” used by docents from the Education Department to explain CMS and LHC to tour groups High Definition (HD) videoconferencing system for conference room HD viewing of and HD display capabilities in the centre Secure group login capability for consoles, with persistent console sessions Role Based Access Control (RBAC) for the LHC controls system (LAFS) Screen Snapshot Service (SSS) for CMS and the LHC controls system

J. Patrick - Major Challenges/MOPA0219 Role Based Access Control (RBAC) An approach to restrict system access to authorized users. What is a ROLE? A role is a job function within an organization. Examples: LHC Operator, SPS Operator, RF Expert, PC Expert, Developer, … A role is a set of access permissions for a device class/property group Roles are defined by the security policy A user may assume several roles What is being ACCESSED? Physical devices (power converters, collimators, quadrupoles, etc.) Logical devices (emittance, state variable) What type of ACCESS? Read: the value of a device once Monitor: the device continuously Write/set: the value of a device Status: Deployed at the end of June 2007 This is a FNAL/CERN collaboration (LAFS) working on RBAC for the LHC control system. See Posters TPPA04, TPPA12, WPPB08 Details & Special Features The software infrastructure for RBAC is crucial for remote operations in that it provides a safeguard. Permissions can be setup to allow experts outside the control room to read or monitor a device safely.

J. Patrick - Major Challenges/MOPA0220 Screen Snapshot Service (SSS) An approach to provide a snapshot of a graphical interface to remote users. What is a snapshot? An image copy of a graphical user interface at a particular instance in time. Examples: DAQ system buffer display, operator control program, … A view-only image, so there is no danger of accidental user input. Initially implemented for desktops, but could be targeted to application GUIs. What is the role of the service ? Receives and tracks the snapshots from the monitored applications. Caches the snapshots for short periods of time. Serves the snapshots to requesting applications/users. Prevents access from unauthorized applications/users. Acts as a gateway to private network applications for public network users. How does this work? Applications capture and send snapshots to the service provider in the background. Users would access snapshots using a web browser. Status: SSS is being used by CDF for operations. SSS is being used by CMS Silicon Tracker Group, and is installed at CERN and in the ROC. More information at: Details & Special Features Monitored Application(s) Snapshot Service Web Browser(s) snapshots requests

J. Patrick - Major Challenges/MOPA0221 Future Plans Ramp up CMS shifts LHC Hardware Commissioning Keep LHC Project Associates engaged after return US/LARP deliverable monitoring LAFS Continue Applications development LHC Beam Participation SPS and other Injector Beam Studies LARP Instrumentation LHC Commissioning and Beam Studies (especially for luminosity upgrades)

J. Patrick - Major Challenges/MOPA0222 SummarySummary Remote operations is the next step to enable collaborators to participate in operations from anywhere in the world. The goals are to have: secure access to data, devices, logbooks, monitoring information, etc.; safeguards, so actions do not jeopardize or interfere with operations; collaborative tools for effective remote participation in shift activities; remote shifts to streamline operations. Fermilab has built the Remote Operations Center, which is shared by scientists and engineers working on the LHC and CMS. Collaborative Design Built rapidly For the LHC it provides a means to participate remotely in LHC studies, access to monitoring information, a training facility, and supports the collaborative development of software for the LHC controls system. For CMS it provides a location (in a U.S. time zone) for CMS remote commissioning and operations shifts, and Tier-1 grid monitoring shifts. Already a popular stop for visitors and dignitaries.

J. Patrick - Major Challenges/MOPA0223 AcknowledgementsAcknowledgements Task Force Especially Erik Gottschalk from whom many slides were taken Design, Engineering, and Construction Team from Fermilab/FESS Gary Van Zandbergen Steve Dixon Merle Olson Tom Prosapio CERN AB/OP Especially Djanko Manglunki Staff of Sites Visited Users of Facility