J. Patrick - Major Challenges/MOPA021 – a new Remote Operations Center at Fermilab J. Patrick, et al Fermilab.

Slides:



Advertisements
Similar presentations
A Presentation Management System for Collaborative Meetings Krzysztof Wrona (ZEUS) DESY Hamburg 24 March, 2003 ZEUS Electronic Meeting Management System.
Advertisements

Netscape Application Server Application Server for Business-Critical Applications Presented By : Khalid Ahmed DS Fall 98.
Development of the System remote access real time (SRART) at JINR for monitoring and quality assessment of data from the ATLAS LHC experiment (Concept.
Network and Server Basics. 6/1/20152 Learning Objectives After viewing this presentation, you will be able to: Understand the benefits of a client/server.
DØ Global Monitoring and Ideas for LHC/CMS remote monitoring Pushpa Bhat Fermilab.
Web Servers How do our requests for resources on the Internet get handled? Can they be located anywhere? Global?
Exploiting the Grid to Simulate and Design the LHCb Experiment K Harrison 1, N Brook 2, G Patrick 3, E van Herwijnen 4, on behalf of the LHCb Grid Group.
Screen Snapshot Service Kurt Biery LAFS Meeting, 08-May-2007.
L. Taylor 2 March CMS Centres Worldwide See paper (attached to agenda) “How to create a CMS My Institute” which addresses the questions:
Confidential Crisis Management Innovations, LLC. CMI CrisisPad TM Product Overview Copyright © 2011, Crisis Management Innovations, LLC. All Rights Reserved.
J. Patrick - Major Challenges/MOPA021 – a new Remote Operations Center at Fermilab J. Patrick, et al Fermilab.
The Internetworked E-Business Enterprise
IT – DBMS Concepts Relational Database Theory.
October 24, 2000Milestones, Funding of USCMS S&C Matthias Kasemann1 US CMS Software and Computing Milestones and Funding Profiles Matthias Kasemann Fermilab.
LCG Milestones for Deployment, Fabric, & Grid Technology Ian Bird LCG Deployment Area Manager PEB 3-Dec-2002.
Trimble Connected Community
US CMS Collaboration Meeting: April 7-8, LHC Remote Operations Center Erik Gottschalk Fermilab.
Copyright © cs-tutorial.com. Introduction to Web Development In 1990 and 1991,Tim Berners-Lee created the World Wide Web at the European Laboratory for.
Suzanne Gysin1 Software for the LHC Types of Software Current Prototyping Architecture Ideas Requirements Revisited WBS considerations.
Thomas Baron 4th June » 20 member states » 2300 staff people in 10 departments » >10000 users on site ˃608 universities and 113 nationalities »
Brochure: Lucas Taylor CHEP 2009, Prague1CMS Centres Worldwide : A New Collaborative.
Version 4.0. Objectives Describe how networks impact our daily lives. Describe the role of data networking in the human network. Identify the key components.
Presentation to OHEP, OFES, OASCR: Nov 7, Remote Operations and Collaborative Technologies for Distributed Science --- HEP & FES --- Erik Gottschalk.
Requirements Review – July 21, Requirements for CMS Patricia McBride July 21, 2005.
Networks QUME 185 Introduction to Computer Applications.
Drag and Drop Display and Builder. Timofei B. Bolshakov, Andrey D. Petrov FermiLab.
SCSC 311 Information Systems: hardware and software.
CIS 1310 – HTML & CSS 1 Introduction to the Internet.
EPICS Direction to Support Large Projects and Incorporate New Technology Leo R. Dalesio 09/21/99.
1 Kenneth Osborne, 9/14/07 Inter Group communications at the Advanced Light Source. Overview of the different methods of communication between different.
Computer Emergency Notification System (CENS)
NETWORK HARDWARE AND SOFTWARE MR ROSS UNIT 3 IT APPLICATIONS.
A.Golunov, “Remote operational center for CMS in JINR ”, XXIII International Symposium on Nuclear Electronics and Computing, BULGARIA, VARNA, September,
09/02 ID099-1 September 9, 2002Grid Technology Panel Patrick Dreher Technical Panel Discussion: Progress in Developing a Web Services Data Analysis Grid.
INTRODUCTION TO WEB APPLICATION Chapter 1. In this chapter, you will learn about:  The evolution of the Internet  The beginning of the World Wide Web,
Lucas Taylor LHC Grid Fest CERN 3 rd Oct Collaborating at a distance 1 Collaborating at a Distance Lucas Taylor ▪ LHC Grid Fest ▪ 3 Oct 2008 ▪ CERN.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
Summary Collaborative tools track Eva Hladká Masaryk University & CESNET Czech Republic.
Hosted Voice & Hosted Contact Center
F Drag and Drop Controls Display and Builder (Synoptic Display) Timofei Bolshakov, Andrey Petrov Fermilab Accelerator Controls Department March 26, 2007.
I NTRODUCTION TO N ETWORK A DMINISTRATION. W HAT IS A N ETWORK ? A network is a group of computers connected to each other to share information. Networks.
Educational Leadership and Planning for Technology Chapter 7 – Data Communications, the Internet, and Educational Applications Dr. Anthony G. Picciano.
LARP Collaboration Meeting April BNL -FNAL - LBNL - SLAC Status Report E. Harms 28 March 2006.
14 November 08ELACCO meeting1 Alice Detector Control System EST Fellow : Lionel Wallet, CERN Supervisor : Andre Augustinus, CERN Marie Curie Early Stage.
McGraw-Hill/Irwin © 2008 The McGraw-Hill Companies, All Rights Reserved Chapter 15 Creating Collaborative Partnerships.
R. Krempaska, October, 2013 Wir schaffen Wissen – heute für morgen Controls Security at PSI Current Status R. Krempaska, A. Bertrand, C. Higgs, R. Kapeller,
Remote Monitoring for Hardware & Beam Commissioning E. Harms/FNAL 7 April 2005 Port Jefferson, New York bnl - fnal- lbnl - slac US LHC Accelerator Research.
Suzanne Gysin 1, Andrey D. Petrov 1, Pierre Charrue 2, Wojciech Gajewski 2, Kris Kostro 2, Maciej Peryt 2 1 Fermi National Accelerator Laboratory, 2 European.
11/3/2010 CM15 Controls for LARP and LAFS - 1 Terri Lahey LARP/LAFS/CERN and SLAC.
September 29, Trip to JLAB MCC September 27, 2005.
Nigel Lockyer Fermilab Operations Review 16 th -18 th May 2016 Fermilab in the Context of the DOE Mission.
CD Strategy Session/Briefing on Collaboration Tools: Sept. 12, Collaboration Tools Erik Gottschalk.
CMS CB, 22 June CMS Centre Status Lucas Taylor CMS Centre Project Manager CMS Collaboration Board CMS Week, June CMS Centre Status 2.New offices.
Visit to CERN/CMS Jan 2006 Patricia McBride Fermilab Slides taken from presentations by Hans Hoffmann and Werner Jank.
Committee – June 30, News from CERN Erik Gottschalk June 30, 2005.
UEC Meeting: May 6, Remote Operations Center Erik Gottschalk.
Network and Server Basics. Learning Objectives After viewing this presentation, you will be able to: Understand the benefits of a client/server network.
Data and database administration
LHC Science Goals & Objectives
Netscape Application Server
Remote Operations Erik Gottschalk Fermilab
Remote Operations For High Energy Physics
CMS Centres for Control, Monitoring, Offline Operations and Analysis
Remote Operations Erik Gottschalk 5 October, 2005
CMS Centres Worldwide CMS Centres Worldwide
Remote Operations at Fermilab
Update Erik Gottschalk.
Remote Operations Erik Gottschalk 7 September, 2005
CMS Centres for Control, Monitoring, Offline Operations and Analysis
Patrick Dreher Research Scientist & Associate Director
Presentation transcript:

J. Patrick - Major Challenges/MOPA021 – a new Remote Operations Center at Fermilab J. Patrick, et al Fermilab

J. Patrick - Major Challenges/MOPA022 AbstractAbstract Commissioning the LHC accelerator and experiments will be a vital part of the worldwide high-energy physics program beginning in A remote operations center, has been built at Fermilab to make it easier for accelerator scientists and experimentalists working in North America to help commission and participate in operations of the LHC and experiments. Evolution of this center from concept through construction and early use will be presented as will details of its controls system, management, and expected future use.

J. Patrick - Major Challenges/MOPA023 ContentsContents Introduction Concept Design Construction Details/Special Features Early Use Future Plans Summary & Acknowledgements

J. Patrick - Major Challenges/MOPA024 What is A Place Where members of the LHC community can participate remotely in CMS and LHC activities That provides access to information similar to what is available in control rooms at CERN With the amenities of a formal control room A Communications Conduit Between CERN and members of the North American LHC community An Outreach tool Visitors will be able to see current LHC activities Visitors will be able to see how future international projects in particle physics can benefit from active participation in projects at remote locations.

J. Patrick - Major Challenges/MOPA025 What is Facilitate communication and help members of the LHC community in North America contribute their expertise LHC is a large part of the US and Fermilab program Major US colliders shut down in 1-2 years HEP trend is fewer but larger and more complex facilities Widely distributed expertise, not always at CERN CMS One of several dedicated operations and monitoring centers: a traditional “Control Room” located at Point 5 in Cessy, France a “CMS Centre” for up to fifty people located in Meyrin, Switzerland remote centers such as the at Fermilab. LHC An extension of the CERN Control Centre (CCC). Provide remote monitoring for LHC accelerator components (magnets+instrumentation) developed and built in the U.S. Development of software for the LHC controls system Development of tools for efficient remote access A unique opportunity to have detector and accelerator experts in close proximity to each other solving problems together.

J. Patrick - Major Challenges/MOPA026 How did the Concept Evolve? Fermilab Contributed to construction of several major CMS components Hosts the LHC Physics Center (LPC) for US-CMS Is a Tier-1 grid computing center for CMS and is also involved in offline software development Designed and fabricated LHC machine components Is part of the US LHC Accelerator Research Program (LARP) Is involved in software development for the LHC control system through a collaboration agreement with CERN called Software (LAFS). The LPC had always planned for remote data quality monitoring of CMS during operations. Could we expand this role to include remote shifts? LARP was interested in providing support for US-built components, training people before going to CERN, and remote participation in LHC studies. We saw an opportunity for US accelerator scientists and engineers to work together with detector experts to contribute their combined expertise to LHC & CMS commissioning. The idea of joint remote operations center at FNAL emerged

J. Patrick - Major Challenges/MOPA027 Remote operations for LHC and LARP CCC LHC remote operations: training prior to stays at CERN remote participation in studies ‘service after the sale’: to support accelerator components built in the U.S. access to monitoring information software development for LHC controls system (LAFS) CCC at CERN LARP: The US LHC Accelerator Research Program (LARP) consists of four US laboratories, BNL, FNAL, LBNL and SLAC, who collaborate with CERN on the LHC. The LARP program enables U.S. accelerator specialists to take an active and important role in the LHC accelerator during its commissioning and operations, and to be a major collaborator in LHC performance upgrades.

J. Patrick - Major Challenges/MOPA028 Concept Evolution Some proof of principle work done by LHC/LARP personnel Thanks to AB/OP colleagues at CERN CMS Remote Operations Center Hadron Calorimeter Test Beam, Magnet Test and Cosmic Challenge Unified Task Force formed at request of FNAL Director First meeting 4 May 2005 Close-out 19 October 2006 Requirements document created and reviewed CMS LHC CMS/LHC combined Constraints 63 total requirements Review 21 July 2005 Visits to other major facilities in fall 2005 JLAB, Hubble Space Telescope, SNS, APS, National Ignition Facility, ESO, … Proposal to Directorate and constituents Construction Authorization and Engineering May 2006 Construction initiated September 2006

J. Patrick - Major Challenges/MOPA029 DesignDesign Variation of CERN Control Centre (CCC) design Design work done in-house High Visibility location preferred Laboratory Director Adjacent to meeting and office areas Provide Security Maintain Privacy when required Special Features Storefront/mullion-free glass Projection Wall & Screens Privacy glass between center and adjacent conference room Programmable lighting Standalone HVAC system Window treatment - morning glare

J. Patrick - Major Challenges/MOPA0210 Location Wilson Hall Main Entrance Cafeteria

J. Patrick - Major Challenges/MOPA0211 RenderingsRenderings

J. Patrick - Major Challenges/MOPA0212 ConsolesConsoles Three Bids submitted Consider Cost & Specifications Same Vendor as for CCC selected

J. Patrick - Major Challenges/MOPA0213 Construction Slide Show

J. Patrick - Major Challenges/MOPA0214 Construction Summary Safety No injuries One incident On-time 12-week schedule Under budget

J. Patrick - Major Challenges/MOPA0215 Noteworthy Features Features that are currently available: CERN-style consoles with 8 workstations shared by CMS & LHC Videoconferencing installed for 2 consoles Webcams for remote viewing Secure keycard access to Secure network for console PCs Dedicated subnet, router with restricted access 12-minute video essay displayed on the large “Public Display” Used to explain the LHC and CMS to tour groups visiting Fermilab High Definition (HD) videoconferencing system for conference room HD viewing of and HD display capabilities in the centre Secure group login capability for consoles, with persistent console sessions

J. Patrick - Major Challenges/MOPA0216 ComputingComputing Separate Computing Systems/Platforms for Consoles Outreach Videoconferencing Projectors Gateway Server Protected access as appropriate

J. Patrick - Major Challenges/MOPA0217 Early Use Current Organization Engineering Working Group Operations Support Team CMS Working Group LHC Working Group Outreach Working Group LARP SPS Beam Study period LHC Hardware Commissioning Software (LAFS) Applications development

J. Patrick - Major Challenges/MOPA0218 CMS Usage Tier 1 Computing Operations Responsible for FNAL Tier-1 Computing ~50% of USCMS computing Central support for university based Tier-2 centers CMS Tracking Integration Test Full operational test of 18% of CMS Silicon Tracker located at CERN US involved in detector construction Shifts ran between February and June, 2007 About 15 people from FNAL and several other institutions participated Remote Monitoring (detector control restricted to CERN) Data Quality Event Display Detector Control System (detector voltage, currents, temperature,…) Data transfer to FNAL and associated bookkeeping CMS Global Runs Integration tests at the end of every month

J. Patrick - Major Challenges/MOPA0219 Role Based Access (RBAC) An approach to restrict system access to authorized users For the LHC control system A ROLE is some job function Examples are LHC operator, RF expert, Developer, etc. Control system users are assigned one or more roles Device/property access may be restricted to authenticated users with certain roles Read Monitor Write/Set Crucial for remote access Permissions can be set up to allow remote users to access certain devices safely Developed as a collaboration between CERN&FNAL (LAFS) Deployed in June 2007 See Posters TPPA04, TPPA12, WPPB08

J. Patrick - Major Challenges/MOPA0220 Screen Snapshot Service (SSS) Provide snapshots of a graphical console to remote users Read only images Using only web-based technologies (Java, JSP, Tomcat) No possibility of control No client software installation Simple, single server installation per facility No extra firewall rules beyond web server access Service Receives snapshots from monitored applications Caches for a short period of time Serves to requesting authorized users Monitored Applications Java webstart application continually captures desktop and sends to service Used by CMS for tracker integration test More information at: Monitored Application(s) Snapshot Service Web Browser(s) snapshots requests

J. Patrick - Major Challenges/MOPA0221 Screen Snapshot Service Example Snapshots from CMS global integration run

J. Patrick - Major Challenges/MOPA0222 Web Based Monitoring (WBM) Considerable information saved by online monitoring processes into the CMS database Trigger rates, cross sections, runs taken, beam conditions, temperatures, voltages, environmental conditions, etc. Varying schemas, but can be presented as value vs time Web Based Monitoring reads data and makes available in a variety of formats HTML, XML, text, graphics, ROOT TTree object Generic database query + many custom web pages Value vs time, also correlation plots between values Web based ROOT browser developed to look at histograms produced by the Data Quality Monitor (DQM) system

J. Patrick - Major Challenges/MOPA0223 CMS Run Summary Pages

J. Patrick - Major Challenges/MOPA0224 CMS Tracker Detector Status

J. Patrick - Major Challenges/MOPA0225 Future Plans Ramp up CMS shifts Detector Commissioning Global Runs Computing Software and Analysis Challenge (CSA07) LHC Hardware Commissioning Keep LHC Project Associates engaged after return US/LARP deliverable monitoring LAFS Continue applications development LHC Beam Participation SPS and other Injector Beam Studies LARP Instrumentation LHC Commissioning and Beam Studies (especially for luminosity upgrades)

J. Patrick - Major Challenges/MOPA0226 Summary Remote operations is the next step to enable collaborators to participate in operations from anywhere in the world. Goals are: Secure access to data, devices, logbooks, monitoring information, etc. Safeguards so actions do not jeopardize or interfere with operations Collaborative tools for effective remote participation in shift activities Fermilab has built the Remote Operations Center Shared by scientists and engineers working on the LHC and CMS. Collaborative Design Built rapidly Provides a means to participate remotely in LHC studies, access monitoring information, a training facility, and supports development of software for the LHC control system. For CMS it provides a location for CMS remote commissioning and operations shifts, and Tier-1 grid monitoring shifts. Already a popular stop for visitors and dignitaries.

J. Patrick - Major Challenges/MOPA0227 AcknowledgementsAcknowledgements Task Force Especially Elvin Harms (the original speaker) and Erik Gottschalk from whom many slides were taken Design, Engineering, and Construction Team from Fermilab/FESS Gary Van Zandbergen Steve Dixon Merle Olson Tom Prosapio CERN AB/OP Especially Djanko Manglunki Staff of Sites Visited Users of Facility