Visit to CERN/CMS Jan 2006 Patricia McBride Fermilab Slides taken from presentations by Hans Hoffmann and Werner Jank.

Slides:



Advertisements
Similar presentations
Negotiating the Internet: Equipment and Beyond David Bankowski IT Manager, Electronic Communications 25 July 2008 Insert graphic.
Advertisements

ATLAS Tier-3 in Geneva Szymon Gadomski, Uni GE at CSCS, November 2009 S. Gadomski, ”ATLAS T3 in Geneva", CSCS meeting, Nov 091 the Geneva ATLAS Tier-3.
Report of Liverpool HEP Computing during 2007 Executive Summary. Substantial and significant improvements in the local computing facilities during the.
1 User Analysis Workgroup Update  All four experiments gave input by mid December  ALICE by document and links  Very independent.
Consoles and Hardware Status Erik Gottschalk. Overview Consoles will be installed Dec. 19 & 20 Network installation will begin Dec. 11 Installation of.
Development of the System remote access real time (SRART) at JINR for monitoring and quality assessment of data from the ATLAS LHC experiment (Concept.
Multi-layer ICT Management Presented by Andy Park.
A tool to enable CMS Distributed Analysis
1 CMS Education and Outreach Dave Barney & Lucas Taylor.
L. Taylor 2 March CMS Centres Worldwide See paper (attached to agenda) “How to create a CMS My Institute” which addresses the questions:
High Definition Videoconferencing for High Energy Physics, Erik Gottschalk 1 High Definition Videoconferencing for High Energy Physics Erik Gottschalk.
Agenda Network Infrastructures LCG Architecture Management
The Internet and World Wide Web.  Understand how the Internet evolved  Describe common Internet communication methods and activities  Setting up your.
Stefano Belforte INFN Trieste 1 CMS SC4 etc. July 5, 2006 CMS Service Challenge 4 and beyond.
October 24, 2000Milestones, Funding of USCMS S&C Matthias Kasemann1 US CMS Software and Computing Milestones and Funding Profiles Matthias Kasemann Fermilab.
US CMS Collaboration Meeting: April 7-8, LHC Remote Operations Center Erik Gottschalk Fermilab.
Computing services for the Traveling Physicist Alberto Pace CERN – Information Technology Division.
Thomas Baron 4th June » 20 member states » 2300 staff people in 10 departments » >10000 users on site ˃608 universities and 113 nationalities »
Manchester Computing Supercomputing, Visualization & e-Science Michael Daw 29 April 2004 Over Access Grid Access Grid An Evaluation Primer for CERN.
LHC Accelerator Coordination Meeting: March 28, Remote Operations for CMS Erik Gottschalk.
Zhiling Chen (IPP-ETHZ) Doktorandenseminar June, 4 th, 2009.
Brochure: Lucas Taylor CHEP 2009, Prague1CMS Centres Worldwide : A New Collaborative.
Collaborative Information Streams Steven Goldfarb, University of Michigan for the ATLAS Collaboration – Pending Approval CHEP – Taipei – 20 October 2010.
Requirements Review – July 21, Requirements for CMS Patricia McBride July 21, 2005.
CERN IT Department CH-1211 Genève 23 Switzerland t EIS section review of recent activities Harry Renshall Andrea Sciabà IT-GS group meeting.
+ discussion in Software WG: Monte Carlo production on the Grid + discussion in TDAQ WG: Dedicated server for online services + experts meeting (Thusday.
F.Fanzago – INFN Padova ; S.Lacaprara – LNL; D.Spiga – Universita’ Perugia M.Corvo - CERN; N.DeFilippis - Universita' Bari; A.Fanfani – Universita’ Bologna;
PPD Computing “Business Continuity” David Kelsey 3 May 2012.
Overview of day-to-day operations Suzanne Poulat.
Databases E. Leonardi, P. Valente. Conditions DB Conditions=Dynamic parameters non-event time-varying Conditions database (CondDB) General definition:
WLCG Service Report ~~~ WLCG Management Board, 1 st September
November SC06 Tampa F.Fanzago CRAB a user-friendly tool for CMS distributed analysis Federica Fanzago INFN-PADOVA for CRAB team.
D0RACE: Testbed Session Lee Lueking D0 Remote Analysis Workshop February 12, 2002.
US CMS/D0/CDF Jet/Met Workshop – Jan. 28, The CMS Physics Analysis Center - PAC Dan Green US CMS Program Manager Jan. 28, 2004.
CERN Physics Database Services and Plans Maria Girone, CERN-IT
A.Golunov, “Remote operational center for CMS in JINR ”, XXIII International Symposium on Nuclear Electronics and Computing, BULGARIA, VARNA, September,
Teaching and Learning with Technology ck to edit Master title style  Allyn and Bacon 2002 Teaching and Learning with Technology k to edit Master title.
CMS pixel data quality monitoring Petra Merkel, Purdue University For the CMS Pixel DQM Group Vertex 2008, Sweden.
Tim 18/09/2015 2Tim Bell - Australian Bureau of Meteorology Visit.
Lucas Taylor LHC Grid Fest CERN 3 rd Oct Collaborating at a distance 1 Collaborating at a Distance Lucas Taylor ▪ LHC Grid Fest ▪ 3 Oct 2008 ▪ CERN.
New Voices and New Visions – Summer 2008 Infrastructure for Creating a Cyberclassroom Jennifer Teig von Hoffman Boston University.
1 Andrea Sciabà CERN Critical Services and Monitoring - CMS Andrea Sciabà WLCG Service Reliability Workshop 26 – 30 November, 2007.
The CMS CERN Analysis Facility (CAF) Peter Kreuzer (RWTH Aachen) - Stephen Gowdy (CERN), Jose Afonso Sanches (UERJ Brazil) on behalf.
Handling ALARMs for Critical Services Maria Girone, IT-ES Maite Barroso IT-PES, Maria Dimou, IT-ES WLCG MB, 19 February 2013.
LARP Collaboration Meeting April BNL -FNAL - LBNL - SLAC Status Report E. Harms 28 March 2006.
The ATLAS Computing Model and USATLAS Tier-2/Tier-3 Meeting Shawn McKee University of Michigan Joint Techs, FNAL July 16 th, 2007.
The Worldwide LHC Computing Grid Introduction & Housekeeping Collaboration Workshop, Jan 2007.
ASCC Site Report Eric Yen & Simon C. Lin Academia Sinica 20 July 2005.
ID Week 13 th of October 2014 Per Johansson Sheffield University.
Wireless Communications Standard ‘wired’ networks are connected together using copper cables that carry data around the network in the form of electrical.
Meeting with University of Malta| CERN, May 18, 2015 | Predrag Buncic ALICE Computing in Run 2+ P. Buncic 1.
INFSO-RI Enabling Grids for E-sciencE File Transfer Software and Service SC3 Gavin McCance – JRA1 Data Management Cluster Service.
ALICE Physics Data Challenge ’05 and LCG Service Challenge 3 Latchezar Betev / ALICE Geneva, 6 April 2005 LCG Storage Management Workshop.
CD Strategy Session/Briefing on Collaboration Tools: Sept. 12, Collaboration Tools Erik Gottschalk.
TIFR, Mumbai, India, Feb 13-17, GridView - A Grid Monitoring and Visualization Tool Rajesh Kalmady, Digamber Sonvane, Kislay Bhatt, Phool Chand,
Your-own-afs-password CMS Centre CMS System Login: Password: N.B. you must.
CMS CB, 22 June CMS Centre Status Lucas Taylor CMS Centre Project Manager CMS Collaboration Board CMS Week, June CMS Centre Status 2.New offices.
Console Status Erik Gottschalk. Status Network - installed, upgrade switch to add more “external” ports Wireless network - available, upgrade to improve.
Committee – June 30, News from CERN Erik Gottschalk June 30, 2005.
CMS Centre Project - Green Light Meeting
WLCG IPv6 deployment strategy
Distributed Collaborations
CMS Centres for Control, Monitoring, Offline Operations and Analysis
Service Operations at the T0/T1 for the ALICE Experiment
Jan 12, 2005 Improving CMS data transfers among its distributed Computing Facilities N. Magini CERN IT-ES-VOS, Geneva, Switzerland J. Flix Port d'Informació.
CMS Centres Worldwide CMS Centres Worldwide
Remote Operations for CMS & LHC Erik Gottschalk Fermilab
CMS Centres for Control, Monitoring, Offline Operations and Analysis
Ákos Frohner EGEE'08 September 2008
The LHCb Computing Data Challenge DC06
Presentation transcript:

Visit to CERN/CMS Jan 2006 Patricia McBride Fermilab Slides taken from presentations by Hans Hoffmann and Werner Jank

S.C. CMS-TriDAS CERN/CMS visit Attended meetings as part of CMS/CPT week. –Tier0 planning, DQM, CCAR… CMS is planning to have a center of activity on the Meyrin site in addition to the operations centre at Point 5. –Space at Point 5 is limited and the Control Room is not large enough for all activities. –The Meyrin centre is known as CCAR though this name will change. At this moment, there is no space at CERN assigned to this centre. There is a proposal to use a space that is currently occupied by the print shop and the mail room. This space belongs to the IT division but may be made available to CMS - negotiations underway. However, funds are not yet secure. Their space planning combines remote operations and other activities. They would like to work together with us on planning for remote operations. –We met with Hans Hoffmann, Werner Jank and Sergio Cittolin to discuss our plans for –They will try to help us with security issues if we need it. We visited Point 5, the control room building and the CMS hall. –

S.C. CMS-TriDAS CMS Meyrin Centre First years of data with CMS: Monitor the operation of the experiment - control in Cessy Near-line control of Data Quality, in particular by the sub detectors Control and operate the CMS use of the worldwide computing facility Functions to be covered: Conference room for ~200 persons Control and Monitoring Room for “near-line” activities of CMS Workplaces for combined Data Quality work Offices for Persons responsible for “near-line” activities CERN Visitor facilities Discussion areas

S.C. CMS-TriDAS Building 510 CCAR Work-area Offices print shop Visitors; in/out-general Monitoring/ Control room Conference Meeting Technical Coffee- discussion

S.C. CMS-TriDAS CCAR (possible) layout Secr.Office New printshop Visitors Control room Conference Technical Collaborative work Meeting Towards Main Building Space breakdown Control room m 2 Conference room m 2 Meeting/VRVS room 1 70 m 2 Technical room 1 80 m 2 Group work room m 2 Office space m 2 Available cooling power 110kW

S.C. CMS-TriDAS CCAR Basic concept Allow operations team to work effectively together! Facilitate communication of all kinds –Technical communications Point 5 LHC control room Tier-0 CAF Outside facilities (Tier-1’s, Tier-2’s) –People communications Between “operators” –Subdetectors –Calibration / alignment –express line(s) –(prompt) analysis groups Physics community –Regular (daily/weekly) updates Become “Centre of Operations”

S.C. CMS-TriDAS CCAR Meeting Rooms Conference room (200+ people) –Regular discussion meeting Daily to weekly, update collaborators Discuss status, short term plan, physics, … –2 big display screens CMS, LHC status –Videoconferencing (VRVS) 2 wall projectors (VRVS, Local presentations) Excellent audio system (fixed + mobile microphones) Meeting room (50 people) –Smaller, more dedicated discussions Physics, (sub)detector related –Videoconferencing (VRVS) 2 wall projectors (VRVS, Local presentations) Good audio system (fixed microphones) Phone conferencing Good network access for both… –Sufficient wall outlets for fixed DHCP –Good WiFi coverage

S.C. CMS-TriDAS Conference Room - Sergio’s view

S.C. CMS-TriDAS Monitoring (Remote Control) Room - 1 Monitoring –Initially used for monitoring, real CR at P5 only! Clone of Point-5 CR (to some extent) Acquire experience and confidence for control operations Understand sharing of responsibilities for distributed control –Video link to Point-5 –Detector status –LHC status –Physics displays Express line Event displays Histograms, data quality, … –Webcam displays –…

S.C. CMS-TriDAS Monitoring (Remote Control) Room - 2 (CMS Offline!) Computing operations –Excellent communications with Point-5, IT and outside experts! –Needed from the beginning! –Making full use of IT facilities and services Tier-0, CAF –COOL, 3D, LSF, AFS, Castor2 General services –interactive service, build system, ORACLE, Monitoring GRID services –RB, BDII, SE, CE, FTS, SRM, GridFTP, Myproxy –Complementary CMS services Non-IT –DBS, DLS admin and operation, fast File- and CPU servers, graphics servers,… Tier-1’s –Phedex, Frontier, data management, heartbeat

S.C. CMS-TriDAS Monitoring Room - Sergio’s view

S.C. CMS-TriDAS Monitoring (Remote Control) Room - 3 Infrastructure – people –2 big display screens for video link (VideoOverIP) CMS, LHC status –4 wall projectors Monitoring of detector, IT services, data quality, physics, … Also for VRVS when needed –4 - 6 Operators desks 3 graphics desktops Local VRVS Printer Phone 3 fixed network sockets for DHCP –Good WiFi coverage –2 - 4 webcams

S.C. CMS-TriDAS Work Areas - 1 For small groups –3 - 5 people each –Allow experts to work together in close proximity Subdetector oriented –10 subdetectors –Detailed (sub)detector monitoring Calibration, alignment Problems, performance, dead channels, … First pass data processing –5 physics analysis groups Hot trigger channels High priority, short latency data processing and analysis.. Flexible setup –Easy reconfiguration to adapt to needs –Standard configuration

S.C. CMS-TriDAS (typical) Work Area ScreensPrinterPhone Status Displays Physicist Keyboards

S.C. CMS-TriDAS Work Areas - Sergio’s view

S.C. CMS-TriDAS Networking Excellent network connectivity –Very reliable and fast 1 GbEN for desktops –To B513 (Tier-0, CAF) and Point-5 2 x 10GbEN –Trusted network for Control Room (CMS domain) –Secure interconnectivity GPN - LCG - CMS Access to lxplus, Tier-0, CAF and Point-5 public –Wireless Coverage and speed GPNCCAR CMS P5 LCG

S.C. CMS-TriDAS Summary CERN CMS Operations Centre? –Remote Control Room –Tier-0 operation –CAF (calibration, alignment,…) –DQM –Physics Analysis –Tier-1 data distribution, communication –More (validation of concept, outreach,..) Goals very similar to and LPC at Fermilab –We can/should work together to make these efforts a success.