Online System Status LHCb Week Beat Jost / Cern 9 June 2015.

Slides:



Advertisements
Similar presentations
GridPP7 – June 30 – July 2, 2003 – Fabric monitoring– n° 1 Fabric monitoring for LCG-1 in the CERN Computer Center Jan van Eldik CERN-IT/FIO/SM 7 th GridPP.
Advertisements

Status of the CTP O.Villalobos Baillie University of Birmingham April 23rd 2009.
Clara Gaspar on behalf of the LHCb Collaboration, “Physics at the LHC and Beyond”, Quy Nhon, Vietnam, August 2014 Challenges and lessons learnt LHCb Operations.
Backing Up Your Computer Hard Drive Lou Koch June 27, 2006.
M. Adinolfi - University of Bristol1LHCb software week - 18/06/2009 Offline monitoring Marco Adinolfi University of Bristol.
LHCb Upgrade Overview ALICE, ATLAS, CMS & LHCb joint workshop on DAQ Château de Bossey 13 March 2013 Beat Jost / Cern.
23/04/2008VLVnT08, Toulon, FR, April 2008, M. Stavrianakou, NESTOR-NOA 1 First thoughts for KM3Net on-shore data storage and distribution Facilities VLV.
ACAT 2002, Moscow June 24-28thJ. Hernández. DESY-Zeuthen1 Offline Mass Data Processing using Online Computing Resources at HERA-B José Hernández DESY-Zeuthen.
Patrick Robbe, LAL Orsay, for the LHCb Collaboration, 16 December 2014
L. Granado Cardoso, F. Varela, N. Neufeld, C. Gaspar, C. Haen, CERN, Geneva, Switzerland D. Galli, INFN, Bologna, Italy ICALEPCS, October 2011.
Trigger and online software Simon George & Reiner Hauser T/DAQ Phase 1 IDR.
Automating Linux Installations at CERN G. Cancio, L. Cons, P. Defert, M. Olive, I. Reguero, C. Rossi IT/PDP, CERN presented by G. Cancio.
Next Generation of Apache Hadoop MapReduce Arun C. Murthy - Hortonworks Founder and Architect Formerly Architect, MapReduce.
Framework for Online Alignment 4th LHCb Computing Workshop 6 November 2014 Beat Jost / Cern.
Clara Gaspar, November 2012 Experiment Control System LS1 Plans…
SOFTWARE & COMPUTING Durga Rajaram MICE PROJECT BOARD Nov 24, 2014.
Framework for Automated Builds Natalia Ratnikova CHEP’03.
Eduardo Rodrigues, Glasgow University LHCb Alignment Working Week, CERN, 7-12 Jan How to work with different databases implementing misalignments.
Test Of Distributed Data Quality Monitoring Of CMS Tracker Dataset H->ZZ->2e2mu with PileUp - 10,000 events ( ~ 50,000 hits for events) The monitoring.
Clara Gaspar, October 2011 The LHCb Experiment Control System: On the path to full automation.
Computing Infrastructure Status. LHCb Computing Status LHCb LHCC mini-review, February The LHCb Computing Model: a reminder m Simulation is using.
Conditions DB in LHCb LCG Conditions DB Workshop 8-9 December 2003 P. Mato / CERN.
Debugging of M2-M5 G. Passaleva. 11/02/2009 Muon System mini review G. Passaleva 2 outline Short summary of Muon readout channel organization Connectivity.
DAQ MICO Report Online Monitoring: –Status All histograms are now implemented Still not fully online –Only monitoring from data file (slightly offline,
Databases E. Leonardi, P. Valente. Conditions DB Conditions=Dynamic parameters non-event time-varying Conditions database (CondDB) General definition:
Status of the production and news about Nagios ALICE TF Meeting 22/07/2010.
CCRC’08 Weekly Update Jamie Shiers ~~~ LCG MB, 1 st April 2008.
Software installation for commissioning tests Olivier Deschamps Calorimeter commissioning meeting – 05 april 2007.
CERN Physics Database Services and Plans Maria Girone, CERN-IT
1 Control Software (CAT) Introduction USB Interface implementation Calorimeter Electronics Upgrade Meeting Frédéric Machefert Wednesday 5 th May, 2010.
Management of the LHCb DAQ Network Guoming Liu * †, Niko Neufeld * * CERN, Switzerland † University of Ferrara, Italy.
Clara Gaspar, March 2005 LHCb Online & the Conditions DB.
LHCb Software Week November 2003 Gennady Kuznetsov Production Manager Tools (New Architecture)
R. Fantechi. TDAQ commissioning Status report on Infrastructure at the experiment PC farm Run control Network …
MIS Gap Analysis “Make technology work for you”. Assumptions All training participants are from banks that have a memorandum of agreement for microfinance.
Alignment Software from the ME perspective Marcus Hohlmann Florida Tech Muon Alignment meeting during CMS week - CERN, March 15, 2005.
ALICE Use of CMF (CC) for the installation of OS and basic S/W OPC servers and other special S/W installed and configured by hand PVSS project provided.
Managing the CERN LHC Tier0/Tier1 centre Status and Plans March 27 th 2003 CERN.ch.
CERN – Alice Offline – Thu, 20 Mar 2008 – Marco MEONI - 1 Status of Cosmic Reconstruction Offline weekly meeting.
Clara Gaspar, July 2005 RTTC Control System Status and Plans.
MICE CM28 Oct 2010Jean-Sebastien GraulichSlide 1 Detector DAQ o Achievements Since CM27 o DAQ Upgrade o CAM/DAQ integration o Online Software o Trigger.
LHCbComputing LHCC status report. Operations June 2014 to September m Running jobs by activity o Montecarlo simulation continues as main activity.
ECS and LS Update Xavier Vilasís-Cardona Calo Meeting - Xvc1.
1 Calorimeters LED control LHCb CALO meeting Anatoli Konoplyannikov /ITEP/ Status of the calorimeters LV power supply and ECS control Status of.
Infrastructure availability and Hardware changes Slides prepared by Niko Neufeld Presented by Rainer Schwemmer for the Online administrators.
Experiment Support CERN IT Department CH-1211 Geneva 23 Switzerland t DBES Andrea Sciabà Hammercloud and Nagios Dan Van Der Ster Nicolò Magini.
Maria Girone CERN - IT Tier0 plans and security and backup policy proposals Maria Girone, CERN IT-PSS.
OPERATIONS REPORT JUNE – SEPTEMBER 2015 Stefan Roiser CERN.
Clara Gaspar, April 2006 LHCb Experiment Control System Scope, Status & Worries.
Summary of User Requirements for Calibration and Alignment Database Magali Gruwé CERN PH/AIP ALICE Offline Week Alignment and Calibration Workshop February.
Management of the LHCb DAQ Network Guoming Liu *†, Niko Neufeld * * CERN, Switzerland † University of Ferrara, Italy.
LAV thresholds requirements Paolo Valente. LAV answers for Valeri’s questions (old) 1.List of hardware to control (HV, LV, crates, temperatures, pressure,
Alignment in real-time in current detector and upgrade 6th LHCb Computing Workshop 18 November 2015 Beat Jost / Cern.
1 Update at RAL and in the Quattor community Ian Collier - RAL Tier1 HEPiX FAll 2010, Cornell.
Sergey Baranov: PanDA Infrastructure at CERN 3 Sep PanDA Infrastructure at CERN Status Sergey Baranov 3 Sep 2013.
Markus Frank (CERN) & Albert Puig (UB).  An opportunity (Motivation)  Adopted approach  Implementation specifics  Status  Conclusions 2.
Calibration & Monitoring M.N Minard Monitoring News Status of monitoring tools Histogramm and monitoring meeting 6/02/08 Calibration farm brainstorming.
M.Frank, CERN/LHCb Persistency Workshop, Dec, 2004 Distributed Databases in LHCb  Main databases in LHCb Online / Offline and their clients  The cross.
LHCb 2009-Q4 report Q4 report LHCb 2009-Q4 report, PhC2 Activities in 2009-Q4 m Core Software o Stable versions of Gaudi and LCG-AA m Applications.
L1Calo Databases ● Overview ● Trigger Configuration DB ● L1Calo OKS Database ● L1Calo COOL Database ● ACE Murrough Landon 16 June 2008.
L1Calo Databases ● Overview ● Recent Activitity ● To Do List ● Run types Murrough Landon 4 June 2007.
THIS MORNING (Start an) informal discussion to -Clearly identify all open issues, categorize them and build an action plan -Possibly identify (new) contributing.
ST software status, Jeroen van Tilburg
Workshop Concluding Remarks
High Availability Linux (HA Linux)
ALICE, ATLAS, CMS & LHCb joint workshop on DAQ
Remaining Online SW Tasks
LHCb Data Quality Check web. cern
Use Of GAUDI framework in Online Environment
Configuration DB Status report Lana Abadie
Presentation transcript:

Online System Status LHCb Week Beat Jost / Cern 9 June 2015

Beat Jost, Cern Hardware and infrastructure work ❏ LS1 is over ❏ A lot of work has been performed during the last ~30 months ➢ All control PCs have been replaced/virtualized ➢ All PCs upgraded to new OS (W2008S/W7/SLC6) ➢ Configuration Management changed from Quattor to Puppet ➢ Controls infrastructure upgraded to new version of PVSS ❏ Much hardware replaced ➢ ~600 farm nodes extracted ➢ ~800 new farm nodes installed (90% cabled and in production) ➢ ~25 SPECS masters replaced by new hardware ➢ ~30 CAN control PCs replaced by Credit-card PCs 2 LHCb Week, 9 June 2015

Beat Jost, Cern Software ❏ Main challenge was to support the Split HLT ➢ Whole control of the farm had to be adapted/redone ➥ Concurrent running of HLT1 and HLT2 ➥ New task architecture in the farm nodes ➥ New management of conditions ❏ Completely new is the online Alignment and Calibration to provide the HLT with the best Alignment and Calibration constants ➢ HLT1 selected/enriched data for alignment tasks on local disks ➢ Alignment runs on all farm nodes (~10 minutes) ➢ Calibration for Rich and OT done online using either output of online reconstruction or monitoring data coming from HLT1. ➢ Calibration constants available typically within seconds after change of run ❏ To synchronize and sequence all these all these activities automatically a new Scheduler has been developed ➢ For the moment the activities are performed manually, though 3 LHCb Week, 9 June 2015

Beat Jost, Cern Software II ❏ Completely new system for HLT2 and Data Quality monitoring ➢ As HLT2 runs asynchronously and on different runs in different nodes, the normal Data monitoring cannot be used ➥ New system, based on HLT2 output files, has been devised and implemented ➥ Same for the final Data Quality monitoring, running Offline Brunel ➢ Dry tested, but not yet in production 4 LHCb Week, 9 June 2015

Beat Jost, Cern Where are we? ❏ All collision data have been taken without major problems ❏ Most of the data are still spinning around on the local disks ❏ Velo and Tracker alignment works ❏ So far, all actions are performed manually ❏ For a few runs, HLT2 has been run and files transferred to offline for test processing ➢ Some minor problems transferring the conditions to the offline database, will be fixed today or tomorrow 5 LHCb Week, 9 June 2015

Beat Jost, Cern Summary ❏ After all the changes and new development the system is in very good shape ➢ Thanks to extensive testing on the way, especially on data saved end of 2012 ❏ Operationally we still have to learn a lot to run the system in a mainly automatic fashion Many thanks to all involved 6 LHCb Week, 9 June 2015