Overview of LHCb Computing Requirements, Organisation, Planning, Resources, Strategy LHCb UK Computing Meeting RAL May 27/28, 1999 John Harvey CERN / EP-ALC.

Slides:



Advertisements
Similar presentations
31/03/00 CMS(UK)Glenn Patrick What is the CMS(UK) Data Model? Assume that CMS software is available at every UK institute connected by some infrastructure.
Advertisements

23/04/2008VLVnT08, Toulon, FR, April 2008, M. Stavrianakou, NESTOR-NOA 1 First thoughts for KM3Net on-shore data storage and distribution Facilities VLV.
O. Stézowski IPN Lyon AGATA Week September 2003 Legnaro Data Analysis – Team #3 ROOT as a framework for AGATA.
Chapter 9: Moving to Design
Trigger and online software Simon George & Reiner Hauser T/DAQ Phase 1 IDR.
August 98 1 Jürgen Knobloch ATLAS Software Workshop Ann Arbor ATLAS Computing Planning ATLAS Software Workshop August 1998 Jürgen Knobloch Slides also.
©Ian Sommerville 2000 Software Engineering, 6th edition. Chapter 8 Slide 1 Software Prototyping l Rapid software development to validate requirements l.
REVIEW OF NA61 SOFTWRE UPGRADE PROPOSAL. Mandate The NA61 experiment is contemplating to rewrite its fortran software in modern technology and are requesting.
Software Development Stephenson College. Classic Life Cycle.
08/06/00 LHCb(UK) Meeting Glenn Patrick LHCb(UK) Computing/Grid: RAL Perspective Glenn Patrick Central UK Computing (what.
Rational Unified Process Fundamentals Module 4: Disciplines II.
JCOP Workshop September 8th 1999 H.J.Burckhart 1 ATLAS DCS Organization of Detector and Controls Architecture Connection to DAQ Front-end System Practical.
LHCb Computing Project Status report to LHCC referees J.Harvey Oct 22, 1998.
1 GAUDI - The Software Architecture and Framework for building LHCb data processing applications Marco Cattaneo, CERN February 2000.
Update on Database Issues Peter Chochula DCS Workshop, June 21, 2004 Colmar.
Fermilab User Facility US-CMS User Facility and Regional Center at Fermilab Matthias Kasemann FNAL.
LHC Computing Review - Resources ATLAS Resource Issues John Huth Harvard University.
Finnish DataGrid meeting, CSC, Otaniemi, V. Karimäki (HIP) DataGrid meeting, CSC V. Karimäki (HIP) V. Karimäki (HIP) Otaniemi, 28 August, 2000.
DCS Workshop - L.Jirdén1 ALICE DCS PROJECT ORGANIZATION - a proposal - u Project Goals u Organizational Layout u Technical Layout u Deliverables.
The Joint COntrols Project Framework Manuel Gonzalez Berges on behalf of the JCOP FW Team.
Instrumentation of the SAM-Grid Gabriele Garzoglio CSC 426 Research Proposal.
LHC Computing Review Recommendations John Harvey CERN/EP March 28 th, th LHCb Software Week.
7April 2000F Harris LHCb Software Workshop 1 LHCb planning on EU GRID activities (for discussion) F Harris.
5 May 98 1 Jürgen Knobloch Computing Planning for ATLAS ATLAS Software Week 5 May 1998 Jürgen Knobloch Slides also on:
1 Planning for Reuse (based on some ideas currently being discussed in LHCb ) m Obstacles to reuse m Process for reuse m Project organisation for reuse.
LHCb Computing Organisation and Development Strategy Presented to ATLAS Architecture WG July 16th, 1999 J.Harvey / LHCb.
LHCb Computing Status Report Meeting with LHCC Referees March 24th, 1999 John Harvey CERN/ EP-ALC.
Subject Slide 1 Roundtable on Software Process Input from LHCb.
19 November 98 1 Jürgen Knobloch ATLAS Computing ATLAS Computing - issues for 1999 Jürgen Knobloch Slides also on:
9 November 98 1 Jürgen Knobloch ATLAS Computing Overview of ATLAS Computing Jürgen Knobloch Slides also on:
Postgraduate Computing Lectures Applications I: Overview 1 Applications: Overview Symbiosis: Theory v. Experiment Theory –Build models to explain existing.
JCOP Review, March 2003 D.R.Myers, IT-CO1 JCOP Review 2003 Architecture.
Software Prototyping Rapid software development to validate requirements.
Marco Cattaneo, 15-Sep OO software plans  Major milestone (presented last June) Fully functional SICB replacement by mid-2000  How to get there?
Introduction What is detector simulation? A detector simulation program must provide the possibility of describing accurately an experimental setup (both.
26 Nov 1999 F Harris LHCb computing workshop1 Development of LHCb Computing Model F Harris Overview of proposed workplan to produce ‘baseline computing.
LHCb Computing Model and updated requirements John Harvey.
1 SICBDST and Brunel Migration status and plans. 2 Migration Step 1: SICBMC/SICBDST split  Last LHCb week: Split done but not tested  Software week.
DoE Review January 1998 Online System WBS 1.5  One-page review  Accomplishments  System description  Progress  Status  Goals Outline Stu Fuess.
Computing R&D and Milestones LHCb Plenary June 18th, 1998 These slides are on WWW at:
23/2/2000Status of GAUDI 1 P. Mato / CERN Computing meeting, LHCb Week 23 February 2000.
The ATLAS Computing Model and USATLAS Tier-2/Tier-3 Meeting Shawn McKee University of Michigan Joint Techs, FNAL July 16 th, 2007.
Workshop decisions Helge Meinhard / CERN-EP Atlas software workshop 08 May 1998.
LHCb datasets and processing stages. 200 kB100 kB 70 kB 0.1 kB 10kB 150 kB 0.1 kB 200 Hz LHCb datasets and processing stages.
The MEG Offline Project General Architecture Offline Organization Responsibilities Milestones PSI 2/7/2004Corrado Gatto INFN.
1 Tracker Software Status M. Ellis MICE Collaboration Meeting 27 th June 2005.
August 98 1 Jürgen Knobloch ATLAS Software Workshop Ann Arbor ACOS Report ATLAS Software Workshop December 1998 Jürgen Knobloch Slides also on:
Detector SimOOlation activities in ATLAS A.Dell’Acqua CERN-EP/ATC May 19th, 1999.
Follow-up to SFT Review (2009/2010) Priorities and Organization for 2011 and 2012.
VI/ CERN Dec 4 CMS Software Architecture vs Hybrid Store Vincenzo Innocente CMS Week CERN, Dec
Jianming Qian, UM/DØ Software & Computing Where we are now Where we want to go Overview Director’s Review, June 5, 2002.
Marco Cattaneo, 3-June Event Reconstruction for LHCb  What is the scope of the project?  What are the goals (short+medium term)?  How do we organise.
Marco Cattaneo, 20-May Event Reconstruction for LHCb  What is the scope of the project?  What are the goals (short+medium term)?  How do we organise.
ATLAS – statements of interest (1) A degree of hierarchy between the different computing facilities, with distinct roles at each level –Event filter Online.
Prototyping in the software process
Software Prototyping.
CMS High Level Trigger Configuration Management
Migration of reconstruction and analysis software to C++
LHC-B Computing Computing Tasks Computing Model Software Strategy
CMS – The Detector Control System
Controlling a large CPU farm using industrial tools
US ATLAS Physics & Computing
SW Architecture SG meeting 22 July 1999 P. Mato, CERN
Simulation and Physics
Strategy for development of new software
LHCb Computing Project Organisation Manage Steering Group
Process for Organising Software Development Activities
Summary Computing Model SICb Event Model Detector Description
Development of LHCb Computing Model F Harris
Planning next release of GAUDI
Presentation transcript:

Overview of LHCb Computing Requirements, Organisation, Planning, Resources, Strategy LHCb UK Computing Meeting RAL May 27/28, 1999 John Harvey CERN / EP-ALC

LHCb UK Computing Meeting, RAL, 27/28 May ‘99 Slide 2 Outline qLHCb data handling requirements qProject Organisation - scope of LHCb Computing qThe DAQ and ECS project : goals, projects, milestones qSoftware : strategy, projects, milestones qCommunication : Meetings, videoconferencing, documentation, web

LHCb UK Computing Meeting, RAL, 27/28 May ‘99 Slide 3 LHCb Data Acquisition System

200 kB100 kB 70 kB 0.1 kB 10kB 150 kB 0.1 kB 200 Hz LHCb datasets and processing stages

LHCb UK Computing Meeting, RAL, 27/28 May ‘99 Slide 5 Data Volumes Assume run for 10 7 secs each year Data type Rate /secVolume /year a) Raw data 20 MB/s 200 TB - b) Interesting physics data 1 MB/s 10 TB c) Simulated data (bx10)10 MB/s 100 TB d) Reconstructed Raw14 MB/s 140 TB e) Reconstructed Sim 7 MB/s 70 TB f) Analysis data 4 MB/s 40 TB TOTAL 550 TB

LHCb UK Computing Meeting, RAL, 27/28 May ‘99 Slide 6 Data Storage Requirements

LHCb UK Computing Meeting, RAL, 27/28 May ‘99 Slide 7 Processing Power Requirements

LHCb UK Computing Meeting, RAL, 27/28 May ‘99 Slide 8 CPU Resources Assumption is that the cpu power / processor in 2004 will be ~4000 Mips CPU Resources at the experiment data production and triggering 1,400x1000 Mips 350 nodes reconstruction 800x1000 Mips 200 nodes Total 550 nodes Event simulation and reprocessing (4 month duty cycle) monte carlo production1,400x1000 Mips 350 nodes reprocessing, event tag creation 800x1000 Mips 200 nodes Total 550 nodes Physics Analysis physics production10 groups 10 7 kMs/refinement/week 46 nodes user analysis kMs/job *2 /week 91 nodes Total 137 nodes

LHCb UK Computing Meeting, RAL, 27/28 May ‘99 Slide 9 Data Rates Raw Reconstructed TagsAnalysis User AnalysisPhysics analysis 660 MB/s360 MB/s DAQReconstructionReprocessingSimulation 4 MB/s 20 MB/s 14 MB/s 20 MB/s14 MB/s35 MB/s DATA REPOSITORY

Process for Organising Computing Activities Manage Plan, initiate, track, coordinate Set priorities and schedules, resolve conflicts Support Support development processes Manage and maintain components Certify, classify, distribute Document, give feedback Assemble Design application Find and specialise components Develop missing components Integrate components Requirements Existing software systems Build Develop models, Evaluate toolkits Architect components and systems Choose integration standard Engineer reusable components

Manage Project Organisation Support Facilities CPU farms Desktop Storage Network System Man. Vendors IT-IPT Vendors IT-PDP Vendors, IT-ASD Support Software SDE Process Quality Librarian Training Webmaster MM Build Architecture Components Frameworks(GAUDI) Glue A Libraries clhep, GUI,... data management M Reconstruction M Simulation M Analysis M Controls M Control Room M Assemble DAQ M Steering Group MM C Technical Review EM A... Arch. Review MA E... M A C E Coordinator Architect Project Manager Project Engineer

LHCb UK Computing Meeting, RAL, 27/28 May ‘99 Slide 12 Schedule for DAQ and ECS

LHCb UK Computing Meeting, RAL, 27/28 May ‘99 Slide 13 qDevise an architecture for the DAQ system and a specification for all dataflow and control elements qAcquire knowledge of, and experience with, new technologies qAssemble small scale hardware prototype of DAQ system (‘String Test’) running at full speed qFinally take an educated decision on the technologies to use for the implementation of the final system Goals of R&D Phase

LHCb UK Computing Meeting, RAL, 27/28 May ‘99 Slide 14 DAQ Activities qArchitecture and protocol design qReadout Unit Implementation Study Study functionality, interfacing Design and prototype, performance qEvent Building Project Devise strategy L1,L2/3 Study technologies e.g. Myrinet Simulation models, demonstrators qTiming and Fast Control Readout Supervisor qFEM Implementation Study qEvent Filter Farm Study (LCB Project) qStudy capabilities of Mass storage (ALICE/IT)

LHCb UK Computing Meeting, RAL, 27/28 May ‘99 Slide 15 Control/Monitoring structure Experimental equipment... LAN WAN Storage Other systems (LHC, Safety,...) Configuration DB, Archives, Logfiles, etc. Controller/ PLC VME FieldBus LAN Supervision Process Management Field Management Sensors/devices Field buses PLC OPC Communication Protocols SCADA Technologies SCADA = supervisory control and data acquisition OPC = OLE for process control PLC = Programmable logic controller Field buses = CAN, ProfiBus, WorldFip,...

LHCb UK Computing Meeting, RAL, 27/28 May ‘99 Slide 16 Experiment Control System (ECS) Joint Project to devise a common controls kernel for all LHC experiments and all aspects of control Selected Sub-Projects …. ãJoint URD for Alice/LHCb - finished ãHardware interface URD - by end ‘99 ãArchitecture design - ongoing ãTechnology survey(SCADA)- finished ãFieldbus evaluation- ongoing ãOPC evaluation- ongoing ã…

LHCb UK Computing Meeting, RAL, 27/28 May ‘99 Slide 17 Schedule for DAQ/ECS

LHCb UK Computing Meeting, RAL, 27/28 May ‘99 Slide 18 Where are we gOOing? SICB FORTRAN Toolkits ZEBRA banks ZEBRA and ASCII files June 1998 June 1998 OO Frameworks OO toolkits OO event + geometry models OO database (Objectivity/DB) June 2000

LHCb UK Computing Meeting, RAL, 27/28 May ‘99 Slide 19 Schedule for Computing

LHCb UK Computing Meeting, RAL, 27/28 May ‘99 Slide 20 Plans for SICb qSICb will be discarded when new software with same or superior functionality will be available ãCurrent planning: July 2000 qUntil then: Production simulation will continue with SICB ãInterface to other event generators ãStudies of alternative detector layout options ãEnhanced detector response simulation

LHCb UK Computing Meeting, RAL, 27/28 May ‘99 Slide 21 Milestone 1: Working Prototype qBy mid 2000, produce a new working prototype of LHCb software incorporating: ãmodel describing the LHCb detector (structure, geometry…) ãdatabase containing ~ 1,000,000 simulated events (~100 GB) ãdata processing programs based on LHCb’s OO software framework : åsimulation : integrate GEANT4 simulation toolkit åreconstruction : pattern recognition algorithms for tracking, RICH, calorimetry etc. åanalysis : toolkit of analysis algorithms ãInteractive analysis and event display facilities åevaluate available toolkits - ROOT, WIRED, JAS, OpenScientist

LHCb UK Computing Meeting, RAL, 27/28 May ‘99 Slide 22 Strategy for development of new software qWe are convinced of the importance of the architecture ãarchitect (experienced designer) and design team (domain specialists) qIdentify components, define their interfaces, relationships among them qBuild framework from implementations of these components ã“framework is an artefact that guarantees the architecture is respected” ãto be used in all the LHCb event data processing applications including : high level trigger, simulation, reconstruction, analysis. ãBuild high quality components and maximise reuse qIncremental approach to development ãnew release 3-4 times per year ãgradually add functionality ãuse what is produced and get rapid feedback qGAUDI ãrelease 1 in Feb ‘99, ãrelease 2 next week

LHCb UK Computing Meeting, RAL, 27/28 May ‘99 Slide 23 Strategy for next two years qDilemma ãhave to keep existing software in production ãupdate practices, use new technology for future development qOO Migration ãdo not isolate people, encourage same people to do both ãminimise ‘legacy’ code -algorithm development ongoing ãwrap existing FORTRAN code if appropriate ãuse available libraries (clhep, STL) and toolkits (GEANT4, ODBMS) ãstart component development (event model, geometry, ….) qTrain ‘just-in-time’, LHCb OOAD course attended by 45 people qShare ideas, designs, components etc. with other experiments qDon’t start too many activities at once, keep under control

LHCb UK Computing Meeting, RAL, 27/28 May ‘99 Slide 24 LHCb Offline Software Road Map Working Prototype, ‘retire’ SICB Detailed Implementation Integration and CommissioningExploitation Release Number 2006 Incremental releases

LHCb UK Computing Meeting, RAL, 27/28 May ‘99 Slide 25 Software Development Environment qDocumentation templates for user requirements, project plans, costings.. qDesign method (UML) and tool (Rose,….) qTraining - OO A&D course, 5 days, John Deacon qLanguage - C++ (but with Java in mind) qNT - Visual Developer Studio used by GAUDI team qLinux - preferred choice of physicists qCode management (cvs) and software release scheme (CMT) Work with IT/ IPT group Project Management - process to manage software projects Configuration Management Documentation : Handbooks - Use, Manage, Engineer

LHCb UK Computing Meeting, RAL, 27/28 May ‘99 Slide 26 Computing Model qCompute facilities - PC farms running NT or Linux åMarseilles, Liverpool, Rio, with other LHC experiments and IT/PDP group q“Data Management and Computing Using Distributed Architectures” åwith other LHC experiments in a proposed LCB project åoutside institutes + CERN/IT (LHCb/Oxford,…) ãdetermine which classes of models for distributed data analysis are feasible, taking into account network capabilities and data handling resources likely to become available in the collaboration sites ãidentify and specify the main parameters and build tools for making simulations for comparison of alternative strategies. ãMake test implementations of elements of computing model

LHCb UK Computing Meeting, RAL, 27/28 May ‘99 Slide 27 Communication qMeetings ãWeekly Computing meeting at CERN - Wednesday morning ãIrregular DAQ/ECS meeting at CERN - Wednesday afternoon ãLHCb Software weeks started in 1999 åFeb 8-12, June 2-4, Nov ãData Handling meeting in LHCb weeks qVideoconferencing ãhas been tried but not institutionalised ãdedicated facility for LHCb at CERN being discussed qLHCb web ãTechnical notes, news of meetings, talks, slide gallery, collaboration database are all maintained on LHCb web - ãwe aim to improve - guidelines on producing web pages and design of public web being discussed