9 November 98 1 Jürgen Knobloch ATLAS Computing Overview of ATLAS Computing Jürgen Knobloch Slides also on:

Slides:



Advertisements
Similar presentations
31/03/00 CMS(UK)Glenn Patrick What is the CMS(UK) Data Model? Assume that CMS software is available at every UK institute connected by some infrastructure.
Advertisements

Resources for the ATLAS Offline Computing Basis for the Estimates ATLAS Distributed Computing Model Cost Estimates Present Status Sharing of Resources.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
23/04/2008VLVnT08, Toulon, FR, April 2008, M. Stavrianakou, NESTOR-NOA 1 First thoughts for KM3Net on-shore data storage and distribution Facilities VLV.
Introduction to Systems Analysis and Design
Trigger and online software Simon George & Reiner Hauser T/DAQ Phase 1 IDR.
August 98 1 Jürgen Knobloch ATLAS Software Workshop Ann Arbor ATLAS Computing Planning ATLAS Software Workshop August 1998 Jürgen Knobloch Slides also.
L3 Filtering: status and plans D  Computing Review Meeting: 9 th May 2002 Terry Wyatt, on behalf of the L3 Algorithms group. For more details of current.
Argonne National Laboratory ATLAS Core Database Software U.S. ATLAS Collaboration Meeting New York 22 July 1999 David Malon
October 24, 2000Milestones, Funding of USCMS S&C Matthias Kasemann1 US CMS Software and Computing Milestones and Funding Profiles Matthias Kasemann Fermilab.
LCG Milestones for Deployment, Fabric, & Grid Technology Ian Bird LCG Deployment Area Manager PEB 3-Dec-2002.
CMS Report – GridPP Collaboration Meeting VI Peter Hobson, Brunel University30/1/2003 CMS Status and Plans Progress towards GridPP milestones Workload.
CLAS12 CalCom Activity CLAS Collaboration Meeting, March 6 th 2014.
REVIEW OF NA61 SOFTWRE UPGRADE PROPOSAL. Mandate The NA61 experiment is contemplating to rewrite its fortran software in modern technology and are requesting.
JCOP Workshop September 8th 1999 H.J.Burckhart 1 ATLAS DCS Organization of Detector and Controls Architecture Connection to DAQ Front-end System Practical.
Fermilab User Facility US-CMS User Facility and Regional Center at Fermilab Matthias Kasemann FNAL.
Offline Coordinators  CMSSW_7_1_0 release: 17 June 2014  Usage:  Generation and Simulation samples for run 2 startup  Limited digitization and reconstruction.
Nick Brook Current status Future Collaboration Plans Future UK plans.
LHC Computing Review - Resources ATLAS Resource Issues John Huth Harvard University.
Finnish DataGrid meeting, CSC, Otaniemi, V. Karimäki (HIP) DataGrid meeting, CSC V. Karimäki (HIP) V. Karimäki (HIP) Otaniemi, 28 August, 2000.
LHC Computing Plans Scale of the challenge Computing model Resource estimates Financial implications Plans in Canada.
ATLAS and GridPP GridPP Collaboration Meeting, Edinburgh, 5 th November 2001 RWL Jones, Lancaster University.
LHC Computing Review Recommendations John Harvey CERN/EP March 28 th, th LHCb Software Week.
ATLAS, U.S. ATLAS, and Databases David Malon Argonne National Laboratory DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National Laboratory.
7April 2000F Harris LHCb Software Workshop 1 LHCb planning on EU GRID activities (for discussion) F Harris.
5 May 98 1 Jürgen Knobloch Computing Planning for ATLAS ATLAS Software Week 5 May 1998 Jürgen Knobloch Slides also on:
1 Planning for Reuse (based on some ideas currently being discussed in LHCb ) m Obstacles to reuse m Process for reuse m Project organisation for reuse.
Subject Slide 1 Roundtable on Software Process Input from LHCb.
19 November 98 1 Jürgen Knobloch ATLAS Computing ATLAS Computing - issues for 1999 Jürgen Knobloch Slides also on:
Fabiola Gianotti, 31/8/’99 PHYSICS and SOFTWARE ATLAS Software Week 31/8/’99 Fabiola Gianotti Software requirements of physics groups What should Detector.
And Tier 3 monitoring Tier 3 Ivan Kadochnikov LIT JINR
ATLAS Data Challenges US ATLAS Physics & Computing ANL October 30th 2001 Gilbert Poulard CERN EP-ATC.
November 2013 Review Talks Morning Plenary Talk – CLAS12 Software Overview and Progress ( ) Current Status with Emphasis on Past Year’s Progress:
CMS Computing and Core-Software USCMS CB Riverside, May 19, 2001 David Stickland, Princeton University CMS Computing and Core-Software Deputy PM.
Introduction CMS database workshop 23 rd to 25 th of February 2004 Frank Glege.
GDB Meeting - 10 June 2003 ATLAS Offline Software David R. Quarrie Lawrence Berkeley National Laboratory
TRT Offline Software DOE Visit, August 21 st 2008 Outline: oTRT Commissioning oTRT Offline Software Activities oTRT Alignment oTRT Efficiency and Noise.
Marco Cattaneo, 15-Sep OO software plans  Major milestone (presented last June) Fully functional SICB replacement by mid-2000  How to get there?
- Early Adopters (09mar00) May 2000 Prototype Framework Early Adopters Craig E. Tull HCG/NERSC/LBNL ATLAS Arch CERN March 9, 2000.
CMS Computing and Core-Software Report to USCMS-AB (Building a Project Plan for CCS) USCMS AB Riverside, May 18, 2001 David Stickland, Princeton University.
GLAST LAT Offline SoftwareCore review, Jan. 17, 2001 Review of the “Core” software: Introduction Environment: THB, Thomas, Ian, Heather Geometry: Joanne.
Status of the LAr OO Reconstruction Srini Rajagopalan ATLAS Larg Week December 7, 1999.
Computing R&D and Milestones LHCb Plenary June 18th, 1998 These slides are on WWW at:
NEC' /09P.Hristov1 Alice off-line computing Alice Collaboration, presented by P.Hristov, CERN NEC'2001 September 12-18, Varna.
Testing and integrating the WLCG/EGEE middleware in the LHC computing Simone Campana, Alessandro Di Girolamo, Elisa Lanciotti, Nicolò Magini, Patricia.
The ATLAS Computing Model and USATLAS Tier-2/Tier-3 Meeting Shawn McKee University of Michigan Joint Techs, FNAL July 16 th, 2007.
12 March, 2002 LCG Applications Area - Introduction slide 1 LCG Applications Session LCG Launch Workshop March 12, 2002 John Harvey, CERN LHCb Computing.
Workshop decisions Helge Meinhard / CERN-EP Atlas software workshop 08 May 1998.
Introduction S. Rajagopalan August 28, 2003 US ATLAS Computing Meeting.
The MEG Offline Project General Architecture Offline Organization Responsibilities Milestones PSI 2/7/2004Corrado Gatto INFN.
Software Week - 8/12/98G. Poulard - CERN EP/ATC1 Status of Software for Physics TDR Atlas Software Week 8 December 1998 G. Poulard.
August 98 1 Jürgen Knobloch ATLAS Software Workshop Ann Arbor ACOS Report ATLAS Software Workshop December 1998 Jürgen Knobloch Slides also on:
Detector SimOOlation activities in ATLAS A.Dell’Acqua CERN-EP/ATC May 19th, 1999.
L. Perini DATAGRID WP8 Use-cases 19 Dec ATLAS short term grid use-cases The “production” activities foreseen till mid-2001 and the tools to be used.
Hans Wenzel CDF CAF meeting October 18 th -19 th CMS Computing at FNAL Hans Wenzel Fermilab  Introduction  CMS: What's on the floor, How we got.
J. Knobloch, SW meeting ACOS Agenda Corrections to the Minutes, Matters arising. 2.Discussion and decisions arising from the Software.
VI/ CERN Dec 4 CMS Software Architecture vs Hybrid Store Vincenzo Innocente CMS Week CERN, Dec
12 March, 2002 LCG Applications Area - Introduction slide 1 LCG Applications Session LCG Launch Workshop March 12, 2002 John Harvey, CERN LHCb Computing.
Online Software November 10, 2009 Infrastructure Overview Luciano Orsini, Roland Moser Invited Talk at SuperB ETD-Online Status Review.
Marco Cattaneo, 3-June Event Reconstruction for LHCb  What is the scope of the project?  What are the goals (short+medium term)?  How do we organise.
1 GlueX Software Oct. 21, 2004 D. Lawrence, JLab.
10-Feb-00 CERN HepCCC Grid Initiative ATLAS meeting – 16 February 2000 Les Robertson CERN/IT.
Marco Cattaneo, 20-May Event Reconstruction for LHCb  What is the scope of the project?  What are the goals (short+medium term)?  How do we organise.
Monthly video-conference, 18/12/2003 P.Hristov1 Preparation for physics data challenge'04 P.Hristov Alice monthly off-line video-conference December 18,
Overview of CLAS12 Calibration
US ATLAS Physics & Computing
LHCb Computing Project Organisation Manage Steering Group
ATLAS DC2 & Continuous production
Development of LHCb Computing Model F Harris
Planning next release of GAUDI
Presentation transcript:

9 November 98 1 Jürgen Knobloch ATLAS Computing Overview of ATLAS Computing Jürgen Knobloch Slides also on:

9 November 98 2 Jürgen Knobloch ATLAS Computing Outline  Software –Quality –Strategy –Human resources –Current activities  Hardware –Requirements –Regional centers  Organizational aspects  Conclusions

9 November 98 3 Jürgen Knobloch ATLAS Computing We are at a turning point...  Physics “TDR” - spring 99 –based principally on FORTRAN software –benchmark for detector and software performance  Internal Review of Computing –Committee created by ATLAS management –Lead by Homer Neal –Report by February 99  Preparatory work for OO software is done –GEANT-4 production release - December 98 –LHC++ - “CERNLIB” for OO developments –ARVE - ATLAS framework available –Software process support - Rules, SRT, ASP

9 November 98 4 Jürgen Knobloch ATLAS Computing Offline Software Generate Events Generate Events Simulate Events Simulate Events Simulation geometry Build Simulation Geometry Build Simulation Geometry Reconstuction geometry Build Reconstruction Geometry Build Reconstruction Geometry Detector description Detector alignment Detector calibration Reconstruction parameters Reconstruct Events Reconstruct Events ESD AOD Analyze Events Analyze Events Physics Raw Data ATLAS Detector

9 November 98 5 Jürgen Knobloch ATLAS Computing Software Quality  Functionality, correctness, robustness –Experience, requirements, software process  Ease of entry for physicists –Documentation, clear architecture, training, mentors  Maintainability –Documentation, software process  Flexibility –changing environment, new requirements  Performance –Few O(0) re-processings, CPU, memory, I/O

9 November 98 6 Jürgen Knobloch ATLAS Computing Software Strategy  Defined process - ATLAS Software Process (ASP) –Reviews of deliverables –Tools (SRT, Code checking, LIGHT, GNATS, …)  Object oriented design and implementation - C++  Standards and commercial components wherever possible and reasonable  Common solutions with other experiments wherever possible –LCB projects GEANT-4, LHC++, RD45, Spider, Monarc, …  Learn from other experiments’ experience –BaBar, Fermilab, STAR,...

9 November 98 7 Jürgen Knobloch ATLAS Computing Trigger Domain Decomposition GUI Control Event Display Muon System Magnetic Field Calorimetry Detector Description Inner Detector Data Base Simulation Event Tools SDE Documentation Muon Reconst. e,  Reconstruction Organisation: Domain architects Domain coordinators Domain interface group Support team Create “centres of gravity” for domains

9 November 98 8 Jürgen Knobloch ATLAS Computing Domains and People FTE nowFTE required  Management, general support616 –Coordinators, Librarian, Software process support  Reconstruction25  Simulation25  Event domain35  Graphics46  Analysis tools13  Field, alignment, calibration14  Detector description13  System Domains:3644 –Trigger Simulation, Inner Detector, Muon Spectrometer, Calorimeters _________________________________________________ SUM5691

9 November 98 9 Jürgen Knobloch ATLAS Computing Manpower evolution Re-direct people from physics studies in FORTRAN to developments in OO technology. In addition need software professionals for key tasks: Software process support, framework, data base: ~25 + support of regional centers + R&D ATLAS Computing Workforce Integral : 900 person/years FORTRAN OO

9 November Jürgen Knobloch ATLAS Computing Timeline - past 1992 | 1993 | 1994 | 1995 | 1996 | 1997 | 1998 Preparatory Phase LOITPGEANT4CTP RD45 1 Tbyte event DB ARVE MOOSE

9 November Jürgen Knobloch ATLAS Computing Timeline - future 1999 | 2000 | 2001 | 2002 | 2003 | 2004 | 2005 Implementation and Commissioning Phase Physics “TDR”First OO simulation First OO reconstruction Complete OO simulation Complete OO reconstruction Decide on DatabaseFull Test OpSys Operation Computing Technical Proposal - Rel 2

9 November Jürgen Knobloch ATLAS Computing  ATLAS Software Process (ASP) –Reviews - requirements, design, code (starting) –C++ coding rules - automatically checked  Objectivity database in use –GEANT 3 data, test beam data, n-tuples –Detector description database  ATLAS repository in CVS/SRT – lines of ATLAS FORTRAN –74956 lines of AGE – lines of C++  Documentation on the web - LIGHT Status of ATLAS OO Environment + external code: FORTRAN C++ GEANT-4

9 November Jürgen Knobloch ATLAS Computing  ARVE –ATLAS Reconstruction and Visualisation Environment –Developed by Toby Burnett (U. Washington) –Inner Detector pattern recognition - several algorithms in C++ –Muon reconstruction –GUI, Graphics –Generation of single particles --> GEANT4 ATLAS OO projects - Reconstruction

9 November Jürgen Knobloch ATLAS Computing  LHC++: commercial tools - evaluating –Thin HEP-specific layer ATLAS OO projects - Analysis tools

9 November Jürgen Knobloch ATLAS Computing ATLAS OO Projects - Simulation  Simulation: GEANT4 - ATLAS participation and implementation –Test beam simulation under development –Full detector prototype  Most US work is done currently within BaBar

9 November Jürgen Knobloch ATLAS Computing  ODBMS (RD45) ATLAS participation –Event storage: GEANT3, GEANT4, test-beam –Detector description database ATLAS OO projects - Data Bases DetectorElement Identifier identify() iterator digits_begin() iterator digits_end() DetectorDescriptor Digit Identifier identify() Point3D position() float response() DetectorPosition Point3D center() Transform3D transform() Basic design of raw data structure:

9 November Jürgen Knobloch ATLAS Computing Monte Carlo Productions  Physics performance work - productions in full swing  Many sites moving to Intel platforms - e.g. at CERN&LBL: Bottleneck: People running productions - not infrastructure Next: run reconstruction and produce n-tuples

9 November Jürgen Knobloch ATLAS Computing  Centralized model –all of the event data are stored centrally and coherently managed at CERN –good time response needed  Partially decentralized model –replicate the event data at about five regional centers –data transfer via network or movable media CERNRC2CERNRC1 Computing model architecture

9 November Jürgen Knobloch ATLAS Computing 1 SPECint95 = 40 SPECint92 = 10 CERN units = 40 MIPS CPU requirements  Event reconstruction7x10 4 SPECint95 –250 SPECint95-sec/event = 10xFermilab run 1  Monte Carlo production5x10 4 SPECint95 –10% of number of real events  Physics analysis15x10 4 SPECint95 –20 analysis groups –25 people in each group => 500 people analyzing –150 simultaneous users –group loops over 10 9 events/month - selecting 1-10% –each physicist loops over selected events 1/month –each physicist loops over AOD/Ntuple 1/day

9 November Jürgen Knobloch ATLAS Computing Note: The unit cost extrapolation and the requirements both have large errors! The final numbers should therefore be used with great care! Cost

9 November Jürgen Knobloch ATLAS Computing Definition of milestones Ingredients:  Milestones given in CTP  OO reconstruction and simulation - first version available end 99  Full chain operational on subset of final infrastructure - 1 year before LHC start-up  Coverage of interim needs –Physics studies –Test beam support –Radiation level studies

9 November Jürgen Knobloch ATLAS Computing Milestones  48 Milestones defined  23 before July 99  Define “near” milestones on a yearly basis  Covering tasks in DFDs  Tasks for end 99: –read “prototype” or “first version”

9 November Jürgen Knobloch ATLAS Computing Milestones - 1 General Provide outline of end-99 CTP2/99 Report from review of computing2/99 Install test and verification procedure4/99 Assess development resources4/99 Revisit development environment7/00 Decide on first production OS7/03 Data challenge1/04 Test full chain in real environment7/04 Analysis Analysis requirements collected12/98 Evaluate analysis environment for interim period7/99 Evaluate analysis environment for production period7/02 Implement analysis environment for production period7/03

9 November Jürgen Knobloch ATLAS Computing Milestones - 2 Event storage Define raw data structure12/98 GEANT3 events in ObjDB8/98 Define initial version of ESD7/99 Define initial version of event tag7/99 Define initial version of AOD7/99 GEANT4 events in ObjDB7/99 Provide DB performance report7/00 Decide on database vendor6/01

9 November Jürgen Knobloch ATLAS Computing Milestones - 3 Simulation ATLAS simulation domains defined10/98 Define event generator interface1/99 Simulation framework available1/99 Single particles from G4 in ARVE1/99 GEANT4 test beam simulations3/99 Simulation geometry available7/99 First general OO simulation available12/99 Minimum bias overlay5/00 Terminate maintenance DICE3/01

9 November Jürgen Knobloch ATLAS Computing Milestones - 4 Reconstruction version 1 Reconstruction geometry available3/99 Prepare data12/98 Find tracks2/99 Find ID tracks2/99 Find muon tracks4/99 Find global tracks7/99 Find cal clusters5/99 Calorimeter Calibration4/99 Muon Spectrometer alignment7/99 Inner Detector Alignment7/99 Event display prototype7/99 First complete OO reconstruction12/99

9 November Jürgen Knobloch ATLAS Computing Milestones - 5 Infrastructure 1 Tbyte database prototype12/98 Conclude study on regional centres11/99 Define financing of computing at CERN10/99 Define scope of regional centres11/ Tbyte database prototype12/03 Full database available12/04 1% processing farm prototype12/02 5% processing farm12/03 40% processing farm12/04 100% processing farm6/05

9 November Jürgen Knobloch ATLAS Computing  Software workshops –1 week, 4 times / year, 1 outside CERN  ACOS (Computing steering group) –6 meetings / year (4 during software workshops)  DIG (Domain interface group and support team) –8 meetings / year (4 during software workshops)  Weekly software meetings –Videoconferenced –Tutorial session about OO issues  Working groups: database, graphics, WWCG, …  CERN ATLAS Computing Group (ATC) Organisational items

9 November Jürgen Knobloch ATLAS Computing Conclusions  Groundwork has been done, a software process is in place, tools such as GEANT4 have been developed.  We are at a turning point from preparatory work to OO software implementation.  Physics studies done in FORTRAN have demonstrated the performance of the ATLAS detector. The results will serve as benchmarks for OO software.  After physics “TDR” - –move physics studies to OO, get more people on board  The distributed (peta-byte) data-mining using regional computing centers needs to be established.

9 November Jürgen Knobloch ATLAS Computing US-ATLAS contributions (1)  US-ATLAS members have played key roles, e.g.: –K. Sliwa >chairing the “Computing Model Group” in preparation of the Computing Technical Proposal; now “World-Wide Computing Group”. >member of ACOS >was the initiator of the MONARC project. –D. Melon >participation in MOOSE project >groundwork for RD45 with “Grand Challenge” –T. Burnett >Major contributor to MOOSE -> developing ARVE

9 November Jürgen Knobloch ATLAS Computing US-ATLAS contributions (2)  US physicists have been involved in the development of the GEANT-4 simulation  US laboratories have made major contributions to physics studies and Monte Carlo productions –Computing facilities in several labs –Porting of ATLAS software to specific operating systems  I hope that US ATLAS institutes will continue to increase their involvement in ATLAS computing, bringing in their valuable expertise from past and current projects.  Early involvement is the key for later success!