ATLAS eScience Programme ATLAS UK Oversight Committee 21 May 2004 Roger Jones 12 UK institutions.

Slides:



Advertisements
Similar presentations
GridPP4 – Revised Plan Implementing the PPAN recommendations.
Advertisements

ATLAS ATLAS PESA Meeting 25/04/02 B-Trigger Working Group Status Report This talk:
15 May 2006Collaboration Board GridPP3 Planning Executive Summary Steve Lloyd.
ATLAS ATLAS PESA Meeting 25/04/02 B-Trigger Working Group Work-plan This talk:
Nick Brook University of Bristol The LHC Experiments & Lattice EB News Brief overview of the expts  ATLAS  CMS  LHCb  Lattice.
ATLAS UK Computing PMB Meeting 12 Sep 2001 Roger Jones.
Resources for the ATLAS Offline Computing Basis for the Estimates ATLAS Distributed Computing Model Cost Estimates Present Status Sharing of Resources.
S.L.LloydGridPP CB 29 Oct 2002Slide 1 Agenda 1.Introduction – Steve Lloyd 2.Minutes of Previous Meeting (23 Oct 2001) 3.Matters Arising 4.Project Leader's.
MAUS Update Adam Dobbs, MICE Project Board, 16 th April 2015.
Trigger and online software Simon George & Reiner Hauser T/DAQ Phase 1 IDR.
August 98 1 Jürgen Knobloch ATLAS Software Workshop Ann Arbor ATLAS Computing Planning ATLAS Software Workshop August 1998 Jürgen Knobloch Slides also.
GridPP Steve Lloyd, Chair of the GridPP Collaboration Board.
October 24, 2000Milestones, Funding of USCMS S&C Matthias Kasemann1 US CMS Software and Computing Milestones and Funding Profiles Matthias Kasemann Fermilab.
LCG Milestones for Deployment, Fabric, & Grid Technology Ian Bird LCG Deployment Area Manager PEB 3-Dec-2002.
ATLAS-Specific Activity in GridPP EDG Integration LCG Integration Metadata.
CMS Report – GridPP Collaboration Meeting VI Peter Hobson, Brunel University30/1/2003 CMS Status and Plans Progress towards GridPP milestones Workload.
CLAS12 CalCom Activity CLAS Collaboration Meeting, March 6 th 2014.
S.L.LloydGridPP CB 19 February 2003Slide 1 Agenda 1.Minutes of Previous Meeting (29 Oct 2002) 2.Matters Arising 3.GridPP2 Planning 4.EGEE 5.Any Other Business.
5 November 2001F Harris GridPP Edinburgh 1 WP8 status for validating Testbed1 and middleware F Harris(LHCb/Oxford)
Tony Doyle - University of Glasgow 12 January 2005Collaboration Board Beyond GridPP2 Tony Doyle.
ATLAS and Grid Computing RWL Jones GridPP 13 5 th July 2005.
Tony Doyle GridPP – From Prototype To Production, GridPP10 Meeting, CERN, 2 June 2004.
David Adams ATLAS ATLAS Distributed Analysis David Adams BNL March 18, 2004 ATLAS Software Workshop Grid session.
1 G4MICE Design Iteration Malcolm Ellis MICE Video Conference 21 st April 2004.
Offline Coordinators  CMSSW_7_1_0 release: 17 June 2014  Usage:  Generation and Simulation samples for run 2 startup  Limited digitization and reconstruction.
ATLAS and GridPP GridPP Collaboration Meeting, Edinburgh, 5 th November 2001 RWL Jones, Lancaster University.
LHC Computing Review Recommendations John Harvey CERN/EP March 28 th, th LHCb Software Week.
5 May 98 1 Jürgen Knobloch Computing Planning for ATLAS ATLAS Software Week 5 May 1998 Jürgen Knobloch Slides also on:
InterGrid Meeting 7 Oct 2001 Tony Doyle. Tony Doyle - University of Glasgow GridPP Status  Financial Background  Deliverables  Recruitment  Regional.
19 November 98 1 Jürgen Knobloch ATLAS Computing ATLAS Computing - issues for 1999 Jürgen Knobloch Slides also on:
Fabiola Gianotti, 31/8/’99 PHYSICS and SOFTWARE ATLAS Software Week 31/8/’99 Fabiola Gianotti Software requirements of physics groups What should Detector.
GridPP Presentation to AstroGrid 13 December 2001 Steve Lloyd Queen Mary University of London.
ATLAS Data Challenges US ATLAS Physics & Computing ANL October 30th 2001 Gilbert Poulard CERN EP-ATC.
GridPP Building a UK Computing Grid for Particle Physics Professor Steve Lloyd, Queen Mary, University of London Chair of the GridPP Collaboration Board.
The Experiments – progress and status Roger Barlow GridPP7 Oxford 2 nd July 2003.
GRID IIII D UK Particle Physics Tony Doyle - University of Glasgow GridPP Status Report  Executive Summary  News  “Unfunded” EU DataGrid Posts  Shadow.
Zprávy z ATLAS SW Week March 2004 Seminář ATLAS SW CZ Duben 2004 Jiří Chudoba FzÚ AV CR.
CMS Computing and Core-Software USCMS CB Riverside, May 19, 2001 David Stickland, Princeton University CMS Computing and Core-Software Deputy PM.
Tony Doyle - University of Glasgow 8 July 2005Collaboration Board Meeting GridPP Report Tony Doyle.
E-Science Research Councils awarded e-Science funds ” science increasingly done through distributed global collaborations enabled by the Internet, using.
GDB Meeting - 10 June 2003 ATLAS Offline Software David R. Quarrie Lawrence Berkeley National Laboratory
TRT Offline Software DOE Visit, August 21 st 2008 Outline: oTRT Commissioning oTRT Offline Software Activities oTRT Alignment oTRT Efficiency and Noise.
The CMS Simulation Software Julia Yarba, Fermilab on behalf of CMS Collaboration 22 m long, 15 m in diameter Over a million geometrical volumes Many complex.
1 SICBDST and Brunel Migration status and plans. 2 Migration Step 1: SICBMC/SICBDST split  Last LHCb week: Split done but not tested  Software week.
Computing R&D and Milestones LHCb Plenary June 18th, 1998 These slides are on WWW at:
Paul Alexander 2 nd SKADS Workshop October 2007 SKA and SKADS Costing The Future Paul Alexander Andrew Faulkner, Rosie Bolton.
12 March, 2002 LCG Applications Area - Introduction slide 1 LCG Applications Session LCG Launch Workshop March 12, 2002 John Harvey, CERN LHCb Computing.
Introduction S. Rajagopalan August 28, 2003 US ATLAS Computing Meeting.
PCAP Close Out Feb 2, 2004 BNL. Overall  Good progress in all areas  Good accomplishments in DC-2 (and CTB) –Late, but good.
The ATLAS Computing & Analysis Model Roger Jones Lancaster University ATLAS UK 06 IPPP, 20/9/2006.
ALICE RRB-T ALICE Computing – an update F.Carminati 23 October 2001.
UK LVL1 Meeting, RAL, 31/01/00Alan Watson 1 ATLAS Trigger Simulations Present & Future? What tools exist? What are they good for? What are the limitations?
August 98 1 Jürgen Knobloch ATLAS Software Workshop Ann Arbor ACOS Report ATLAS Software Workshop December 1998 Jürgen Knobloch Slides also on:
12 March, 2002 LCG Applications Area - Introduction slide 1 LCG Applications Session LCG Launch Workshop March 12, 2002 John Harvey, CERN LHCb Computing.
Dario Barberis: Conclusions ATLAS Software Week - 10 December Conclusions Dario Barberis CERN & Genoa University.
FTK high level simulation & the physics case The FTK simulation problem G. Volpi Laboratori Nazionali Frascati, CERN Associate FP07 MC Fellow.
MAUS Status A. Dobbs CM43 29 th October Contents MAUS Overview Infrastructure Geometry and CDB Detector Updates CKOV EMR KL TOF Tracker Global Tracking.
1 ALICE Summary LHCC Computing Manpower Review September 3, 2003.
Monthly video-conference, 18/12/2003 P.Hristov1 Preparation for physics data challenge'04 P.Hristov Alice monthly off-line video-conference December 18,
GUIDO VOLPI – UNIVERSITY DI PISA FTK-IAPP Mid-Term Review 07/10/ Brussels.
Bob Jones EGEE Technical Director
Ian Bird GDB Meeting CERN 9 September 2003
S. Rajagopalan August 28, 2003 US ATLAS Computing Meeting
US ATLAS Physics & Computing
Project Ideation Agile Down-to-Earth © 2016.
Collaboration Board Meeting
Simulation and Physics
ATLAS DC2 & Continuous production
Use of GEANT4 in CMS The OSCAR Project
LHC Computing, RRB; H F Hoffmann
Presentation transcript:

ATLAS eScience Programme ATLAS UK Oversight Committee 21 May 2004 Roger Jones 12 UK institutions

2 RWL Jones, Lancaster University POOL/SEAL release (done) ATLAS release 7 (with POOL persistency) (done) LCG-1 deployment (in progress...) ATLAS complete Geant4 validation (done) ATLAS release 8 (done) DC2 Phase 1: simulation production DC2 Phase 2: intensive reconstruction (the real challenge!) Combined test beams (barrel wedge) Computing Model paper Computing Memorandum of Understanding ATLAS Computing TDR and LCG TDR DC3: produce data for PRR and test LCG-n Physics Readiness Report Start commissioning run GO! NOW LCG and GEANT 4 Integration Testing the Computing Model Testing the Physics Readiness Data-ready versions Confront with data Packages shake-down in DC3 (or earlier) ready for physics in 2007 ATLAS Computing Timeline

3 RWL Jones, Lancaster University ATLAS eScience Programme  Original bid was presented to PPRP on 14 th July 2003  Reduced programme indicated  An allocation and full management plan had not been requested at that time  A presentation to the PPRP of a prioritized programme and a fuller description of the management plan and process given on 4 th February 2004 and approved at £2.45M  ‘Required’ 10% working margin  No similar programme has that level of working margin  LHCb (1/2 the size), 3.5%  GridPP (5 times the size) has just under 5% (although can vary the hardware spend)  10% would do serious damage in terms of the number of hires, in a way that would not be recoverable when the residual margin is released  We ask Oversight Committee endorsement of a 7.6% working margin (on April 2004 numbers) as detailed in the paperwork

4 RWL Jones, Lancaster University New ATLAS eScience Programme: Summary of Deliverables This project will produce:  Major components of the track & vertex reconstruction and fitting packages for the (major UK participation in construction) ATLAS SCT, validated against test beam and simulation data  The calibration and monitoring software required to ensure reliable and optimal operation of the detector system for which the UK is responsible  The validated simulation and monitoring tools for UK aspects of the trigger.  The ATLAS fast simulation validated against full GEANT simulation and interfaced to UK-lead physics simulations (HERWIG++)  The general ATLAS software validation system.  General ATLAS reconstruction and physics tools.  The ATLAS event display/visualisation program. (see appendix A of February 2 nd document to PPRP for detailed deliverables – currently under revision in the light of staggered hiring, delayed approval)

5 RWL Jones, Lancaster University Project Definition  Project broken into 4 work packages  Co-ordinated with ATLAS GridPP effort, effectively ‘WP5’ lead by Roger Jones  2.5 FTEs working on Grid integration and a 2FTE joint project constructing a User Interface (GANGA) with LHCb  Overall project leader (R Jones, project manager in ATLAS terms) Work Package Sub- package (Sub) Package Leader Activity Low-level Tracking Software 1 S Lloyd; QMUL Monitoring, alignment, ‘core tracking algorithms’ and tracking validation High-level Tracking & reconstruction software 2 D Tovey; Sheffield Tracking tools, energy flow Trigger & Simulation Software Trigger3.1 S George; RHUL Level-1 monitoring and simulation and trigger validation. HLT re-validation under the new B-physics framework Simulation3.2(A Doyle; Glasgow)Geant4 integration and framework activity Support Software, Frameworks & Visualisation Support & Frameworks 4.1 P Sherwood; UCL Validation framework, tier support, ARTEMIS Visualisation4.2(N Konstantinidis; UCL)ATLANTIS package

6 RWL Jones, Lancaster University Management Structure ATLAS UK Oversight Committee ATLAS-UK Computing Project Management Board CEB & Institute representatives A-UK Computing Executive Board Project & WP leaders, Resource officer, Comp. Co-ordinator WP1 Low Level Tracking Steve Lloyd, QMUL WP2 High level Tracking Dan Tovey, Sheffield WP3 Trigger and Simulation Simon George, RHUL 3.1 Trigger Simon George, RHUL 3.2 Simulation Tony Doyle, Glasgow WP4 Software Support/Frameworks Peter Sherwood, UCL 4.1 Software Support Peter Sherwood, UCL 4.2 Visualisation Nikos Konstantinidis, UCL ATLAS UK Collaboration Board UK Group and Project leaders

7 RWL Jones, Lancaster University Spend Profile There will be some shift later because of delayed hires Contingency generated by shaving months off the end Conservative salary total 2.43M

8 RWL Jones, Lancaster University Risk register

9 RWL Jones, Lancaster University Work Packages: Recent work and Plans at Start-up

10 RWL Jones, Lancaster University WP1 Low Level Tracking  Digits and Geometry  QMUL worked on SCT digits (switch to Raw Data Objects)  New hits definition  SCT Monitoring  Work is underway at Cambridge and Liverpool  Core Tracking Algorithms (RAL)  Tuning and testing of iPatRec code  Impact of ID layouts  Future work shaped by common Track framework, merging of two tracking suites  Alignment  Steve Haywood co-convenes alignment group  Much RAL work on interfacing the GeoModel  First tests of debugged code and algebra have been done  Manchester are working on the magnetic alignment code

11 RWL Jones, Lancaster University WP2 High Level Tracking  V0s (Lancaster)  Inefficiency largely caused by hit requirements  Field causes more minor problems  New strategies developed, but implementation awaiting the new ATLAS track model (currently iPatRec and xKalman are different and should merge)  Bremsstralung recovery (Lancaster)  Various strategies tried and compared, small improvements  More use of the calorimeter is possible, Gaussian-filter shows promise  Again awaits unified track model  Vertexing  First look at implementing billoir fitter by Liverpool  Sheffield providing a C++ fitter working on CBNT until an Athena solution available  Energy Flow  Sheffield have had a notable success with EflowRec implementing first energy flow ideas  Now porting it to the RTF recommended formats  Redesign and extensions are planned Revised deliverables available and discussed with overall ATLAS ID soft. coordinator

12 RWL Jones, Lancaster University WP3.1 Trigger and Offline  HLT  Simon George co-ordinated HLT/PESA software, is on the Architecture team and the Software PMB  RHUL prototyping offline trigger code validation (with some UCL assistance)  Also working on a region selector and the event and conditions data access  Level 1  Birmingham developing the Level 1 calorimeter simulation (started by QMUL)  More realistic simulation of electronics being developed  New scheme for tower simulation, preliminary releasse this month Deliverables under revision

13 RWL Jones, Lancaster University WP3.2 Simulation  ATLFast  A great UCL-driven success story  Wish to transfer the experience into other areas  Integration with POOL persistancy  FASTShower integration – final verification stalled because of non-retention of staff  ATLFast must continue to evolve and match the GEANT4 simulation and – Glasgow post hired early on local money  This needs a comparator, which will be Artemis based (see later)  Adaptors reconstruction  Artemis  Parser for DC2 Geant4 data  EvtGen  Framework (Lancaster) will allow inter-experiment sharing of decay data and models  First crude integration done  More work need on framework itself and on the integration incorporated  Will be used to validate the HLT B-physics triggers (sensitive to the angular structure) Deliverables under revision

14 RWL Jones, Lancaster University WP4.1 Support and Frameworks  Artemis  The UK (mainly UCL, but also Glasgow and Lancaster) has developed a prototype lightweight physics analysis framework  Full functionality  OO/C++  Easy to develop (students can extend it)  Can allow painless evolution to track reconstruction framework attempts to provide same  ATLAS workshop in UCL prompted by Artemis Contributions to ‘Core&Infrastructure’ ~9 s-y Computing MoU is being constructed by Task Force UK is represented by Neil Geddes Roger Jones actively reviews as ICB Chair Revised deliverables available and being added to WBS

15 RWL Jones, Lancaster University ATHENA reconstruction (xKalman, iPatrec) StoreGate DataVector< Trk::Track ….> ARTEMIS Track Interface (ITrack) Create() getTrkTrack getPerigee getBLayerHits etc TrackCollection DataVector< ITrack ….> (shared ptrs) User’s analysis code Histograms/ ASCII output Artemis toolsVertexing software Data flow Shared pointer Service Calorimetry users already happy This will allow B-physics use

16 RWL Jones, Lancaster University WP4.2 Support and Frameworks  Validation Framework  Software hooks as well as testing suite envisaged  Current product RTT allows frequent large sample tests  Mainly developed by UCL with Manchester support  Parallel testing framework developments at RHUL on trigger simulation  With convergence of HLT and offline, convergence and co- ordination planned in these activities

17 RWL Jones, Lancaster University WP4.2 Visualisation  UCL has the graphics and visualisation coordinator  Lead responsiblity for ATLANTIS  See cover of current CERN Courier for an example  Evolved from ALEPH display DALI  JAVA based  Should:  Be able to run reconstruction algorithms (C++ - use PYTHON bus?)  Be able to interact work on the Grid (interaction with Alvin Tan, GridPP funded)  Be tailored to ID and trigger needs

18 RWL Jones, Lancaster University ATLANTIS Display makes the cover of the April 2004 CERN Courier

19 RWL Jones, Lancaster University Summary  This prioritised programme will ensure ATLAS-UK is ready to exploit LHC data:  Supports the large construction phase experiment  Supports the existing UK leadership  Integrates with the GridPP activities  Makes a significant contribution to the overall shortfall in ATLAS computing effort, commensurate with our share of the experiment  Will position the UK well for physics exploitation  Management team blends project management experience with software expertise  Local expertise in place to support and manage new staff  Collaboration eager to start posts at the earliest possible date  Request your support of 7.6% working margin  Largest uncertainties (new salary scales) will be resolved over the next few months. The current PPARC model for them still gives 5.4% working margin, and would still just cover the full program.

20 RWL Jones, Lancaster University Priorities  Programme preserves close links with the large UK involvement with the construction of the detector  Prioritized programme guided by:  ATLAS, GridPP, LCG planning  The CERN Computing Review of “core” computing for LHC experiments – Shortfalls of ~35% in all experiments  Most important to ATLAS include:  ‘Core and Infrastructure’ – a high priority, 8FTEs needed  Tracking  Reconstruction and analysis tools  The CERN review did not cover detector-specific software; this programme does  The prioritized programme makes a reasonable UK contribution towards some of the shortfalls, especially in core and infrastructure

21 RWL Jones, Lancaster University Post Allocations PostHostStartInitial endWished EndStatus SCT Digits/Tier SupportQMUL1-Oct-0430-Jun-0730-Sep-07W SCT Monitoring ForwardLiverpool1-Sep-0431-Aug-07 A SCT Monitoring BarrelCambridge1-Apr-0531-Mar-07 H AlignmentOxford1-Oct-0430-Sep-07 W AlignmentRAL (half)*1-Apr-0530-Jun-0731-Mar-08W Alignment&Track ValidationManchester1-Jan-0530-Sep-0731-Dec-07H Base IDRAL*1-Jun-0431-May-07 A Brem recovery & V0Lancaster1-Jun-0431-May-07 F Conversions and vertexingSheffield1-Apr-0531-Mar-0730-Sep-07H Kink finding & conversionsCambridge (half)1-Oct-0431-Sep-0730-Sep-07F B&trigger vertexing/tier supportLiverpool1-Apr-0531-Mar-08 A EflowSheffield1-Jul-0430-Jun-07 A Geant 4 validationGlasgow1-Oct-0431-Jun-0730-Sep-07F ATLFast and ArtemisUCL1-Apr-0431-Mar-07 A EvtGen/HLT ArtemisLancaster1-Apr-0531-Sep-0731-Mar-08H Level 1Birmingham1-Jun-0431-Nov-06 W HLT validation & frameworkRHUL1-Jan-0531-Aug-0731-Dec-07H ATLANTIS & core validationUCL1-Apr-0431-Mar-07 A Trigger visualisationBirmingham (half)1-Apr-0531-Mar-07 H Artemis and ATLANTISUCL1-Apr-0531-Mar-08 H

22 RWL Jones, Lancaster University Reporting  Project Leader reports to ATLAS oversight committee via the ATLAS UK Spokesperson (as with ID, Level-1 and HLT) in normal cycle (currently biannual)  Project Leader also reports for C-PMB to the ATLAS UK CB (again biannual at present)  C-PMB meets at least twice a year to consider reports from project leader and CEB  CEB reviews quarterly reports on each WP  CEB meets more frequently (initially every two weeks) to monitor progress and problems, and to address resource issues, spend profile and new hires  WPs meet frequently (~weekly) to assess progress and problems.Problem-handling  Problems initially addressed by WP leaders and local responsibles  Resource issues discussed by the CEB and taken to the C-PMB if required  Serious issues taken up to the CB