Software Status/Plans Torre Wenaus, BNL/CERN US ATLAS Software Manager US ATLAS PCAP Review November 14, 2002.

Slides:



Advertisements
Similar presentations
Alan Edwards European Commission 5 th GEO Project Workshop London, UK 8-9 February 2011 * The views expressed in these slides may not in any circumstances.
Advertisements

Upgrading the Oracle Applications: Going Beyond the Technical Upgrade Atlanta OAUG March 19, 1999 Robert Cooney.
Project Overview John Huth Harvard University U.S. ATLAS Physics and Computing Project Review ANL October 2001.
Information Systems and Data Acquisition for ATLAS What was achievedWhat is proposedTasks Database Access DCS TDAQ Athena ConditionsDB Time varying data.
12. March 2003Bernd Panzer-Steindel, CERN/IT1 LCG Fabric status
U.S. ATLAS Software WBS 2.2 S. Rajagopalan July 8, 2003 DOE/NSF Review of LHC Computing.
Software Project Status Torre Wenaus BNL U.S. ATLAS Physics and Computing Advisory Panel Review Argonne National Laboratory Oct 30, 2001.
Trigger and online software Simon George & Reiner Hauser T/DAQ Phase 1 IDR.
National Finance Center’s 2008 Customer Forum EmpowHR 9.0 Billy Dantagnan Teracore.
System Design/Implementation and Support for Build 2 PDS Management Council Face-to-Face Mountain View, CA Nov 30 - Dec 1, 2011 Sean Hardman.
U.S. ATLAS Physics and Computing Budget and Schedule Review John Huth Harvard University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven.
October 24, 2000Milestones, Funding of USCMS S&C Matthias Kasemann1 US CMS Software and Computing Milestones and Funding Profiles Matthias Kasemann Fermilab.
LCG Milestones for Deployment, Fabric, & Grid Technology Ian Bird LCG Deployment Area Manager PEB 3-Dec-2002.
BNL PCAP Meeting Jan U.S. ATLAS Computing  Overview  Status of ATLAS computing  U.S. ATLAS  Project Management Organization  Status of.
CMS Report – GridPP Collaboration Meeting VI Peter Hobson, Brunel University30/1/2003 CMS Status and Plans Progress towards GridPP milestones Workload.
Software Overview and LCG Project Status & Plans Torre Wenaus BNL/CERN DOE/NSF Review of US LHC Software and Computing NSF, Arlington June 20, 2002.
U.S. ATLAS Computing Project: Budget Profiles, Milestones Jim Shank Boston University Physics and Computing Advisory Panel Review LBNL Nov., 2002.
Software Project Status Torre Wenaus, BNL/CERN US ATLAS Software Manager DOE/NSF Review of the US ATLAS Physics and Computing Project January 15, 2003.
US-ATLAS Management Overview John Huth Harvard University Agency Review of LHC Computing Lawrence Berkeley Laboratory January 14-17, 2003.
INFSO-RI Enabling Grids for E-sciencE SA1: Cookbook (DSA1.7) Ian Bird CERN 18 January 2006.
LCG and HEPiX Ian Bird LCG Project - CERN HEPiX - FNAL 25-Oct-2002.
Grid Status - PPDG / Magda / pacman Torre Wenaus BNL U.S. ATLAS Physics and Computing Advisory Panel Review Argonne National Laboratory Oct 30, 2001.
Tier 1 Facility Status and Current Activities Rich Baker Brookhaven National Laboratory NSF/DOE Review of ATLAS Computing June 20, 2002.
PCAP Management Overview John Huth Harvard University PCAP Review of U.S. ATLAS Lawrence Berkeley Laboratory NOVEMBER 14-16, 2002.
ATLAS and GridPP GridPP Collaboration Meeting, Edinburgh, 5 th November 2001 RWL Jones, Lancaster University.
LCG Applications Area – Overview, Planning, Resources Torre Wenaus, BNL/CERN LCG Applications Area Manager LHCC Comprehensive Review.
ATLAS, U.S. ATLAS, and Databases David Malon Argonne National Laboratory DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National Laboratory.
ATLAS, U.S. ATLAS, and Databases David Malon Argonne National Laboratory PCAP Review of U.S. ATLAS Computing Project Argonne National Laboratory
June 02 John Huth, LHC Computing 1 U.S. ATLAS Overview  Project ManagementJ. Huth  SoftwareT.Wenaus  ArchitectureD. Quarrie  PhysicsI. Hinchliffe 
1 Planning for Reuse (based on some ideas currently being discussed in LHCb ) m Obstacles to reuse m Process for reuse m Project organisation for reuse.
LCG LHC Computing Grid Project – LCG CERN – European Organisation for Nuclear Research Geneva, Switzerland LCG LHCC Comprehensive.
Questions for ATLAS  How can the US ATLAS costs per SW FTE be lowered?  Is the scope of the T1 facility matched to the foreseen physics requirements.
Magda status and related work in PPDG year 2 Torre Wenaus, BNL/CERN US ATLAS Core/Grid Software Workshop, BNL May 6-7, 2002 CERN.
Atlas CAP Closeout Thanks to all the presenters for excellent and frank presentations Thanks to all the presenters for excellent and frank presentations.
Software Project Status Torre Wenaus BNL U.S. ATLAS Physics and Computing Advisory Panel Review Brookhaven National Laboratory May 21, 2001.
JRA Execution Plan 13 January JRA1 Execution Plan Frédéric Hemmer EGEE Middleware Manager EGEE is proposed as a project funded by the European.
U.S. ATLAS Software WBS 2.2 S. Rajagopalan July 8, 2004 DOE-NSF Review of U.S. ATLAS Computing.
U.S. ATLAS Project Overview John Huth Harvard University LHC Computing Review FNAL November 2001.
CMS Computing and Core-Software USCMS CB Riverside, May 19, 2001 David Stickland, Princeton University CMS Computing and Core-Software Deputy PM.
US ATLAS Tier 1 Facility Rich Baker Brookhaven National Laboratory Review of U.S. LHC Software and Computing Projects Fermi National Laboratory November.
GDB Meeting - 10 June 2003 ATLAS Offline Software David R. Quarrie Lawrence Berkeley National Laboratory
U.S. ATLAS Executive Committee August 3, 2005 U.S. ATLAS TDAQ FY06 M&O Planning A.J. Lankford UC Irvine.
Software Project Status Torre Wenaus BNL DOE/NSF Review of US LHC Software and Computing Fermilab Nov 29, 2001.
CD FY09 Tactical Plan Status FY09 Tactical Plan Status Report for Neutrino Program (MINOS, MINERvA, General) Margaret Votava April 21, 2009 Tactical plan.
Data Management Overview David M. Malon Argonne U.S. ATLAS Physics and Computing Project Advisory Panel Meeting Berkeley, CA November 2002.
Data Management Overview David M. Malon Argonne U.S. LHC Computing Review Berkeley, CA January 2003.
U.S. ATLAS Computing Facilities Overview Bruce G. Gibbard Brookhaven National Laboratory U.S. LHC Software and Computing Review Brookhaven National Laboratory.
Magda Distributed Data Manager Prototype Torre Wenaus BNL September 2001.
12 March, 2002 LCG Applications Area - Introduction slide 1 LCG Applications Session LCG Launch Workshop March 12, 2002 John Harvey, CERN LHCb Computing.
Introduction S. Rajagopalan August 28, 2003 US ATLAS Computing Meeting.
PCAP Close Out Feb 2, 2004 BNL. Overall  Good progress in all areas  Good accomplishments in DC-2 (and CTB) –Late, but good.
Grid Status - PPDG / Magda / pacman Torre Wenaus BNL DOE/NSF Review of US LHC Software and Computing Fermilab Nov 29, 2001.
DPS/ CMS RRB-T Core Software for CMS David Stickland for CMS Oct 01, RRB l The Core-Software and Computing was not part of the detector MoU l.
Simulation Project Setup Status Torre Wenaus, BNL/CERN LCG Applications Area Manager PEB Meeting January 28, 2003.
HUIT Cloud Initiative Update November, /20/2013 Ryan Frazier & Rob Parrott.
Towards deploying a production interoperable Grid Infrastructure in the U.S. Vicky White U.S. Representative to GDB.
LCG Persistency Framework Project Boundary Conditions and Overall Schedule Torre Wenaus, BNL/CERN.
12 March, 2002 LCG Applications Area - Introduction slide 1 LCG Applications Session LCG Launch Workshop March 12, 2002 John Harvey, CERN LHCb Computing.
1 ALICE Summary LHCC Computing Manpower Review September 3, 2003.
Magda Distributed Data Manager Torre Wenaus BNL October 2001.
Bob Jones EGEE Technical Director
Software Project Configuration Management
LCG Applications Area Milestones
U.S. ATLAS TDAQ FY06 M&O Planning
EGEE Middleware Activities Overview
U.S. ATLAS Grid Production Experience
S. Rajagopalan August 28, 2003 US ATLAS Computing Meeting
Readiness of ATLAS Computing - A personal view
Ian Bird LCG Project - CERN HEPiX - FNAL 25-Oct-2002
Department of Licensing HP 3000 Replatforming Project Closeout Report
Presentation transcript:

Software Status/Plans Torre Wenaus, BNL/CERN US ATLAS Software Manager US ATLAS PCAP Review November 14, 2002

Torre Wenaus, BNL/CERN PCAP Review, November 14, 2002 Slide 2 U.S. ATLAS Software Project Overview  Control framework and architecture  Chief Architect, principal development role, ATLAS LCG applications area liaison  Databases and data management  Database Leader, primary ATLAS expertise on ROOT/relational baseline  Software support for development and analysis  Software librarian, quality control, software development tools, training…  Automated build/testing system adopted by Int’l ATLAS  Subsystem software roles complementing hardware responsibilities  Muon system software coordinator  Scope commensurate with U.S. in ATLAS: ~20% of overall effort  Commensurate representation on steering group  Strong role and participation in LCG common effort Recent developments in green

Torre Wenaus, BNL/CERN PCAP Review, November 14, 2002 Slide 3 U.S. ATLAS Software Organization

Torre Wenaus, BNL/CERN PCAP Review, November 14, 2002 Slide 4 U.S. ATLAS - ATLAS Coordination US roles in Int’l ATLAS software: D. Quarrie (LBNL), Chief Architect D. Malon (ANL), Database Coordinator P. Nevski (BNL), Geant3 Simulation Coordinator, Simulation production lead C. Tull (LBNL), EDG WP8 Liaison H. Ma (BNL), Raw Data Coordinator T. Wenaus (BNL), Planning Officer USInternational See task matrix

Torre Wenaus, BNL/CERN PCAP Review, November 14, 2002 Slide 5 ATLAS Subsystem/Task Matrix Offline Coordinator ReconstructionSimulationDatabase ChairN. McCubbinD. RousseauA. Dell’Acqua D. Malon Inner DetectorD. BarberisD. RousseauF. Luehring S. Bentvelsen / D. Calvet Liquid ArgonJ. CollotS. RajagopalanM. Leltchouk H. Ma Tile CalorimeterA. SolodkovF. MerrittV.TsulayaT. LeCompte MuonJ.ShankJ.F. LaporteA. RimoldiS. Goldfarb LVL 2 Trigger/ Trigger DAQ S. GeorgeS. TapproggeM. Weilers A. Amorim / F. Touchard Event FilterV. VercesiF. Touchard Computing Steering Group members/attendees: 4 of 19 from US (Malon, Quarrie, Shank, Wenaus) Physics Coordinator: F.Gianotti Chief Architect: D.Quarrie

Torre Wenaus, BNL/CERN PCAP Review, November 14, 2002 Slide 6 Project Planning Status  U.S./Int’l ATLAS WBS/PBS and schedule fully unified  US/Int’l software planning covered by same person (TW)  Synergies outweigh the added burden of the ATLAS Planning Officer role  No ‘coordination layer’ between US and Int’l ATLAS planning  Possible because of how the ATLAS Planning Officer role is currently scoped  As pointed out by an informal ATLAS computing review in March, ATLAS would benefit from a full FTE Planning Officer  I have a standing offer to the Computing Coordinator: to step aside if/when a capable person with more time is found  Until then, I scope the job to what I have time for and what is highest priority

Torre Wenaus, BNL/CERN PCAP Review, November 14, 2002 Slide 7 ATLAS Computing Planning  US led a comprehensive review and update of ATLAS computing schedule in the spring  Milestone count increased by 50% to 600; many others updated  Milestones and planning coordinated around DC schedule  Reasonably comprehensive and detailed through 2002  New round underway now to flesh out 2003 schedule  US core activity scheduling in reasonable shape  Long term milestones reworked to reflect LHC schedule, LCG  Centered around escalating data challenges  Weak decision making, still a problem, translates to (among other things) weak planning  Strong recommendation of the March review to fix this  Should be fixed in the present computing management reorganization

Torre Wenaus, BNL/CERN PCAP Review, November 14, 2002 Slide 8 Summary Major Milestones Green: Done Gray: Original date Blue: Current date

Torre Wenaus, BNL/CERN PCAP Review, November 14, 2002 Slide 9 Major Milestones One DC per year until startup

Torre Wenaus, BNL/CERN PCAP Review, November 14, 2002 Slide 10 Data Challenge 1  DC1 phase 1 (simu production for HLT TDR) executed successfully  World-wide operation  Phase 2, starting in the next weeks, focuses on testing of new software  Introduction and testing of new event data model  Intensive Geant4 usage  Intensive Grid usage  Data production for Physics and computing model studies  First tests of computing model  Analysis using Analysis Object Data (AOD)  DC1 Phase 2 too early for LCG common persistency POOL production use  POOL production releases in spring/summer 2003

Torre Wenaus, BNL/CERN PCAP Review, November 14, 2002 Slide 11 Software Support, Quality Control  New releases are available in the US typically ~1-2 days after CERN  Provided in AFS for use throughout the US  Librarian receives help requests and queries from ~25 people in the US  US-developed nightly build facility used throughout ATLAS  Central tool in the day to day work of developers and the release process  Recently expanded as framework for progressively integrating more quality control and testing  Testing at component, package and application level  Code checking to be integrated  CERN support functions being transferred to new ATLAS librarian  BNL-based nightlies recently resumed  Much more stable build environment than CERN at the moment  Use timely, robust nightlies to promote usage of the Tier 1 for development

Torre Wenaus, BNL/CERN PCAP Review, November 14, 2002 Slide 12 Software Support, Quality Control (2)  Testing integrated into automated builds  Unit tests, package tests, integration/system tests  ATLAS has (finally!) established a dedicated support team (SIT) for software infrastructure, testing, release management etc.  U.S. represented by the U.S. ATLAS librarian, an active team member  SIT needs a dedicated leader (currently the rotating, and overloaded, release manager heads SIT)  Provides a much needed context for U.S. support and QA efforts  pacman (Boston U) for remote software installation  Adopted by grid projects for VDT, and a central tool in US grid testbed work

Torre Wenaus, BNL/CERN PCAP Review, November 14, 2002 Slide 13 Grid Software  Software development within the ATLAS complements of the grid projects is being managed as an integral part of the software effort  Grid software activities tightly integrated into ongoing core software program, for maximal relevance and return  Grid project programs consistent with this have been developed  And has been successful  e.g. Distributed data manager tool (Magda) we developed was adopted ATLAS-wide for data management in the DCs  Grid goals, schedules integrated with ATLAS (particularly DC) program  However we do suffer some program distortion  e.g. we have to limit effort on providing ATLAS with event storage capability in order to do work on longer-range, higher-level distributed data management services

Torre Wenaus, BNL/CERN PCAP Review, November 14, 2002 Slide 14 Effort Level Changes  ANL/Chicago – loss of.5 FTE in DB  Ed Frank departure; no resources to replace  Another.5 FTE recently lost, will be replaced  BNL – cancelled 1 FTE new hire in data management  Insufficient funding in the project and the base program to sustain the bare- bones plan  Results in transfer of DB effort to grid (PPDG) effort – because the latter pays the bills, even if it distorts our program towards lesser priorities  LBNL – stable project-supported FTE count in architecture/framework  But loss of base support is threatening effort level and deliverables  Grid funding being sought to ameliorate  DB effort hard-hit, but ameliorated by common project  Because the work is now in the context of a broad common project, US can still sustain our major role in ATLAS DB  A material example of common effort translating into savings (even if we wouldn’t have chosen to structure the savings this way!)

Torre Wenaus, BNL/CERN PCAP Review, November 14, 2002 Slide 15 Personnel Priorities for FY02, FY03  This is how we are doing relative to goals…  Sustain LBNL (4.5FTE) and ANL (3FTE) support  This we are doing so far.  Add FY02, FY03 1FTE increments at BNL to reach 3FTEs  Failed in 02; BNL hire cancelled. Should recover to 3 FTEs in FY03  Restore the.5FTE lost at UC to ANL  No resources  Establish sustained presence at CERN.  No resources, despite being a very high priority  We rely on labs to continue base program and other lab support to sustain existing complement of developers  And needed base program support is not there. Lab base programs are being hammered.

Torre Wenaus, BNL/CERN PCAP Review, November 14, 2002 Slide 16 General and longer term priorities  These are reflected in the software request in the research program proposal (and go back as far as our original Jan 2000 project plan)  Priorities in order:  Sustain existing ANL, BNL, LBNL efforts  Complete the ramp of the lab based core developer FTEs to the long-planned levels  ANL 3.5 FTEs, LBNL 4.5 FTEs, BNL 4 FTEs  Establish, over and above these lab levels, presence at CERN of at least 2 core developer FTEs  In addition to any lab people who might be located at CERN  Establish effort at the core-subsystem interface – sited mainly at universities and possibly CERN – to  support the translation of core developments into established software employed by end users  better support the leadership roles held by the US with developer effort capable of translating decisions into established solutions

Torre Wenaus, BNL/CERN PCAP Review, November 14, 2002 Slide 17 SW Funding Profile Comparisons 2000 agency guideline January 2000 PMP 11/2001 guideline ‘Compromise profile’ requested in 2000 Mid 02 bare bones

Torre Wenaus, BNL/CERN PCAP Review, November 14, 2002 Slide 18 Concluding Remarks  No strategic changes; program is working, but stressed by funding  US has consolidated the leading roles in our targeted core software areas  Involved with new LCG common efforts in all our core areas  Architecture/framework effort level being sustained so far  And is delivering the baseline core software of ATLAS  Database/data mgmt effort reduced but so far preserving our key technical expertise  Leveraging that expertise for a strong role in common project  Cannot tolerate further reduction in a key strategic US core area  US major contributor to software infrastructure and QA in ATLAS  Recent emphasis: improve QA, and make the US development and production environment as effective as possible  Soft support from the project and base programs while the emphasis on grids grows is distorting our program in a troubling way