DOE/U.S. ATLAS Computing Sept. 9, 1999 U.S. ATLAS Computing  Overview  Status of ATLAS computing  U.S. ATLAS  Project Management Organization  Status.

Slides:



Advertisements
Similar presentations
Chapter 7: Key Process Areas for Level 2: Repeatable - Arvind Kabir Yateesh.
Advertisements

Project Overview John Huth Harvard University U.S. ATLAS Physics and Computing Project Review ANL October 2001.
Distributed IT Infrastructure for U.S. ATLAS Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
U.S. ATLAS Software WBS 2.2 S. Rajagopalan July 8, 2003 DOE/NSF Review of LHC Computing.
Proposal for a Constitution for MICE A Plan for Discussion P Dornan G Gregoire Y Nagashima A Sessler.
February 2002 Scope and Contingency; Transition to the Research Phase William J. Willis Columbia University.
US ATLAS Distributed IT Infrastructure Rob Gardner Indiana University October 26, 2000
U.S. ATLAS Physics and Computing Budget and Schedule Review John Huth Harvard University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven.
GridPP Steve Lloyd, Chair of the GridPP Collaboration Board.
Argonne National Laboratory ATLAS Core Database Software U.S. ATLAS Collaboration Meeting New York 22 July 1999 David Malon
October 24, 2000Milestones, Funding of USCMS S&C Matthias Kasemann1 US CMS Software and Computing Milestones and Funding Profiles Matthias Kasemann Fermilab.
LCG Milestones for Deployment, Fabric, & Grid Technology Ian Bird LCG Deployment Area Manager PEB 3-Dec-2002.
BNL PCAP Meeting Jan U.S. ATLAS Computing  Overview  Status of ATLAS computing  U.S. ATLAS  Project Management Organization  Status of.
The Preparatory Phase Proposal a first draft to be discussed.
BNL ATLAS Meeting July 1999 U.S. ATLAS Computing  Goals for the next year  Status of ATLAS computing  U.S. ATLAS  Management proposal  Brief status.
REVIEW OF NA61 SOFTWRE UPGRADE PROPOSAL. Mandate The NA61 experiment is contemplating to rewrite its fortran software in modern technology and are requesting.
Mantychore Oct 2010 WP 7 Andrew Mackarel. Agenda 1. Scope of the WP 2. Mm distribution 3. The WP plan 4. Objectives 5. Deliverables 6. Deadlines 7. Partners.
University of Wisconsin System HRS Project Update to ITC November 19, 2010.
May 8, 20071/15 VO Services Project – Status Report Gabriele Garzoglio VO Services Project – Status Report Overview and Plans May 8, 2007 Computing Division,
Fermilab User Facility US-CMS User Facility and Regional Center at Fermilab Matthias Kasemann FNAL.
Nick Brook Current status Future Collaboration Plans Future UK plans.
LHC Computing Review - Resources ATLAS Resource Issues John Huth Harvard University.
Tier 1 Facility Status and Current Activities Rich Baker Brookhaven National Laboratory NSF/DOE Review of ATLAS Computing June 20, 2002.
LHC Computing Review Recommendations John Harvey CERN/EP March 28 th, th LHCb Software Week.
14 Aug 08DOE Review John Huth ATLAS Computing at Harvard John Huth.
ATLAS, U.S. ATLAS, and Databases David Malon Argonne National Laboratory DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National Laboratory.
5 May 98 1 Jürgen Knobloch Computing Planning for ATLAS ATLAS Software Week 5 May 1998 Jürgen Knobloch Slides also on:
June 02 John Huth, LHC Computing 1 U.S. ATLAS Overview  Project ManagementJ. Huth  SoftwareT.Wenaus  ArchitectureD. Quarrie  PhysicsI. Hinchliffe 
19 November 98 1 Jürgen Knobloch ATLAS Computing ATLAS Computing - issues for 1999 Jürgen Knobloch Slides also on:
U.S. ATLAS Tier 1 Planning Rich Baker Brookhaven National Laboratory US ATLAS Computing Advisory Panel Meeting Argonne National Laboratory October 30-31,
U.S. ATLAS Software WBS 2.2 S. Rajagopalan July 8, 2004 DOE-NSF Review of U.S. ATLAS Computing.
DECEMBER 19, 2013 PRESENTATION TO THE TRSD SCHOOL BOARD Timberlane District Action Plans.
U.S. ATLAS Project Overview John Huth Harvard University LHC Computing Review FNAL November 2001.
BNL Tier 1 Service Planning & Monitoring Bruce G. Gibbard GDB 5-6 August 2006.
Tony Doyle - University of Glasgow 8 July 2005Collaboration Board Meeting GridPP Report Tony Doyle.
US ATLAS Tier 1 Facility Rich Baker Brookhaven National Laboratory Review of U.S. LHC Software and Computing Projects Fermi National Laboratory November.
GDB Meeting - 10 June 2003 ATLAS Offline Software David R. Quarrie Lawrence Berkeley National Laboratory
6/23/2005 R. GARDNER OSG Baseline Services 1 OSG Baseline Services In my talk I’d like to discuss two questions:  What capabilities are we aiming for.
ATLAS WAN Requirements at BNL Slides Extracted From Presentation Given By Bruce G. Gibbard 13 December 2004.
Deliverables, Cost, Manpower, Schedule & Maintenance Su Dong CSC Readout Replacement CDR Oct/8/
Computing R&D and Milestones LHCb Plenary June 18th, 1998 These slides are on WWW at:
CD FY09 Tactical Plan Status FY09 Tactical Plan Status Report for Neutrino Program (MINOS, MINERvA, General) Margaret Votava April 21, 2009 Tactical plan.
1 The ILC Control Work Packages. ILC Control System Work Packages GDE Oct Who We Are Collaboration loosely formed at Snowmass which included SLAC,
U.S. ATLAS Computing Facilities Overview Bruce G. Gibbard Brookhaven National Laboratory U.S. LHC Software and Computing Review Brookhaven National Laboratory.
Computing Division FY03 Budget and budget outlook for FY04 + CDF International Finance Committee April 4, 2003 Vicky White Head, Computing Division.
12 March, 2002 LCG Applications Area - Introduction slide 1 LCG Applications Session LCG Launch Workshop March 12, 2002 John Harvey, CERN LHCb Computing.
Introduction S. Rajagopalan August 28, 2003 US ATLAS Computing Meeting.
U.S. ATLAS Computing Facilities DOE/NFS Review of US LHC Software & Computing Projects Bruce G. Gibbard, BNL January 2000.
US_ATLAS Computing Review Jan 2000 Architecture & Framework David R. Quarrie Lawrence Berkeley National Lab
U.S. ATLAS Computing Facilities U.S. ATLAS Physics & Computing Review Bruce G. Gibbard, BNL January 2000.
U.S. ATLAS Project Manager’s Review with the Project Advisory Panel March 21-22, BNL Introduction Howard Gordon.
Atlas Software May, 2000 K.Amako Status of Geant4 Physics Validation Atlas Software Week 10 May, Katsuya Amako (KEK)
Follow-up to SFT Review (2009/2010) Priorities and Organization for 2011 and 2012.
PDS4 Project Report PDS MC F2F University of Maryland Dan Crichton March 27,
12 March, 2002 LCG Applications Area - Introduction slide 1 LCG Applications Session LCG Launch Workshop March 12, 2002 John Harvey, CERN LHCb Computing.
CFS / Global – 09 June, 2010 PM Report: SB2009: –4 two-day workshops form the core of ‘TOP LEVEL CHANGE CONTROL’ –  as advised by AAP, PAC and etc –Written.
Marco Cattaneo, 3-June Event Reconstruction for LHCb  What is the scope of the project?  What are the goals (short+medium term)?  How do we organise.
Grid Deployment Technical Working Groups: Middleware selection AAA,security Resource scheduling Operations User Support GDB Grid Deployment Resource planning,
Marco Cattaneo, 20-May Event Reconstruction for LHCb  What is the scope of the project?  What are the goals (short+medium term)?  How do we organise.
Ian Bird, CERN WLCG Project Leader Amsterdam, 24 th January 2012.
S. Rajagopalan August 28, 2003 US ATLAS Computing Meeting
S4 will be a “big” Collaboration:
Enterprise Content Management Owners Representative Contract Approval
Collaboration Board Meeting
Presentation to Project Certification Committee, DoIT August 24, 2008
Computing activities at Victoria
Development of LHCb Computing Model F Harris
Preliminary Project Execution Plan
Presentation transcript:

DOE/U.S. ATLAS Computing Sept. 9, 1999 U.S. ATLAS Computing  Overview  Status of ATLAS computing  U.S. ATLAS  Project Management Organization  Status of efforts  Core software  Subsystems  Facilities  Schedule  FY 00 funding request  Summary

DOE/U.S. ATLAS Computing Sept. 9, 1999 Scale of Computing Effort  Rough scaling of factors of 5 to 1E+3 in relevant parameters from Tevatron Experiments  Manpower x5  CPU/event x1E+3 (event complexity)  Data volume x10 to x1E+2 (channel count)  Distribution of data x10  U.S. effort comparable to scale of Tevatron experiment.  Effort $15M/year

DOE/U.S. ATLAS Computing Sept. 9, 1999 Scales from experience

DOE/U.S. ATLAS Computing Sept. 9, 1999 Goals for the next year  Project organization  Management  Identify areas of responsibility  Integration of efforts into ATLAS  Inception/development of software  U.S. support facilities  Planning/development of infrastructure  Prepare for reviews

DOE/U.S. ATLAS Computing Sept. 9, 1999 International ATLAS  New Computing Coordinator  Norman McCubbin (RAL)  Available full time November  Approval vote - ATLAS CB June 10th  Responsibility: Core software  New Physics Coordinator  Fabiola Gianotti (CERN)  Approval vote - ATLAS CB June 10th  Detector specific sim/reconstruction  Organized within subsystems

DOE/U.S. ATLAS Computing Sept. 9, 1999 Architecture Taskforce  Software partitioned into work packages  Katsuya Amako, KEK  Laurent Chevalier, CEA  Andrea Dell’Acqua, CERN  Fabiola Gianotti, CERN  Steve Haywood, RAL (Chair)  Jurgen Knobloch, CERN  Norman McCubbin, RAL  David Quarrie, LBL  R.D. Schaffer, LAL  Marjorie Shapiro, LBNL  Valerio Vercesi, Pavia

DOE/U.S. ATLAS Computing Sept. 9, 1999 Architecture T.F. Status  Three meetings so far  Directions:  Language: C++ (allow for migration to other e.g. JAVA)  Examine GAUDI (LHCb) architecture  Adoption of “use cases”  Goals for October  Outline of architecture design  Appointment of Chief Architect  Commission work on prototyping of parts of design  Create use-cases, requirement document  Define packages and relations (package diagram)

DOE/U.S. ATLAS Computing Sept. 9, 1999 Stages in Software Management

DOE/U.S. ATLAS Computing Sept. 9, 1999 Quality Control  Recommend software performance specifications, review process  Makoto Asai, Hiroshima  Dario Barberis, Genoa  Martine Bosman, Barcelona  Bob Jones, CERN  Jean-Francois LaPorte, CEA  Helge Meinhard, CERN  Maya Stavrianakou, CERN

DOE/U.S. ATLAS Computing Sept. 9, 1999 Action on other Groups  National Board  Supported platforms  Regional centers  Training  Network of national contacts for training  C++, OO programming  GEANT 4  ATLAS Specific

DOE/U.S. ATLAS Computing Sept. 9, 1999 ATLAS/CERN Schedule ‘00  Sept. ‘99  Start of Cashmore/Hoffman review  Oct. ‘99  Report of architecture T.F.  Commissioning of prototyping code  Jan ‘00  Start preparations for bilateral agreements  Fall ‘00  Report of Cashmore/Hoffman review  MOU preparations

DOE/U.S. ATLAS Computing Sept. 9, 1999 U.S. Participation  Frank Paige - Co- convenor of SUSY working group  David Malon - Co-leader of database group  Craig Tull - Architecture Group  Ian Hinchliffe - Leader of Event Generator group  David Quarrie, Marjorie Shapiro - Architecture Task Force  John Parsons - Co-convenor of Top working group  Misha Leltchouk - L Ar simulation coordinator  Michael Shupe - Convenor of Background working group  Fred Luehring - TRT software coordinator  Steve Goldfarb - Muon Database Coordinator  Tom LeCompte - Tilecal Database Coordinator  Krzys Sliwa - Chair of ATLAS World-wide computing group  Frank Merritt - Training contact, Tilecal Reconstruction coord.  Bruce Gibbard - Regional center contact  John Huth- National Board contact

DOE/U.S. ATLAS Computing Sept. 9, 1999 U.S. ATLAS Computing  NSF, DOE: LHC computing activities are now “projectized”  Implications for U.S. ATLAS:  Direct reporting lines through Project Manager (Bill Willis) and BNL Directorate (Tom Kirk)  Appointment of Associate Project Manager for Computing and Physics (John Huth)  Implications for Agencies:  Must clarify reporting lines, operations

DOE/U.S. ATLAS Computing Sept. 9, 1999 Proposed Management

DOE/U.S. ATLAS Computing Sept. 9, 1999 Management Structure  Reflects flow of deliverables to, from ATLAS  Appointments (2 year renewable terms)  Physics: Ian Hinchliffe (LBNL)  Facilities: Bruce Gibbard (BNL) + deputy  Issues  Software manager  Availability within U.S. ATLAS - hire?  Flatter structure for the time being?  Bring on soon ? Most desirable!

DOE/U.S. ATLAS Computing Sept. 9, 1999 Management Plan  Associate Project Manager  Member of E.C.  Develop and execute project plan  Establish and maintain project organization+Tracking  Develop annual budget requests  Liason to ATLAS Computing Management  Appoint L2 managers  Review and approve MOU’s to CERN and Institutes  Exercise change control authority  Establish advisory committees where appropriate  Provide reports and organize reviews

DOE/U.S. ATLAS Computing Sept. 9, 1999 Implications for APM  APM is a time consuming job.  Actions for John Huth:  Relief from MDT electronics coordinator (Jay Chapman, U. Michigan)  Relief from U.S. ATLAS Inst. Bd. Chair (Jim Siegrist, LBNL)  Teaching relief - spring terms ‘00 and ‘01  Granted by Harvard University

DOE/U.S. ATLAS Computing Sept. 9, 1999 Level 2 Managers  Appointed by APM, concurrance of Exec. Comm.  Members of E.C. (+ APM, + deputy)  Two year renewable terms  Generic responsibilities  Develop definition of milestones and deliverables  Define, with APM, organizational substructure of level 2  Develop, with APM, annual budget proposals  Identify resource imbalances within subprojects and recommend adjustments  Deliver scope of subproject on time within budget  Maintain cost and schedule  Provide reports to APM, PM  Liason with counterparts at CERN

DOE/U.S. ATLAS Computing Sept. 9, 1999 Specific Responsibilities  Physics Manager  Generators, physics objects, benchmark studies, mock data challenge  Software  Core  Detector specific sim/recon  Training  Facilities  Tier 1,2, networking, support

DOE/U.S. ATLAS Computing Sept. 9, 1999 Project Engineer  Same roles as project engineer’s for construction project  Tracking  Reviews, oversight  Reporting  Technical input

DOE/U.S. ATLAS Computing Sept. 9, 1999 Proposed Names  Physics: Ian Hinchliffe (LBNL)  Contacted, accepted  Policy question: physicists on project  Software: search underway  Facilities: Bruce Gibbard (+deputy)  Deputy: Jim Shank

DOE/U.S. ATLAS Computing Sept. 9, 1999 Facilities Manager  Bruce Gibbard (BNL)  Proposal to add U.S. ATLAS Deputy  U.S. ATLAS and RHIC Deputies  General agreement with Bruce, Tom Kirk  Begin to fill in other areas  Networking  Tier 1  Remote sites  Support

DOE/U.S. ATLAS Computing Sept. 9, 1999 Detector Contacts  L Ar - Srini Rajagopalan (BNL)  Tilecal - Frank Merritt (U.Chicago)  ID- Laurent Vacavant (LBNL)  Muon - Bing Zhou (U. Michigan)  TRT - Keith Baker (Hampton)  Trigger/DAQ -Andy Lankford (UCI)

DOE/U.S. ATLAS Computing Sept. 9, 1999 Planning Activities  Writing/preparation assignments for review - next week  L2 managers where appropriate (ie. Facilities, physics)  Core  Sim/recon  Training  Management (PMP for computing)  MRE/IT “team”  Review in October

DOE/U.S. ATLAS Computing Sept. 9, 1999 WBS Structure  Should be flexible while project definition is underway (level 3+beyond)  Level 2’s should be fixed now  Commensurate with management structure  Adopt lead number “2”

DOE/U.S. ATLAS Computing Sept. 9, 1999 High Levels of WBS  Draft WBS  2.1 Physics  Generators, benchmarks, mock data challenges, physics objects  2.2 Software  Core –Control/Framework,database, event model, analysis tools  Detector specific simulation and recon.  Collaborative tools  Training  2.3 Facilities  Regional center, remote sites, networking, support

DOE/U.S. ATLAS Computing Sept. 9, 1999 Near Term Activities/Issues  U.S. ATLAS Web-site  Weekly video conferences  Support role of BNL  Gathering FY 00 requests  Advisory group appointment  Writing assignments for proposal  NSF MRE/IT proposal - Tier 2 centers  Discussions of deliverables with ATLAS  Interactions with agencies  JOG, Computing review

DOE/U.S. ATLAS Computing Sept. 9, 1999 Software  Core Software  Control/Framework (Tull)  Database, Tilecal Pilot Project (Malon)  Event Model (Rajagopalan)  Detector-specific sim/reconstruction  Representatives from subsystems chosen  Training (Merritt)  Establishment of OO courses (BNL, U. Chicago)

DOE/U.S. ATLAS Computing Sept. 9, 1999 General Requirements  Software must last over lifetime of experiment, yet track language changes  Well defined interface layers  Maintainability, engineering critical  Number of users, use of software professionals  Adaptability to distributed environments  Learn from experiments working on OO (BaBar, D0, CDF, STAR)

DOE/U.S. ATLAS Computing Sept. 9, 1999 Database  David Malon (ANL)  Tilecal pilot project  Tilecal testbeam data in object database  Testbed for ATLAS technologies and strategies  Early feedback to developers  Generalized to other subsystems  Database core software  Transient and persistent object mapping  Definition of database/control interface  Specifications  Examine alternatives to Objectivity

DOE/U.S. ATLAS Computing Sept. 9, 1999 Tilecal Model

DOE/U.S. ATLAS Computing Sept. 9, 1999 Database Schedule

DOE/U.S. ATLAS Computing Sept. 9, 1999 Database Milestones  Jan ‘00  Survey of mapping strategies  Feb. ‘00  Infrastructure for developers deployed  April ‘00  Validation of strategies in testbed  July ‘00  Database management infrastructure defined  Oct ‘00  Infrastructure for distributed access deployed  Jan ‘01  Scalability test  Oct ‘01  Beta release

DOE/U.S. ATLAS Computing Sept. 9, 1999 Control/Framework  Craig Tull (LBNL)  Working on user requirements document (w/ Hinchliffe, Shapiro, Vacavent)  Market survey of framework systems  Object component model  AC++  Compatibility with ATLAS architecture  Resource loaded work plan exists  Work with A.T.F. for design requirements  Already have tested prototype designs

DOE/U.S. ATLAS Computing Sept. 9, 1999 Framework Milestones

DOE/U.S. ATLAS Computing Sept. 9, 1999 Framework Schedule

DOE/U.S. ATLAS Computing Sept. 9, 1999 One Framework Model Software Bus (eg. CORBA) Component C++ Classes/Objects Component Class Adatpers Scripting Interface (eg. Tcl, …) Command Marshalling (eg. SWIG,...) GUI Interface (eg. Tk, …)

DOE/U.S. ATLAS Computing Sept. 9, 1999 Event Model  Type of objects that are stored  Client’s view of the event  Physical organization of the event  Client’s interface to the event  Mechanism to navigate between objects  Transient to Persistent Object mapping

DOE/U.S. ATLAS Computing Sept. 9, 1999 Event Model Milestones  Nov. 99  Review of models in other exp’s  Requirements documents  Dec. 99  Resource loaded schedule  Mar. 00  Baseline design  June 00  Alpha release  Aug. 00  Review experience  FY 01  Beta release

DOE/U.S. ATLAS Computing Sept. 9, 1999 Framework Options

DOE/U.S. ATLAS Computing Sept. 9, 1999 Some Detector Activities  TRT/ID  Put full TRT simulation into GEANT4  L-Ar  Coil, cryos in GEANT4 (Nevis)  Accordian structure in GEANT4 (BNL)  Tilecal  Pilot project

DOE/U.S. ATLAS Computing Sept. 9, 1999 Some Detector Activities  Muon  Study of noise in Higgs-> 4 muon  Combined performance of ID+muon system (A reconstruction)  CSC into simulation  Trigger/DAQ  Comparison of switching architectures  Background studies  Optimization of shielding (100 MeV muon background)

DOE/U.S. ATLAS Computing Sept. 9, 1999 Training  New paradigm of OO programming  Training courses (F. Merritt)  Course offered at BNL (near future)  Course offered at Chicago  Successful programs seen at other experiments (CDF, D0, BaBar)  Ongoing need for training throughout course of experiment  Documentation  ATLAS-specific

DOE/U.S. ATLAS Computing Sept. 9, 1999 Software Development  Asymptotic level - est. 10 software professionals  Peak load (circa 2003) est. 16 S.P.’s  Extrapolations based on existing experiments and proposed areas of responsibility, fractional of U.S. participation  Choice of technology can influence actual needs strongly (e.g. BaBar, STAR in database)

DOE/U.S. ATLAS Computing Sept. 9, 1999 Planned Training  All by Object Mentor (BaBar, others)  Organized by Frank Merritt  Courses approximately 1 week long  Aug. 9 - BNL - OO Design - 13 people  Sept U.C. - OO Design - 15 people  Oct ANL or BNL - Advanced OO - 10 people  Nov. 8 - FNAL - GEANT people

DOE/U.S. ATLAS Computing Sept. 9, 1999 Facilities  BNL ramping up support facility  Taps into RHIC Computing Facility  Major issue of Tier 1/2 facilities  Scale of “Tier 2’s”  Size for support staff, infrastructure  Computing model for U.S. (e.g. grids)  Being addressed in NSF MRE/IT proposal  In the process of developing policy on usage, support of platforms at institutions

DOE/U.S. ATLAS Computing Sept. 9, 1999 Facilities  Tier 1 (Regional Center)  BNL  Leverages RCF  ATLAS specific needs, however.  Primary support function for U.S.  Code release, support  Major processing, event store  Personnel scale estimate:  Roughly linear ramp from 2 FTE’s (now) to 22 or more (depending on computing model)

DOE/U.S. ATLAS Computing Sept. 9, 1999 MONARC  Models of Networked Architecture at Regional Centers (ATLAS+CMS)  Alexander Nazarenko, Tufts hire  Tasks:  Validate simulation models  Perform first simulations of LHC architectures  After Dec. ‘99, focus on planning for regional centers  Model validation - end of September  Understanding of U.S. computing facilities

DOE/U.S. ATLAS Computing Sept. 9, 1999 NSF MRE/IT Proposal  Tier 2 centers  Approx. 5 total  256 node systems  100 TB tape system  Low maintenance  Linked by computing grid  Computing professionals  Dual role - user/developers

DOE/U.S. ATLAS Computing Sept. 9, 1999 Review Process  Send proposals to Advisory Group (Aug. 4th)  Charge:  Identify areas of overlap/commonality  Suggest simplifications/savings  Coherency with ATLAS effort  Prioritize  Meet with Agencies to establish scale for FY 00, initial request (Now)  Confer with Advisory group on feedback  Prepare, review documentation for Dec. review (mid-late Sept.)

DOE/U.S. ATLAS Computing Sept. 9, 1999 Priorities  Critical personnel  People who would otherwise be lost, fulfilling a critical role  Core software effort  Prerequisite to inclusion of sim/recon software  Yet, cannot commit to major ramp (no MOU’s)  Support of U.S. efforts (facilities)  Critical studies  Transition to OO

DOE/U.S. ATLAS Computing Sept. 9, 1999 Priorities  Coherency in development of plan  Matching of facilities scope to usage  E.g. database effort, simulations  Contiguous/overlapping areas  E.g. event model, database, control/framework

DOE/U.S. ATLAS Computing Sept. 9, 1999 FY 00 Needs  Starting Oct. 1st - will need more than simple continuation of present level  Support functions at BNL  Deputy Facilities Manager  Core support - database  Estimate roughly 4 FTE increment  Remaining needs made part of review process for Dec.  Still give estimates for needs beyond December -> now

DOE/U.S. ATLAS Computing Sept. 9, 1999 Facilities  Support for U.S. users a necessary precondition for effective U.S. participation (like training)  Use of RCF leverages existing facilities  Requesting 3 FTE’s Oct. 1 (deputy+support)  $50K of equipment Oct. 1 (500 SpecInt95)  $500K (CPU, disk, mass storage) - after review

DOE/U.S. ATLAS Computing Sept. 9, 1999 Training  Necessary precondition to effective U.S. participation.  Must be done now (trained group of physicists)  Substantial pay-back (experience in industry)  NSF request of $75K to subsidize courses ($1k/student + setup)

DOE/U.S. ATLAS Computing Sept. 9, 1999 Leveraging Funds  Groups that have applied for internal funds  BNL: Already support, 2 FTE’s for FY 00 (core software)  LBNL: Request for support on control/framework  U.Chicago: Advance support for training, request for software professional to start

DOE/U.S. ATLAS Computing Sept. 9, 1999 FTE Request

DOE/U.S. ATLAS Computing Sept. 9, 1999 Schedule  July  Propose management structure to E.C., PM  Collaboration meeting  Tier 1/2 scoping  Plans for FY 00 reviewed  MRE “White paper”  August  Present FY 00 plans to agencies  Outline and writing assignments for proposal (Dec.)

DOE/U.S. ATLAS Computing Sept. 9, 1999 Schedule  September  First drafts of proposal  Management PMP  Software: Core and recon/sim  Facilities  Training, collaborative tools  October  Revise proposal, review  November  Meeting to prepare for Dec. review

DOE/U.S. ATLAS Computing Sept. 9, 1999 Schedule  December (January?)  Agency review  January  Revise funding plan for FY 00  Begin work on bilateral agreements  Ongoing - and beyond January  Prototyping code  Progress toward baselining  Filling in management slots  Bilateral agreements

DOE/U.S. ATLAS Computing Sept. 9, 1999 Summary  Project organization  Management  Identify areas of responsibility  Integration of efforts into ATLAS  Inception/development of software  U.S. support facilities  Planning/development of infrastructure  Prepare for reviews

DOE/U.S. ATLAS Computing Sept. 9, 1999 Summary  Major points:  Oct. 1st FY 00 needs are more than ongoing (4 FTE)  Hardware to augment RCF  Training physicists in OO design  Continue to fill in management structure