CMS Computing and Core-Software Report to USCMS-AB (Building a Project Plan for CCS) USCMS AB Riverside, May 18, 2001 David Stickland, Princeton University.

Slides:



Advertisements
Similar presentations
Maria Grazia Pia, INFN Genova 1 Part V The lesson learned Summary and conclusions.
Advertisements

Amber Boehnlein, FNAL D0 Computing Model and Plans Amber Boehnlein D0 Financial Committee November 18, 2002.
Lucia Silvestris, INFN Bari and CERN/CMC Status Report on CPT Project 23 March 2001, CERN Meeting del Consorzio INFN Status Reports on CPT Project, on.
August 98 1 Jürgen Knobloch ATLAS Software Workshop Ann Arbor ATLAS Computing Planning ATLAS Software Workshop August 1998 Jürgen Knobloch Slides also.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridPP Vendor Day 30 th April.
Argonne National Laboratory ATLAS Core Database Software U.S. ATLAS Collaboration Meeting New York 22 July 1999 David Malon
October 24, 2000Milestones, Funding of USCMS S&C Matthias Kasemann1 US CMS Software and Computing Milestones and Funding Profiles Matthias Kasemann Fermilab.
CMS Software and Computing FNAL Internal Review of USCMS Software and Computing David Stickland Princeton University CMS Software and Computing Deputy.
CMS Report – GridPP Collaboration Meeting VI Peter Hobson, Brunel University30/1/2003 CMS Status and Plans Progress towards GridPP milestones Workload.
C. Seez Imperial College November 28th, 2002 ECAL testbeam Workshop 1 Offline software for ECAL test beam The pre-processing model The offline software.
S.L.LloydGridPP CB 19 February 2003Slide 1 Agenda 1.Minutes of Previous Meeting (29 Oct 2002) 2.Matters Arising 3.GridPP2 Planning 4.EGEE 5.Any Other Business.
Requirements Review – July 21, Requirements for CMS Patricia McBride July 21, 2005.
DPS May 11/2002 USCMS CMS-CCS Status and Plans May 11, 2002 USCMS meeting David Stickland.
Conditions DB in LHCb LCG Conditions DB Workshop 8-9 December 2003 P. Mato / CERN.
Offline Coordinators  CMSSW_7_1_0 release: 17 June 2014  Usage:  Generation and Simulation samples for run 2 startup  Limited digitization and reconstruction.
LHC Computing Review - Resources ATLAS Resource Issues John Huth Harvard University.
Finnish DataGrid meeting, CSC, Otaniemi, V. Karimäki (HIP) DataGrid meeting, CSC V. Karimäki (HIP) V. Karimäki (HIP) Otaniemi, 28 August, 2000.
11 December 2000 Paolo Capiluppi - DataGrid Testbed Workshop CMS Applications Requirements DataGrid Testbed Workshop Milano, 11 December 2000 Paolo Capiluppi,
ATLAS and GridPP GridPP Collaboration Meeting, Edinburgh, 5 th November 2001 RWL Jones, Lancaster University.
GridPP3 Project Management GridPP20 Sarah Pearce 11 March 2008.
LHC Computing Review Recommendations John Harvey CERN/EP March 28 th, th LHCb Software Week.
5 May 98 1 Jürgen Knobloch Computing Planning for ATLAS ATLAS Software Week 5 May 1998 Jürgen Knobloch Slides also on:
1 Planning for Reuse (based on some ideas currently being discussed in LHCb ) m Obstacles to reuse m Process for reuse m Project organisation for reuse.
19 November 98 1 Jürgen Knobloch ATLAS Computing ATLAS Computing - issues for 1999 Jürgen Knobloch Slides also on:
CERN Physics Database Services and Plans Maria Girone, CERN-IT
Bob Jones Technical Director CERN - August 2003 EGEE is proposed as a project to be funded by the European Union under contract IST
9 November 98 1 Jürgen Knobloch ATLAS Computing Overview of ATLAS Computing Jürgen Knobloch Slides also on:
ATLAS Data Challenges US ATLAS Physics & Computing ANL October 30th 2001 Gilbert Poulard CERN EP-ATC.
CMS Computing and Core-Software USCMS CB Riverside, May 19, 2001 David Stickland, Princeton University CMS Computing and Core-Software Deputy PM.
The LHC Computing Grid – February 2008 The Challenges of LHC Computing Dr Ian Bird LCG Project Leader 6 th October 2009 Telecom 2009 Youth Forum.
Les Les Robertson LCG Project Leader High Energy Physics using a worldwide computing grid Torino December 2005.
GDB Meeting - 10 June 2003 ATLAS Offline Software David R. Quarrie Lawrence Berkeley National Laboratory
LHCbComputing Manpower requirements. Disclaimer m In the absence of a manpower planning officer, all FTE figures in the following slides are approximate.
EGEE MiddlewareLCG Internal review18 November EGEE Middleware Activities Overview Frédéric Hemmer EGEE Middleware Manager EGEE is proposed as.
June 29, 2000DOE/NSF USCMS Computing and Software Report. HLT Studies D. Acosta1 High-Level Trigger Studies Darin Acosta University of Florida DOE/NSF.
0 Fermilab SW&C Internal Review Oct 24, 2000 David Stickland, Princeton University CMS Software and Computing Status The Functional Prototypes.
Computing R&D and Milestones LHCb Plenary June 18th, 1998 These slides are on WWW at:
Friday the 18th of May, 2001US CMS Physics J.G. Branson1 Physics in (US) CMS James G. Branson UC San Diego US CMS Collaboration Meeting Riverside CA.
CD FY09 Tactical Plan Status FY09 Tactical Plan Status Report for Neutrino Program (MINOS, MINERvA, General) Margaret Votava April 21, 2009 Tactical plan.
Commission 1: Landscape challenges Chairperson: Aslam Raffee Issues in the current environment : – Lack of sponsorship and accountability – No coordination.
U.S. ATLAS Computing Facilities Overview Bruce G. Gibbard Brookhaven National Laboratory U.S. LHC Software and Computing Review Brookhaven National Laboratory.
The ATLAS Computing Model and USATLAS Tier-2/Tier-3 Meeting Shawn McKee University of Michigan Joint Techs, FNAL July 16 th, 2007.
Computing Division FY03 Budget and budget outlook for FY04 + CDF International Finance Committee April 4, 2003 Vicky White Head, Computing Division.
12 March, 2002 LCG Applications Area - Introduction slide 1 LCG Applications Session LCG Launch Workshop March 12, 2002 John Harvey, CERN LHCb Computing.
USCMS Physics, May 2001Darin Acosta1 Status Report of PRS/  D.Acosta University of Florida Current U.S. activities PRS/  Activities New PRS organization.
EGEE is a project funded by the European Union under contract IST Roles & Responsibilities Ian Bird SA1 Manager Cork Meeting, April 2004.
The MEG Offline Project General Architecture Offline Organization Responsibilities Milestones PSI 2/7/2004Corrado Gatto INFN.
November 27, 2001DOE/NSF review of US LHC S&C projects1 The Software and Computing Committee (SC2) in the LHC Computing Grid Project M Kasemann, FNAL.
LHC Computing, SPC-FC-CC-C; H F Hoffmann1 CERN/2379/Rev: Proposal for building the LHC computing environment at CERN (Phase 1) Goals of Phase.
January 18, 2000DOE/NSF USCMS Computing and Software Review. HLT Studies D. Acosta1 High-Level Trigger Studies Darin Acosta University of Florida DoE/NSF.
PCAP Close Out Feb 2, 2004 BNL. Overall  Good progress in all areas  Good accomplishments in DC-2 (and CTB) –Late, but good.
The Software Engineering Process Discussion Slides.
DPS/ CMS RRB-T Core Software for CMS David Stickland for CMS Oct 01, RRB l The Core-Software and Computing was not part of the detector MoU l.
Atlas Software May, 2000 K.Amako Status of Geant4 Physics Validation Atlas Software Week 10 May, Katsuya Amako (KEK)
USCMS May 2002Jim Branson 1 Physics in US CMS US CMS Annual Collaboration Meeting May 2002 FSU Jin Branson.
- 11apr03 # 1 Operations Operations Management = LCG deployment management Management team at CERN (+7 FTEs) Core infrastructure.
CERN 1 DataGrid Architecture Group Bob Jones CERN.
VI/ CERN Dec 4 CMS Software Architecture vs Hybrid Store Vincenzo Innocente CMS Week CERN, Dec
12 March, 2002 LCG Applications Area - Introduction slide 1 LCG Applications Session LCG Launch Workshop March 12, 2002 John Harvey, CERN LHCb Computing.
Development Project Management Jim Kowalkowski. Outline Planning and managing software development – Definitions – Organizing schedule and work (overall.
1 ALICE Summary LHCC Computing Manpower Review September 3, 2003.
Collaboration Board 27/09/ Next RRB in October - Core and Common Funds in Projected M&O Cat. A budgets for the coming years - Received funds.
Bob Jones EGEE Technical Director
EGEE Middleware Activities Overview
Long-term Grid Sustainability
WLCG: TDR for HL-LHC Ian Bird LHCC Referees’ meting CERN, 9th May 2017.
US ATLAS Physics & Computing
HEC Beam Test Software schematic view T D S MC events ASCII-TDS
Computing activities at Victoria
LHC Computing, RRB; H F Hoffmann
Presentation transcript:

CMS Computing and Core-Software Report to USCMS-AB (Building a Project Plan for CCS) USCMS AB Riverside, May 18, 2001 David Stickland, Princeton University CMS Computing and Core-Software Deputy PM

DPS May/18/2001 USCMS-AB Slide 2 CCS Core Computing & Software PRS Physics Reconstruction and Selection TriDAS Online Software 1. Computing Centres 2. General CMS Computing Services 3. Architecture, Frameworks / Toolkits 9. Tracker / b-tau 8. Online Farms 7. Online Filter Software Framework 6. Production Processing & Data Management 5. Software Process and Quality 4. Software Users and Developers Environment 10. E-gamma / ECAL 11. Jets, Etmiss/HCAL 12. Muons SPROM (Simulation Project Management) RPROM (Reconstruction Project Management) GPI (Group for Process Improvement)…recently created CPROM (Calibration Project Management)…to be created Cafe (CMS Architectural Forum and Evaluation) CPT Project

DPS May/18/2001 USCMS-AB Slide 3 Developing a CCS Project Plan l Build a common planning base for all CPT tasks l Clarify responsibilities l Coordinate milestones l March 2001 planning: ( è Task Breakdown, Deliverables, Cross-projects l Next: Milestone study è Top Down p Starting from major deliverables è Bottom up p Starting from current project understanding è External Constraints p DAQ TDR, Physics TDR, CCS TDR, Data Challenges, LHC timetable etc Without this it is impossible to measure performance, assign limited resources effectively, identify conflicting constraints etc

DPS May/18/2001 USCMS-AB Slide 4 Current most significant risk to the project is insufficient SW manpower l We are making good use of the resources we have and making progress: è OO code is deployed and is the standard for CMS è Worldwide productions è Full use of prototype facilities p Leading to improved code and understanding of limitations è A solid SW Infrastructure base is in place l But there are many things we are unable to cover adequately: è No Calibration infrastructure è No Alignment infrastructure è Detector Description Database only just getting underway è Analysis infrastructure not yet deployed è Slow progress with our GEANT4 implementation è Unable (time!) to answer all the (good) questions the GRID projects are asking us è “Spotty” user-support p Best effort, when time permits è Most of the tasks in SW Quality Assurance and Control are unmanned è Unacceptably high exposure to loss of key people p No backups in any role è Etc etc….

DPS May/18/2001 USCMS-AB Slide 5 Core SW: Requirements v Current Situation l Major commitments from USA and CERN/CMS l Recent additions from Italy, France and Russia

DPS May/18/2001 USCMS-AB Slide 6 Core SW Manpower: Next steps (already underway)

DPS May/18/2001 USCMS-AB Slide 7 Where does CERN stand? l With the CPT organization we have successfully delineated the on-project SW tasks from the base-program tasks and we have a management structure to handle those two cases. l In the US, reasonable plans exist to put in place the Computing and Software. They still need to be funded but the plans and general will exist. l In Russia, the UK, Italy, France, Germany the wheels are in motion. They also clearly have the intention to make every effort to put the Computing in place. l But è CERN does not know how to pay for its share è Currently it is very difficult to discern a coherent direction to their planning p LHC Computing GRID Project p PPDG,GriPhyN, DataGrid p IMoU’s p Special contributions p ….

DPS May/18/2001 USCMS-AB Slide 8 How to respond? l We continue to build a project plan for CCS l We continue to put in place an IMoU for the SW Manpower è In the meantime we focus action to actually get the manpower l We clearly define our prototype requirements è We should be the ones to define this, not an external committee l Those Prototypes may be supplied within an IMoU context, or within a broader context of collaboration towards LHC Computing l We try to work with CERN to ensure the experiments and the Regional centers are the driving partners in any new projects and that our real needs are addressed