Mu2e FY15 and FY16 Computing Needs Rob Kutschke Scientific Computing Portfolio Management Team (SC-PMT) Review 4 March 2015.

Slides:



Advertisements
Similar presentations
US CMS DOE/NSF Review: May 8-10, US CMS Cost & Schedule Mark Reichanadter US CMS Project Engineer DOE/NSF Review 8 May 2001.
Advertisements

Software Summary Database Data Flow G4MICE Status & Plans Detector Reconstruction 1M.Ellis - CM24 - 3rd June 2009.
23/04/2008VLVnT08, Toulon, FR, April 2008, M. Stavrianakou, NESTOR-NOA 1 First thoughts for KM3Net on-shore data storage and distribution Facilities VLV.
1 Timescales Construction LCR MICE Beam Monitoring counters + DAQ My understanding Jan 07 Feb 07 Mar 07 Apr 07 May 07 Jun 07 Jul 07 Aug 07 Sep 07 ISIS.
DATA PRESERVATION IN ALICE FEDERICO CARMINATI. MOTIVATION ALICE is a 150 M CHF investment by a large scientific community The ALICE data is unique and.
Trigger and online software Simon George & Reiner Hauser T/DAQ Phase 1 IDR.
ALICE Operations short summary and directions in 2012 Grid Deployment Board March 21, 2011.
Hall D Online Data Acquisition CEBAF provides us with a tremendous scientific opportunity for understanding one of the fundamental forces of nature. 75.
Mu2e WGM 7/27/2011 R. Ray Mu2e Project manager. Director’s Review We have settled on date for the Director’s CD-1 Review; Oct Focus of review will.
1 Software Testing (Part-II) Lecture Software Testing Software Testing is the process of finding the bugs in a software. It helps in Verifying and.
CLAS12 CalCom Activity CLAS Collaboration Meeting, March 6 th 2014.
Mu2e Mu2e Status Andy Hocker, Technical Division All-Experimenters Meeting 8-JUN-2015.
Chapter 18: Windows Server 2008 R2 and Active Directory Backup and Maintenance BAI617.
US ATLAS Western Tier 2 Status and Plan Wei Yang ATLAS Physics Analysis Retreat SLAC March 5, 2007.
Alexandre A. P. Suaide VI DOSAR workshop, São Paulo, 2005 STAR grid activities and São Paulo experience.
Kaname Ikeda, October Status of the ITER Project Status of the ITER Project Kaname Ikeda ITER Nominee Director-General October 2006.
03/27/2003CHEP20031 Remote Operation of a Monte Carlo Production Farm Using Globus Dirk Hufnagel, Teela Pulliam, Thomas Allmendinger, Klaus Honscheid (Ohio.
D0 SAM – status and needs Plagarized from: D0 Experiment SAM Project Fermilab Computing Division.
CHEP'07 September D0 data reprocessing on OSG Authors Andrew Baranovski (Fermilab) for B. Abbot, M. Diesburg, G. Garzoglio, T. Kurca, P. Mhashilkar.
MiniBooNE Computing Description: Support MiniBooNE online and offline computing by coordinating the use of, and occasionally managing, CD resources. Participants:
Webster Visualize Webster Financial Team Visual Scrumware Joe Andrusyszyn Mark Bryant Brian Hannan Robert Songer.
LCG Middleware Testing in 2005 and Future Plans E.Slabospitskaya, IHEP, Russia CERN-Russia Joint Working Group on LHC Computing March, 6, 2006.
14 Aug 08DOE Review John Huth ATLAS Computing at Harvard John Huth.
Mu2e WGM R. Ray Mu2e Project Manager Sept. 14, 2012.
Workshop on Computing for Neutrino Experiments - Summary April 24, 2009 Lee Lueking, Heidi Schellman NOvA Collaboration Meeting.
16 September GridPP 5 th Collaboration Meeting D0&CDF SAM and The Grid Act I: Grid, Sam and Run II Rick St. Denis – Glasgow University Act II: Sam4CDF.
BNL Tier 1 Service Planning & Monitoring Bruce G. Gibbard GDB 5-6 August 2006.
Mu2e WGM 11/16/2011 R. Ray Mu2e Project manager. Review of the past few months In September it became apparent that the cost of Mu2e was well in excess.
Philip Burrows SiD meeting, Chicago 15/11/081 Progress on the LoI Philip Burrows John Adams Institute Oxford University Thanks to: Hiro Aihara, Mark Oreglia.
Simulations and Software CBM Collaboration Meeting, GSI, 17 October 2008 Volker Friese Simulations Software Computing.
6/23/2005 R. GARDNER OSG Baseline Services 1 OSG Baseline Services In my talk I’d like to discuss two questions:  What capabilities are we aiming for.
TB1: Data analysis Antonio Bulgheroni on behalf of the TB24 team.
CDMS Computing Project Don Holmgren Other FNAL project members (all PPD): Project Manager: Dan Bauer Electronics: Mike Crisler Analysis: Erik Ramberg Engineering:
Linda R. Coney – 5 November 2009 Online Reconstruction Linda R. Coney 5 November 2009.
NTOF DAQ status D. Macina (EN-STI-EET) Acknowledgements: EN-STI-ECE section: A. Masi, A. Almeida Paiva, M. Donze, M. Fantuzzi, A. Giraud, F. Marazita,
CD FY09 Tactical Plan Status FY09 Tactical Plan Status Report for Neutrino Program (MINOS, MINERvA, General) Margaret Votava April 21, 2009 Tactical plan.
CD FY08 Tactical Plan Status FY08 Tactical Plan Status Report for LSCS-DBI-APP Dennis Box June
Pixel power R&D in Spain F. Arteche Phase II days Phase 2 pixel electronics meeting CERN - May 2015.
Doug Benjamin Duke University. 2 ESD/AOD, D 1 PD, D 2 PD - POOL based D 3 PD - flat ntuple Contents defined by physics group(s) - made in official production.
Plan to go forward Peter Wilson SBN Program Coordinator 27 September 2014.
Computing Division FY03 Budget and budget outlook for FY04 + CDF International Finance Committee April 4, 2003 Vicky White Head, Computing Division.
David Stickland CMS Core Software and Computing
CERN IT Department CH-1211 Genève 23 Switzerland t Migration from ELFMs to Agile Infrastructure CERN, IT Department.
Ian Bird Overview Board; CERN, 8 th March 2013 March 6, 2013
LHCbComputing Computing for the LHCb Upgrade. 2 LHCb Upgrade: goal and timescale m LHCb upgrade will be operational after LS2 (~2020) m Increase significantly.
CMS: T1 Disk/Tape separation Nicolò Magini, CERN IT/SDC Oliver Gutsche, FNAL November 11 th 2013.
Markus Frank (CERN) & Albert Puig (UB).  An opportunity (Motivation)  Adopted approach  Implementation specifics  Status  Conclusions 2.
WLCG Status Report Ian Bird Austrian Tier 2 Workshop 22 nd June, 2010.
Meeting with University of Malta| CERN, May 18, 2015 | Predrag Buncic ALICE Computing in Run 2+ P. Buncic 1.
Introduction Rob Kutschke Mu2e Computing Review March 5, 2015 Mu2e-doc-5220-v8.
Introduction Rob Kutschke Mu2e Computing Review March 5, 2015 Mu2e-doc-5220-v1.
LArIAT FY15 and FY16 Computing Needs Hans Wenzel * (taking over from Michael Kirby) Scientific Computing Portfolio Management Team (SC-PMT) Review 4 th.
Mu2e and Muon g-2: FIFE Plans and Issues Rob Kutschke and Adam Lyon FIFE Workshop June 15, 2013 Mu2e-doc-4269-v1.
Acronyms GAS - Grid Acronym Soup, LCG - LHC Computing Project EGEE - Enabling Grids for E-sciencE.
Recent Evolution of the Offline Computing Model of the NOA Experiment Talk #200 Craig Group & Alec Habig CHEP 2015 Okinawa, Japan.
Fabric for Frontier Experiments at Fermilab Gabriele Garzoglio Grid and Cloud Services Department, Scientific Computing Division, Fermilab ISGC – Thu,
Gavin S. Davies Iowa State University On behalf of the NOvA Collaboration FIFE Workshop, June th 2014 FIFE workshop.
Analysis Model Zhengyun You University of California Irvine Mu2e Computing Review March 5-6, 2015 Mu2e-doc-5227.
CD FY10 Budget and Tactical Plan Review FY10 Tactical Plans for Scientific Computing Facilities / General Physics Computing Facility (GPCF) Stu Fuess 06-Oct-2009.
MAUS Status A. Dobbs CM43 29 th October Contents MAUS Overview Infrastructure Geometry and CDB Detector Updates CKOV EMR KL TOF Tracker Global Tracking.
1 P. Murat, Mini-review of the CDF Computing Plan 2006, 2005/10/18 An Update to the CDF Offline Plan and FY2006 Budget ● Outline: – CDF computing model.
Computing Infrastructure – Minos 2009/12 ● Downtime schedule – 3 rd Thur monthly ● Dcache upgrades ● Condor / Grid status ● Bluearc performance – cpn lock.
Computing Infrastructure Arthur Kreymer 1 ● Power status in FCC (UPS1) ● Bluearc disk purchase – coming soon ● Planned downtimes – none ! ● Minos.
Minos Computing Infrastructure Arthur Kreymer 1 ● Young Minos issues: – /local/stage1/minosgli temporary reprieve – need grid shared area – Retiring.
TK2023 Object-Oriented Software Engineering
 YongPyong-High Jan We appreciate that you give an opportunity to have this talk. Our Belle II computing group would like to report on.
ATLAS DC2 & Continuous production
Overview Cost to Complete
Presentation transcript:

Mu2e FY15 and FY16 Computing Needs Rob Kutschke Scientific Computing Portfolio Management Team (SC-PMT) Review 4 March 2015

Scientific Goals for FY15 and FY16 Obtain CD3c and begin construction Planning date for DOE CD3c review is March 2016 Implies computing done by November 15, 2015 Big questions for computing: Is the neutron flux on the CRV system low enough to allow the scintillator based design to work? Are the weak spots in the CRV system acceptable? One iteration of the full simulation, with the most recent design, to obtain resolutions, sensitivities and backgrounds, all with systematic errors. At least the same size as the TDR sample. After CD3c Execute the plan: detector commissioning FY20 Computing: final optiization, calibration, algorithm improvement 3/4/15Kutschke | Mu2e FY15/FY16 Computing Needs2

Large Scale or out of ordinary computing needed to complete these goals - 1 Before Oct 1, 2015 we need about 14 million CPU hours –Need to get much of this opportunistically (including offsite) Already testing offsite running Thanks to the SCD team working with us. –Much of the work is low IO and is well suited for offsite. We will need about 210 TB of tape to hold all of the files generated by the above –We have an option to reduce this substantially Not clear that we have the people to implement it in a timely fashion. –Many files will use small file aggregation –Existing bluearc data disk (82 TB) OK. 3/4/15Kutschke | Mu2e FY15/FY16 Computing Needs3

Large Scale or out of ordinary computing needed to complete these goals - 2 Making premixed background events: –To do complete events we need a CPU with 8GB memory/core –Proof of principle on FermiCloud We found it awkward to use. Advertised to be easier with jobsub_client – not yet tested. Expect that jobs needing large memory/core will become more frequent as the experiment proceeds: –It will remain a minor use case for a long time to come. –But convenience has value 3/4/15Kutschke | Mu2e FY15/FY16 Computing Needs4

Did you meet your FY14 Scientific Goals? Yes In Feb Mu2e received a positive recommendation for CD2/3b Expect signatures on CD2/3b document today! Computing was NOT a rate limiting factor 100E9 events for CRV deadtime study using G4beamline 3E9 events with full simulation to study physics sensitiviity 3/4/15Kutschke | Mu2e FY15/FY16 Computing Needs5

Needs from Service Areas (1) Grid and Cloud Computing –To end of FY15 we need 14 M-hours to be ready for CD3c –Exisiting allocation integrates to 4.3 M-hours To achieve this we will need significant offsite running. Peak vs integral: if we can get it faster, that would be great. –Using jobsub_client; testing offsite OSG running Networked Storage –Existing allocation is OK for work leading to CD3c. –Would like finer control of qutotas on bluearc Physics and Detector Simulation –Geant4, CRY both needed for CD3c 3/4/15Kutschke | Mu2e FY15/FY16 Computing Needs6

Needs from Service Areas (2) Scientific Data Storage and Access –Enstore: estimate incremental tape usage: 2015: 210 TB - for CD3c 2016: 200 TB - mock data/alignment challenges 2017: 300 TB - mock data/alignment challenges –dCache – little operational experience, just a guess. The following are integral: 2015: 50 TB R/W and 50 TB scratch 2016: same 2017: 100 TB R/W and 50 TB scratch 3/4/15Kutschke | Mu2e FY15/FY16 Computing Needs7

Needs from Service Areas (3) CVMFS –Just commisioned it. Soon to be a critical service. Scientific Data Management –SAM and SAM Web: critical –FTS Just getting experience with it Have dumped 20 TB from bluearc IFDH, GRIDFTP –Heavy use of both IFDH_ART –Will soon test running art using SAM for input and output –We have use cases for which this will not work. We know how to script them using IFDH 3/4/15Kutschke | Mu2e FY15/FY16 Computing Needs8

Needs from Service Areas (4) Scientific Frameworks –art: critical Interactive GPCF –5 nodes are enough for now and the next few years Experiment control room –N/A for 2015/16. First commissioning data in 2020 Jenkins –Recently commissioned: becoming critical –Continuous integration / nightly build/ release builds –Expect to increase testing in all 3 modes. 3/4/15Kutschke | Mu2e FY15/FY16 Computing Needs9

Needs from Service Areas (5) Central Web Hosting –Mu2e homepage and web site: critical DAQ and Engineering –artdaq: it’s in our baseline –Kurt Biery, Mark Bowden and their groups are on project. Document Management –DocDB: critical redmine –Critical for git repos, wiki, issue tracker cdcvs –Some of Mu2e still use cvs hosted on cvs for their work svn –No lab hosted repos but we access an svn repo at LBL 3/4/15Kutschke | Mu2e FY15/FY16 Computing Needs10

Needs from Service Areas (6) ECL –In use for offsite test beam work –Expect it to grow exponentially during construction UPS/UPD –Critical Readytalk –Critical xrootd –Not yet but we know it’s there. 3/4/15Kutschke | Mu2e FY15/FY16 Computing Needs11

Needs from Service Areas (7) Construction databases –Discussions started with Igor. –Mu2e point person is Kevin Lynch (CUNY) –Plan is to build from the NOvA experience –Mu2e has some in-house people for GUI development but I don’t yet know if we have enough. –Need to be ready for start of construction in FY2016 Conditions database –Guess we will want to start discussions with Igor in late 2016 or early –Very likely can start with work done for other experiments. 3/4/15Kutschke | Mu2e FY15/FY16 Computing Needs12

Needs from Service Areas (8) Consulting –G4: critical, mostly Krzysztof Genser –art: critical, concerned that the art team may be shrinking Production Operators –They have been valuable and plan to use them for the simulation campaign this summer. Discussion Forum –Hypernews Up and running in a sandbox Waiting for Shibboleth to be installed so that we can open it up for offsite people Then testing starts for real. Thanks for setting this up. 3/4/15Kutschke | Mu2e FY15/FY16 Computing Needs13

TSW/EOP Status and Plans Are your TSWs signed and up to date? –To the best of my knowledge we have no TSWs or EOPs. If not, do they need revision? –According to Program Planning, these need be in draft form by the CD3c review (est. March 2016). –They do not need to be signed until CD4 ( Q1 FY21) 3/4/15Kutschke | Mu2e FY15/FY16 Computing Needs14

Future Directions (Challenges and R&D) Will your SOPs change significantly in the future (new phase of the experiment, new running conditions, etc.)? – focus of computing effort will switch to calibrations, alignment, refinement of reco algorithms. –Some simulations will continue. Are future R&D projects in the pipeline? –None that I know of. Are additional resources going to be required to meet them? –No 3/4/15Kutschke | Mu2e FY15/FY16 Computing Needs15

Additional Projects/Comments (1) In 2015 Mu2e Project contributed 0.5 FTE of funding to CS –Will be 0 for 2016 and beyond –But we still need work done by CS and we do not yet have an operations budget. We need to learn how this works. Event display help –We have several event displays –Each does a good for a narrow purpose –It would be great help to have access to a graphics consultant to integrate and improve these. We need a good general programmer/scripter –Not available from the collaboration except for people who are more valuable elsewhere (too far from data). 3/4/15Kutschke | Mu2e FY15/FY16 Computing Needs16

Additional Projects/Comments (2) SCD documentation is aimed at experts and is generally very good. –We greatly appreciate it. –It would be much easier to onboard new people there were more material aimed at beginners. –For example: What is a grid? How does it work? 3/4/15Kutschke | Mu2e FY15/FY16 Computing Needs17

Schedule 2/6/15Kutschke | Introduction18 FY15 FY16 FY17 FY18 FY19 FY20 FY21 FY22 CD-2/3b Project Complete Detector Construction Accelerator and Beamline Construction Solenoid Infrastructure KPPs Satisfied PS Fabrication and QA PS/DS Final Design Accelerator Commissioning (off Project) CD-3c Fabricate and QA Superconductor Detector Hall Construction Cosmic Ray System Test 24 months of schedule float PO issued for TS Module Fabrication Fabricate and QA TS Modules, Assemble TS DS Fabrication and QA PS Installation TS Installation DS Installation Critical Path Solenoid power/cryo hookup Solenoid Commissioning