Computing requirements for the experiments Akiya Miyamoto, KEK 31 August 2015 Mini-Workshop on ILC Infrastructure and CFS for Physics and KEK.

Slides:



Advertisements
Similar presentations
Resources for the ATLAS Offline Computing Basis for the Estimates ATLAS Distributed Computing Model Cost Estimates Present Status Sharing of Resources.
Advertisements

Computing plans for the post DBD phase Akiya Miyamoto KEK ILD Session ECFA May 2013.
Status of ILC Computing Model and Cost Study Akiya Miyamoto Norman Graf Frank Gaede Andre Sailer Marcel Stanitzki Jan Strube 1.
 k0k0 ++ -- -- p ILC Technical Design Report Physics and Detectors – Detailed Baseline Design Juan A. Fuster Verdú, IFIC-Valencia PAC Meeting, KEK.
Software Common Task Group Report Akiya Miyamoto KEK ALCPG09 30 September 2009.
Computing for ILC experiment Computing Research Center, KEK Hiroyuki Matsunaga.
LHC Computing Review - Resources ATLAS Resource Issues John Huth Harvard University.
Finnish DataGrid meeting, CSC, Otaniemi, V. Karimäki (HIP) DataGrid meeting, CSC V. Karimäki (HIP) V. Karimäki (HIP) Otaniemi, 28 August, 2000.
ALICE Upgrade for Run3: Computing HL-LHC Trigger, Online and Offline Computing Working Group Topical Workshop Sep 5 th 2014.
Mark Thomson University of Cambridge  LoI loose ends  New physics studies?  What next... Future Plans… This talk:
Computing for ILC experiments Akiya Miyamoto KEK 14 May 2014 AWLC14 Any comments are welcomed.
Computing Resources for ILD Akiya Miyamoto, KEK with a help by Vincent, Mark, Junping, Frank 9 September 2014 ILD Oshu City a report on work.
EGEE is a project funded by the European Union under contract IST HEP Use Cases for Grid Computing J. A. Templon Undecided (NIKHEF) Grid Tutorial,
RD’s Report on Detector Activity General Overview Project Advisory Sakue Yamada December 14, 2012 Sakue Yamada.
Commissioning Studies Top Physics Group M. Cobal – University of Udine ATLAS Week, Prague, Sep 2003.
ILC DBD Common simulation and software tools Akiya Miyamoto KEK ILC PAC 14 December 2012 at KEK.
ILC-ECFA Workshop Valencia November 2006 Four-fermion processes as a background in the ILC luminosity calorimeter for the FCAL Collaboration I. Božović-Jelisavčić,
Software Common Task Group Meeting Akiya Miyamoto 6-June-2011.
Status of tth analysis 1. √s=500 GeV 2. √s=1 TeV 3. MC requests R. Yonamine, T. Tanabe, K. Fujii T. Price, H. Tabassam, N. Watson, V. Martin ILD Workshop,
Software tools and Computing Akiya Miyamoto KEK 29-September-2006 At FJPPL meeting.
WLCG and the India-CERN Collaboration David Collados CERN - Information technology 27 February 2014.
Software Overview Akiya Miyamoto KEK JSPS Tokusui Workshop December-2012 Topics MC production Computing reousces GRID Future events Topics MC production.
1 Top and Tau Measurements Tim Barklow (SLAC) Oct 02, 2009.
2012 RESOURCES UTILIZATION REPORT AND COMPUTING RESOURCES REQUIREMENTS September 24, 2012.
Summary of Software and tracking activities Tokusui Workshop December 2015 Akiya Miyamoto KEK.
Software Common Task Group Report Akiya Miyamoto ILC PAC POSTECH 3 November 2009.
Computing for LHC Physics 7th March 2014 International Women's Day - CERN- GOOGLE Networking Event Maria Alandes Pradillo CERN IT Department.
Next Step Akiya Miyamoto, KEK 17-Sep-2008 ILD Asia Meeting.
LHC Computing, CERN, & Federated Identities
Computing Resources for ILD Akiya Miyamoto, KEK with a help by Vincent, Mark, Junping, Frank 9 September 2014 ILD Oshu City a report on work.
Large scale data flow in local and GRID environment Viktor Kolosov (ITEP Moscow) Ivan Korolko (ITEP Moscow)
Introduction to the seminar ILC Detectors: Status and Prospects Akiya Miyamoto KEK IPNS 18 July
Gordana Milutinovic-Dumbelovic Vinca Institute of Nuclear Sciences, Belgrade Ivanka Bozovic-Jelisavcic, Strahinja Lukic, Mila Pandurovic Branching ratio.
Assembly scenario of ILD Si trackers 2015/4/21 Yasuhiro 1.
KEK GRID for ILC Experiments Akiya Miyamoto, Go Iwai, Katsumasa Ikematsu KEK LCWS March 2010.
Summary of MC requests for DBD benchmarks ILD Workshop, Kyushu University May 23,
Ian Bird WLCG Networking workshop CERN, 10 th February February 2014
Ian Bird Overview Board; CERN, 8 th March 2013 March 6, 2013
Report from SB2009 Working Group Akiya Miyamoto 8-July-2010 DESY ILD Software and Integration Workshop.
LHCbComputing Computing for the LHCb Upgrade. 2 LHCb Upgrade: goal and timescale m LHCb upgrade will be operational after LS2 (~2020) m Increase significantly.
La Thuile, March, 15 th, 2003 f Makoto Tomoto ( FNAL ) Prospects for Higgs Searches at DØ Makoto Tomoto Fermi National Accelerator Laboratory (For the.
16 September 2014 Ian Bird; SPC1. General ALICE and LHCb detector upgrades during LS2  Plans for changing computing strategies more advanced CMS and.
Search for Invisible Higgs Decays at the ILC Ayumi Yamamoto, Akimasa Ishikawa, Hitoshi Yamamoto (Tohoku University) Keisuke Fujii (KEK)
Simulation Plan Discussion What are the priorities? – Higgs Factory? – 3-6 TeV energy frontier machine? What detector variants? – Basic detector would.
Belle II Computing Fabrizio Bianchi INFN and University of Torino Meeting Belle2 Italia 17/12/2014.
Grid technologies for large-scale projects N. S. Astakhov, A. S. Baginyan, S. D. Belov, A. G. Dolbilov, A. O. Golunov, I. N. Gorbunov, N. I. Gromova, I.
LIT participation LIT participation Ivanov V.V. Laboratory of Information Technologies Meeting on proposal of the setup preparation for external beams.
1 GlueX Software Oct. 21, 2004 D. Lawrence, JLab.
Hall D Computing Facilities Ian Bird 16 March 2001.
LHC collisions rate: Hz New PHYSICS rate: Hz Event selection: 1 in 10,000,000,000,000 Signal/Noise: Raw Data volumes produced.
Measurement of sZH using Zqq mode - preliminary result -
ILD MCProduction with ILCDirac
ILD Soft & Analysis meeting
Ian Bird WLCG Workshop San Francisco, 8th October 2016
Grid site as a tool for data processing and data analysis
Overview of the Belle II computing
SuperB and its computing requirements
Pasquale Migliozzi INFN Napoli
Akiya Miyamoto KEK 1 June 2016
for the Offline and Computing groups
Brief summary of discussion at LCWS2010 and draft short term plan
Search for Invisible Higgs Decays at the ILC
Readiness of ATLAS Computing - A personal view
Dagmar Adamova (NPI AS CR Prague/Rez) and Maarten Litmaath (CERN)
ILD Background WG Mark Thomson
Computing in ILC Contents: ILC Project Software for ILC
ILD Ichinoseki Meeting
Nuclear Physics Data Management Needs Bruce G. Gibbard
Simulation plan -- GLDPrime --
Study of Top properties at LHC
Presentation transcript:

Computing requirements for the experiments Akiya Miyamoto, KEK 31 August 2015 Mini-Workshop on ILC Infrastructure and CFS for Physics and KEK Based on a draft document

ILC computing overview /8/31 ILC Infra. & CFS Mini-WS ILD/SiD data processing/ analysis Detector Operation Accelerator Operation Computing infrastructure network, security, mail, servers, operation, maintenance,… ILC Lab. Collaborating institutes This talk

Computing concepts (assumptions) Three computing elements, at IP, at Main Campus, and GRID based global computing ILC lab. provides the essential on-site computing resources for ILD and SiD data processing. Utilize the distributed computing (GRID) for reconstruction and MC, resources by collaborations IP computing  A capacity to store raw data in case of the severe network outage.  A computing resource for data monitoring  The required resource will be small and provided by the experiments. Raw data size has been estimated in TDR.  Data size is dominated by beam backgrounds (pairs). ILD case, VTX:Forward Cal: Others ~ 1:1:1. SiD is similar  Raw data rate at 500 GeV is ~1GB/sec  ~7 PB/year for ILD w. push-pull /8/31 ILC Infra. & CFS Mini-WS

Data processing model Raw data: - All data from detector -2 set: 1 at Main Campus 1 at other than ILC Lab. Main campus reconstruction -Prompt reconstruction for filtering GRID computing - Detail reconstruction and MC production

Reconstruction: Storage and CPU Storage and CPU requirements were estimated based on the samples produced for DBD and Snowmass study ( 500, 350 and 250 GeV) The trigger-less readout is the ILC feature. The software based event filtering and pulse-to-bunch decomposition are new challenge. We don’t know the efficiency and purity for sure.  From the total cross section used for DBD/Snowmass study, about 1% of BXs will contains meaningful events.  MC CPU: Rec/Sim ~ 0.1; MC Data: Sim/Rec/DST~1/1/0.02 We of assume (essentially)  2% of Bxs will pass the filter  20% of MC Sim. CPU are needed to process ALL Bxs for prompt rec.  MC statistics x10 of signals  Need resources during data taking /8/31 ILC Infra. & CFS Mini-WS

Requirements depend on run scenario /8/31 ILC Infra. & CFS Mini-WS H-20 run scenario ILD data rate Energy (GeV) #Pulse /sec Data rate (GB/sec)

Annual luminosity and CPU /8/31 ILC Infra. & CFS Mini-WS KEKCC ~4k cores (Prompt reconstruction )

Accumulated Lumi.and data size /8/31 ILC Infra. & CFS Mini-WS

CPU and storage needs for ILD+SiD /8/31 ILC Infra. & CFS Mini-WS

Comparison with other experiments /8/31 ILC Infra. & CFS Mini-WS Requirements for the ILC computing will be not as huge as HL LHC and below Belle-II. Ref. : 3) Takanori Hara, Computing for Belle-II, 3 March 2014, 4) Ian Bird., WLCG Workshop, Copenhagen. 12 th November

Supports by the ILC laboratory /8/31 ILC Infra. & CFS Mini-WS

Conclusion Initial attempt to estimate CPU and storage requirements for ILD and SiD was made. Large CPU and storage will be necessary, but it will be not as large as BELLE-II and LHC. Requirements for prompt reconstruction is ambiguous and serious study is awaited /8/31 ILC Infra. & CFS Mini-WS