Physics Data Processing at NIKHEF Jeff Templon WAR 7 May 2004.

Slides:



Advertisements
Similar presentations
S.L.LloydATSE e-Science Visit April 2004Slide 1 GridPP – A UK Computing Grid for Particle Physics GridPP 19 UK Universities, CCLRC (RAL & Daresbury) and.
Advertisements

Nikhef Jamboree 2008 BiG Grid Update Jan Just Keijser.
Grid Jeff Templon PDP Group, NIKHEF NIKHEF Jamboree 22 december 2005 Throbbing jobsGoogled Grid.
Introduction to CMS computing CMS for summer students 7/7/09 Oliver Gutsche, Fermilab.
The DutchGrid Platform Collaboration of projects from –Computer Science, HEP and service providers Participating and supported projects –Virtual Laboratory.
Resources for the ATLAS Offline Computing Basis for the Estimates ATLAS Distributed Computing Model Cost Estimates Present Status Sharing of Resources.
S.L.LloydGridPP CB 29 Oct 2002Slide 1 Agenda 1.Introduction – Steve Lloyd 2.Minutes of Previous Meeting (23 Oct 2001) 3.Matters Arising 4.Project Leader's.
T1 at LBL/NERSC/OAK RIDGE General principles. RAW data flow T0 disk buffer DAQ & HLT CERN Tape AliEn FC Raw data Condition & Calibration & data DB disk.
Distributed IT Infrastructure for U.S. ATLAS Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Infrastructure overview Arnold Meijster &
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 15 th April 2009 Visit of Spanish Royal Academy.
GridPP Steve Lloyd, Chair of the GridPP Collaboration Board.
October 24, 2000Milestones, Funding of USCMS S&C Matthias Kasemann1 US CMS Software and Computing Milestones and Funding Profiles Matthias Kasemann Fermilab.
Grids at NIKHEF Grid NIKHEF David Groep NIKHEF PDP The (data) problem to solve beyond meta-computing: the Grid realizing.
Tony Doyle GridPP – From Prototype To Production, GridPP10 Meeting, CERN, 2 June 2004.
Task 6.1 Installing and testing components of the LCG infrastructure to achieve full-scale functionality CERN-INTAS , 25 June, 2006, Dubna V.A.
1 Kittikul Kovitanggoon*, Burin Asavapibhop, Narumon Suwonjandee, Gurpreet Singh Chulalongkorn University, Thailand July 23, 2015 Workshop on e-Science.
ICHEP06, 29 July 2006, Moscow RDIG The Russian Grid for LHC physics analysis V.A. Ilyin, SINP MSU V.V. Korenkov, JINR A.A. Soldatov, RRC KI LCG.
Dutch Tier Hardware Farm size –now: 150 dual nodes + scavenging 200 nodes –buildup to ~1500 up-to-date nodes in 2007 Network –now: 2 Gbit/s internatl.
LHC Computing Review Recommendations John Harvey CERN/EP March 28 th, th LHCb Software Week.
14 Aug 08DOE Review John Huth ATLAS Computing at Harvard John Huth.
Grid Computing Status Report Jeff Templon PDP Group, NIKHEF NIKHEF Scientific Advisory Committee 20 May 2005.
October LHCUSA meeting BNL Bjørn S. Nilsen Update on NSF-ITR Proposal Bjørn S. Nilsen The Ohio State University.
1 HiGrade Kick-off Welcome to DESY Hamburg Zeuthen.
The DutchGrid Platform – An Overview – 1 DutchGrid today and tomorrow David Groep, NIKHEF The DutchGrid Platform Large-scale Distributed Computing.
Questions for ATLAS  How can the US ATLAS costs per SW FTE be lowered?  Is the scope of the T1 facility matched to the foreseen physics requirements.
EGEE is a project funded by the European Union under contract IST HEP Use Cases for Grid Computing J. A. Templon Undecided (NIKHEF) Grid Tutorial,
GridPP Building a UK Computing Grid for Particle Physics Professor Steve Lloyd, Queen Mary, University of London Chair of the GridPP Collaboration Board.
Ian Bird LHC Computing Grid Project Leader LHC Grid Fest 3 rd October 2008 A worldwide collaboration.
…building the next IT revolution From Web to Grid…
Tony Doyle - University of Glasgow 8 July 2005Collaboration Board Meeting GridPP Report Tony Doyle.
Ian Bird LCG Deployment Area Manager & EGEE Operations Manager IT Department, CERN Presentation to HEPiX 22 nd October 2004 LCG Operations.
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
Nikhef/(SARA) tier-1 data center infrastructure
INFSO-RI Enabling Grids for E-sciencE Experience of using gLite for analysis of ATLAS combined test beam data A. Zalite / PNPI.
11 November 2010 Natascha Hörmann Computing at HEPHY Evaluation 2010.
ATLAS WAN Requirements at BNL Slides Extracted From Presentation Given By Bruce G. Gibbard 13 December 2004.
119 May 2003HEPiX/HEPNT National Institute for Nuclear Physics and High Energy Physics Coordinates all (experimental) subatomic physics research in The.
15 December 2015M. Lamanna “The ARDA project”1 The ARDA Project (meeting with the LCG referees) Massimo Lamanna CERN.
High Energy FermiLab Two physics detectors (5 stories tall each) to understand smallest scale of matter Each experiment has ~500 people doing.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow UK-T0 Meeting 21 st Oct 2015 GridPP.
WP3 Information and Monitoring Rob Byrom / WP3
23.March 2004Bernd Panzer-Steindel, CERN/IT1 LCG Workshop Computing Fabric.
, VilniusBaltic Grid1 EG Contribution to NEEGRID Martti Raidal On behalf of Estonian Grid.
LHC Computing, CERN, & Federated Identities
U.S. ATLAS Computing Facilities Overview Bruce G. Gibbard Brookhaven National Laboratory U.S. LHC Software and Computing Review Brookhaven National Laboratory.
Tier 1 at Brookhaven (US / ATLAS) Bruce G. Gibbard LCG Workshop CERN March 2004.
Computing Division FY03 Budget and budget outlook for FY04 + CDF International Finance Committee April 4, 2003 Vicky White Head, Computing Division.
U.S. ATLAS Computing Facilities Bruce G. Gibbard Brookhaven National Laboratory Mid-year Review of U.S. LHC Software and Computing Projects NSF Headquarters,
Distributed Physics Analysis Past, Present, and Future Kaushik De University of Texas at Arlington (ATLAS & D0 Collaborations) ICHEP’06, Moscow July 29,
Research organization technology David Groep, October 2007.
Ian Bird LCG Project Leader Status of EGEE  EGI transition WLCG LHCC Referees’ meeting 21 st September 2009.
GRID IL Tel Aviv, G.Mikenberg2 General Comments on Israeli Education and Research (2005… but not far from now..) Israeli Population 6.86 Millions.
Dominique Boutigny December 12, 2006 CC-IN2P3 a Tier-1 for W-LCG 1 st Chinese – French Workshop on LHC Physics and associated Grid Computing IHEP - Beijing.
Grid Computing Jeff Templon Programme: Group composition (current): 2 staff, 10 technicians, 1 PhD. Publications: 2 theses (PD Eng.) 16 publications.
Slide § David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridPP delivering The UK Grid.
J. Templon Nikhef Amsterdam Physics Data Processing Group “Grid” Computing J. Templon SAC, 26 April 2012.
Status of the NL-T1. BiG Grid – the dutch e-science grid Realising an operational ICT infrastructure at the national level for scientific research (e.g.
J. Templon Nikhef Amsterdam Physics Data Processing Group Large Scale Computing Jeff Templon Nikhef Jamboree, Utrecht, 10 december 2012.
Critical Issues in Distributed Computing Jeff Templon NIKHEF ACAT’07 Conference Amsterdam, 26 april 2007.
Bob Jones EGEE Technical Director
A Dutch LHC Tier-1 Facility
The LHC Computing Grid Visit of Mtro. Enrique Agüera Ibañez
Ian Bird GDB Meeting CERN 9 September 2003
Data Challenge with the Grid in ATLAS
Grid related projects CERN openlab LCG EDG F.Fluckiger
New strategies of the LHC experiments to meet
Collaboration Board Meeting
LHCb thinking on Regional Centres and Related activities (GRIDs)
Presentation transcript:

Physics Data Processing at NIKHEF Jeff Templon WAR 7 May 2004

Jeff Templon – WAR, NIKHEF, Goals 1. Realize an LHC physics computing infrastructure optimized for use by NIKHEF physicists 2. Where possible, combine expertise associated with goal 1 for other projects with NIKHEF participation 3. Capitalize on expertise & available funds by participating in closely-related EU & NL projects 4. Use NIKHEF Grid-computing expertise and capacity as currency

Jeff Templon – WAR, NIKHEF, NIKHEF-optimal LHC Computing Infrastructure u Operation of LCG core site n Build experience with site operation and discover “external” site issues traditionally ignored by CERN n Leverage front-runner position earned by our EDG effort u Strong participation in LHC/LCG/HEP Grid framework projects n Meten is weten – preoptimization is the root of all evil (Knuth) n Leverage front-runner position earned by EDG effort u Leading role in Architecture/Design arm of EGEE n AA model is fulcrum of balance between “CERN-centric” and “really distributed” models n Make use of accumulated expertise in “security” to gain position in middleware design u Preparation for Tier-1 n Avoids having others determine NIKHEF computing priorities

Jeff Templon – WAR, NIKHEF, LHC/LCG/HEP Projects u Strong Coupling to NIKHEF LHC experiment analysis n One grad student per experiment, working with ARDA project; early influence, experience, and expertise with LHC analysis frameworks n Room for more participation in medium term (postdocs, staff) u Continuing work with D0 reprocessing n D0 metadata model is far advanced compared to LHC model n Influence via our (LHC) task-distribution expertise on US computing u Investigations on ATLAS distributed Level-3 trigger n Precursor for LOFAR/Km3NeT activities

Jeff Templon – WAR, NIKHEF, Preparation for Tier-1 u Tier-1 for LHC n Archive ~ 1/7 of raw data, all ESDs produced on site, all MC produced on site, full copies of AOD and tags n Contribute ~ 1/7 of twice-yearly reprocessing power u End result: major computing facility in Watergraafsmeer n 1 petabyte each of disk cache & tape store per year start 2008 n ~ 2000 CPUs in 2008 n ~ 1.5 Gbit/s network to CERN n These numbers are per experiment u NIKHEF contributes research, SARA eventually takes lion’s share of operation u NCF must underwrite this effort (MoU with CERN)

Jeff Templon – WAR, NIKHEF, Overlap with other NIKHEF projects u Other HEP experiments n D0 work Q4 2003, continuing n Babar project together with Miron Livny (Wisconsin) u Astroparticle physics n LOFAR SOC much like LHC Tier-1 n Km3NeT on-demand repointing much like ATLAS Level-3 trigger

Jeff Templon – WAR, NIKHEF, EU & NL Projects u EGEE (EU FP6 project, 2+2 years, 30M) n Funding for site operation (together with SARA) n Funding for Grid Technology projects (together with UvA) n Funding for “generic applications” (read non-LHC) u BSIK/VL-E n Funding for Data-Intensive Science (everything we do) n Funding for Scaling and Validation (large-scale site operation) u Cooperation with other disciplines n Leverage multi-disciplinary use of our infrastructure into large NCF-funded facility (Tier-1)

Jeff Templon – WAR, NIKHEF, Currency u Advantages of Grid Computing for external funding u Grid computing (cycles & expertise) in exchange for membership fees

Jeff Templon – WAR, NIKHEF, People u LHC applications n Templon, Bos, “postdoc”, 3 grad students u Non-LHC applications n Van Leeuwen (CT), Grijpink (CT), Bos, Templon, Groep u Grid Technology n Groep, Koeroo (CT), Venekamp (CT), Steenbakkers (UvA), Templon u Site Operations n Salomoni (CT), Groep, Templon, other CT support

Jeff Templon – WAR, NIKHEF, People / Funding u EGEE n 1 FTE Generic Apps, 1 FTE Site Operations, 1 FTE AA u BSIK/VL-E n 1 FTE Scaling & Validation, 1 FTE Data-Intensive Sciences u Both projects require local 1-1 matching (50% cost model) u Can overlap +- 15% u Possible additional money from bio-range project u Possible to replace some manpower with equivalent equipment

Jeff Templon – WAR, NIKHEF, Possible “Funding Model”