Physics Data Processing at NIKHEF Jeff Templon WAR 7 May 2004
Jeff Templon – WAR, NIKHEF, Goals 1. Realize an LHC physics computing infrastructure optimized for use by NIKHEF physicists 2. Where possible, combine expertise associated with goal 1 for other projects with NIKHEF participation 3. Capitalize on expertise & available funds by participating in closely-related EU & NL projects 4. Use NIKHEF Grid-computing expertise and capacity as currency
Jeff Templon – WAR, NIKHEF, NIKHEF-optimal LHC Computing Infrastructure u Operation of LCG core site n Build experience with site operation and discover “external” site issues traditionally ignored by CERN n Leverage front-runner position earned by our EDG effort u Strong participation in LHC/LCG/HEP Grid framework projects n Meten is weten – preoptimization is the root of all evil (Knuth) n Leverage front-runner position earned by EDG effort u Leading role in Architecture/Design arm of EGEE n AA model is fulcrum of balance between “CERN-centric” and “really distributed” models n Make use of accumulated expertise in “security” to gain position in middleware design u Preparation for Tier-1 n Avoids having others determine NIKHEF computing priorities
Jeff Templon – WAR, NIKHEF, LHC/LCG/HEP Projects u Strong Coupling to NIKHEF LHC experiment analysis n One grad student per experiment, working with ARDA project; early influence, experience, and expertise with LHC analysis frameworks n Room for more participation in medium term (postdocs, staff) u Continuing work with D0 reprocessing n D0 metadata model is far advanced compared to LHC model n Influence via our (LHC) task-distribution expertise on US computing u Investigations on ATLAS distributed Level-3 trigger n Precursor for LOFAR/Km3NeT activities
Jeff Templon – WAR, NIKHEF, Preparation for Tier-1 u Tier-1 for LHC n Archive ~ 1/7 of raw data, all ESDs produced on site, all MC produced on site, full copies of AOD and tags n Contribute ~ 1/7 of twice-yearly reprocessing power u End result: major computing facility in Watergraafsmeer n 1 petabyte each of disk cache & tape store per year start 2008 n ~ 2000 CPUs in 2008 n ~ 1.5 Gbit/s network to CERN n These numbers are per experiment u NIKHEF contributes research, SARA eventually takes lion’s share of operation u NCF must underwrite this effort (MoU with CERN)
Jeff Templon – WAR, NIKHEF, Overlap with other NIKHEF projects u Other HEP experiments n D0 work Q4 2003, continuing n Babar project together with Miron Livny (Wisconsin) u Astroparticle physics n LOFAR SOC much like LHC Tier-1 n Km3NeT on-demand repointing much like ATLAS Level-3 trigger
Jeff Templon – WAR, NIKHEF, EU & NL Projects u EGEE (EU FP6 project, 2+2 years, 30M) n Funding for site operation (together with SARA) n Funding for Grid Technology projects (together with UvA) n Funding for “generic applications” (read non-LHC) u BSIK/VL-E n Funding for Data-Intensive Science (everything we do) n Funding for Scaling and Validation (large-scale site operation) u Cooperation with other disciplines n Leverage multi-disciplinary use of our infrastructure into large NCF-funded facility (Tier-1)
Jeff Templon – WAR, NIKHEF, Currency u Advantages of Grid Computing for external funding u Grid computing (cycles & expertise) in exchange for membership fees
Jeff Templon – WAR, NIKHEF, People u LHC applications n Templon, Bos, “postdoc”, 3 grad students u Non-LHC applications n Van Leeuwen (CT), Grijpink (CT), Bos, Templon, Groep u Grid Technology n Groep, Koeroo (CT), Venekamp (CT), Steenbakkers (UvA), Templon u Site Operations n Salomoni (CT), Groep, Templon, other CT support
Jeff Templon – WAR, NIKHEF, People / Funding u EGEE n 1 FTE Generic Apps, 1 FTE Site Operations, 1 FTE AA u BSIK/VL-E n 1 FTE Scaling & Validation, 1 FTE Data-Intensive Sciences u Both projects require local 1-1 matching (50% cost model) u Can overlap +- 15% u Possible additional money from bio-range project u Possible to replace some manpower with equivalent equipment
Jeff Templon – WAR, NIKHEF, Possible “Funding Model”