Planning for a Western Analysis Facility Richard P. Mount Planning for a Western Analysis FacilityPage 1.

Slides:



Advertisements
Similar presentations
When Students Can’t Read…
Advertisements

Storage Workshop Summary Wahid Bhimji University Of Edinburgh On behalf all of the participants…
LSST science support Richard Dubois SLAC National Accelerator Laboratory Steve Allen Debbie Bard Dominique Boutigny Jim Chiang 27 Feb 2014 Seth Digel SLAC-NOAO.
Review of Maternal and Child Health Service
SLA-Oriented Resource Provisioning for Cloud Computing
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
Integrated Scientific Workflow Management for the Emulab Network Testbed Eric Eide, Leigh Stoller, Tim Stack, Juliana Freire, and Jay Lepreau and Jay Lepreau.
ATLAS users and SLAC Jason Nielsen Santa Cruz Institute for Particle Physics University of California, Santa Cruz SLAC Users Organization Meeting 7 February.
ALICE Operations short summary LHCC Referees meeting June 12, 2012.
Welcome to HTCondor Week #14 (year #29 for our project)
Assessment of Core Services provided to USLHC by OSG.
Hall D Online Data Acquisition CEBAF provides us with a tremendous scientific opportunity for understanding one of the fundamental forces of nature. 75.
US ATLAS Western Tier 2 Status and Plan Wei Yang ATLAS Physics Analysis Retreat SLAC March 5, 2007.
Impact Measurement and You Understanding Impact Measurement, Practical Tools, Key Questions.
Sustaining Community-Based Programs and Supporting Families: The Virginia Tech Interactive Training On Sustainability Training Developed by: Lydia I. Marek.
Elements and Applications of the NACS Approach Serigne Diene, Senior Nutrition and HIV Advisor (FANTA/FHI360) AIDS Turning the Tide Together.
PPDG and ATLAS Particle Physics Data Grid Ed May - ANL ATLAS Software Week LBNL May 12, 2000.
SLUO LHC Workshop: Closing RemarksPage 1 SLUO LHC Workshop: Closing Remarks David MacFarlane Associate Laboratory Directory for PPA.
1 Distributed Energy-Efficient Scheduling for Data-Intensive Applications with Deadline Constraints on Data Grids Cong Liu and Xiao Qin Auburn University.
Instrumentation of the SAM-Grid Gabriele Garzoglio CSC 426 Research Proposal.
Made by: Mudassar Iqbal 1 INTEGRATION OF IT PLAN IN SCHOOL SYSTEM.
Recycle by Design A P3 Approach to Resource Recapture in the State of Michigan Mike Csapo General Manager, Resource Recovery and Recycling Authority of.
3.1 © 2007 by Prentice Hall 3 Chapter Achieving Competitive Advantage with Information Systems.
Third Sector Evaluation: Challenges and Opportunities Presentation to the Public Legal Education in Canada National Conference on “Making an Impact” 26.
Tier 3 Computing Doug Benjamin Duke University. Tier 3’s live here Atlas plans for us to do our analysis work here Much of the work gets done here.
ATLAS Computing at SLAC Future Possibilities Richard P. Mount Western Tier 2 Users Forum April 7, 2009.
Atlas CAP Closeout Thanks to all the presenters for excellent and frank presentations Thanks to all the presenters for excellent and frank presentations.
Ian Bird LHC Computing Grid Project Leader LHC Grid Fest 3 rd October 2008 A worldwide collaboration.
Next Generation Operating Systems Zeljko Susnjar, Cisco CTG June 2015.
US ATLAS Tier 1 Facility Rich Baker Brookhaven National Laboratory DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National Laboratory.
BNL Tier 1 Service Planning & Monitoring Bruce G. Gibbard GDB 5-6 August 2006.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
Background Nature and function Rationale Opportunities for TB control Partnering process.
PPDGLHC Computing ReviewNovember 15, 2000 PPDG The Particle Physics Data Grid Making today’s Grid software work for HENP experiments, Driving GRID science.
Open Science Grid & its Security Technical Group ESCC22 Jul 2004 Bob Cowles
Ian Bird GDB CERN, 9 th September Sept 2015
Storage and Data Movement at FNAL D. Petravick CHEP 2003.
HEP and NP SciDAC projects: Key ideas presented in the SciDAC II white papers Robert D. Ryne.
Doug Benjamin Duke University. 2 ESD/AOD, D 1 PD, D 2 PD - POOL based D 3 PD - flat ntuple Contents defined by physics group(s) - made in official production.
U.S. ATLAS Computing Facilities Overview Bruce G. Gibbard Brookhaven National Laboratory U.S. LHC Software and Computing Review Brookhaven National Laboratory.
The ATLAS Computing Model and USATLAS Tier-2/Tier-3 Meeting Shawn McKee University of Michigan Joint Techs, FNAL July 16 th, 2007.
US ATLAS Tier 1 Facility Rich Baker Deputy Director US ATLAS Computing Facilities October 26, 2000.
U.S. ATLAS Facility Planning U.S. ATLAS Tier-2 & Tier-3 Meeting at SLAC 30 November 2007.
Ian Bird Overview Board; CERN, 8 th March 2013 March 6, 2013
PCAP Close Out Feb 2, 2004 BNL. Overall  Good progress in all areas  Good accomplishments in DC-2 (and CTB) –Late, but good.
Site Services and Policies Summary Dirk Düllmann, CERN IT More details at
Western Tier 2 Site at SLAC Wei Yang US ATLAS Tier 2 Workshop Harvard University August 17-18, 2006.
WLCG Status Report Ian Bird Austrian Tier 2 Workshop 22 nd June, 2010.
Meeting with University of Malta| CERN, May 18, 2015 | Predrag Buncic ALICE Computing in Run 2+ P. Buncic 1.
PetaCache: Data Access Unleashed Tofigh Azemoon, Jacek Becla, Chuck Boeheim, Andy Hanushevsky, David Leith, Randy Melen, Richard P. Mount, Teela Pulliam,
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
Assessing Logistics System Supply Chain Management 1.
University Research Model Committee - Key points/issues - Other points/issues - New ideas - “University Model” issues in the report - Findings and Recommendations.
How We Got Here PC and Internet changed the rules –Viruses, information sharing, “outside” and “inside” indistinguishable –Vulnerability research for.
CMS Crosscut Operations and Research, Theory, Computing, University Involvement C. Young and B. Zhou.
PROJECT MANAGEMENT Software Engineering CSE
05/14/04Larry Dennis, FSU1 Scale of Hall D Computing CEBAF provides us with a tremendous scientific opportunity for understanding one of the fundamental.
ADC Operations Shifts J. Yu Guido Negri, Alexey Sedov, Armen Vartapetian and Alden Stradling coordination, ADCoS coordination and DAST coordination.
EVALUATION SYSTEM FOR NIGER REPUBLIC Dr TARINGUE WAZIRI GAMBO.
LHCbComputing Update of LHC experiments Computing & Software Models Selection of slides from last week’s GDB
Hall D Computing Facilities Ian Bird 16 March 2001.
QALIBRA - Introduction
Coaching for Impact Susan Barrett
Computing Operations Roadmap
Virtualization and Clouds ATLAS position
Western Analysis Facility
for the Offline and Computing groups
Group work on challenges Learning oriented M&E System
Presentation transcript:

Planning for a Western Analysis Facility Richard P. Mount Planning for a Western Analysis FacilityPage 1

Planning for a Western Analysis FacilityPage 2 Evaluating US ATLAS Needs Data Production Tasks: BNL Tier 1 (+Tier 2s) meet the pledged requirements Simulation: Could absorb all Tier 2 capacity and more. Physics impact of scaling back is not clear. Analysis: Computing Model allocates 20% to 50% of Tier 2s. Is this realistic quantitatively? Is this realistic qualitatively (data-intensive analysis)? Could Tier 3s provide what appears to be missing?

Planning for a Western Analysis FacilityPage 3 US ATLAS T1 + T2 in 2010 Excluding T3s

Planning for a Western Analysis FacilityPage 4 BaBar and CDF Excluding “T3s” BaBar Tier As in 2006 (+ simulation done mainly at universities) CDF Fermilab Site in 2008 (thanks to Rick Snider)

Planning for a Western Analysis FacilityPage 5 Data-Intensive Analysis 1.Babar (and Run II) experience demonstrated a vital role for major “Tier-A” computer centers specializing in high-throughput data- intensive work. 2.Physics demands chaotic, apparently random access to data. This is a hardware/software/management challenge. 3.Achieving high-availability with a complex data-intensive workload is hard! 4.Running simulation at major data-intensive facilities fails to exploit them fully (BaBar ran most simulation on university facilities). 5.Significant university (Tier 3) facilities have always been very important for final-stage analysis.

Planning for a Western Analysis FacilityPage 6 SLAC Strengths 1.Pushing the envelope of Data Intensive Computing e.g. Scalla/xrootd (in use at the SLAC T2) 2.Design and implementation of efficient and scalable computing systems (1000s of boxes) 3.Strongly supportive interactions with the university community (and 10 Gbits/s to Internet2). Plus a successful ongoing computing operation: Multi-tiered multi-petabyte storage ~10,000 cores of CPU Space/power/cooling continuously evolving

Planning for a Western Analysis FacilityPage 7 SLAC Laboratory Goals 1.Maintain, strengthen and exploit the Core Competency in Data Intensive Computing; 2.Collaborate with universities exploiting the complementary strengths of universities and SLAC.

Planning for a Western Analysis FacilityPage 8 ATLAS Western Analysis Facility Concept 1.Focus on data-intensive analysis on a “major-HEP-computing- center” scale; 2.Flexible and Nimble to meet the challenge of rapidly evolving analysis needs; 3.Flexible and Nimble to meet the challenge of evolving technologies: Particular focus on the most effective role for solid state storage (together with enhancements to data-access software); 4.Close collaboration with US ATLAS university groups: Make best possible use of SLAC-based and university-based facilities. 5.Coordinate with ATLAS Analysis Support Centers.

Planning for a Western Analysis FacilityPage 9 ATLAS Western Analysis Facility Possible Timeline 1.Today: Interest from “Western” ATLAS university groups in co- locating ARRA-funded equipment at SLAC (under discussion) : Tests of various types of solid-state storage in various ATLAS roles (conditions DB, TAG access, PROOF-based analysis, xrootd access to AOD …). Collaborate with BNL and ANL : Re-evaluation of ATLAS analysis needs after experience with real data on: WAF implementation as part of overall US ATLAS strategy.