ATLAS users and SLAC Jason Nielsen Santa Cruz Institute for Particle Physics University of California, Santa Cruz SLAC Users Organization Meeting 7 February.

Slides:



Advertisements
Similar presentations
31/03/00 CMS(UK)Glenn Patrick What is the CMS(UK) Data Model? Assume that CMS software is available at every UK institute connected by some infrastructure.
Advertisements

First results from the ATLAS experiment at the LHC
October 2011 David Toback, Texas A&M University Research Topics Seminar 1 David Toback January 2015 Big Computing and the Mitchell Institute for Fundamental.
Duke Atlas Tier 3 Site Doug Benjamin (Duke University)
T1 at LBL/NERSC/OAK RIDGE General principles. RAW data flow T0 disk buffer DAQ & HLT CERN Tape AliEn FC Raw data Condition & Calibration & data DB disk.
Randall Sobie The ATLAS Experiment Randall Sobie Institute for Particle Physics University of Victoria Large Hadron Collider (LHC) at CERN Laboratory ATLAS.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
Structural Genomics – an example of transdisciplinary research at Stanford Goal of structural and functional genomics is to determine and analyze all possible.
The LHC Computing Grid Project Tomi Kauppi Timo Larjo.
LHC Experiments at Liverpool E2V Visit – Nov 2005 Introduction Si Technology Upgrade/Maintenance Summary.
Sept. 18, 2008SLUO 2008 Annual Meeting ATLAS Detector Upgrade Opportunities M. G. D. Gilchriese Lawrence Berkeley National Laboratory.
Open Meeting on LHC Ideas for a Physics Analysis Center Open Meeting on LHC Ideas for a Physics Analysis Center Fermilab, September 4, 2003 Regina Demina.
HEP Prospects, J. Yu LEARN Strategy Meeting Prospects on Texas High Energy Physics Network Needs LEARN Strategy Meeting University of Texas at El Paso.
August 98 1 Jürgen Knobloch ATLAS Software Workshop Ann Arbor ATLAS Computing Planning ATLAS Software Workshop August 1998 Jürgen Knobloch Slides also.
CERN/IT/DB Multi-PB Distributed Databases Jamie Shiers IT Division, DB Group, CERN, Geneva, Switzerland February 2001.
October 24, 2000Milestones, Funding of USCMS S&C Matthias Kasemann1 US CMS Software and Computing Milestones and Funding Profiles Matthias Kasemann Fermilab.
LCG Milestones for Deployment, Fabric, & Grid Technology Ian Bird LCG Deployment Area Manager PEB 3-Dec-2002.
HEPAP and P5 Report DIET Federation Roundtable JSPS, Washington, DC; April 29, 2015 Andrew J. Lankford HEPAP Chair University of California, Irvine.
High Energy Physics At OSCER A User Perspective OU Supercomputing Symposium 2003 Joel Snow, Langston U.
1 Kittikul Kovitanggoon*, Burin Asavapibhop, Narumon Suwonjandee, Gurpreet Singh Chulalongkorn University, Thailand July 23, 2015 Workshop on e-Science.
Fermilab User Facility US-CMS User Facility and Regional Center at Fermilab Matthias Kasemann FNAL.
Finnish DataGrid meeting, CSC, Otaniemi, V. Karimäki (HIP) DataGrid meeting, CSC V. Karimäki (HIP) V. Karimäki (HIP) Otaniemi, 28 August, 2000.
SLUO LHC Workshop: Closing RemarksPage 1 SLUO LHC Workshop: Closing Remarks David MacFarlane Associate Laboratory Directory for PPA.
LHC Computing Plans Scale of the challenge Computing model Resource estimates Financial implications Plans in Canada.
ALICE Upgrade for Run3: Computing HL-LHC Trigger, Online and Offline Computing Working Group Topical Workshop Sep 5 th 2014.
ATLAS and GridPP GridPP Collaboration Meeting, Edinburgh, 5 th November 2001 RWL Jones, Lancaster University.
14 Aug 08DOE Review John Huth ATLAS Computing at Harvard John Huth.
US CMS/D0/CDF Jet/Met Workshop – Jan. 28, The CMS Physics Analysis Center - PAC Dan Green US CMS Program Manager Jan. 28, 2004.
1 HiGrade Kick-off Welcome to DESY Hamburg Zeuthen.
24-Aug-11 ILCSC -Mumbai Global Design Effort 1 ILC: Future after 2012 preserving GDE assets post-TDR pre-construction program.
Silicon Module Tests The modules are tested in the production labs. HIP is is participating in the rod quality tests at CERN. The plan of HIP CMS is to.
DESY Photon Science XFEL official start of project: 5 June 2007 FLASH upgrade to 1 GeV done, cool down started PETRA III construction started 2 July 2007.
11 November 2010 Natascha Hörmann Computing at HEPHY Evaluation 2010.
ATLAS WAN Requirements at BNL Slides Extracted From Presentation Given By Bruce G. Gibbard 13 December 2004.
Outline  Introduction  Higgs Particle Searches for Origin of Mass  Grid Computing Effort  Linear Collider Detector R&D  Conclusions A Quest for the.
University user perspectives of the ideal computing environment and SLAC’s role Bill Lockman Outline: View of the ideal computing environment ATLAS Computing.
ATLAS Trigger Development
Outline  Higgs Particle Searches for Origin of Mass  Grid Computing  A brief Linear Collider Detector R&D  The  The grand conclusion: YOU are the.
Predrag Buncic Future IT challenges for ALICE Technical Workshop November 6, 2015.
Sep/19/2008 Su Dong Stanford Student Orientation: Accelerator based Particle Physics1 Accelerator Based Particle Physics Su Dong Stanford Student Orientation.
LHC Computing, CERN, & Federated Identities
U.S. ATLAS Computing Facilities Overview Bruce G. Gibbard Brookhaven National Laboratory U.S. LHC Software and Computing Review Brookhaven National Laboratory.
ScECAL Beam FNAL Short summary & Introduction to analysis S. Uozumi Nov ScECAL meeting.
Global Design Effort: Controls & LLRF Americas Region Team WBS x.2 Global Systems Program Overview for FY08/09.
The ATLAS Computing Model and USATLAS Tier-2/Tier-3 Meeting Shawn McKee University of Michigan Joint Techs, FNAL July 16 th, 2007.
Computing Division FY03 Budget and budget outlook for FY04 + CDF International Finance Committee April 4, 2003 Vicky White Head, Computing Division.
John Womersley 1/13 Fermilab’s Future John Womersley Fermilab May 2004.
Computing Issues for the ATLAS SWT2. What is SWT2? SWT2 is the U.S. ATLAS Southwestern Tier 2 Consortium UTA is lead institution, along with University.
The MEG Offline Project General Architecture Offline Organization Responsibilities Milestones PSI 2/7/2004Corrado Gatto INFN.
Distributed Physics Analysis Past, Present, and Future Kaushik De University of Texas at Arlington (ATLAS & D0 Collaborations) ICHEP’06, Moscow July 29,
Remote Monitoring for Hardware & Beam Commissioning E. Harms/FNAL 7 April 2005 Port Jefferson, New York bnl - fnal- lbnl - slac US LHC Accelerator Research.
Argonne’s Rôle in ZEUS José Repond Argonne National Laboratory DOE Visit, Germantown, March 11 th, 2004.
USCMS May 2002Jim Branson 1 Physics in US CMS US CMS Annual Collaboration Meeting May 2002 FSU Jin Branson.
Victoria A. White Head, Computing Division, Fermilab Fermilab Grid Computing – CDF, D0 and more..
Meeting with University of Malta| CERN, May 18, 2015 | Predrag Buncic ALICE Computing in Run 2+ P. Buncic 1.
Eric Prebys, Fermilab Program Director, LARP July 10, 2012.
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
Nigel Lockyer Fermilab Operations Review 16 th -18 th May 2016 Fermilab in the Context of the DOE Mission.
1 Particle Physics Data Grid (PPDG) project Les Cottrell – SLAC Presented at the NGI workshop, Berkeley, 7/21/99.
Grid technologies for large-scale projects N. S. Astakhov, A. S. Baginyan, S. D. Belov, A. G. Dolbilov, A. O. Golunov, I. N. Gorbunov, N. I. Gromova, I.
RD42 Status Report W. Trischuk for the RD42 Collaboration LHCC Meeting – June 12, 2013 Development of CVD Diamond Tracking Detectors for Experiments at.
14 Aug. 08DOE Review John Huth Future Harvard ATLAS Plans John Huth.
LHCb Computing 2015 Q3 Report Stefan Roiser LHCC Referees Meeting 1 December 2015.
Particle Physics Sector Young-Kee Kim / Greg Bock Leadership Team Strategic Planning Winter Workshop January 29, 2013.
Detector building Notes of our discussion
Stanford Linear Accelerator
Grid site as a tool for data processing and data analysis
LHC Science Goals & Objectives
Nuclear Physics Data Management Needs Bruce G. Gibbard
Stanford Linear Accelerator
Presentation transcript:

ATLAS users and SLAC Jason Nielsen Santa Cruz Institute for Particle Physics University of California, Santa Cruz SLAC Users Organization Meeting 7 February 2008

SLUO MeetingJ. Nielsen, UCSC2 Main Topics for Discussion Sizable number of regional ATLAS users Coordinated through US ATLAS Collaboration Users are all active in global ATLAS activities, but computing model fosters regional collaboration on common physics topics Computing support and leadership for Tier-2 center, data handling, and interactive computing pool Physics analysis collaboration ATLAS detector upgrade R&D

SLUO MeetingJ. Nielsen, UCSC3 ATLAS experiment at LHC We have no new official schedule, but expect to close up in May to be ready for beam injection in June Probe of “terascale” physics with 7+7 TeV proton beams; Initial luminosity cm -2 s -1 ; design luminosity cm -2 s -1

SLUO MeetingJ. Nielsen, UCSC4 Guess at New Physics Timeline

SLUO MeetingJ. Nielsen, UCSC5 Regional US ATLAS collaborators University of Washington University of Oregon Lawrence Berkeley National Laboratory Stanford Linear Accelerator Center Santa Cruz Institute for Particle Physics California State University, Fresno University of California, Irvine University of Arizona University of Wisconsin - Madison + other users who are SLAC residents Totals 250+ collaborators in CERN database

SLUO MeetingJ. Nielsen, UCSC6 ATLAS Computing Model Utilizes the GRID framework for distributed computing Tier 0 (CERN) Tier 1 (10 worldwide) Tier 2 (5 in US) Tier 3 Final Dress Rehearsal of data chain is in progress this month RAW/ AOD/ ESD AOD DPD Data processing with full calibration MC production Dataset access Interactive analysis: fits, plots (after A. Farbin)

SLUO MeetingJ. Nielsen, UCSC7 Western Tier 2 Center WT2 at SLAC is a “super Tier-2” with some nodes set aside for T3-like interactive user analysis Scale eventually to O(1000) CPU, O(PB) disk Currently plenty of interactive CPU for early users; hope this will be true with more users xrootd, PROOF, and other data-handling expertise aids not only WT2 but also other ATLAS installations Feel pressure to have WT2 fully operational in 2009 –Useful to begin prototyping analysis frameworks in 2008 –Could run, assuming ATLAS upgrade, until 2020s(?)

SLUO MeetingJ. Nielsen, UCSC8 Data Access Network connectivity challenges point to national lab involvement Leverage expertise gained on BaBar –Computing model has evolved toward BaBar’s! xrootd provides distributed access to user ntuples Some user-based work on PROOF for parallel analysis jobs Users working with WT2 board to make sure subscriptions for analysis datasets Recently began user analysis meetings for FDR (Final Dress Rehearsal) work

SLUO MeetingJ. Nielsen, UCSC9 User Computing at SLAC Big advantage to users who have reliable access to all datasets at the Tier 2 centers Even more important to have local interactive resources with access to datasets –Available CPU for reducing AOD datasets –Disk space for storing Derived Physics Data ntuples Estimate O(10) CPUs/user –One option: Tier 3 on every campus -- none in our region SLAC ATLAS users are excited about access on interactive nodes as a part of the WT2 center

SLUO MeetingJ. Nielsen, UCSC10 Pooling “Local” Tier-3 Resources Tier-3 is not currently part of ATLAS computing budget -- any contributions come from institutions Must have excellent data connection to Tier-2 Must have infrastructure for nodes –These nodes are probably not used 100% of time Must have computing support for OS/software, data center operations Any possibility of hosting future Tier-3 contributions at SLAC for economy of scale? –Similar to model of CDF at Fermilab; ATLAS at NERSC

SLUO MeetingJ. Nielsen, UCSC11 Physics Analysis West Coast Analysis Support Center exists at LBNL and provides analysis software support and training Works with SLAC to host and train WT2 users SLAC has been successful in forming small physics working groups with users (JetMET, hadronic state) –Reconstruction experts in US ATLAS are most often national lab staff or research physicists Excellent partnership with theory group in W/Z+jets, jet algorithms, SUSY, generators –West Coast LHC Theory Network attracts SLAC users from university theory groups SLAC ATLAS physics forum for presenting on-going work in informal atmosphere

SLUO MeetingJ. Nielsen, UCSC12 Targeted Topical Workshops Bring together parties interested in specific topics for 3-5 days of intense work –Very good feedback from users who attend Hadronic Final State Analysis Forum (Jan 2008) Tier 2/3 Workshop (Nov 2007) Physics Workshop of the Americas (Aug 2007) Physics Analysis Retreat (Mar 2007) Fast Shower Simulation Workshop (Nov 2006) Also overlap with West Coast theory meetings

SLUO MeetingJ. Nielsen, UCSC13 Future of Remote Users During current commissioning, all available hands are at CERN; not necessarily the case in the future If budgets allow, postdocs and senior grad students are at CERN instead of campus Faculty, junior grad students can benefit from local meetings and collaborative workshops –People with visa issues also fall in this camp Possibility of remote monitoring shifts –Looking into a remote monitoring station at SLAC –Already done for CMS at Fermilab

SLUO MeetingJ. Nielsen, UCSC14 Local Computing Support Experts Not so much core software questions (LBNL) as specific questions about tuning jobs and data access for the WT2 May be a different expert from the WT2 support team, but necessary to streamline operations and avoid wasting CPU Could overlap with leadership in physics reconstruction objects (jets, tracking, etc.) –Respond to queries like “What job options and data files do I need to get the most recent jet energy calibration?”

SLUO MeetingJ. Nielsen, UCSC15 ATLAS Upgrade R&D Many Western U.S. institutes have interests in R&D toward major upgrades in 2015 Silicon tracker (pixel & strips), trigger upgrades, forward det SLAC is part of LHC Accelerator Research Program (LARP) Expertise at SLAC exists as extension of collider program

SLUO MeetingJ. Nielsen, UCSC16 Engineering Resources Often a challenge for university groups to maintain these resources over long time periods SLAC engineers have experience, mechanical and electrical, from many recent projects (SLD, BaBar, GLAST, etc.) Universities are already consulting with SLAC personnel on some projects (DAT, Pixels)

SLUO MeetingJ. Nielsen, UCSC17 Additional Resources Any microelectronics fabrication facilities would be put to good use for prototyping and test components Simulation experts needed (GEANT4) along with experts on developing tracking algorithms in high- multiplicity environment Tracker production will involve high-throughput testing and construction –Would require infrastructure investment: probing stations, assembly robotics

SLUO MeetingJ. Nielsen, UCSC18 Summary ATLAS users are preparing for first data and already pursuing detector upgrade R&D Western Tier-2 is a huge contribution from SLAC that is necessary to analyze data efficiently Users are very keen on interactive access at WT2 and understanding how to best use facilities Physics analysis collaboration clustering around seeds of interest -- SLAC group plays big role Upgrade R&D will benefit greatly from experience of SLAC personnel