LIGO-G020484-00-Z 23 October 2002NSF Review of LIGO Laboratory1 The Penn State LIGO Data Analysis Center Lee Samuel Finn Penn State.

Slides:



Advertisements
Similar presentations
11/12/2003LIGO Document G Z1 Data reduction for S3 I Leonor (UOregon), P Charlton (CIT), S Anderson (CIT), K Bayer (MIT), M Foster (PSU), S Grunewald.
Advertisements

Oklahoma Center for High Energy Physics A DOE EPSCoR Proposal Submitted by a consortium of University of Oklahoma, Oklahoma State University, Langston.
Duke Atlas Tier 3 Site Doug Benjamin (Duke University)
Modern Correctional Facilities Need:  Robust and Reliable Data Management  Efficient data entry and manipulation  Versatile Professional Report Generation.
Distributed IT Infrastructure for U.S. ATLAS Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
Title US-CMS User Facilities Vivian O’Dell US CMS Physics Meeting May 18, 2001.
Marilyn T. Smith, Head, MIT Information Services & Technology DataSpace IS&T Data CenterMIT Optical Network 1.
Dell IT Innovation Labs in the Cloud “The power to do more!” Andrew Underwood – Manager, HPC & Research Computing APJ Solutions Engineering Team.
8 Systems Analysis and Design in a Changing World, Fifth Edition.
© 2008 Hewlett-Packard Development Company, L.P. The information contained herein is subject to change without notice Automated Workload Management in.
LIGO- GXXXXXX-XX-X Advanced LIGO Data & Computing Material for the breakout session NSF review of the Advanced LIGO proposal Albert Lazzarini Caltech,
Research Computing with Newton Gerald Ragghianti Nov. 12, 2010.
Grid Computing Veronique Anxolabehere Senior Director of Product Marketing Mike Margulies Senior Director, Grid Platform Solutions.
U.S. ATLAS Physics and Computing Budget and Schedule Review John Huth Harvard University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven.
LIGO-G E ITR 2003 DMT Sub-Project John G. Zweizig LIGO/Caltech Argonne, May 10, 2004.
October 24, 2000Milestones, Funding of USCMS S&C Matthias Kasemann1 US CMS Software and Computing Milestones and Funding Profiles Matthias Kasemann Fermilab.
G D LSC ITR2003, ANL, LIGO Scientific Collaboration ITR 2003 Advisory Committee Meeting K. Blackburn 1, P. Brady 2, S. Finn 3, E.
Patrick R Brady University of Wisconsin-Milwaukee
LSC General Meeting LIGO Scientific Collaboration - University of Wisconsin - Milwaukee 1 UWM Data Analysis Facility LSC General Meeting, August.
Update on the LIGO Data Analysis System LIGO Scientific Collaboration Meeting LIGO Hanford Observatory August 19 th, 2002 Kent Blackburn Albert Lazzarini.
9.3 Computer System Support. Total Cost of Operating Computer Systems Total Costs of Ownership (TCO) – costs of installing, operating & maintaining Includes.
LIGO-G9900XX-00-M ITR 2003 DMT Sub-Project John G. Zweizig LIGO/Caltech.
Proposal for Salish Kootenai College Participation in the LSC Tim Olson Salish Kootenai College SKC Astrophysics GroupMarch 2002 LSC Meeting Thursday Plenary.
The Future of the iPlant Cyberinfrastructure: Coming Attractions.
Instrumentation of the SAM-Grid Gabriele Garzoglio CSC 426 Research Proposal.
CRISP & SKA WP19 Status. Overview Staffing SKA Preconstruction phase Tiered Data Delivery Infrastructure Prototype deployment.
Looking Ahead: A New PSU Research Cloud Architecture Chuck Gilbert - Systems Architect and Systems Team Lead Research CI Coordinating Committee Meeting.
LIGO Z LIGO Scientific Collaboration -- UWM 1 LSC Data Analysis Alan G. Wiseman (LSC Software Coordinator) LIGO Scientific Collaboration.
U.S. ATLAS Tier 1 Planning Rich Baker Brookhaven National Laboratory US ATLAS Computing Advisory Panel Meeting Argonne National Laboratory October 30-31,
US ATLAS Tier 1 Facility Rich Baker Brookhaven National Laboratory DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National Laboratory.
US ATLAS Tier 1 Facility Rich Baker Brookhaven National Laboratory Review of U.S. LHC Software and Computing Projects Fermi National Laboratory November.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
LIGO-G E LIGO Scientific Collaboration Data Grid Status Albert Lazzarini Caltech LIGO Laboratory Trillium Steering Committee Meeting 20 May 2004.
GriPhyN EAC Meeting (Jan. 7, 2002)Paul Avery1 Integration with iVDGL è International Virtual-Data Grid Laboratory  A global Grid laboratory (US, EU, Asia,
Mindcraft is a registered trademark of Mindcraft, Inc. October 26, 1998Copyright 1998 Mindcraft, Inc. A Strategy for Buying Directory Servers Bruce Weiner.
February 20, 2006 Nodal Architecture Overview Jeyant Tamby 20 Feb 2006.
LIGO-G Z Detector Characterization Summary K. Riles - University of Michigan 1 Summary of the Detector Characterization Sessions Keith.
Jan 12, 2009LIGO-G Z1 DMT and NDS2 John Zweizig LIGO/Caltech Ligo PAC, Caltech, Jan 12, 2009.
State of LSC Data Analysis and Software LSC Meeting LIGO Hanford Observatory November 11 th, 2003 Kent Blackburn, Stuart Anderson, Albert Lazzarini LIGO.
Scott Koranda, UWM & NCSA 14 January 2016www.griphyn.org Lightweight Data Replicator Scott Koranda University of Wisconsin-Milwaukee & National Center.
11/12/2003LIGO-G Z1 Data reduction for S3 P Charlton (CIT), I Leonor (UOregon), S Anderson (CIT), K Bayer (MIT), M Foster (PSU), S Grunewald (AEI),
ClinicalSoftwareSolutions Patient focused.Business minded. Slide 1 Opus Server Architecture Fritz Feltner Sept 7, 2007 Director, IT and Systems Integration.
Comprehensive Scientific Support Of Large Scale Parallel Computation David Skinner, NERSC.
CD FY09 Tactical Plan Status FY09 Tactical Plan Status Report for Neutrino Program (MINOS, MINERvA, General) Margaret Votava April 21, 2009 Tactical plan.
LSC Meeting LIGO Scientific Collaboration - University of Wisconsin - Milwaukee 1 Software Coordinator Report Alan Wiseman LIGO-G Z.
LSC Meeting LIGO Scientific Collaboration - University of Wisconsin - Milwaukee 1 LSC Software and Other Things Alan Wiseman University of Wisconsin.
LIGO-G Z Detector Characterization Issues K. Riles - University of Michigan 1 Detector Characterization Issues -- S2 Analysis & S3 Planning.
Computing Issues for the ATLAS SWT2. What is SWT2? SWT2 is the U.S. ATLAS Southwestern Tier 2 Consortium UTA is lead institution, along with University.
MC Production in Canada Pierre Savard University of Toronto and TRIUMF IFC Meeting October 2003.
US ATLAS Tier 1 Facility Rich Baker Deputy Director US ATLAS Computing Facilities October 26, 2000.
LIGO-G W Use of Condor by the LIGO Scientific Collaboration Gregory Mendell, LIGO Hanford Observatory On behalf of the LIGO Scientific Collaboration.
U.S. ATLAS Computing Facilities DOE/NFS Review of US LHC Software & Computing Projects Bruce G. Gibbard, BNL January 2000.
LIGO-G E PAC9 Meeting LIGO Laboratory at Caltech 1 The LIGO I Science Run Data & Computing Group Operations Plan 9 th Meeting of the.
U.S. ATLAS Computing Facilities U.S. ATLAS Physics & Computing Review Bruce G. Gibbard, BNL January 2000.
LIGO PAC11, Nov 2001 Syracuse proposal to NSF LIGO-G Z 1 Proposal to NSF: “Research in Gravitational Wave Detection with LIGO” Peter R.
LIGO-G Z 23 October 2002LIGO Laboratory NSF Review1 Searching for Gravitational Wave Bursts: An Overview Sam Finn, Erik Katsavounidis and Peter.
Our Mission. Computer Purchasing Website Design and Development Services.
2014 Redefining the Data Center: White-Box Networking Jennifer Casella October 9, 2014 #GHC
Summary of OSG Activities by LIGO and LSC LIGO NSF Review November 9-11, 2005 Kent Blackburn LIGO Laboratory California Institute of Technology LIGO DCC:
Jianming Qian, UM/DØ Software & Computing Where we are now Where we want to go Overview Director’s Review, June 5, 2002.
Architecture of a platform for innovation and research Erik Deumens – University of Florida SC15 – Austin – Nov 17, 2015.
Page 1 Coles – Operating the Observatories PAC 9 Dec. 13, 2000 G L Operating the Observatories Mark Coles.
Scientific Computing at Fermilab Lothar Bauerdick, Deputy Head Scientific Computing Division 1 of 7 10k slot tape robots.
Hall D Computing Facilities Ian Bird 16 March 2001.
Rick Fleming HP Federal Practice Lead February 2009
What is HPC? High Performance Computing (HPC)
Joslynn Lee – Data Science Educator
Grow Your Business with Intel
Computing activities at Victoria
Presentation transcript:

LIGO-G Z 23 October 2002NSF Review of LIGO Laboratory1 The Penn State LIGO Data Analysis Center Lee Samuel Finn Penn State

LIGO-G Z 23 October 2002NSF Review of LIGO Laboratory2 Overview “Greenfield” Tier 2 Center »Served by LIGO Tier 1 Center »On-par with LHO, LLO, MIT, UWM »Serves: LIGO Goal: “Full service” Analysis & Development LIGO Data Analysis Center »Applications –Analysis, detector characterization, Monte Carlo, simulations »Software systems support –LDAS, DMT, other tools (e.g., matlab, standalone C, C++, analysis tools) Configuration: “Clone” LIGO Lab facility »Minimize resources, including sweat equity, spent adapting analysis tools to local customizations »Maximize inter-site operability, resources available for supporting LIGO analysis activities Status »Small (12 node) pathfinder system purchased to evaluate networking options & gain experience with LDAS configuration, operations »Approaching decision point on h/w for DMT support

LIGO-G9900XX-00-M 23 October 2002NSF Review of LIGO Laboratory3 Resources Goal: “Clone” LIGO Lab facility »Minimize resources, sweat equity, spent local customizations »Maximize inter-site operability, resources available to support LIGO/LSC analysis activities H/W »iVDGL: 400K$ h/w over 3 yrs –Proposal budget; iVDGL Project Directorate re-visiting allocations, focus »PSU matching: +150K$ Personnel »iVDGL-funded (4 yrs) –1 Postdoc: hired (1 Nov start) –50% Sysadmin: center support (searching) »PSU-funded to establish Center –(Partnership with HPC group) –0.1 FTE Director, HPC group –0.4 FTE Sr Rsrch Prgmr –0.4 FTE Rsrch Prgmr –1 FTE (CS/EE) graduate student (globus/iVDGL focus) –0.2 FTE web administrative support

LIGO-G9900XX-00-M 23 October 2002NSF Review of LIGO Laboratory4 Reality: iVDGL funding insufficient for full clone Trade: »Compatibility vs.cycles & storage vs. admin FTE cost within budget envelope LDAS: developed, supported on sparc/solaris & intel/linux »Sun vs. Intel for ldas s/w production (esp. database, diskcache API) »Single vs. dual processor nodes for ldas beowulf DMT: open issue »Developed on sparc/solaris, runs on intel/linux, future platform support TBD Pathfinder system for trade- study »Intel (procured) –Goal: evaluate networking, smp options; with LDAS configuration, operations Network options: GigE v. fast ethernet Node options: up v. smp –Configuration: 8 single processor, 4 dual processor nodes »Sun (not procured) –DMT support?

LIGO-G9900XX-00-M 23 October 2002NSF Review of LIGO Laboratory5 Reality: iVDGL funding insufficient for Center support Large scale computing for production analysis requires dedicated, professional support What is required to support center mission? »1 FTE h/w systems administration –70 nodes, 30 TB storage (6 mo RDS) »1 FTE s/w system administration –Maintain & support ldas, dmt, database, other s/w systems & upgrades –Liaison with other tier 2, tier 1 centers (data exhange, database federation, etc.) »0.5-1 FTE at tier 1 center –User support/help desk/liaison/development

LIGO-G9900XX-00-M 23 October 2002NSF Review of LIGO Laboratory6 Summary “Greenfield” Tier 2 Center »Goal: “Full service” Analysis & Development LIGO Data Analysis Center Configuration: “Clone” LIGO Lab facility »Minimize resources, including sweat equity, spent adapting analysis tools to local customizations »Maximize inter-site operability, resources available for supporting LIGO analysis activities Status »Small (12 node) pathfinder system purchased to evaluate networking options & gain experience with LDAS configuration, operations »Approaching decision point on h/w for DMT support Reality: iVDGL funding insufficient for Center support »Required: 2 FTE IT professional at PSU, FTE at tier 1 (LIGO/CIT)