PRP Astronomy & Astrophysics Panel

Slides:



Advertisements
Similar presentations
Common Instrument Middleware Architecture and Federation of Instrument Resources for X-ray Crystallography Rick McMullen Indiana University.
Advertisements

High Performance Wireless Research and Education Network
Grid Database Projects Paul Watson, Newcastle Norman Paton, Manchester.
CCGrid2013 Panel on Clouds Henri Bal Vrije Universiteit Amsterdam.
Astrophysics /07 Technology Responsivity and Risk Mitigation Optimizing the Programmatic S/N Of Future Large Space Telescopes Dan Lester University.
Jeroen Stil Department of Physics & Astronomy University of Calgary Stacking of Radio Surveys.
 Contributing >30% of throughput to ATLAS and CMS in Worldwide LHC Computing Grid  Reliant on production and advanced networking from ESNET, LHCNET and.
NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE SAN DIEGO SUPERCOMPUTER CENTER Particle Physics Data Grid PPDG Data Handling System Reagan.
SAN DIEGO SUPERCOMPUTER CENTER Using Gordon to Accelerate LHC Science Rick Wagner San Diego Supercomputer Center XSEDE 13 July 22-25, 2013 San Diego, CA.
A Radio Telescope for High School Education Abe Reddy North Dakota State University, Department of Physics Advisor: Professor Brian Keating, UCSD CASS.
ANL NCSA PICTURE 1 Caltech SDSC PSC 128 2p Power4 500 TB Fibre Channel SAN 256 4p Itanium2 / Myrinet 96 GeForce4 Graphics Pipes 96 2p Madison + 96 P4 Myrinet.
A Very Brief Introduction to iRODS
The Open Science Grid: Bringing the power of the Grid to scientific research
ADAPT An Approach to Digital Archiving and Preservation Technology Principal Investigator: Joseph JaJa Lead Programmers: Mike Smorul and Mike McGann Graduate.
VADE - Virtual Assembly Design Environment Virtual Reality & Computer Integrated Manufacturing Lab.
The Faulkes Telescope Project Deep Impact & Bioastronomy Institute for Astronomy University of Hawaii K. Meech.
Knowledge Environments for Science and Engineering: Current Technical Developments James French, Information and Intelligent Systems Division, Computer.
LIGO- GXXXXXX-XX-X Advanced LIGO Data & Computing Material for the breakout session NSF review of the Advanced LIGO proposal Albert Lazzarini Caltech,
May 15, 2012 through May 17, 2012 Attended by Michael Baker and Brian McKittrick.
The Cosmic Simulator Daniel Kasen (UCB & LBNL) Peter Nugent, Rollin Thomas, Julian Borrill & Christina Siegerist.
1 LIGO Education and Outreach Gregory Harry On behalf of the LIGO Scientific Collaboration’s Education and Public Outreach Working Group Joint Spring Meeting.
“An Integrated Science Cyberinfrastructure for Data-Intensive Research” Panel CISCO Executive Symposium San Diego, CA June 9, 2015 Dr. Larry Smarr Director,
computer
Open Science Grid  Consortium of many organizations (multiple disciplines)  Production grid cyberinfrastructure  80+ sites, 25,000+ CPU.
ITSC Issue Forum: Toward a Common Long-Term Vision for NEES Cyberinfrastructure & IT Services.
10/24/2015OSG at CANS1 Open Science Grid Ruth Pordes Fermilab
University of Chicago Brent O’Keeffe – CC-NIE proposal # CC-NIE proposal # Creation of HiPeRNet.
How Stars Form Shantanu Basu Physics & Astronomy University of Western Ontario Preview Western, May 3/4, 2003.
Data Intensive Science Network (DISUN). DISUN Started in May sites: Caltech University of California at San Diego University of Florida University.
National Collaboratories Program Overview Mary Anne ScottFebruary 7, rd DOE/NSF Meeting on LHC and Global Computing “Infostructure”
CARRUTHERS LSC 3/20/06 1 LIGO-G M The View from NSF Tom Carruthers LIGO Program Officer National Science Foundation (703)
OSG Tier 3 support Marco Mambelli - OSG Tier 3 Dan Fraser - OSG Tier 3 liaison Tanya Levshina - OSG.
EURO-VO Structure Data Centre Alliance (DCA) A collaborative and operational network of European data centres who, by the uptake of new VO technologies.
Cloud of Clouds for UK public sector. Cloud Services Integrator.
LIGO-G Z LIGO at the start of continuous observation Prospects and Challenges Albert Lazzarini LIGO Scientific Collaboration Presentation at NSF.
LIGO-G What comes next for LIGO? Planning for the post-detection era in gravitational-wave detectors and astrophysics Gabriela González, Louisiana.
By Trajan Anzalone-Gurnee.  Caltech is located at Pasadena, California.
The PRPv1 Architecture Model Panel Presentation Building the Pacific Research Platform Qualcomm Institute, Calit2 UC San Diego October 16, 2015.
“The Pacific Research Platform: a Science-Driven Big-Data Freeway System.” Big Data for Information and Communications Technologies Panel Presentation.
LIGO Plans for OSG J. Kent Blackburn LIGO Laboratory California Institute of Technology Open Science Grid Technical Meeting UCSD December 15-17, 2004.
Slide 1 UCSC 100 Gbps Science DMZ – 1 year 9 month Update Brad Smith & Mary Doyle.
“The UCSD Big Data Freeway System” Invited Short Talk Workshop on “Enriching Human Life and Society” UC San Diego February 6, 2014 Dr. Larry Smarr Director,
UCSD’s Distributed Science DMZ
NSF Middleware Initiative Purpose To design, develop, deploy and support a set of reusable, expandable set of middleware functions and services that benefit.
LIGO-G E PAC9 Meeting LIGO Laboratory at Caltech 1 The LIGO I Science Run Data & Computing Group Operations Plan 9 th Meeting of the.
LIGO: The Laser Interferometer Gravitational-wave Observatory Sarah Caudill Louisiana State University Physics Graduate Student LIGO Hanford LIGO Livingston.
Logistical Networking: Buffering in the Network Prof. Martin Swany, Ph.D. Department of Computer and Information Sciences.
LIGO-G M Press Conference Scientific Operation of LIGO Gary H Sanders Caltech (on behalf of a large team) APS April Meeting Philadelphia 6-April-03.
1 Open Science Grid.. An introduction Ruth Pordes Fermilab.
LIGO-G Z LIGO Scientific Collaboration1 Upcoming Meetings April APS meeting, April 14-17, Jacksonville, FL L-V Meeting, May 21-25, Cascina, Italy.
Nigel Lockyer Fermilab Operations Review 16 th -18 th May 2016 Fermilab in the Context of the DOE Mission.
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
Opportunistic Computing Only Knocks Once: Processing at SDSC Ian Fisk FNAL On behalf of the CMS Collaboration.
“Pacific Research Platform Science Drivers” Opening Remarks PRP Science Driver PI Workshop UC Davis March 23, 2016 Dr. Larry Smarr Director, California.
RI EGI-InSPIRE RI Astronomy and Astrophysics Dr. Giuliano Taffoni Dr. Claudio Vuerli.
1 Particle Physics Data Grid (PPDG) project Les Cottrell – SLAC Presented at the NGI workshop, Berkeley, 7/21/99.
High Performance Cyberinfrastructure Discovery Tools for Data Intensive Research Larry Smarr Prof. Computer Science and Engineering Director, Calit2 (UC.
1 User-Focused Engagement and Service Development Interactive Panel Discussion on Science Engagement TNC16 June 2016.
VO Experiences with Open Science Grid Storage OSG Storage Forum | Wednesday September 22, 2010 (10:30am)
OSG User Support OSG User Support March 13, 2013 Chander Sehgal
PAWN: Producer-Archive Workflow Network
Network-enabled access to globally distributed data: LIGO-India
A Radiotelescope for High School Students
The Faulkes Telescope Project Deep Impact & Bioastronomy
Rutgers Faculty Survey Results
Math 265 Sections 13.1 – 13.5 Created by Educational Technology Network
Grid Application Model and Design and Implementation of Grid Services
User interaction and workflow management in Grid enabled e-VLBI experiments Dominik Stokłosa Poznań Supercomputing and Networking Center, Supercomputing.
Internet2: Focusing on the Future
Substation Automation IT Needs
Presentation transcript:

PRP Astronomy & Astrophysics Panel Moderator – Mike Norman, SDSC Panelists Peter Nugent (LBNL): optical telescope surveys Brian Keating (UCSD): CMB polarization surveys Sharon Brunett (Caltech): advanced LIGO

PRP Astronomy & Astrophysics Panel Data Data path Data volume and rates over time Data access and archiving over time Processing Workflow: what computers and where? Size of data caches near processing Data products distribution Collaboration How will network be used to enable collaboration? SDSC Comet OSG, XRootD integration

POLARBEAR