Quality Assurance Benchmark Datasets and Processing David Nidever SQuaRE Science Lead.

Slides:



Advertisements
Similar presentations
LSST science support Richard Dubois SLAC National Accelerator Laboratory Steve Allen Debbie Bard Dominique Boutigny Jim Chiang 27 Feb 2014 Seth Digel SLAC-NOAO.
Advertisements

DECam, NEOs, LSST: Lessons (to be) learned David E. Trilling 2012 LSST AHM That’s a lot of acronyms.
Research School of Astronomy & AstrophysicsSlide 1 SkyMapper SkyMapper and the Stromlo Southern Sky Survey Stefan Keller, Brian Schmidt, Paul Francis and.
1 MULTI-WAVELENGTH, MULTI-EPOCH HERITAGE OF STRIPE 82 | PRINCETON, NJ | MARCH 17, 2014 Name of Meeting Location Date - Change in Slide Master LSST Reprocessed.
CANDELS C osmic A ssembly N ear-infrared D eep E xtragalactic L egacy S urvey Co-PIs: Sandra Faber University of California Santa Cruz Harry Ferguson Space.
DES PreCam A Proposal for the PreCam Calibration Strategy J. Allyn Smith David L. Burke Melissa J. Butner 15 July 2009 DRAFT.
Ultimate wide field Imaging: The Large Synoptic Sky Survey Marek Kowalski Physikalisches Institut Universität Bonn.
1 Preliminary Design Review Tucson, Arizona August 29 th – September 2 nd, 2011 Data Management: Input Data Prepared by: Simon Krughoff ImSim Project:
R. Pain9/18/2008 LSST-SNAP complementarity Reynald Pain IN2P3/LPNHE Paris, France Page 1.
Brian Schmidt, Paul Francis, Mike Bessell, Stefan Keller.
KDUST Supernova Cosmology
The Transient Universe: AY 250 Spring 2007 Existing Transient Surveys: Optical I: Big Apertures Geoff Bower.
January 10, th AAS Meeting, Washington, D.C. 1 The COSMOS Survey: Data Products and Archives Patrick L. Shopbell (Caltech), Nicholas Z. Scoville.
Relative measurements with Synoptic surveys I.Photometry & Astrometry Eran Ofek Weizmann Institute.
X-ray Optical microwave Cosmology at KIPAC. The Survey 5000 square degrees (overlap with SPT and VISTA) Five-band (grizY) + VISTA (JHK) photometry to.
Image credit: ESO/NASA/ESA/JPL-Caltech Unveiling the astrophysics of high-redshift galaxy evolution.
Commissioning the NOAO Data Management System Howard H. Lanning, Rob Seaman, Chris Smith (National Optical Astronomy Observatory, Data Products Program)
Key Performance Metrics and End-to-End Testing
PPA Stack User Driven Image Stacking for ODI data via a Highly Customizable Web Interface Soichi Hayashi Indiana University - Pervasive Technology Institute.
The Cosmic Simulator Daniel Kasen (UCB & LBNL) Peter Nugent, Rollin Thomas, Julian Borrill & Christina Siegerist.
1 New Frontiers with LSST: leveraging world facilities Tony Tyson Director, LSST Project University of California, Davis Science with the 8-10 m telescopes.
Cosmic shear results from CFHTLS Henk Hoekstra Ludo van Waerbeke Catherine Heymans Mike Hudson Laura Parker Yannick Mellier Liping Fu Elisabetta Semboloni.
Distortion in the WFC Jay Anderson Rice University
Dec 2, 2014 Hubble Legacy Archive and Hubble Source Catalog Rick White & Brad Whitmore Current teams: HLA: Michael Dulude, Mark Kyprianou, Steve Lubow,
1 Radio Astronomy in the LSST Era – NRAO, Charlottesville, VA – May 6-8 th LSST Survey Data Products Mario Juric LSST Data Management Project Scientist.
Effective Mirror Diameter6.5 m Field of view9.6 sq deg Exposure (‘Visit’) Time2x15 s /visit Survey length10 years Data rate~15 TB/night Sky coverage~18,000.
● DES Galaxy Cluster Mock Catalogs – Local cluster luminosity function (LF), luminosity-mass, and number-mass relations (within R 200 virial region) from.
Observing Cadences Workshop Organized by NOAO and LSST w/
LSST AGN TODO Items Gordon Richards (Drexel University)
LSST: Preparing for the Data Avalanche through Partitioning, Parallelization, and Provenance Kirk Borne (Perot Systems Corporation / NASA GSFC and George.
John Peoples for the DES Collaboration BIRP Review August 12, 2004 Tucson1 DES Management  Survey Organization  Survey Deliverables  Proposed funding.
1 FINAL DESIGN REVIEW | TUCSON, AZ | OCTOBER 21-25, 2013 Name of Meeting Location Date - Change in Slide Master Title of Presentation Andrew Connolly LSST.
1 PreCam Update: Simulations Douglas Tucker, DES-Calib Telecon, 28 Jan 2008 Courtesy: NOAO/AURA/NSF UM Curtis-Schmidt SMARTS 1m SMARTS 1.5m SMARTS 0.9m.
AST3-1 photometry from Dome A Bin Ma, Peng Wei, Yi Hu, Zhaohui Shang NAOC AST3
The Active Optics System S. Thomas and the AO team.
1 CORE DATA PROCESSING SOFTWARE PLAN REVIEW | SEATTLE, WA | SEPTEMBER 19-20, 2013 Name of Meeting Location Date - Change in Slide Master Data Release Processing.
Source catalog generation Aim: Build the LAT source catalog (1, 3, 5 years) Jean Ballet, CEA SaclayGSFC, 29 June 2005 Four main functions: Find unknown.
Conversion of Stackfit to LSST software stack Status as of Feb 20, 2012.
1 The DES Calibrations Effort Douglas L. Tucker (DES Calibrations Scientist) DES NSF/DOE Review, 8-9 June 2009 The DES Calibrations Effort has connections.
June 7, 2010COSMOS Team Meeting, IfA, Hawaii 1 The COSMOS Archive: Eight Years of Data Patrick L. Shopbell (Caltech), Nick Scoville (Caltech), The COSMOS.
VISTA Hemisphere Survey Dark Energy Survey VHS & DES Francisco Javier Castander.
1 Imaging Surveys: Goals/Challenges May 12, 2005 Luiz da Costa European Southern Observatory.
VST ATLAS: Requirements, Operations and Products Tom Shanks, Nigel Metcalfe, Jamie McMillan (Durham Univ.) +CASU.
From photons to catalogs. Cosmological survey in visible/near IR light using 4 complementary techniques to characterize dark energy: I. Cluster Counts.
LSST and VOEvent VOEvent Workshop Pasadena, CA April 13-14, 2005 Tim Axelrod University of Arizona.
COS PIPELINE CDR Jim Rose July 23, 2001OPUS Science Data Processing Space Telescope Science Institute 1 of 12 Science Data Processing
HLA WFPC2 Source List Photometric Quality Checks Version: August 25, 2008 Brad Whitmore 1.Introduction 2.Comparison with Ground-based Stetson Photometry.
SDSS-II Photometric Calibration:
The LSST Data Processing Software Stack Tim Jenness (LSST Tucson) for the LSST Data Management Team Abstract The Large Synoptic Survey Telescope (LSST)
Mountaintop Software for the Dark Energy Camera Jon Thaler 1, T. Abbott 2, I. Karliner 1, T. Qian 1, K. Honscheid 3, W. Merritt 4, L. Buckley-Geer 4 1.
1 LSST Town Hall 227 th meeting of the AAS 1/7/2016 LSST Town Hall 227 th meeting of the AAS 1/7/16 Large Synoptic Survey Telescope Town Hall Beth Willman.
WFC3 PIPELINE CDR Jim Rose October 16, 2001OPUS Science Data Processing Space Telescope Science Institute 1 of 13 Science Data Processing
Selection and Characterization of Interesting Grism Spectra Gerhardt R. Meurer The Johns Hopkins University Gerhardt R. Meurer The Johns Hopkins University.
NIRSS: the Near-Infrared Sky Surveyor WFIRST Meeting 2011 February 3 Daniel Stern (JPL/Caltech)
26th October 2005 HST Calibration Workshop 1 The New GSC-II and it’s Use for HST Brian McLean Archive Sciences Branch.
COSMIC MAGNIFICATION the other weak lensing signal Jes Ford UBC graduate student In collaboration with: Ludovic Van Waerbeke COSMOS 2010 Jes Ford Jason.
DESpec in the landscape of large spectrographic surveys Craig Hogan University of Chicago and Fermilab.
GSPC -II Program GOAL: extend GSPC-I photometry to B = V ˜ 20 add R band to calibrate red second-epoch surveys HOW: take B,V,R CCD exposures centered at.
1 DES Calibrations: Remaining Loose Ends for Year 1 (Update: 23 March 2012) Douglas L. Tucker.
1 OSG All Hands SLAC April 7-9, 2014 LSST Data and Computing - Status and Plans Dominique Boutigny CNRS/IN2P3 and SLAC OSG All Hands meeting April 7-9,
LSST CCD Chip Calibration Tiarra Stout. LSST Large Synoptic Survey Telescope Camera – 1.6 m by 3 m. 3.2 billion pixels kg (~6173 lbs) 10 square.
Validation of HLA Source Lists Feb. 4, 2008 Brad Whitmore 1.Overview 2.Plots 3.Summary.
1 GMT Community Science Meeting Monterey, CA October 1 – 3, 2015 LSST – A Discovery Machine for ELT Era Science Beth Willman LSST Deputy Director GMT Community.
Source catalog generation Aim: Build the LAT source catalog (1, 3, 5 years) Jean Ballet, CEA SaclaySLAC, 23 May 2005 Four main functions: Find unknown.
LSST Commissioning Overview and Data Plan Charles (Chuck) Claver Beth Willman LSST System Scientist LSST Deputy Director SAC Meeting.
LSST Commissioning Overview and Data Plan Charles (Chuck) Claver Beth Willman LSST System Scientist LSST Deputy Director SAC Meeting.
Crowded Field Photometry Stars, Milky Way, Local Volume
DM Plans for Crowded Stellar Fields
Photometric Redshift Training Sets
Overview of Science Verification Plan Keith Bechtol and Zeljko Ivezic LSST Commissioning Plan Review January 24-26, 2017.
Presentation transcript:

Quality Assurance Benchmark Datasets and Processing David Nidever SQuaRE Science Lead

Science Quality Assurance Need to verify the LSST pipeline software (“the stack”) produces data products that are good enough to do the science that LSST has promised. One of the essential goals of the Science Quality and Reliability Engineering (SQuaRE) team in Tucson (led by Frossie Economou and myself). 2

QA Approaches Several approaches or aspects of this: Run quick tests on small datasets and produce metrics on a regular basis to performance of the stack (i.e. regression testing). Run large precursor datasets through the stack “end-to-end” and manually inspect the results. Treat them as if we were going to do science on them. 3

“End-to-End” Processing Want "end-to-end" as much as possible to fully test the software; starting from raw data through to calibrated source catalogs and alerts. If end-to-end can't be done, then at least we'll be able to figure out which portions of the stack are currently missing or need work. In some ways this is a “data challenge” for ourselves. 4

Benchmark Datasets Want datasets to test the stack in regimes of four primary science drivers: – Dark Energy / Dark Matter – Solar System – Transients – Milky Way Want datasets to test two main categories of data products: – Level 1, Alert Production (difference imaging, transients) – Level 2, Data Release Production (deep stacks, “static” sky) 5

Benchmark Datasets Want datasets taken in different environments (e.g. crowded/sparse regions, good/bad seeing, etc.). We need multiple datasets since no existing survey spans all these regimes the way LSST will. Bunch of DM scientists sat down on Monday and came up with an initial list of benchmark datasets that satisfy most of these needs. Focused on DECam, CFHT and HSC 6

Benchmark Datasets DECam COSMOS field, PI: Dey – Deep (~26 AB mag), 3 sq. deg., ugrizY – ~1500 images – All public now – Lots of existing datasets and catalogs for reference Bulge survey, PI: Saha – ugriz, 6 fields – ~3500 shallow frames (~700 per field), ~3500 deep frames (~7000 per field) – Crowded field, looking for variables – All public now 7

Benchmark Datasets Solar system objects, PI: Allen – Covering large area, looking for solar system objects – ~ s r-band images public so far HiTS survey, PI: Forster – Difference imaging, transients, SNe – ~2800 images of 40 fields going public in 2015B (almost all g) and more in 2016B – ~ s exposures, most g (~7000) and r (~1000) – ~50 fields, ~30-40 epochs per field 8

Benchmark Datasets SMASH survey, PI: Nidever – Deep, ~24.5 mag, ugriz – 180 fields, ~30 frames per field – Magellanic Clouds and periphery – Crowded and sparse fields – Stripe82 data, useful reference field with good astrometry and photometry from SDSS – Eventually want to run the full dataset through the stack The stack currently runs on calibrated DECam images, not yet on raw images. Need to implement DECam Instrumental Signature Removal (ISR). Slater/Chiang/Nidever working on this. 9

Benchmark Datasets CFHT Lensing Survey – Deep, ~25 mag, ugriz, 154 sq. deg – Good for weak lensing – To be run by our French colleagues at IN2P3 (Dominique Boutigny et al.) – The stack is already set up to run on CFHT data HSC COSMOS data – Very good seeing, very deep (deeper than 10yr LSST) – Telescope/camera similar to LSST – Won't be publicly available until March 2016 – The stack is already set up to run on HSC data 10

Benchmark Datasets Simulated LSST data – Still need to figure out what we want/need – But important to test that the stack is ready for LSST data This is a first draft of the benchmark datasets and it will likely grow/change. Want to keep it to a relatively small number of datasets. 11

Software Cross-Comparison Almost all of these datasets will have separate reductions by the PI teams using non-LSST software. Allows for comparison of LSST stack with other software suites (e.g. DAOPHOT, DoPHOT, Sextractor, etc.). 12

Benchmark Data Processing Goal to run a "significant" fraction of these data through the stack in the near future. Still need to converge on the number of frames for each dataset, but these will be large runs. 13

Post-Processing Analysis After processing will inspect the outputs manually and perform quality check/metrics. Will look to turn the manual inspection and checks into automated ones. Would like to make the results of the runs available to the community. 14

The End 15

16

17