Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Core Data Processing Software Plan Review – University of Washington, Seattle, WA – Sept. 19-20 th 2013. Data Management XXVIII IAU General Assembly.

Similar presentations


Presentation on theme: "1 Core Data Processing Software Plan Review – University of Washington, Seattle, WA – Sept. 19-20 th 2013. Data Management XXVIII IAU General Assembly."— Presentation transcript:

1 1 Core Data Processing Software Plan Review – University of Washington, Seattle, WA – Sept. 19-20 th 2013. Data Management XXVIII IAU General Assembly Beijing, China August 20-31, 2012 ISR for LSST Simon Krughoff LSST Staff Scientist – University of Washington Core Data Processing Software Plan Review September 19-20 th, 2013

2 2 Core Data Processing Software Plan Review – University of Washington, Seattle, WA – Sept. 19-20 th 2013. Data Management Addressing the Charge 1.Have the necessary astronomical data processing software components needed to produce the LSST data products been identified and planned for? 2.Do credible plans exist to achieve readiness of those components in time for start of survey operations? 6.Are the plans for image processing and source characterization on single visits in agreement with the data product requirements and technically feasible?

3 3 Core Data Processing Software Plan Review – University of Washington, Seattle, WA – Sept. 19-20 th 2013. Data Management Outline − Steps of instrument signature removal (ISR) − Nightly vs. data release production (DRP) processing

4 4 Core Data Processing Software Plan Review – University of Washington, Seattle, WA – Sept. 19-20 th 2013. Data Management Introduction The instrument signature removal is the first step in any analysis of the imagery. ISR sets the systematic error floor for subsequent measurements. The data management (DM) subsystem has been allocated 3mmags of systematic error in the ISR process (Docushare- 8123). The application of the corrections is very straightforward algorithmically, so will add nothing to the error budget. Measurement errors of the calibration products is the source of systematic error. Creation of calibration products was reviewed in the data products definition review.

5 5 Core Data Processing Software Plan Review – University of Washington, Seattle, WA – Sept. 19-20 th 2013. Data Management General ISR steps 1.Mask bad pixels 2.Perform x-talk correction (*) 3.Bias correction (+) ○ This can be in two steps: an overscan correction and subtraction of a master bias 4.Correct for non-linearity in the detectors (*) 5.Trim and assemble 6.Flat field correction (*) 7.Fringe correction (+) 8.Cosmic ray rejection (not strictly ISR)

6 6 Core Data Processing Software Plan Review – University of Washington, Seattle, WA – Sept. 19-20 th 2013. Data Management General ISR steps S = Corrected science frame g = gain R = raw science frame B = Master Bias a i = crosstalk coefficient for i-th offending segment C i = bias subtracted pixel data in i-th offending segment L = linearity correction F = illumination corrected flat Fr = fitted fringe contribution

7 7 Core Data Processing Software Plan Review – University of Washington, Seattle, WA – Sept. 19-20 th 2013. Data Management Bias subtraction Cause: The CCD electronics add a roughly constant pedestal to the counts in each pixel. This bias can have pixel to pixel structure and will be handled in two parts. Correction: 1.Overscan correction: overscan columns are averaged and fit with a 1D function and subtracted row by row. This accounts for time variable bias level – B(t) below. 2.A master bias frame created from a set of overscan corrected zero length exposures is subtracted from the image to correct for 2D structure in the bias level – b(x,y) below.

8 8 Core Data Processing Software Plan Review – University of Washington, Seattle, WA – Sept. 19-20 th 2013. Data Management Cross talk correction Cause: Cross talk is due to the interaction of fields produced by the current physically proximate electronic components. The result is an imprint of other segments on the raw pixel data. Correction: Correction will consist of multiplying each victim segment by the aggressor segments modulated by a measured coefficient.

9 9 Core Data Processing Software Plan Review – University of Washington, Seattle, WA – Sept. 19-20 th 2013. Data Management Correcting for non-linearity Cause: CCDs do not have perfectly linear response. At both almost empty and almost full well the response can become non- linear. Correction: 1.ISR is supplied a measurement of the linearity of the CCD response along with any temperature dependence. 2.Replace data values from a lookup table or functional form to correct the response to be linear.

10 10 Core Data Processing Software Plan Review – University of Washington, Seattle, WA – Sept. 19-20 th 2013. Data Management Flat field Cause: Flat fielding puts the flux seen by all pixels on the same system. This accounts for variation in quantum efficiency from pixel to pixel as well as vignetting of the system and occultation caused by dust on the optical elements. Correction: ISR will be supplied with a broad band flat assuming a flat spectrum for all sources (corrections will be applied downstream on a source by source basis assuming an SED for each source). The correction is simply a division by the normalized master flat.

11 11 Core Data Processing Software Plan Review – University of Washington, Seattle, WA – Sept. 19-20 th 2013. Data Management Fringing Cause: Fringe patterns are an interference affect that result from the sharp emission lines in the night sky spectrum. This affect is the strongest in redder bands. Correction: 1.Fringe patterns are measured at the wavelengths of important sky lines. 2.A linear combination of the measured fringe patterns are fit to the data using a minimization technique for each flat fielded chip. 3.The fitted fringe pattern is subtracted from the image.

12 12 Core Data Processing Software Plan Review – University of Washington, Seattle, WA – Sept. 19-20 th 2013. Data Management CR rejection We will take exposures in pairs separated by the readout time. Using a variant of the algorithm originally proposed by E.J. Groth in 1992, we use the two images and the expected statistics on those images to reject pixels that are significant outliers. Once cosmic rays are flagged the two splits can be combined to produce an image with an effective exposure time of twice the individual splits.

13 13 Core Data Processing Software Plan Review – University of Washington, Seattle, WA – Sept. 19-20 th 2013. Data Management Data release vs. nightly processing: Cross-talk correction Data Release Coefficients from data < 1 week removed Segments from multiple rafts considered (if necessary) Nightly Performed by camera before delivery to ISR Uses coefficients from data up to 2 weeks removed No inter-raft correction applied

14 14 Core Data Processing Software Plan Review – University of Washington, Seattle, WA – Sept. 19-20 th 2013. Data Management Addressing the Charge 1.Have the necessary astronomical data processing software components needed to produce the LSST data products been identified and planned for? – Yes, in fact most of the correction steps exist in prototype form in the LSST stack. – All the mentioned steps have been carried out on HSC data. – All but cross-talk correction and fringe subtraction have been carried out on simulated LSST data (the simulations have not fully implemented a cross-talk model or fringing from sky lines).

15 15 Core Data Processing Software Plan Review – University of Washington, Seattle, WA – Sept. 19-20 th 2013. Data Management Addressing the Charge 2.Do credible plans exist to achieve readiness of those components in time for start of survey operations? – In terms of algorithmic development, the ISR process represents very little risk since the algorithmic complexity is very low and well understood. – Further, the necessity of these components for doing commissioning and other engineering and scientific validation influences the schedule for development.

16 16 Core Data Processing Software Plan Review – University of Washington, Seattle, WA – Sept. 19-20 th 2013. Data Management Addressing the Charge 6.Are the plans for image processing and source characterization on single visits in agreement with the data product requirements and technically feasible? – There are no technical roadblocks in applying the calibration products given that they are produced to the necessary specifications.

17 17 Core Data Processing Software Plan Review – University of Washington, Seattle, WA – Sept. 19-20 th 2013. Data Management Questions?

18 18 Core Data Processing Software Plan Review – University of Washington, Seattle, WA – Sept. 19-20 th 2013. Data Management The Stubbs-Tonry throughput machine Stubbs and Tonry (2006; and later papers for results) have shown that using a calibrated photo sensor and a tunable laser it is possible to measure the total system throughput for complicated telescope systems. The technique can be used to generate some of the necessary calibration products. The basic idea is to shine monochromatic light on a ~Lambertian screen. Measure the light coming from the screen with a photodiode and compare that to the light received by the CCD.

19 19 Core Data Processing Software Plan Review – University of Washington, Seattle, WA – Sept. 19-20 th 2013. Data Management Pupil ghost This will be slowly varying across the frame. We expect that for most applications this can be modeled in the background estimation phase. It will be relatively stable and depends only on the total flux through the system so should be possible to model and subtract if necessary.

20 20 Core Data Processing Software Plan Review – University of Washington, Seattle, WA – Sept. 19-20 th 2013. Data Management Crosstalk matrix for a single chip


Download ppt "1 Core Data Processing Software Plan Review – University of Washington, Seattle, WA – Sept. 19-20 th 2013. Data Management XXVIII IAU General Assembly."

Similar presentations


Ads by Google