IRAC-Data Processing Pipeline Review / January 26, 2001 Jason Surace (1) IRAC Pipeline Data Analysis and Pipeline Validation Plan Jason Surace January.

Slides:



Advertisements
Similar presentations
EMMA with 4 Modules: Commissioning Similar to Full Ring J. Scott Berg Brookhaven National Laboratory 11 March 2010.
Advertisements

CALIBRATION WRINKLES Project OBJECTIVE: Test techniques for improving echelle-mode wavelength  scales of STIS, solar-system’s premier high-res UV.
Systems Analysis, Prototyping and Iteration Systems Analysis.
CS 325: Software Engineering January 13, 2015 Introduction Defining Software Engineering SWE vs. CS Software Life-Cycle Software Processes Waterfall Process.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 17 Slide 1 Rapid software development.
CS 325: Software Engineering January 15, 2015 Software Process & Methodology Prototyping Process Model Evolutionary Process Model Spiral Process Model.
Clouds and the Earth’s Radiant Energy System NASA Langley Research Center / Atmospheric Sciences Methodology to compare GERB- CERES filtered radiances.
The loss function, the normal equation,
Chandra X-Ray Observatory CXC Paul Plucinsky CUC September Overall Status of the Instrument 2.Update on Controlling the FP Temperature ACIS Ops.
Timepix Studies: Medipix Collaboration Summary and More Timewalk Plots Alessandra Borgia Marina Artuso Syracuse University Group Meeting – Thursday 20.
GLAST LAT Project IA weekly meeting, February 3, 2006 ACD subsystem Alex Moiseev 1 ACD Pedestals ACD Team at Goddard: Alex Moiseev Dave Thomspon Bob Hartman.
Identifying anomalous strips David Stuart, Noah Rubinstein University of California, Santa Barbara June 18,
Stellar Linearity Test Jason Surace (Spitzer Science Center)
Introduction to Spitzer and some applications Data products Pipelines Preliminary work K. Nilsson, J.M. Castro Cerón, J.P.U. Fynbo, D.J. Watson, J. Hjorth.
Developed by Reneta Barneva, SUNY Fredonia The Process.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 17 Slide 1 Rapid software development.
Testing Test Plans and Regression Testing. Programs need testing! Writing a program involves more than knowing the syntax and semantics of a language.
Method Comparison A method comparison is done when: A lab is considering performing an assay they have not performed previously or Performing an assay.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 17 Slide 1 Extreme Programming.
Presenter: Shant Mandossian EFFECTIVE TESTING OF HEALTHCARE SIMULATION SOFTWARE.
Chapter 2 Data Handling.
Performance of SPIROC chips in SKIROC mode
Chapter 2 The process Process, Methods, and Tools
CS 360 Lecture 3.  The software process is a structured set of activities required to develop a software system.  Fundamental Assumption:  Good software.
Mark Tillack, Lane Carlson, Jon Spalding Laboratory Demonstration of In-chamber Target Engagement HAPL Project Meeting Rochester, NY 8-9 November 2005.
Extreme/Agile Programming Prabhaker Mateti. ACK These slides are collected from many authors along with a few of mine. Many thanks to all these authors.
LECTURE 38: REFACTORING CSC 395 – Software Engineering.
Raw data recipes SGDP Science Grade Data Products Wolfram Freudling DPD / SDP Group.
ENM 503 Lesson 1 – Methods and Models The why’s, how’s, and what’s of mathematical modeling A model is a representation in mathematical terms of some real.
Dr. Tom WayCSC Testing and Test-Driven Development CSC 4700 Software Engineering Based on Sommerville slides.
Lesson 7.3 Graphs of Real World Situations
Consortium Meeting La Palma October PV-Phase & Calibration Plans Sarah Leeks 1 SPIRE Consortium Meeting La Palma, Oct. 1 – PV Phase and.
Lecture-3.
CDE CDR, September 14, 2004 Your Position, Your Name 1 GATS AIM Science Team Meeting January 23-24, 2007 CIPS Calibration Review, Aimee Merkel, Bill McClintock.
Jörn Helbert Planetary Emissivity Laboratory Facing the heat – Obtaining near infrared real emissivity spectra at Venus surface temperatures.
COS signal to noise capabilities Limitation of COS S/N No good 2-D flat available. Fixed pattern noise dominates COS spectra. An uncalibrated COS spectrum.
N-W Regional Initiative: Transparency Questionnaire Results Sonia Brown Director – European Strategy & Environment 19 September 2007.
Robust Real Time Face Detection
NIRISS Calibration Working Group Activities NIRISS Science Team Meeting Montréal, 2015 October 20/21 Kevin Volk STScI / HIA.
Click to add text Systems Analysis, Prototyping and Iteration.
1 SiPM studies: Highlighting current equipment and immediate plans Lee BLM Quasar working group.
14 January Observational Astronomy SPECTROSCOPIC data reduction Piskunov & Valenti 2002, A&A 385, 1095.
11-Jun-04 1 Joseph Hora & the IRAC instrument team Harvard-Smithsonian Center for Astrophysics The Infrared Array Camera (IRAC) on the Spitzer Space Telescope.
Basic Detector Measurements: Photon Transfer Curve, Read Noise, Dark Current, Intrapixel Capacitance, Nonlinearity, Reference Pixels MR – May 19, 2014.
T. Lari – INFN Milan Status of ATLAS Pixel Test beam simulation Status of the validation studies with test-beam data of the Geant4 simulation and Pixel.
G063 – Prototyping. Learning Objective: At the end of this topic you should be able to: describe prototyping as a software development methodology.
Jin Huang M.I.T. For Transversity Collaboration Meeting Jan 29, JLab.
Software Development Process CS 360 Lecture 3. Software Process The software process is a structured set of activities required to develop a software.
LIGO-G09xxxxx-v1 Form F v1 The Viewfinder Telescopes of Advanced LIGO’s Optical Levers Michael Enciso Mentor: Riccardo DeSalvo Co-Mentor: Tara Celermsongsak.
February 21, 2002TIPS meeting1 "Data contained herein is exempt from ITAR regulations under CFR 125.4(13) -- data approved for public disclosure." TIPS.
These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e (McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman.1.
General Troubleshooting Nonlinear Diagnostics. Goal – In this workshop, our goal is to use the nonlinear diagnostics tools available in Solution Information.
Валидация TRT DCS CONDITIONS SERVICE Евгений Солдатов, НИЯУ МИФИ “Physics&Computing in ATLAS” – 22/09/2010.
BI MD# BSRT Measurements Beam 1 and 3.5 TeV 2 bunches with different emittances Bumps: -4, -2,0,2,4 mm Results: 3.5TeV.
Systems Development Life Cycle
Exploring Taverna engine Aleksandra Pawlik materials by Katy Wolstencroft University of Manchester.
Software Design and Development Development Methodoligies Computing Science.
1 Core Data Processing Software Plan Review – University of Washington, Seattle, WA – Sept th Data Management XXVIII IAU General Assembly.
1 SYS366 Week 2 - Lecture Visual Modeling and Process.
NAC flat fielding and intensity calibration
Florian Lütticke, Carlos Marinas, Norbert Wermes
Fernando Aguilar, IFCA-CSIC
Software Process Models
Life Cycle Models PPT By :Dr. R. Mall.
Generalization ..
Software life cycle models
Requirements Engineering Introduction
Rapid software development
Charge for CSPEC Detector Construction Workshop
Presentation transcript:

IRAC-Data Processing Pipeline Review / January 26, 2001 Jason Surace (1) IRAC Pipeline Data Analysis and Pipeline Validation Plan Jason Surace January 26, 2001

IRAC-Data Processing Pipeline Review / January 26, 2001 Jason Surace (2) Pipeline Module Testing Nearly all IRAC pipeline modules have been tested in some way with “real” data, i.e. produced by the flight instrument. But, testing with real data has been somewhat haphazard and not very systematic. We have had to sift through 10s of Gb of test data. Instrument not completed or in stable environment - not clear what is best characterization data. Analysis has been lengthy - lots of problems have been turned up both by us and SAO-IT. IT is still analyzing data and working on characterizing instrumental signatures. Short on manpower. IT pipeline work pre-empted by instrument delivery. How to reach a point where we can check off each module as “finished”?

IRAC-Data Processing Pipeline Review / January 26, 2001 Jason Surace (3) Pipeline Validation Plan 1) Instrument team selects and provides a “validation” data set, where they have pre-analyzed the data. 2) SSC processes this raw data via pipeline. 3) This data is then analyzed by SSC-IST/SAO-IT. 4) If results match, we’re done. If not, we go back to IT and iterate by asking for a new algorithm or a clarification of requirements. In this way, the results of each module and thread will be validated and approved by IT. The pipeline has renewed visibility currently with IT as instrument construction ramps down.

IRAC-Data Processing Pipeline Review / January 26, 2001 Jason Surace (4) Example - LINCAL IRAC data must be corrected for non-linear response of detector. Science requirement is a very stringent 1% linearity over usable data range of detector. Previous LINCAL effort based on a multivariate function delivered by SAO: S'=S (C + A/Sqrt(B-S)) This was tested using the comprehensive performance test (CPT) data generated via ASIST at GSFC. This was intended to be the definitive test data set.

IRAC-Data Processing Pipeline Review / January 26, 2001 Jason Surace (5) Early Analysis Problems However, testing with CPT data showed data was too unstable to demonstrate requirement. Camera was turned on and off each day during multiday test, may have been other problems. Normalized mean count levels for supposedly identical linearity frames taken on different days during CPT. Note that on a given day the stability is very high (drift < 0.5% per 6 hours).

IRAC-Data Processing Pipeline Review / January 26, 2001 Jason Surace (6) Validation Data At SSC request, Rochester provided a new linearity dataset taken under laboratory conditions which they believe can demonstrate the 1% requirement. Jason writes new software to analyze this dataset and prototype several different ways to linearize the data. (See Input: TCAL with increasing exptimes Derived “linear” part (flat-field+lamp pattern) Derived “non-linearity” for rate method.

IRAC-Data Processing Pipeline Review / January 26, 2001 Jason Surace (7) A Comparison of Methods % error as a function of full-well depth for individual pixels. LINCAL (blue) shows behavior seen in previous tests. Scatter and offset are never below a few percent and hence fails the requirement - functional form does not fit well. Red is a new proposed quadratic solution. Both models break down completely within 10% of full- well capacity.

IRAC-Data Processing Pipeline Review / January 26, 2001 Jason Surace (8) So We Redo It New solution will be a quadratic function: S' = A (-A+sqrt((A^2)-(4*B*S)))/(2 * B)) Quadratic solution expected to exceed 1% requirement up to 90% full-well capacity.

IRAC-Data Processing Pipeline Review / January 26, 2001 Jason Surace (9) The Next Iteration There is a plan to acquire new linearity test data during CTA testing. We have a more uniform illuminator available, and the data will be taken in a flight-like manner and environment. The new LINCAL will be ready to test this once the data arrives. Rec/Del for all IRAC algorithms and data from SAO-IT is 02/01/01.