Commissioning the NOAO Data Management System Howard H. Lanning, Rob Seaman, Chris Smith (National Optical Astronomy Observatory, Data Products Program)

Slides:



Advertisements
Similar presentations
Software Quality Assurance Plan
Advertisements

1 Software Testing and Quality Assurance Lecture 13 - Planning for Testing (Chapter 3, A Practical Guide to Testing Object- Oriented Software)
CS 325: Software Engineering January 13, 2015 Introduction Defining Software Engineering SWE vs. CS Software Life-Cycle Software Processes Waterfall Process.
Copyright © The OWASP Foundation Permission is granted to copy, distribute and/or modify this document under the terms of the OWASP License. The OWASP.
Alternate Software Development Methodologies
Validata Release Coordinator Accelerated application delivery through automated end-to-end release management.
Software for Science Support Systems EVLA Advisory Committee Meeting, March 19-20, 2009 David M. Harland & Bryan Butler.
VISTA/WFCAM pipelines summit pipeline: real time DQC verified raw product to Garching standard pipeline: instrumental signature removal, catalogue production,
OHT 6.1 Galin, SQA from theory to implementation © Pearson Education Limited 2004 Development plan and quality plan objectives The elements of the development.
GLAST LAT Project ISOC Peer Review - March 2, 2004 Document: LAT-PR Section 2.3 Verification and Validation 1 Gamma-ray Large Area Space Telescope.
Copyright © 2007 Software Quality Research Laboratory DANSE Software Quality Assurance Tom Swain Software Quality Research Laboratory University of Tennessee.
Slide 1 Sterling Software Peter Sharer Sterling Software.
Systems Analysis and Design in a Changing World, 6th Edition
The Web 2.0 and the NOAO NVO Portal Christopher J. Miller Data Products Program CTIO/NOAO.
Automated Tests in NICOS Nightly Control System Alexander Undrus Brookhaven National Laboratory, Upton, NY Software testing is a difficult, time-consuming.
Chapter 11: Testing The dynamic verification of the behavior of a program on a finite set of test cases, suitable selected from the usually infinite execution.
© 2006, Cognizant Technology Solutions. All Rights Reserved. The information contained herein is subject to change without notice. Automation – How to.
System Design/Implementation and Support for Build 2 PDS Management Council Face-to-Face Mountain View, CA Nov 30 - Dec 1, 2011 Sean Hardman.
Release & Deployment ITIL Version 3
University of Palestine software engineering department Testing of Software Systems Fundamentals of testing instructor: Tasneem Darwish.
UML - Development Process 1 Software Development Process Using UML (2)
1 Building and Maintaining Information Systems. 2 Opening Case: Yahoo! Store Allows small businesses to create their own online store – No programming.
ABSTRACT Zirous Inc. is a growing company and they need a new way to track who their employees working on various different projects. To solve the issue.
Hunt for Molecules, Paris, 2005-Sep-20 Software Development for ALMA Robert LUCAS IRAM Grenoble France.
Introduction to RUP Spring Sharif Univ. of Tech.2 Outlines What is RUP? RUP Phases –Inception –Elaboration –Construction –Transition.
Software Development *Life-Cycle Phases* Compiled by: Dharya Dharya Daisy Daisy
Framework for Compliance, Verification, and Non-Conformance George Angeli LSST All-Hands Meeting Bremerton August 17, 2015.
CPIS 357 Software Quality & Testing
Data Management Subsystem: Data Processing, Calibration and Archive Systems for JWST with implications for HST Gretchen Greene & Perry Greenfield.
RUP Implementation and Testing
1 ANASAC Meeting – May 20, 2015 ALMA Pipeline Brian Glendenning (for Jeff Kern)
1 Lecture 19 Configuration Management Software Engineering.
1 The EIS Experience: Lessons Learned May 12, 2005.
By Touseef Tahir Software Testing Basics. Today's Agenda Software Quality assurance Software Testing Software Test cases Software Test Plans Software.
 CS 5380 Software Engineering Chapter 8 Testing.
Product Development Chapter 6. Definitions needed: Verification: The process of evaluating compliance to regulations, standards, or specifications.
1 © 2004 Cisco Systems, Inc. All rights reserved. Rich Gore Case Study: Cisco Global Wireless LAN Software Migration Cisco Information.
The ALMA Software and Release Management Ruben Soto Software Operations Group & Release Manager Joint ALMA Observatory.
FotoGazmic Software (From left to right: Chad Zbinden, Josey Baker, Rob Mills, Myra Bergman, Tinate Dejtiranukul)
ISM 5316 Week 3 Learning Objectives You should be able to: u Define and list issues and steps in Project Integration u List and describe the components.
Doug Tody E2E Perspective EVLA Advisory Committee Meeting December 14-15, 2004 EVLA Software E2E Perspective.
Using the NSA Presentation to NOAO Users Committee October 5, 2005.
DC2 Post-Mortem/DC3 Scoping February 5 - 6, 2008 DC3 Goals and Objectives Jeff Kantor DM System Manager Tim Axelrod DM System Scientist.
TESTING LEVELS Unit Testing Integration Testing System Testing Acceptance Testing.
SSC SI Data Processing Pipeline Plans Tom Stephens USRA Information Systems Development Manager SSSC Meeting – Sept 29, 2009.
1 CORE DATA PROCESSING SOFTWARE PLAN REVIEW | SEATTLE, WA | SEPTEMBER 19-20, 2013 Name of Meeting Location Date - Change in Slide Master Data Release Processing.
Software Phase V Testing and Improvements to Test Procedures S. Corder and L.-A. Nyman April 18, 20131ICT Planning Meeting, Santiago.
What the Data Products Program Offers Users Todd Boroson Dick Shaw Presentation to NOAO Users Committee October 23, 2003.
The Astronomy challenge: How can workflow preservation help? Susana Sánchez, Jose Enrique Ruíz, Lourdes Verdes-Montenegro, Julian Garrido, Juan de Dios.
SAGE meeting Socorro, May 22-23, 2007 EVLA Science Operations: the Array Science Center Claire Chandler NRAO/Socorro.
Atacama Large Millimeter/submillimeter Array Expanded Very Large Array Robert C. Byrd Green Bank Telescope Very Long Baseline Array Data Processing Progress.
Slide 1 Archive Computing: Scalable Computing Environments on Very Large Archives Andreas J. Wicenec 13-June-2002.
Principles of Computer Security: CompTIA Security + ® and Beyond, Third Edition © 2012 Principles of Computer Security: CompTIA Security+ ® and Beyond,
ADMIT: ALMA Data Mining Toolkit  Developed by University of Maryland, University of Illinois, and NRAO (PI: L. Mundy)  Goal: First-view science data.
Ray Plante for the DES Collaboration BIRP Meeting August 12, 2004 Tucson Fermilab, U Illinois, U Chicago, LBNL, CTIO/NOAO DES Data Management Ray Plante.
August 2003 At A Glance The IRC is a platform independent, extensible, and adaptive framework that provides robust, interactive, and distributed control.
Rational Unified Process Fundamentals Module 4: Core Workflows II - Concepts Rational Unified Process Fundamentals Module 4: Core Workflows II - Concepts.
GROUP PresentsPresents. WEB CRAWLER A visualization of links in the World Wide Web Software Engineering C Semester Two Massey University - Palmerston.
McDonough Bolyard Peck Educational Series Building Commissioning Presented by: Doug Wrenn Steve Baxter July
Unified Software Practices v D Copyright  1998 Rational Software, all rights reserved 1 Practice 5: Verify Software Quality Control Changes Develop.
GSPC -II Program GOAL: extend GSPC-I photometry to B = V ˜ 20 add R band to calibrate red second-epoch surveys HOW: take B,V,R CCD exposures centered at.
Applied Software Implementation & Testing
Leigh Grundhoefer Indiana University
Module 01 ETICS Overview ETICS Online Tutorials
Click to add title Planning for LSST Verification George Angeli LSST All Hands Meeting Tucson August 15, 2016.
Course: Module: Lesson # & Name Instructional Material 1 of 32 Lesson Delivery Mode: Lesson Duration: Document Name: 1. Professional Diploma in ERP Systems.
Baisc Of Software Testing
PSS verification and validation
Overview Activities from additional UP disciplines are needed to bring a system into being Implementation Testing Deployment Configuration and change management.
Presentation transcript:

Commissioning the NOAO Data Management System Howard H. Lanning, Rob Seaman, Chris Smith (National Optical Astronomy Observatory, Data Products Program) The NOAO Data Management System is comprised of several large subsystems. Its Data Transport System annually conveys terabytes of data between six remote intercontinental sites. The NOAO Science Archive has been safeguarding key NOAO data products for almost five years - NSA release 3.0 has dramatically increased data holdings as well as updated the entire suite of technologies. The NOAO High-Performance Pipeline System addresses the need for scientifically verified pipeline processed data products from major NOAO instrumentation. The NOAO Virtual Observatory Portal is the observatory’s keystone VO project. This integrated, yet highly distributed, system is the result of a large software project known as the “NOAO End-to-End System.” E2E involved the development of numerous interfaces and tools requiring careful and thorough review and testing. Extensive test plans were developed to assure that the science and functional requirements of the entire E2E system were met. Integration tests were run by the developers before the individual subsystems were delivered to the Data Products Program Operations Group. Acceptance tests were then run by the Operations staff to ensure the delivered system was ready for commissioning and deployment. Performance tests and scientific verification were done concurrently to assure the resulting data quality of the processed data met their science requirements. Testing of infrastructure and user interfaces was invaluable not only in ensuring that functional requirements were met for the current version, but in developing new requirements for future versions. In short, commissioning is an ongoing process, not a milestone.

INTRODUCTION Commissioning the NOAO Archive - Large system: 6 distributed systems (Tucson, La Serena, KPNO, CTIO, Cerro Pachon, NCSA) - OPS configuration and functional testing (automated scripts and manual verification of fits, etc.) - TEST personnel test all subsystems as well as the integrated (E2E) system using test plans, manual testing, automated testing, iterations with Development Team and Acceptance testing to verify product - Customer Team user and science evaluation - the Operations and Scientific personnel charged with determining features that need to be built and verified

Testing Benefits Independent view to software functionality - effective testing is careful analysis of the product as well as creating tests/procedures Results in improved software quality Value-added software testing - Customer input required - improved user interface - not just finding problems ; making system more productive for the user/customer

ITERATIVE RELEASE TESTING Testers involved in iterative delivery/testing with Development Teams Bugs, improvement suggestions, new features, clarifications, etc. filed using JIRA bug tracking system [ Issues addressed/fixed for next delivery/test Iterative testing has proved invaluable to Portal quality in preparation for final release Similar process being used for NOAO Science Archive (w/ automated FitNesse tests) - FitNesse:

ACCEPTANCE TESTING Science and Functional requirements identified (E2E, NSA, NVO Portal, Pipeline) Detailed test plans prepared and executed - regression tests/procedures - science requirement test plans - functional requirements test plans - performance test plans Customer Team Science Verification Detailed test reports filed at completion

Science Verification and the Customer Team Customer Team science verification goals - testing astrometric and photometric accuracy to assure E2E Specifications are met - comment on data quality issues - identify nature of problems and ways to improve results Customer Team tests - inspection of pipeline review pages (PNG graphics, etc.) - visual and quantitative inspection of processed FITS images - quantitative assessment of observed characteristics

Science Verification: Pipeline Processing of Observations Mosaic pipeline processing to remove instrumental and telescope signatures - astrometric & photometric characterization - measurement of data quality parameters Goals: - relative astrometric calibration of 0.5 arcsec (RMS) 90% of time - absolute accuracy of 0.5 arcsec 90% of time - relative photometric accuracy of 5% (RMS) 90% of time - 20% absolute accuracy for BVRI filters

Science Verification: Quality and Characteristics Evaluation Artifacts (Pupil ghosts, Fringes, Bad pixels, etc.) Examination of resampled/reprojected images Handling of extended objects, crowded fields, poor observing conditions, etc. World Coordinate Systems (astrometric accuracy, internal residuals, ‘reference reductions’) Relative/Absolute photometry evaluation (accuracy, zeropoint, photometric depth) Image noise PSF FWHM

Scientific Verification: NVO Portal Verify scientific capabilities and goals are met - queries of NOAO Science Archive - queries of external archives - extraction, download of datafiles, image display - VO Plotting, WESIX, XMatch - ease of use of Portal Customer Team evaluation critical to Acceptance of Portal and NOAO Data Management System - valuable resource for verification and future improvements to the user interface

SUMMARY Commissioning a system as extensive as the distributed NOAO Data Management System is critically dependent upon good testing practices and Customer Team input. The process to date has demonstrated the value of such practices in isolating problem issues, improving scientific usefulness and functional capabilities of the product, providing recommendations for enhancement of tools and addition of new features and ultimately providing a more user friendly interface. While testing is never 100% foolproof, serious problems and general issues are addressed and resolved early resulting in a higher quality and more efficient end-product.