California Institute of Technology Formalized Pilot Study of Safety- Critical Software Anomalies Dr. Robyn Lutz and Carmen Mikulski This research was carried.

Slides:



Advertisements
Similar presentations
Ninth Lecture Hour 8:30 – 9:20 pm, Thursday, September 13
Advertisements

Software Quality Assurance Plan
Software Engineering CSE470: Process 15 Software Engineering Phases Definition: What? Development: How? Maintenance: Managing change Umbrella Activities:
Chapter 4 Quality Assurance in Context
Protecting the Public, Astronauts and Pilots, the NASA Workforce, and High-Value Equipment and Property Mission Success Starts With Safety The Future Role.
R R R CSE870: Advanced Software Engineering (Cheng): Intro to Software Engineering1 Advanced Software Engineering Dr. Cheng Overview of Software Engineering.
COMP8130 and 4130Adrian Marshall 8130 and 4130 Test Execution and Reporting Adrian Marshall.
University of Southern California Center for Systems and Software Engineering ©USC-CSSE1 Ray Madachy, Barry Boehm USC Center for Systems and Software Engineering.
Presentation R. R. Lutz. Analyzing Software Requirements Errors in Safety-Critical Embedded Systems. In Proceedings of the IEEE International Symposium.
1 Software Testing and Quality Assurance Lecture 30 – Testing Systems.
Computer Security: Principles and Practice
Software causes many failures - significant mission risk Hard to quantify effects on system risk of: software defects software development practices software.
Automation for System Safety Analysis: Executive Briefing Jane T. Malin, Principal Investigator Project: Automated Tool and Method for System Safety Analysis.
National Aeronautics and Space Administration SAS08_Classify_Defects_Nikora1 Software Reliability Techniques Applied to Constellation Allen P. Nikora,
SPACECRAFT ACCIDENTS: EXAMINING THE PAST, IMPROVING THE FUTURE Hubble Recovery with STS Bryan Palaszewski working with the Digital Learning Network NASA.
Module 3: Business Information Systems
Chapter 20: Defect Classification and Analysis  General Types of Defect Analyses.  ODC: Orthogonal Defect Classification.  Analysis of ODC Data.
S/W Project Management
CCSB223/SAD/CHAPTER141 Chapter 14 Implementing and Maintaining the System.
SAS_08_AADL_Exec_Gluch MAC-T IVV Model-Based Software Assurance with the SAE Architecture Analysis & Design Language (AADL) California Institute.
Test Organization and Management
Dillon: CSE470: SE, Process1 Software Engineering Phases l Definition: What? l Development: How? l Maintenance: Managing change l Umbrella Activities:
1SAS 03/ GSFC/SATC- NSWC-DD System and Software Reliability Dolores R. Wallace SRS Technologies Software Assurance Technology Center
California Institute of Technology Adapting ODC for Empirical Evaluation of Pre-Launch Anomalies Dr. Robyn Lutz and Carmen Mikulski This research was carried.
1PBI_SAS_08_Exec_ShullSeptember 2008MAC-T IVV Dr. Forrest Shull, FCMD Kurt Woodham, L-3 Communications OSMA SAS 08 Infusion of Perspective-Based.
SAS_08_Preventing_Eliminating_SWfaults_Goseva-Popstojanova Preventing and Eliminating Software Faults through the Life Cycle PI: Katerina Goseva-Popstojanova.
Protecting the Public, Astronauts and Pilots, the NASA Workforce, and High-Value Equipment and Property Mission Success Starts With Safety Believe it or.
Contents 1 Description of 1 Description of Initiative Initiative 3 Results: 3 Results: Characterization Characterization 2 Description of 2 Description.
NDIA Systems Engineering Supportability & Interoperability Conference October 2003 Using Six Sigma to Improve Systems Engineering Rick Hefner, Ph.D.
NASA Software Productivity Consortium Affiliation NASA Office of Safety and Mission Assurance Software Assurance Symposium September 5, 2001 Cynthia Calhoun.
Testing Workflow In the Unified Process and Agile/Scrum processes.
THEMIS Instrument CDR(1) UCB, April 19 & 20, 2004 Mission Assurance Critical Design Review Ron Jackson University of California - Berkeley.
1 Software Reliability Assurance for Real-time Systems Joel Henry, Ph.D. University of Montana NASA Software Assurance Symposium September 4, 2002.
Slide 1V&V 10/2002 Software Quality Assurance Dr. Linda H. Rosenberg Assistant Director For Information Sciences Goddard Space Flight Center, NASA
Final Version Micro-Arcsecond Imaging Mission, Pathfinder (MAXIM-PF) Mission Operations Tim Rykowski Jeffrey Hosler May 13-17, 2002.
California Institute of Technology Requirements Decomposition Analysis, Model Based Testing of Sequential Code Properties Allen P. Nikora, John D. Powell.
SAS ‘05 Reducing Software Security Risk through an Integrated Approach David P. Gilliam, John D. Powell Jet Propulsion Laboratory, California Institute.
NASA Office of Safety & Mission Assurance IV&V Cost Estimation – A Joint NASA & US Navy Collaboration to Model & Automate the Process Software Assurance.
California Institute of Technology Formalized Pilot Study of Safety- Critical Software Anomalies Dr. Robyn Lutz and Carmen Mikulski Software Assurance.
Optimizing NASA IV&V Benefits Using Simulation Grant Number: NAG David M. Raffo, Ph.D College of Engineering and Computer Science School of Business.
Development of Methodologies for Independent Verification and Validation of Neural Networks NAG OSMA-F001-UNCLASS Methods and Procedures.
Lecture 14 Maintaining the System and Managing Software Change SFDV Principles of Information Systems.
BSBPMG501A Manage Project Integrative Processes Manage Project Integrative Processes Project Integration Processes – Part 1 Diploma of Project Management.
Sept Software Research and Technology Infusion 2007 Software Assurance Symposium.
Intelligent Systems Software Assurance Symposium 2004 Bojan Cukic & Yan Liu, Robyn Lutz & Stacy Nelson, Chris Rouff, Johann Schumann, Margaret Smith July.
National Aeronautics and Space Administration SAS08_Classify_Defects_Nikora1 Classifying Software Faults to Improve Fault Detection Effectiveness Allen.
New Products from NASA’s Software Architecture Review Board
1 Technology Infusion of the Software Developer’s Assistant (SDA) into the MOD Software Development Process NASA/JSC/MOD/Brian O’Hagan 2008 Software Assurance.
1 NASA OSMA SAS02 Software Fault Tree Analysis Dolores R. Wallace SRS Information Services Software Assurance Technology Center
Test status report Test status report is important to track the important project issues, accomplishments of the projects, pending work and milestone analysis(
Pavan Rajagopal, GeoControl Systems James B. Dabney, UHCL Gary Barber, GeoControl Systems 1Spacecraft FSW Workshop 2015.
1 Overview of Maintenance CPRE 416-Software Evolution and Maintenance-Lecture 3.
18 th GIST Meeting 14 th –16 th May 2003 Ground Segment (GGSPS) Report 1 GERB Ground Segment B.C.Stewart RAL.
SAS_05_Contingency_Lutz_Tal1 Contingency Software in Autonomous Systems Robyn Lutz, JPL/Caltech & ISU Doron Tal, USRA at NASA Ames Ann Patterson-Hine,
California Institute of Technology 1 Operationalization and Enhancement of the Advanced Risk Reduction Tool (ARRT) Presentation to the 2 nd Annual NASA.
Robotics and Autonomy Test Facility - Hardware Verification needs Elie Allouis HRAF Workshop – 28/02/2012.
A way to develop software that emphasizes communication, collaboration, and integration between development and IT operations teams.
Completing the Loop: Linking Software Features to Failures 20 July 2004 Copyright © 2004, Mountain State Information Systems, Inc. All rights reserved.
1 SAS ‘04 Reducing Software Security Risk through an Integrated Approach David P. Gilliam and John D. Powell.
March 2004 At A Glance The AutoFDS provides a web- based interface to acquire, generate, and distribute products, using the GMSEC Reference Architecture.
CSCI 3428: Software Engineering Tami Meredith Chapter 11 Maintaining the System.
SwCDR (Peer) Review 1 UCB MAVEN Particles and Fields Flight Software Critical Design Review Peter R. Harvey.
Contents 1 Description of 1 Description of Initiative Initiative 3 Year 2: Updated 3 Year 2: Updated Training/Metrics Training/Metrics 2 Year 1: NASA 2.
Computer Security: Principles and Practice First Edition by William Stallings and Lawrie Brown Lecture slides by Lawrie Brown Chapter 17 – IT Security.
EUM/OPS/VWG/11/3248 Issue 1 04/09/2011 Eumetsat Conference, 3-7 September 2012, Sopot, Poland Satellite Event Working Group GSICS Workshop, 4 September.
Advanced Software Engineering Dr. Cheng
I&T&C Organization Chart
CAE-SCRUB for Incorporating Static Analysis into Peer Reviews
Overview Activities from additional UP disciplines are needed to bring a system into being Implementation Testing Deployment Configuration and change management.
Jeff Dutton/NASA COR August 26, 2019
Presentation transcript:

California Institute of Technology Formalized Pilot Study of Safety- Critical Software Anomalies Dr. Robyn Lutz and Carmen Mikulski This research was carried out at the Jet Propulsion Laboratory, California Institute of Technology, under a contract with the National Aeronautics and Space Administration. The work was sponsored by the NASA Office of Safety and Mission Assurance under the Software Assurance Research Program led by the NASA Software IV&V Facility. This activity is managed locally at JPL through the Assurance and Technology Program Office OSMA Software Assurance Symposium September 4-6, 2002

California Institute of Technology SAS’022 Topics Overview Results –Quantitative analysis –Evolution of requirements –Pattern identification & unexpected patterns Work-in-progress Benefits

California Institute of Technology SAS’023 Overview Goal: To reduce the number of safety-critical software anomalies that occur during flight by providing a quantitative analysis of previous anomalies as a foundation for process improvement. Approach: Analyzed anomaly data using adaptation of Orthogonal Defect Classification (ODC) method –Developed at IBM; widely used by industry –Quantitative approach –Used here to detect patterns in anomaly data –More information at Evaluated ODC for NASA use using a Formalized Pilot Study [Glass, 97]

California Institute of Technology SAS’024 Overview: Status Year 3 of planned 3-year study Plan Design Conduct Evaluate Use FY’03 extension proposed to extend ODC work to pre-launch and transition to projects (Deep Impact, contractor-developed software, Mars Exploration Rover testing) Adapted ODC categories to operational spacecraft software at JPL: –Activity: what was taking place when anomaly occurred? –Trigger: what was the catalyst? –Target: what was fixed? –Type: what kind of fix was done?

California Institute of Technology SAS’025 Results: ODC Adaptation n Adapted ODC classification to post-launch spacecraft Incident Surprise Anomalies (ISAs)

California Institute of Technology SAS’026 Results: Quantitative Analysis Analyzed 189 Incident/Surprise/Anomaly reports (ISAs) of highest criticality from 7 spacecraft –Cassini, Deep Space 1, Galileo, Mars Climate Orbiter, Mars Global Surveyor, Mars Polar Lander, Stardust Institutional defect database  Access database of data of interest  Excel spreadsheet with ODC categories  Pivot tables with multiple views of data Frequency counts of Activity, Trigger, Target, Type, Trigger within Activity, Type within Target, etc. User-selectable representation of results User-selectable sets of spacecraft for comparison Provides rapid quantification of data

California Institute of Technology SAS’027 Results: Quantitative Analysis

California Institute of Technology SAS’028 Results: Quantitative Analysis Trigger vs. Target Ground/Flight S/W vs. Type within Activity

California Institute of Technology SAS’029 Results: Evolution of Requirements Anomalies sometimes result in changes to software requirements Finding: –Change to handle rare event or scenario (software adds fault tolerance) –Change to compensate for hardware failure or limitations (software adds robustness) Contradicts assumption that “what breaks is what gets fixed” Example: Damaged Solar Array Panel cannot deploy as planned  Activity = Flight Operations (occurred during flight)  Trigger = Hardware failure (Solar Array panel incorrect position--broken piece rotated & prevented latching)  Target = Flight Software (Fixed via changes to flight software)  Type = Function/Algorithm (Added a solar-array-powered hold capability to s/w)

California Institute of Technology SAS’0210 Results: Pattern Identification Sample Question: What is the typical signature of a post-launch critical software anomaly? Finding: –Activity = Flight Operations –Trigger = Data Access/Delivery –Target = Information Development –Type = Procedures Example: Star Scanner anomaly –Activity = occurred during flight –Trigger = star scanner telemetry froze –Target = fix was new description of star calibration –Type = procedure written

California Institute of Technology SAS’0211 Results: Unexpected Patterns Examples of Unexpected ISA patterns: Process Recommendation:Example (from spacecraft): 22% of critical ISAs had ground software as Target (fix) Software QA for ground software Unable to process multiple submissions. Fixed code. 23% of critical ISAs had procedures as Type Assemble checklist of needed procedures for future projects Not in inertial mode during star calibration. Additions made to checklist to prevent in future. Of these, 41% had Data access / delivery as Trigger Better communication of changes and updates to operations Multiple queries for spacecraft engineering and monitor data failed. Streamlined notification to operators of problems. 34% of critical ISAs involving system test had software configuration as Trigger (cause) ; 24% had hardware configuration as Trigger Additional end-to-end configuration testing OPS personnel did not have a green command system for the uplink of two trajectory-correction command files. Problems resulted from a firewall configuration change.

California Institute of Technology SAS’0212 Work-In-Progress Assembling process recommendations tied to specific findings and unexpected patterns; in review by projects Working to incorporate ODC classifications into next-generation problem-failure reporting database (to support automation & visualization) Disseminating results: invited presentations to JPL Software Quality Improvement task, to JPL Mission Assurance Managers, to MER, informal briefings to other flight projects; at Assurance Technology Conference (B. Sigal), included in talk at Metrics 2002 (A. Nikora), at 2001 IFIP WG 2.9 Workshop on Requirements Engineering; papers in 5 th IEEE Int’l Symposium on Requirements Engineering and The Journal of Systems and Software (to appear).

California Institute of Technology SAS’0213 Work-In-Progress Collaborating with Mars Exploration Rover to experimentally extend ODC approach to pre-launch software problem/failure testing reports –Adjusted ODC classifications to testing phases (build, integration, acceptance) –Delivered experimental ODC analysis of 155 Problem/ Failure Reports to MER –Feedback from Project has been noteworthy –Results can support tracking trends and progress: Graphical summaries Comparisons of testing phases –Results can provide better understanding of typical problem signatures

California Institute of Technology SAS’0214 Benefits User selects preferred representation (e.g., 3-D bar graphs) and set of projects to view Data mines historical and current databases of anomaly and problem reports to feed-forward into future projects Uses metrics information to identify and focus on problem areas Provides quantitative foundation for process improvement Equips us with a methodology to continue to learn as projects and processes evolve