California Institute of Technology Formalized Pilot Study of Safety- Critical Software Anomalies Dr. Robyn Lutz and Carmen Mikulski Software Assurance.

Slides:



Advertisements
Similar presentations
Making the System Operational
Advertisements

INFO 631 Prof. Glenn Booker Week 1 – Defect Analysis and Removal 1INFO631 Week 1.
Electronic Measure and Test Unit 53 Task 4 (P4).  A plan that clearly details the tests that will be performed  What to test  How to test (step by.
Software Quality Assurance Plan
Chapter 4 Quality Assurance in Context
Software Development Methodologies Damian Gordon.
WPI CS534 Showcase Jeff Martin. * Computer Software on Deep Space 1 * Used to execute plans/mission objectives * Model based * Constraint based * Fault.
Software Quality Assurance (SQA). Recap SQA goal, attributes and metrics SQA plan Formal Technical Review (FTR) Statistical SQA – Six Sigma – Identifying.
SE 450 Software Processes & Product Metrics Reliability: An Introduction.
SE curriculum in CC2001 made by IEEE and ACM: Overview and Ideas for Our Work Katerina Zdravkova Institute of Informatics
COMP8130 and 4130Adrian Marshall 8130 and 4130 Test Execution and Reporting Adrian Marshall.
Presentation : Analyzing Software Requirements Errors in Safety-Critical Embedded Systems.
University of Southern California Center for Systems and Software Engineering ©USC-CSSE1 Ray Madachy, Barry Boehm USC Center for Systems and Software Engineering.
Presentation R. R. Lutz. Analyzing Software Requirements Errors in Safety-Critical Embedded Systems. In Proceedings of the IEEE International Symposium.
RIT Software Engineering
SIM5102 Software Evaluation
SE 450 Software Processes & Product Metrics 1 Defect Removal.
Ch7: Software Production Process. 1 Questions  What is the life cycle of a software product?  Why do we need software process models?  What are the.
© USC-CSE Feb Keun Lee ( & Sunita Chulani COQUALMO and Orthogonal Defect.
Section 15-1GLAST Ground System Design Review August 18&19, 2004 ISOC Organization ISOC Manager R Cameron Commanding, H&S Timeline Planning Command Generation.
Computer Security: Principles and Practice
University of Southern California Center for Software Engineering CSE USC ©USC-CSE 3/11/2002 Empirical Methods for Benchmarking High Dependability The.
Swami NatarajanJuly 14, 2015 RIT Software Engineering Reliability: Introduction.
Functional Testing.
Software causes many failures - significant mission risk Hard to quantify effects on system risk of: software defects software development practices software.
National Aeronautics and Space Administration SAS08_Classify_Defects_Nikora1 Software Reliability Techniques Applied to Constellation Allen P. Nikora,
Proactive Risk and Problem Management March 14 – 15, 2011 John E. Tinsley Director, Air & Missile Defense Systems Mission Assurance 19 th Annual Conference.
SPACECRAFT ACCIDENTS: EXAMINING THE PAST, IMPROVING THE FUTURE Hubble Recovery with STS Bryan Palaszewski working with the Digital Learning Network NASA.
Chapter 20: Defect Classification and Analysis  General Types of Defect Analyses.  ODC: Orthogonal Defect Classification.  Analysis of ODC Data.
SEDS Research GroupSchool of EECS, Washington State University Annual Reliability & Maintainability Symposium January 30, 2002 Frederick T. Sheldon and.
CS527: (Advanced) Topics in Software Engineering Overview of Software Quality Assurance Tao Xie ©D. Marinov, T. Xie.
Ch.4: QA in Context  QA and the overall development context Defect handling/resolution How alternative QA activities fit in process Alternative perspectives:
1SAS 03/ GSFC/SATC- NSWC-DD System and Software Reliability Dolores R. Wallace SRS Technologies Software Assurance Technology Center
California Institute of Technology Adapting ODC for Empirical Evaluation of Pre-Launch Anomalies Dr. Robyn Lutz and Carmen Mikulski This research was carried.
Software Inspection A basic tool for defect removal A basic tool for defect removal Urgent need for QA and removal can be supported by inspection Urgent.
1PBI_SAS_08_Exec_ShullSeptember 2008MAC-T IVV Dr. Forrest Shull, FCMD Kurt Woodham, L-3 Communications OSMA SAS 08 Infusion of Perspective-Based.
ESA/ESTEC, TEC-QQS August 8, 2005 SAS_05_ESA SW PA R&D_Winzer,Prades Slide 1 Software Product Assurance (PA) R&D Road mapping Activities ESA/ESTEC TEC-QQS.
Protecting the Public, Astronauts and Pilots, the NASA Workforce, and High-Value Equipment and Property Mission Success Starts With Safety Believe it or.
Contents 1 Description of 1 Description of Initiative Initiative 3 Results: 3 Results: Characterization Characterization 2 Description of 2 Description.
The Solar System Chapter 6 COPY DOWN THE LEARNING GOALS ON PG SKIP 5 LINES BETWEEN EACH!
.1 RESEARCH & TECHNOLOGY DEVELOPMENT CENTER SYSTEM AND INFORMATION SCIENCES JHU/MIT Proprietary Titan MESSENGER Autonomy Experiment.
Slide 1V&V 10/2002 Software Quality Assurance Dr. Linda H. Rosenberg Assistant Director For Information Sciences Goddard Space Flight Center, NASA
California Institute of Technology Formalized Pilot Study of Safety- Critical Software Anomalies Dr. Robyn Lutz and Carmen Mikulski This research was carried.
Defect resolution  Defect logging  Defect tracking  Consistent defect interpretation and tracking  Timely defect reporting.
California Institute of Technology Estimating and Controlling Software Fault Content More Effectively NASA Code Q Software Program Center Initiative UPN.
NASA/Air Force Cost Model presented by Keith Smith Science Applications International Corporation 2002 SCEA National Conference June
Sept Software Research and Technology Infusion 2007 Software Assurance Symposium.
Software Architecture Evaluation Methodologies Presented By: Anthony Register.
Ensure that the right functions are performed Ensure that the these functions are performed right and are reliable.
ST5 PDR June 19-20, 2001 NMP 2-1 EW M ILLENNIUM P ROGRA NNMM Program Overview Dr. Christopher Stevens Jet Propulsion Laboratory, California Institute of.
New Products from NASA’s Software Architecture Review Board
1 NASA OSMA SAS02 Software Fault Tree Analysis Dolores R. Wallace SRS Information Services Software Assurance Technology Center
Pavan Rajagopal, GeoControl Systems James B. Dabney, UHCL Gary Barber, GeoControl Systems 1Spacecraft FSW Workshop 2015.
TRAC Software for creation of supplier contract risk profile
SAS_08_ Architecture_Analysis_of_Evolving_Complex_Systems_of_Systems_Lindvall Architecture Analysis of Evolving Complex Systems of Systems Executive Status.
 Software Testing Software Testing  Characteristics of Testable Software Characteristics of Testable Software  A Testing Life Cycle A Testing Life.
SAS_05_Contingency_Lutz_Tal1 Contingency Software in Autonomous Systems Robyn Lutz, JPL/Caltech & ISU Doron Tal, USRA at NASA Ames Ann Patterson-Hine,
Software Engineering Lecture 8: Quality Assurance.
2/21/20161 California Institute of Technology Software Anomaly Trends in JPL Missions Assurance Technology Program Office presented by Allen P. Nikora.
1 Software Testing and Quality Assurance Lecture 17 - Test Analysis & Design Models (Chapter 4, A Practical Guide to Testing Object-Oriented Software)
1 SAS ‘04 Reducing Software Security Risk through an Integrated Approach David P. Gilliam and John D. Powell.
CSCI 3428: Software Engineering Tami Meredith Chapter 11 Maintaining the System.
SwCDR (Peer) Review 1 UCB MAVEN Particles and Fields Flight Software Critical Design Review Peter R. Harvey.
by: Er. Manu Bansal Deptt of IT Software Quality Assurance.
Failure Modes, Effects and Criticality Analysis
Contents 1 Description of 1 Description of Initiative Initiative 3 Year 2: Updated 3 Year 2: Updated Training/Metrics Training/Metrics 2 Year 1: NASA 2.
IEEE Std 1074: Standard for Software Lifecycle
Software Quality Engineering
Automated Analysis and Code Generation for Domain-Specific Models
Testing, Inspection, Walkthrough
Presentation transcript:

California Institute of Technology Formalized Pilot Study of Safety- Critical Software Anomalies Dr. Robyn Lutz and Carmen Mikulski Software Assurance Group Jet Propulsion Laboratory California Institute of Technology NASA Code Q Software Program Center Initiative UPN ; Kenneth McGill, Research Lead OSMA Software Assurance Symposium Sept 5-7, 2001

California Institute of Technology SAS’012 Topics Overview Preliminary Results –Quantitative analysis –Evolution of requirements –Visualization tools Work-in-progress Benefits

California Institute of Technology SAS’013 Overview: Goal To reduce the number of safety-critical software anomalies that occur during flight by providing a quantitative analysis of previous anomalies as a foundation for process improvement.

California Institute of Technology SAS’014 Overview: Approach Analyzed anomaly data using Orthogonal Defect Classification (ODC) method –Developed at IBM; widely used by industry –Quantitative approach –Used here to detect patterns in anomaly data Evaluated ODC using Formalized Pilot Study –R. Glass [’97] detailed rigorous process to get valid results –35 steps divided into 5 phases –Used here to evaluate ODC for NASA use

California Institute of Technology SAS’015 Overview: Status Year 2 of planned 3-year study Plan Design Conduct Evaluate Use Adapted ODC categories to operational spacecraft software at JPL: –Activity: what was taking place when anomaly occurred? –Trigger: what was the catalyst? –Target: what was fixed? –Type: what kind of fix was done?

California Institute of Technology SAS’016 Preliminary Results: Quantitative Analysis Analyzed 189 Incident/Surprise/Anomaly reports (ISAs) of highest criticality –7 spacecraft: Cassini, Deep Space 1, Galileo, Mars Climate Orbiter, Mars Global Surveyor, Mars Polar Lander, Stardust Institutional defect database  Access database of data of interest  Excel spreadsheet with ODC categories  Pivot tables with multiple views of data 1-D and 2-D frequency counts of Activity, Trigger, Target, Type, Trigger within Activity, Type within Target, etc.

California Institute of Technology SAS’017 Preliminary Results: Quantitative Analysis User-selectable representation of analysis results: tables, pie charts, bar graphs User-selectable sets of spacecraft for comparisons Provides rapid quantification of data Assists in detecting unexpected patterns, confirming expected patterns

California Institute of Technology SAS’018 Preliminary Results: Quantitative Analysis

California Institute of Technology SAS’019 Preliminary Results: Quantitative Analysis

California Institute of Technology SAS’0110 Preliminary Results: Quantitative Analysis

California Institute of Technology SAS’0111 Preliminary Results: Evolution of Safety-Critical Requirements Post-Launch Anomalies sometimes result in changes to software requirements –Looked at 86 critical ISAs from 3 spacecraft (MGS, DS-1, Cassini) –17 of 86 had Target (what was fixed) = Flight Software –8 of 17 changed code only; 1 was incorrect patch; 1 used contingency command –Focused on remaining 7 with new software requirements as a result of critical anomaly

California Institute of Technology SAS’0112 Preliminary Results: Evolution of Safety-Critical Requirements Post-Launch Found that requirements changes are not due to earlier requirement errors Instead, requirements changes are due to: –Need to handle rare event or scenario (4; software adds fault tolerance) –Need to compensate for hardware failure or limitations (3; software adds robustness)

California Institute of Technology SAS’0113 Preliminary Results: Evolution of Safety-Critical Requirements Post-Launch Confirms value of requirements completeness for fault tolerance Confirms value of contingency planning to speed change Contradicts assumption that “what breaks is what gets fixed” Suggests need for better requirements engineering for maintenance Results presented –IFIP WG 2.9 Workshop on Requirements Engineering, Feb, 2001; –5 th IEEE International Symposium on Requirements Engineering, Aug, 2001

California Institute of Technology SAS’0114 Preliminary Results: Evolution of Safety-Critical Requirements Post-Launch Launch Requirements Evolution Requirements Engineering Maintenance

California Institute of Technology SAS’0115 Preliminary Results: Web-based Visualization Tool Results of Peter Neubauer (ASU), Caltech/JPL Summer Undergraduate Research Fellow, 2001 Developed alternate visualizations of data results to support users’ analyses Web-based tool assists distributed users Sophisticated tool architecture builds on existing freeware Demo at QA Section Manager’s meeting (FAQ: Would this work for our project?) Demo to D. Potter’s JPL group developing next- generation Failure Anomaly Management System

California Institute of Technology SAS’0116 Preliminary Results: Web-based Visualization Tool Objective: Investigate and characterize the common causes of safety-critical, in-flight software anomalies on spacecraft. The work uses a defect-analysis technology called Orthogonal Defect Classification, developed at IBM. A rigorous pilot study approach using the Glass criteria is currently underway. Calibration Cmd seq test Data Access/ Delivery H/W Config. H/W Failure Inspection/ Review Normal Activity Recovery S/W Config. Special Procedure Start/Restart/ Shutdown Unknown --- Operations --- None/? --- Flight Software --- Ground Software --- Hardware --- Ground Resources --- Build Package “Trigger” – what was happening to cause defect to be noticed “Target” - what was changed to respond to defect Large number of defects seen during sending commands to / receiving data from spacecraft. Of these, many were responded to by changing operational procedures or software on the ground. For other defects, changes to flight software more prevalent 7 space missions: 189 defects classified; chart shows one of the 6 possible 2-way views into this information Discover and present useful information

California Institute of Technology SAS’0117 Work-in-progress Several patterns noted but not yet quantified –Ex: Procedures often implicated Profile by mission phase –Ex: Cruise, orbit insertion, entry, landing Better way to disseminate “mini-LL’s”? –Ex: Corrective action sometimes notes need for similar action on a future mission Incorporate standardized ODC classifications in next-generation database to support automation and visualization

California Institute of Technology SAS’0118 Benefits Data mining of historical and current databases of incidents / surprises / anomalies Uses metrics information to identify and focus on problem areas Provides a quantitative foundation for process improvement Equips us with a methodology to continue to learn as projects and processes evolve