Download presentation
Presentation is loading. Please wait.
Published byAbraham May Modified over 8 years ago
1
Table of Contents Program Model Checking: Case Studies and Practitioner’s Guide John Penix, ARC Owen O’Malley, QSS Lawrence Markosian, QSS Peter Mehlitz, CSC Masoud Mansouri-Semani, CSC Howard Hu, JSC Tanya Lippencott, GDDS Mark Coats, GDDS
2
Program Model Checking Case Studies and Practitioner’s Guide Objectives ► Overall objectives ▀ Assemble the emerging best practices in program model checking ▀ Demonstrate and validate their use in several case studies ▀ Document the results in a Practitioner’s Guide for Program Model Checking ► Year 1 - Analysis & Baseline ▀ Determine critical requirements and verification coverage requirements for initial case study ▀ Set up test environment for the selected application at Ames ▀ Apply and document basic model checking techniques and determine coverage of verification goals.
3
Program Model Checking Case Studies and Practitioner’s Guide Initial Case Study: Shuttle Abort Flight Manager (SAFM) ► SAFM provides situational awareness and decision support for shuttle pilots in case of an abort prior to orbit. ▀ abort performance assessment during powered flight ▀ landing site evaluation and monitoring during glided flight ► Developed by NASA Johnson Space Center and General Dynamics Decision Systems ► Runs on-board shuttle and on the ground ► About 38K SLOC C++ ► Scheduled to fly in 2006
4
Model Checking as a Best Practice ► Model checking best practices must address: ▀ Identifying critical components and properties ▀ Constructing test drivers and environment models ▀ Developing “models” or abstractions of system ▀ Tuning model checking algorithms/tools ▀ Assessing verification results – coverage and error reports ► Apply model checking a part of integrated best practices: ▀ Manual and automated software inspections ▀ Testing ▀ Model-checking
5
Program Model Checking Case Studies and Practitioner’s Guide Accomplishments ► Evaluation of SAFM source code and requirements for applicability to model checking & identify critical issues ▀ Identified Sequencer as a critical subsystem where model checking can be applied ▀ Manual and automated code inspections ► Hosted SAFM test lead at ARC for a week and to elicit requirements and design properties that are currently unchecked. ▀ Identified critical properties ► Set up SAFM build & test environment at ARC ▀ Gathered data on existing test coverage
6
Identifying Critical Subsystem - Sequencer ► Focus on the Executive Controller and Sequencer ▀ Top level control logic for SAFM including error handling ▀ Manages evaluation of the various potential abort scenarios ▀ Complex scenario interactions that are difficult to test Scenario Input Manager Output Manager Exec Controller Sequencer System Software Calls
7
Model Checking - Sequencer ► Use non-determinism to simplify the input and state space ▀ “Stub-out” most of the numeric computation and replace with non-deterministic choice ▀ Abstract data values by collapsing multiple floating point values into a single boolean. ► Example properties: ▀ Child scenarios don’t use data from unavailable or invalid parents ▀ How many scenarios can run in the same cycle?
8
► Properties: “low level” requirements that characterize valid sequences of program states (“executions”) ► Many application specific properties: ▀ All dynamic memory (de)allocation must happen during initialization and shutdown ▀ No scenario uses data from a parent scenario that was not applicable or valid ► General properties: ▀ No memory leaks ▀ No array bounds overflows Critical Properties
9
► Manual inspections discovered: ▀ Duplicated code and data ▀ Inconsistencies between the description of functions and their implementations - maintainability issue ► Automated inspections discovered: ▀ Incorrect overloading of binary & instead of unary & ▀ Single argument constructors that are not explicit ▀ Missing assignment operator (e.g. += when + is defined) ▀ Missing inverse operator (e.g. >= when < is defined) ► Similar issues in the System Software Manual and Automated Code Inspection
10
Program Model Checking Case Studies and Practitioner’s Guide Approach - Testing ► Ported the subsystem (L2) test driver to run under Unix (e.g. Linux and Mac OS) and added additional testing flexibility ► Applied testing tools ▀ Memcheck (array bounds and memory leaks) ▀ Gcov (statement coverage and counts) ▀ Kcachegrind (performance measurements) ► Used test data from General Dynamics that was generated by shuttle simulator
11
Testing - Code Coverage ► Used GNU’s gcov tool to determine code coverage of the current set of test cases. ► Statement coverage after running all test cases was 83%. ► Some surprising hotspots involved getting mnemonic names (~2 billion times) and string comparison on mnemonic names (~1.5 billion times).
12
Testing - Verifying Properties ► All dynamic memory allocations happen during SafmInitialize and all deallocations happen during ~SafmExecutiveController ▀ Overloaded operator new and delete ▀ Used control flags to ensure they weren’t called at the wrong time ► No memory leaks ▀ Used valgrind and only found 1 leak in the testing stubs
13
► Model Checking (MC): systematically check all transitions of an automaton for property violations ► Program Model Checking (PMC): execute all potential execution sequences (paths) of a program ► PMC measure of choice when not all execution paths can be tested: Concurrency (scheduler not controllable in test) ► BUT: SAFM has no internal concurrency (threads)… Approaches - Model Checking
14
► SMC also good for checking program responses to non- deterministic input → very suitable for SAFM (GNC input data variations) ► BUT: model checkers usually do this by enumerating all possible input values → not feasible for infinite value sets like float intervals Approaches - Model Checking
15
► JPF solution for SAFM: Heuristic Choice Generators Approaches - Model Checking
16
Lessons Incountered To Date ► C++ is a new platform for shuttle software and has many pitfalls for even experienced programmers ► Testing and model checking effort significantly eased with good design. ▀ Coding standards can encourage good design, but need to be flexible to handle special cases. (e.g. a friend class to enhance data visibility during testing) ► Difficulty in testing and sustaining engineering arises from— ▀ An impoverished development environment for flight software ▀ “Distance” between development environment and runtime environment ► Development team support is required for obtaining required domain expertise ► Goals for model checking must be carefully selected ► Understand where model checking fits into overall verification program
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.