Software Excellence via Program Verification at Microsoft Manuvir Das Center for Software Excellence Microsoft Corporation.

Slides:



Advertisements
Similar presentations
Advanced programming tools at Microsoft
Advertisements

Automated Theorem Proving Lecture 1. Program verification is undecidable! Given program P and specification S, does P satisfy S?
Formal Specifications on Industrial Strength Code: From Myth to Reality Manuvir Das Principal Researcher Center for Software Excellence Microsoft Corporation.
An Abstract Interpretation Framework for Refactoring P. Cousot, NYU, ENS, CNRS, INRIA R. Cousot, ENS, CNRS, INRIA F. Logozzo, M. Barnett, Microsoft Research.
Abstraction and Modular Reasoning for the Verification of Software Corina Pasareanu NASA Ames Research Center.
Testing and Quality Assurance
1/20 Generalized Symbolic Execution for Model Checking and Testing Charngki PSWLAB Generalized Symbolic Execution for Model Checking and Testing.
SOFTWARE TESTING. INTRODUCTION  Software Testing is the process of executing a program or system with the intent of finding errors.  It involves any.
Taming Win32 Threads with Static Analysis Jason Yang Program Analysis Group Center for Software Excellence (CSE) Microsoft Corporation.
Testing Without Executing the Code Pavlina Koleva Junior QA Engineer WinCore Telerik QA Academy Telerik QA Academy.
Software testing.
Building Enterprise Applications Using Visual Studio ®.NET Enterprise Architect.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 23 Slide 1 Software testing.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 23 Slide 1 Software testing.
Software Factory Assembling Applications with Models, Patterns, Frameworks and Tools Anna Liu Senior Architect Advisor Microsoft Australia.
Software Reliability Methods Sorin Lerner. Software reliability methods: issues What are the issues?
SE 450 Software Processes & Product Metrics 1 Defect Removal.
ESP: Program Verification Of Millions of Lines of Code Manuvir Das Researcher PPRC Reliability Team Microsoft Research.
ESP [Das et al PLDI 2002] Interface usage rules in documentation –Order of operations, data access –Resource management –Incomplete, wordy, not checked.
PASTE at Microsoft Manuvir Das Center for Software Excellence Microsoft Corporation.
Defect Detection at Microsoft – Where the Rubber Meets the Road Manuvir Das (and too many others to list) Program Analysis Group Center for Software Excellence.
Symbolic Path Simulation in Path-Sensitive Dataflow Analysis Hari Hampapuram Jason Yue Yang Manuvir Das Center for Software Excellence (CSE) Microsoft.
High Level: Generic Test Process (from chapter 6 of your text and earlier lesson) Test Planning & Preparation Test Execution Goals met? Analysis & Follow-up.
Introduction to Software Testing
Software Testing Verification and validation planning Software inspections Software Inspection vs. Testing Automated static analysis Cleanroom software.
CS527: (Advanced) Topics in Software Engineering Overview of Software Quality Assurance Tao Xie ©D. Marinov, T. Xie.
Dr. Pedro Mejia Alvarez Software Testing Slide 1 Software Testing: Building Test Cases.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 23 Slide 1 Software testing.
Unleashing the Power of Static Analysis Manuvir Das Principal Researcher Center for Software Excellence Microsoft Corporation.
Towards Scalable Modular Checking of User-defined Properties Thomas Ball, MSR Brian Hackett, Mozilla Shuvendu Lahiri, MSR Shaz Qadeer, MSR Julien Vanegue,
Scalable Defect Detection Manuvir Das, Zhe Yang, Daniel Wang Center for Software Excellence Microsoft Corporation.
CS 501: Software Engineering Fall 1999 Lecture 16 Verification and Validation.
Mining Windows Kernel API Rules Jinlin Yang 09/28/2005CS696.
Software testing techniques 3. Software testing
Software Testing.
1 Debugging and Testing Overview Defensive Programming The goal is to prevent failures Debugging The goal is to find cause of failures and fix it Testing.
The Daikon system for dynamic detection of likely invariants MIT Computer Science and Artificial Intelligence Lab. 16 January 2007 Presented by Chervet.
Testing and Debugging Version 1.0. All kinds of things can go wrong when you are developing a program. The compiler discovers syntax errors in your code.
OHTO -99 SOFTWARE ENGINEERING “SOFTWARE PRODUCT QUALITY” Today: - Software quality - Quality Components - ”Good” software properties.
Testing. 2 Overview Testing and debugging are important activities in software development. Techniques and tools are introduced. Material borrowed here.
DEV337 Modeling Distributed Enterprise Applications Using UML in Visual Studio.NET David Keogh Program Manager Visual Studio Enterprise Tools.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 22 Slide 1 Software Verification, Validation and Testing.
1 Introduction to Software Testing. Reading Assignment P. Ammann and J. Offutt “Introduction to Software Testing” ◦ Chapter 1 2.
1 Ch. 1: Software Development (Read) 5 Phases of Software Life Cycle: Problem Analysis and Specification Design Implementation (Coding) Testing, Execution.
Building More Reliable And Better Performing Web Applications With Visual Studio 2005 Team System Gabriel Marius TLN312 Program Manager Microsoft Corporation.
What is Testing? Testing is the process of finding errors in the system implementation. –The intent of testing is to find problems with the system.
Xusheng Xiao North Carolina State University CSC 720 Project Presentation 1.
An Undergraduate Course on Software Bug Detection Tools and Techniques Eric Larson Seattle University March 3, 2006.
Software Development Problem Analysis and Specification Design Implementation (Coding) Testing, Execution and Debugging Maintenance.
Static Techniques for V&V. Hierarchy of V&V techniques Static Analysis V&V Dynamic Techniques Model Checking Simulation Symbolic Execution Testing Informal.
Dynamic Testing.
/ PSWLAB Evidence-Based Analysis and Inferring Preconditions for Bug Detection By D. Brand, M. Buss, V. C. Sreedhar published in ICSM 2007.
C HAPTER 3 Describing Syntax and Semantics. D YNAMIC S EMANTICS Describing syntax is relatively simple There is no single widely acceptable notation or.
Testing Overview Software Reliability Techniques Testing Concepts CEN 4010 Class 24 – 11/17.
CS 5150 Software Engineering Lecture 21 Reliability 2.
CSE 374 Programming Concepts & Tools Hal Perkins Fall 2015 Lecture 17 – Specifications, error checking & assert.
Jeremy Nimmer, page 1 Automatic Generation of Program Specifications Jeremy Nimmer MIT Lab for Computer Science Joint work with.
Building Enterprise Applications Using Visual Studio®
CMSC 345 Defensive Programming Practices from Software Engineering 6th Edition by Ian Sommerville.
Chapter 8 – Software Testing
CSE 374 Programming Concepts & Tools
High Coverage Detection of Input-Related Security Faults
Improving software quality using Visual Studio 11 C++ Code Analysis
Introduction to Software Testing
CSE 303 Concepts and Tools for Software Development
Test Case Test case Describes an input Description and an expected output Description. Test case ID Section 1: Before execution Section 2: After execution.
4/28/2019 6:13 PM HW-889P Advanced driver code analysis techniques Tips and tricks to develop more secure & reliable drivers Dave Sielaff Principal Software.
The Zoo of Software Security Techniques
Assertions References: internet notes; Bertrand Meyer, Object-Oriented Software Construction; 4/25/2019.
Software Development Process Using UML Recap
Presentation transcript:

Software Excellence via Program Verification at Microsoft Manuvir Das Center for Software Excellence Microsoft Corporation

Software Excellence via Program Verification at Microsoft Manuvir Das Center for Software Excellence Microsoft Corporation

Software Excellence via Program Analysis at Microsoft Manuvir Das Center for Software Excellence Microsoft Corporation

Arizona, 6 Oct ‘05Manuvir Das, Microsoft Corporation 4 Talking the talk … Program analysis technology can make a huge impact on how software is engineered The trick is to properly balance research on new techniques with a focus on deployment The Center for Software Excellence (CSE) at Microsoft is doing this (well?) today

Arizona, 6 Oct ‘05Manuvir Das, Microsoft Corporation 5 … walking the walk Program Analysis group in June 2005 – Filed bugs – Automatically added 10,000+ specifications – Answered hundreds of s (one future version of one product) We are program analysis researchers – but we live and breathe deployment & adoption – and we feel the pain of the customer

Arizona, 6 Oct ‘05Manuvir Das, Microsoft Corporation 6 Context The Nail (Windows) – Manual processes do not scale to “real” software The Hammer (Program Analysis) – Automated methods for “searching” programs The Carpenter (CSE) – A systematic, heavily automated, approach to improving the “quality” of software

Arizona, 6 Oct ‘05Manuvir Das, Microsoft Corporation 7 What is program analysis? grep == program analysis program analysis == grep syntax trees, CFGs, instrumentation, alias analysis, dataflow analysis, dependency analysis, binary analysis, automated debugging, fault isolation, testing, symbolic evaluation, model checking, specifications, …

Arizona, 6 Oct ‘05Manuvir Das, Microsoft Corporation 8 Roadmap (part of) The engineering process today (some of) The tools that enable the process (a few) Program analyses behind the tools (too many) Lessons learned along the way (too few) Suggestions for future research

Arizona, 6 Oct ‘05Manuvir Das, Microsoft Corporation 9 Engineering process

Arizona, 6 Oct ‘05Manuvir Das, Microsoft Corporation 10 Methodology Root Cause Analysis Measurement Analysis Technology Resource Constraints Engineering Process

Arizona, 6 Oct ‘05Manuvir Das, Microsoft Corporation 11 Root cause analysis Understand important failures in a deep way – Every MSRC bulletin – Beta release feedback – Watson crash reports – Self host – Bug databases Design and adjust the engineering process to ensure that these failures are prevented

Arizona, 6 Oct ‘05Manuvir Das, Microsoft Corporation 12 Measurement Measure everything about the process – Code quality – Code velocity – Tools effectiveness – Developer productivity Tweak the process accordingly

Arizona, 6 Oct ‘05Manuvir Das, Microsoft Corporation 13 Process – Build Architecture Main Branch Team Branch Desktop Team Branch Team Branch Desktop ……

Arizona, 6 Oct ‘05Manuvir Das, Microsoft Corporation 14 Process – Quality Gates Lightweight tools – run on developer desktop & team level branches – issues tracked within the program artifacts Enforced by rejection at gate Main Branch Team Branch Desktop Team Branch Team Branch Desktop ……

Arizona, 6 Oct ‘05Manuvir Das, Microsoft Corporation 15 Process – Automated Bug Filing Heavyweight tools – run on main branch – issues tracked through a central bug database Enforced by bug cap Main Branch Team Branch Desktop Team Branch Team Branch Desktop ……

Arizona, 6 Oct ‘05Manuvir Das, Microsoft Corporation 16 Tools

Arizona, 6 Oct ‘05Manuvir Das, Microsoft Corporation 17 QG – Code Coverage via Testing Reject code that is not adequately tested – Maintain a minimum bar for code coverage Code coverage tool – Magellan Based on binary analysis - Vulcan

Arizona, 6 Oct ‘05Manuvir Das, Microsoft Corporation 18 Magellan BBCover – low overhead instrumentation & collection – down to basic block level Sleuth – coverage visualization, reporting & analysis Blender – coverage migration Scout – test prioritization

Arizona, 6 Oct ‘05Manuvir Das, Microsoft Corporation 19 QG – Component Integrity Reject code that breaks the componentized architecture of the product – Control all dependencies across components Dependency analysis tool – MaX Based on binary analysis - Vulcan

Arizona, 6 Oct ‘05Manuvir Das, Microsoft Corporation 20 MaX Constructs a graph of dependencies between binaries (DLLs) in the system – Obvious : call graph – Subtle : registry, RPC, … Compare policy graph and actual graph Some discrepancies are treated as errors

Arizona, 6 Oct ‘05Manuvir Das, Microsoft Corporation 21 Vulcan Input – binary code Output – program abstractions Adapts to level of debug information Makes code instrumentation easy – think ATOM Makes code modification easy – link time, post link time, install time, run time

Arizona, 6 Oct ‘05Manuvir Das, Microsoft Corporation 22 QG – Formal Specifications Reject code with poorly designed and/or insufficiently specified interfaces Lightweight specification language – SAL – initial focus on memory usage All functions must be SAL annotated Fully supported in Visual Studio (see MSDN)

Arizona, 6 Oct ‘05Manuvir Das, Microsoft Corporation 23 SAL A language of contracts between functions preconditions – Statements that hold at entry to the callee – What does a callee expect from its callers? postconditions – Statements that hold at exit from the callee – What does a callee promise its callers? Usage example: a 0 RT func(a 1 … a n T par) Buffer sizes, null pointers, memory usage, …

Arizona, 6 Oct ‘05Manuvir Das, Microsoft Corporation 24 SAL Example wcsncpy – precondition: destination must have enough allocated space wchar_t wcsncpy ( wchar_t *dest, wchar_t *src, size_t num ); wchar_t wcsncpy ( __pre __writableTo(elementCount(num)) wchar_t *dest, wchar_t *src, size_t num );

Arizona, 6 Oct ‘05Manuvir Das, Microsoft Corporation 25 SAL Principle Control the power of the specifications: – Impractical solution: Rewrite code in a different language that is amenable to automated analysis – Practical solution: Formalize invariants that are implicit in the code in intuitive notations These invariants often already appear in comments

Arizona, 6 Oct ‘05Manuvir Das, Microsoft Corporation 26 Defect Detection Process – 1

Arizona, 6 Oct ‘05Manuvir Das, Microsoft Corporation 27 QG – Integer Overflow Reject code with potential security holes due to unchecked integer arithmetic Range specifications + range checker – IO Local (intra-procedural) analysis Runs on developer desktop as part of regular compilation process

Arizona, 6 Oct ‘05Manuvir Das, Microsoft Corporation 28 IO Enforces correct arithmetic for allocations Construct an expression tree for every interesting expression in the code Ensure that every node in the tree is checked size1 = … size2 = … data = MyAlloc(size1+size2); for (i = 0; i < size1; i++) data[i] = …

Arizona, 6 Oct ‘05Manuvir Das, Microsoft Corporation 29 QG – Buffer Overruns Reject code with potential security holes due to out of bounds buffer accesses Buffer size specifications + buffer overrun checker – espX Local (intra-procedural) analysis Runs on developer desktop as part of regular compilation process

Arizona, 6 Oct ‘05Manuvir Das, Microsoft Corporation 30 Bootstrap the process Combine global and local analysis: – Weak global analysis to infer (potentially inaccurate) interface annotations - SALinfer – Strong local analysis to identify incorrect code and/or annotations - espX

Arizona, 6 Oct ‘05Manuvir Das, Microsoft Corporation 31 Defect Detection Process - 2

Arizona, 6 Oct ‘05Manuvir Das, Microsoft Corporation 32 void work() { int elements[200]; wrap(elements, 200); } void wrap( int *buf, int len) { int *buf2 = buf; int len2 = len; zero(buf2, len2); } void zero( int *buf, int len) { int i; for(i = 0; i <= len; i++) buf[i] = 0; } SALinfer Track flow of values through the code 1.Finds stack buffer 2.Adds annotation 3.Finds assignments 4.Adds annotation void work() { int elements[200]; wrap(elements, 200); } void wrap(pre elementCount(len) int *buf, int len) { int *buf2 = buf; int len2 = len; zero(buf2, len2); } void zero( int *buf, int len) { int i; for(i = 0; i <= len; i++) buf[i] = 0; } void work() { int elements[200]; wrap(elements, 200); } void wrap(pre elementCount(len) int *buf, int len) { int *buf2 = buf; int len2 = len; zero(buf2, len2); } void zero(pre elementCount(len) int *buf, int len) { int i; for(i = 0; i <= len; i++) buf[i] = 0; }

Arizona, 6 Oct ‘05Manuvir Das, Microsoft Corporation 33 void work() { int elements[200]; wrap(elements, 200); } void wrap(pre elementCount(len) int *buf, int len) { int *buf2 = buf; int len2 = len; zero(buf2, len2); } void zero(pre elementCount(len) int *buf, int len) { int i; for(i = 0; i <= len; i++) buf[i] = 0; } Building and solving constraints 1.Builds constraints 2.Verifies contract 3.Builds constraints len = length(buf); i ≤ len 4.Finds overrun i < length(buf) ? NO! espX

Arizona, 6 Oct ‘05Manuvir Das, Microsoft Corporation 34 QG – Code Correctness Reject code with potential crashes due to improper usage of memory Pointer usage specifications + memory usage checker – PREfast Managed code – PREsharp Local (intra-procedural) analysis Runs on developer desktop as part of regular compilation process

Arizona, 6 Oct ‘05Manuvir Das, Microsoft Corporation 35 ABF – Code Correctness Tease out hard to find inter-component bugs that lead to crashes – null dereference, un-initialized memory, leaks, … – difficult to find accurately on the desktop Inter-procedural symbolic evaluation - PREfix

Arizona, 6 Oct ‘05Manuvir Das, Microsoft Corporation 36 PREfix Bottom-up process on the call graph Symbolic evaluation of a fixed number of distinct paths through each function – use symbolic state to remove infeasible paths – report defects – build function models for use by callers Solved all the difficult engineering problems for the inter-procedural tools to follow

Arizona, 6 Oct ‘05Manuvir Das, Microsoft Corporation 37 ABF – Security For every new security issue, map it to a coding defect and root out all other instances – Each coding defect is a different pattern, but most can be viewed as finite state properties Heavyweight, thorough, property-based inter- procedural analysis - ESP

Arizona, 6 Oct ‘05Manuvir Das, Microsoft Corporation 38 Property-based analysis void main () { if (dump) fil = fopen(dumpFile,”w”); if (p) x = 0; else x = 1; if (dump) fclose(fil); } Closed Opened Error Open Print Open Close Print/Close * void main () { if (dump) Open; if (p) x = 0; else x = 1; if (dump) Close; }

Arizona, 6 Oct ‘05Manuvir Das, Microsoft Corporation 39 ESP Symbolically evaluate the program – track FSA state and execution state At control flow branch points: – Execution state implies branch direction? Yes: process appropriate branch No: split state and process both branches At control flow merge points: – States agree on property FSA state? Yes: merge states No: process states separately

Example [Opened|dump=T] [Closed] [Closed|dump=T] [Closed|dump=F] [Opened|dump=T,p=T,x=0] [Opened|dump=T,p=F,x=1] [Opened|dump=T] [Closed|dump=F] [Closed] entry dump p x = 0x = 1 Open Close exit dump T T T F F F

Arizona, 6 Oct ‘05Manuvir Das, Microsoft Corporation 41 Lessons

Arizona, 6 Oct ‘05Manuvir Das, Microsoft Corporation 42 Forcing functions for change Gen 1: Manual Review – Too many paths Gen 2: Massive Testing – Inefficient detection of common patterns Gen 3: Global Program Analysis – Stale results Gen 4: Local Program Analysis – Lack of context Gen 5: Specifications

Arizona, 6 Oct ‘05Manuvir Das, Microsoft Corporation 43 Don’t bother doing this without - No-brainer must-haves – Defect viewer, docs, champions, partners A mechanism for developers to teach the tool – Suppression, assertion, assumption A willingness to support the tool A positive attitude

Arizona, 6 Oct ‘05Manuvir Das, Microsoft Corporation 44 Myth 1 – Soundness matters Sound == find only real bugs The real measure is Fix Rate Centralized: >50% Desktop: >75% Specification inference – Is it much worse than manual insertion?

Arizona, 6 Oct ‘05Manuvir Das, Microsoft Corporation 45 Myth 2 – Completeness matters Complete == find all the bugs There will never be a complete analysis – Partial specifications – Missing code Developers want consistent analysis – Tools should be stable w.r.t. minor code changes – Systematic, thorough, tunable program analysis

Arizona, 6 Oct ‘05Manuvir Das, Microsoft Corporation 46 Myth 3 – Developers only fix real bugs Key attributes of a “fixable” bug – Easy to fix – Easy to verify – Unlikely to introduce a regression Simple tools can be very effective

Arizona, 6 Oct ‘05Manuvir Das, Microsoft Corporation 47 Myth 4 – Developers hate specifications Control the power of the specifications This will work – Formalize invariants that are implicit in the code This will not work – Re-write code in a different language that is amenable to automated analysis Think like a developer

Arizona, 6 Oct ‘05Manuvir Das, Microsoft Corporation 48 Myth 5 – Whiteboards are useful Whiteboards have held back defect detection The most useful analyses and tools mimic the thinking of the developer – e.g. do developers consider every possible interleaving when writing threaded code? No!

Arizona, 6 Oct ‘05Manuvir Das, Microsoft Corporation 49 Myth 6 – Theory is useless Fundamental ideas have been crucial – Hoare logic – Abstract interpretation – Context-sensitive analysis with summaries – Alias analysis

Arizona, 6 Oct ‘05Manuvir Das, Microsoft Corporation 50 Don’t break the shipping code b = a + 16; Use(b); b = __invariant(a) + 16; Use(b); __invariant(a); b = a + 16; Use(b); __invariant() is an annotation macro – generates code in the tools build, noop in the real build Before: After (correct code): After (incorrect code): Incorrect usage silently breaks the code!

Arizona, 6 Oct ‘05Manuvir Das, Microsoft Corporation 51 Research directions

Arizona, 6 Oct ‘05Manuvir Das, Microsoft Corporation 52 Concurrency tools Developers working on large projects follow sequential locking disciplines – Sequential analysis to mimic the developer – Language constructs to help the developer Indirect defects reported on a single thread are much easier to fix

Arizona, 6 Oct ‘05Manuvir Das, Microsoft Corporation 53 Static & dynamic analysis Static followed by dynamic – Instrument problem areas using static analysis – Gather dynamic traces to diagnose defects Dynamic followed by static – Use dynamic analysis to create a signature for good execution traces – Use static analysis to find execution traces that do not match the signature Common intermediate information – Code coverage, …

Arizona, 6 Oct ‘05Manuvir Das, Microsoft Corporation 54 Users as automated testers Huge opportunity to improve code quality – Find out what’s failing, where, how often – Diagnose the failures – Early warning data Avoid falling into the trap of the long awaited “code review editor” – Need to find limited, concrete scenarios

Arizona, 6 Oct ‘05Manuvir Das, Microsoft Corporation 55 Evolutionary tools Specification-based tools evolve a language – Introduce a programming discipline – Increase the portability of legacy code We have tackled memory usage – Rinse and repeat

Arizona, 6 Oct ‘05Manuvir Das, Microsoft Corporation 56 Summary Program analysis technology can make a huge impact on how software is developed The trick is to properly balance research on new techniques with a focus on deployment The Center for Software Excellence (CSE) at Microsoft is doing this (well?) today

© 2005 Microsoft Corporation. All rights reserved. This presentation is for informational purposes only. MICROSOFT MAKES NO WARRANTIES, EXPRESS OR IMPLIED, IN THIS SUMMARY.