Tao Xie North Carolina State University Nikolai Tillmann, Peli de Halleux, Wolfram Schulte Microsoft Research.

Slides:



Advertisements
Similar presentations
Leonardo de Moura Microsoft Research. Z3 is a new solver developed at Microsoft Research. Development/Research driven by internal customers. Free for.
Advertisements

Sensitivity Analysis A systematic way of asking “what-if” scenario questions in order to understand what outcomes could possibly occur that would effect.
Tutorial Pex4Fun: Teaching and Learning Computer Science via Social Gaming Nikolai Tillmann, Jonathan de Halleux, Judith Bishop, Michal.
Generating GAIGS XML Scripts I Integrating Algorithm Visualization into Computer Science Education Grand Valley State University June 13-16, 2006.
PLDI’2005Page 1June 2005 Example (C code) int double(int x) { return 2 * x; } void test_me(int x, int y) { int z = double(x); if (z==y) { if (y == x+10)
Giving a formal meaning to “Specialization” In these note we try to give a formal meaning to specifications, implementations, their comparisons. We define.
Symbolic execution © Marcelo d’Amorim 2010.
Peli de Halleux Senior Research Software Design Engineer Microsoft Research.
© 2004 Goodrich, Tamassia Hash Tables1  
© Copyright 1992–2005 by Deitel & Associates, Inc. and Pearson Education Inc. All Rights Reserved. Tutorial 4 – Introducing Algorithms, Pseudocode and.
Pexxxx White Box Test Generation for
Program Exploration with Pex Nikolai Tillmann, Peli de Halleux Pex
Nikolaj Bjørner Leonardo de Moura Nikolai Tillmann Microsoft Research August 11’th 2008.
Sensitivity Analysis A systematic way of asking “what-if” scenario questions in order to understand what outcomes could possibly occur that would affect.
Automating Software Testing Using Program Analysis -Patrice Godefroid, Peli de Halleux, Aditya V. Nori, Sriram K. Rajamani,Wolfram Schulte, and Nikolai.
Deep Dive into Pex How Pex works, implications for design of Code Hunt puzzles Nikolai Tillmann Principal Software Engineering Manager Microsoft, Redmond,
CS 1031 C++: Object-Oriented Programming Classes and Objects Template classes Operator Overloading Inheritance Polymorphism.
Separation of Concerns Tao Xie Peking University, China North Carolina State University, USA In collaboration with Nikolai Tillmann, Peli de Halleux, Wolfram.
Testing. Definition From the dictionary- the means by which the presence, quality, or genuineness of anything is determined; a means of trial. For software.
Tao Xie North Carolina State University Supported by CACC/NSA Related projects supported in part by ARO, NSF, SOSI.
Automated Testing of System Software (Virtual Machine Monitors) Tao Xie Department of Computer Science North Carolina State University
Tao Xie (North Carolina State University) Nikolai Tillmann, Jonathan de Halleux, Wolfram Schulte (Microsoft Research, Redmond WA, USA)
Symbolic Execution with Mixed Concrete-Symbolic Solving (SymCrete Execution) Jonathan Manos.
Automated Developer Testing: Achievements and Challenges Tao Xie North Carolina State University contact:
CUTE: A Concolic Unit Testing Engine for C Technical Report Koushik SenDarko MarinovGul Agha University of Illinois Urbana-Champaign.
1 CSC241: Object Oriented Programming Lecture No 27.
Database testing Prepared by Saurabh sinha. Database testing mainly focus on: Data integrity test Data integrity test Stored procedures test Stored procedures.
DySy: Dynamic Symbolic Execution for Invariant Inference.
Databases in Visual Studio. Database in VisualStudio An MS SQL database are built in Visual studio The Name can be something like ”(localdb)\Projects”
Tao Xie Automated Software Engineering Group Department of Computer Science North Carolina State University
Teaching and Learning Programming and Software Engineering via Interactive Gaming Tao Xie University of Illinois at Urbana-Champaign In collaboration with.
Testing and Debugging Version 1.0. All kinds of things can go wrong when you are developing a program. The compiler discovers syntax errors in your code.
Code Contracts Parameterized Unit Tests Tao Xie. Example Unit Test Case = ? Outputs Expected Outputs Program + Test inputs Test Oracles 2 void addTest()
Parameterized Unit Tests By Nikolai Tillmann and Wolfram Schulte Proc. of ESEC/FSE 2005 Presented by Yunho Kim Provable Software Lab, KAIST TexPoint fonts.
Tao Xie (North Carolina State University) Peli de Halleux, Nikolai Tillmann, Wolfram Schulte (Microsoft Research)
CSE 232: C++ debugging in Visual Studio and emacs C++ Debugging (in Visual Studio and emacs) We’ve looked at programs from a text-based mode –Shell commands.
Mining Gigabytes of Dynamic Traces for Test Generation Suresh Thummalapenta North Carolina State University Peli de Halleux and Nikolai Tillmann Microsoft.
Code Hunt: Experience with Coding Contests at Scale Judith Bishop, R Nigel Horspool, Tao Xie, Nikolai Tillmann, Jonathan de Halleux Microsoft Research,
Today’s Agenda  Reminder: HW #1 Due next class  Quick Review  Input Space Partitioning Software Testing and Maintenance 1.
Nikolai Tillmann, Jonathan de Halleux Tao Xie Microsoft Research Univ. Illinois at Urbana-Champaign.
Nirmalya Roy School of Electrical Engineering and Computer Science Washington State University Cpt S 122 – Data Structures Templates.
Xusheng Xiao North Carolina State University CSC 720 Project Presentation 1.
jFuzz – Java based Whitebox Fuzzing
Chapter 8 Lecture 1 Software Testing. Program testing Testing is intended to show that a program does what it is intended to do and to discover program.
Cooperative Developer Testing: Tao Xie North Carolina State University In collaboration with Xusheng ASE and Nikolai Tillmann, Peli de
Tao Xie (North Carolina State University) Nikolai Tillmann, Peli de Halleux, Wolfram Schulte (Microsoft Research)
A Test Case + Mock Class Generator for Coding Against Interfaces Mainul Islam, Christoph Csallner Software Engineering Research Center (SERC) Computer.
Parameterized Unit Testing in the Open Source Wild Wing Lam (U. Illinois) In collaboration with Siwakorn Srisakaokul, Blake Bassett, Peyman Mahdian and.
Sensitivity Analysis A systematic way of asking “what-if” scenario questions in order to understand what outcomes could possibly occur that would effect.
CMSC 202 Advanced Section Classes and Objects: Object Creation and Constructors.
ICS3U_FileIO.ppt File Input/Output (I/O)‏ ICS3U_FileIO.ppt File I/O Declare a file object File myFile = new File("billy.txt"); a file object whose name.
Improving Structural Testing of Object-Oriented Programs via Integrating Evolutionary Testing and Symbolic Execution Kobi Inkumsah Tao Xie Dept. of Computer.
CUTE: A Concolic Unit Testing Engine for C Koushik SenDarko MarinovGul Agha University of Illinois Urbana-Champaign.
Random Test Generation of Unit Tests: Randoop Experience
Week 6 MondayTuesdayWednesdayThursdayFriday Testing III Reading due Group meetings Testing IVSection ZFR due ZFR demos Progress report due Readings out.
Symbolic Execution in Software Engineering By Xusheng Xiao Xi Ge Dayoung Lee Towards Partial fulfillment for Course 707.
Symstra: A Framework for Generating Object-Oriented Unit Tests using Symbolic Execution Tao Xie, Darko Marinov, Wolfram Schulte, and David Notkin University.
MUTACINIS TESTAVIMAS Benediktas Knispelis, IFM-2/2 Mutation testing.
Clear Lines Consulting · clear-lines.comApril 21, 2010 · 1 The Joy of Pex
Dynamic Symbolic Execution
Specifications What? Not how!.
A Test Case + Mock Class Generator for Coding Against Interfaces
Marcelo d’Amorim (UIUC)
Code Contracts and Pex Peli de Halleux, Nikolai Tillmann
White-Box Testing Using Pex
CSC 143 Stacks [Chapter 6].
CUTE: A Concolic Unit Testing Engine for C
Assertions References: internet notes; Bertrand Meyer, Object-Oriented Software Construction; 4/25/2019.
Random numbers What does it mean for a number to be random?
Presentation transcript:

Tao Xie North Carolina State University Nikolai Tillmann, Peli de Halleux, Wolfram Schulte Microsoft Research

 Unit Testing  Parameterized Unit Testing (PUT)  Mutation Analysis for PUT

public class IntStack { public IntStack() { … } public void Push(int value) { if (value < 0) return; … } public int Pop() { … } public bool IsEmpty() { … } public bool Equals(Object other) { … } } public class IntStack { public IntStack() { … } public void Push(int value) { if (value < 0) return; … } public int Pop() { … } public bool IsEmpty() { … } public bool Equals(Object other) { … } }

void TestPushPop() { IntStack s = new IntStack(); s.Push(3); s.Push(5); Assert.IsTrue(s.Pop() == 5); } void TestPushPop() { IntStack s = new IntStack(); s.Push(3); s.Push(5); Assert.IsTrue(s.Pop() == 5); }  A unit test is a small program with assertions  Test a single (small) unit of code  Happy path only  New code with old tests  Redundant tests

IntStack s = new IntStack(); s.Push(item1); s.Push(item2); IntStack s = new IntStack(); s.Push(item1); s.Push(item2); Assert.IsTrue(s.Pop() == item2); } Assert.IsTrue(s.Pop() == item2); }  Three ingredients:  Data  Method Sequence  Assertions void TestPushPop() { int item1 = 3, item2 = 5; void TestPushPop() { int item1 = 3, item2 = 5;

 Which value matters?  Redundant, Incomplete Test Suites  Does not evolve with the code under test. s.Push(5);

void TestPushPopPUT4(IntStack s, int i) { PexAssume.IsTrue(s != null); PexAssume.IsTrue(i >= 0); s.Push(i); PexAssert.IsTrue(s.Pop() == i); } void TestPushPopPUT4(IntStack s, int i) { PexAssume.IsTrue(s != null); PexAssume.IsTrue(i >= 0); s.Push(i); PexAssert.IsTrue(s.Pop() == i); }  Parameterized Unit Test = Unit Test with Parameters  Separation of concerns  Data is generated by a tool  Human takes care of the Functional Specification

 A Parameterized Unit Test can be read as a universally quantified, conditional axiom. void TestPushPopPUT4(IntStack s, int i) { PexAssume.IsTrue(s != null); PexAssume.IsTrue(i >= 0); s.Push(i); PexAssert.IsTrue(s.Pop() == i); } void TestPushPopPUT4(IntStack s, int i) { PexAssume.IsTrue(s != null); PexAssume.IsTrue(i >= 0); s.Push(i); PexAssert.IsTrue(s.Pop() == i); }  IntStack s, int i: s ≠ null ⋀ i >= 0 ⇒ equals( Pop(Push(s, i)), i)  IntStack s, int i: s ≠ null ⋀ i >= 0 ⇒ equals( Pop(Push(s, i)), i)

Test Project Code Under Test Project Code Under Test Project Parameterized Unit Tests Pex Generated Tests

 Pex is a test input generator  Pex starts from parameterized unit tests  Generated tests are emitted as traditional unit tests  Pex analyzes execution paths  Analysis at the level of the.NET instructions (MSIL)  Dynamic symbolic execution (i.e., directed random testing in DART, concolic testing in CUTE, …)

Pex is being used both inside and outside of Microsoft Publicly available with both commercial and academic licenses Being integrated into Visual Studio Being taught at NCSU graduate testing course ICSE 2009 Tutorial on PUT …

 Assume, Arrange, Act, Assert void TestPushPopPUT3(int i) { PexAssume.IsTrue(i >= 0); // assume IntStack s = new IntStack(); // arrange s.Push(i); // act PexAssert.IsTrue(s.Pop() == i); // assert } void TestPushPopPUT3(int i) { PexAssume.IsTrue(i >= 0); // assume IntStack s = new IntStack(); // arrange s.Push(i); // act PexAssert.IsTrue(s.Pop() == i); // assert }

 Stronger assumptions (not good)  Weaker assertions (not good) void TestPushPopPUT3(int i) { PexAssume.IsTrue(i >= 0); IntStack s = new IntStack(); s.Push(i); PexAssert.IsTrue(s.Pop() == i); } void TestPushPopPUT3(int i) { PexAssume.IsTrue(i >= 0); IntStack s = new IntStack(); s.Push(i); PexAssert.IsTrue(s.Pop() == i); } void TestPushPopPUT4(IntStack s, int i) { PexAssume.IsTrue(s != null); PexAssume.IsTrue(i >= 0); s.Push(i); PexAssert.IsTrue(s.Pop() == i); } void TestPushPopPUT4(IntStack s, int i) { PexAssume.IsTrue(s != null); PexAssume.IsTrue(i >= 0); s.Push(i); PexAssert.IsTrue(s.Pop() == i); } Detecting them is challenging too

 Key idea for detecting stronger assumptions  weakening assumptions while violating no assertions in the PUT  Key idea for detecting weaker assertions  strengthening assertions while still being satisfied by the generated test inputs

 Key idea for detecting stronger assumptions  weakening assumptions (producing a mutant PUT) while violating no assertions in the PUT (being a live mutant or not being killed)  Key idea for detecting weaker assertions  strengthening assertions (producing a mutant PUT) while still being satisfied by the generated test inputs PUT (being a live mutant or not being killed)

 A mutant PUT is live if no test inputs can be generated (by a test generation tool) to  violate specified assertions  satisfy the specified assumptions  A live mutant PUT indicates likely PUT improvement  generalization on assumptions  specialization on assertions

 Assumption Weakening: weaken constraints specified in assumptions  Assertion Strengthening: strengthen constraints specified in assertions  Primitive-Value Generalization: replace a primitive value with an additional parameter (related to assumption weakening )  Method-Invocation Deletion: Delete a method invocation (related to assumption weakening)

void TestPushPopPUT1(int j) { PexAssume.IsTrue(j >= 0); IntStack s = new IntStack(); s.Push(j); s.Push(5); PexAssert.IsTrue(s.Pop() == 5); } void TestPushPopPUT1(int j) { PexAssume.IsTrue(j >= 0); IntStack s = new IntStack(); s.Push(j); s.Push(5); PexAssert.IsTrue(s.Pop() == 5); } void TestPushPopPUT2(int i) { PexAssume.IsTrue(i > 0);  PexAssume.IsTrue(i >= 0); IntStack s = new IntStack(); s.Push(i); PexAssert.IsTrue(s.Pop() == i); } void TestPushPopPUT2(int i) { PexAssume.IsTrue(i > 0);  PexAssume.IsTrue(i >= 0); IntStack s = new IntStack(); s.Push(i); PexAssert.IsTrue(s.Pop() == i); } Deleting an assumption from the PUT Weakening a clause in an assumption: P > Q  P >= Q

void TestPushPopPUT1(int j) { IntStack s = new IntStack(); s.Push(j); s.Push(5); PexAssert.IsTrue(s.Pop() > -1); PexAssert.IsTrue(s.Pop() > 0); PexAssert.IsTrue(s.Pop() == 5); } void TestPushPopPUT1(int j) { IntStack s = new IntStack(); s.Push(j); s.Push(5); PexAssert.IsTrue(s.Pop() > -1); PexAssert.IsTrue(s.Pop() > 0); PexAssert.IsTrue(s.Pop() == 5); } Strengthen a clause strengthen P > Q to P > (Q + const) strengthen P > Q to P == (Q+const)),

void TestPushPopPUT() { IntStack s = new IntStack(); s.Push(3); s.Push(5); PexAssert.IsTrue(s.Pop() == 5); } void TestPushPopPUT() { IntStack s = new IntStack(); s.Push(3); s.Push(5); PexAssert.IsTrue(s.Pop() == 5); } void TestPushPopPUT1(int j) { IntStack s = new IntStack(); s.Push(j); s.Push(5); PexAssert.IsTrue(s.Pop() == 5); } void TestPushPopPUT1(int j) { IntStack s = new IntStack(); s.Push(j); s.Push(5); PexAssert.IsTrue(s.Pop() == 5); }

void TestPushPopPUT3(int i) { PexAssume.IsTrue(i >= 0); IntStack s = new IntStack(); s.Push(i); PexAssert.IsTrue(s.Pop() == i); } void TestPushPopPUT3(int i) { PexAssume.IsTrue(i >= 0); IntStack s = new IntStack(); s.Push(i); PexAssert.IsTrue(s.Pop() == i); } void TestPushPopPUT4(IntStack s, int i) { PexAssume.IsTrue(s != null); PexAssume.IsTrue(i >= 0); s.Push(i); PexAssert.IsTrue(s.Pop() == i); } void TestPushPopPUT4(IntStack s, int i) { PexAssume.IsTrue(s != null); PexAssume.IsTrue(i >= 0); s.Push(i); PexAssert.IsTrue(s.Pop() == i); }

 Writing good PUTs is challenging  Stronger assumptions  Weaker assertions Mutation analysis of PUTs  Mutation killing  Mutation operators  Assumption weakening  Assertion strengthening  Primitive-value generalization  Method-invocation deletion

Leveraging the Visual Studio integration

 Code to generate inputs for: Constraints to solve a!=null a!=null && a.Length>0 a!=null && a.Length>0 && a[0]== void CoverMe(int[] a) { if (a == null) return; if (a.Length > 0) if (a[0] == ) throw new Exception("bug"); } void CoverMe(int[] a) { if (a == null) return; if (a.Length > 0) if (a[0] == ) throw new Exception("bug"); } Observed constraints a==null a!=null && !(a.Length>0) a!=null && a.Length>0 && a[0]!= a!=null && a.Length>0 && a[0]== Data null {} {0} {123…} a==null a.Length>0 a[0]==123… T T F T F F Execute&Monitor Solve Choose next path Done: There is no path left. Negated condition

 Right-click on the method name  Select Run Pex Explorations Here!

Exploration Status Exploration Status Current Parameterized Unit Test Issue bar Important messages here !!! Issue bar Important messages here !!! Input/Output table Row = generated test Column = parameterized test input or output Input/Output table Row = generated test Column = parameterized test input or output

Event filtering Event filtering Select and see details Apply Pex suggested fix

 Attributes

Test outcome filtering Test outcome filtering Passing Test Passing Test Fix available New Test Failing Test Failing Test

Exception Stack trace Exception Stack trace Review test Allow exception