Equivalence Class Testing, Decision Table Testing UC Santa Cruz CMPS 171 – Game Design Studio II courses.soe.ucsc.edu/courses/cmps171/Winter12/01

Slides:



Advertisements
Similar presentations
Testing Relational Database
Advertisements

Overview Functional Testing Boundary Value Testing (BVT)
DETAILED DESIGN, IMPLEMENTATIONA AND TESTING Instructor: Dr. Hany H. Ammar Dept. of Computer Science and Electrical Engineering, WVU.
Decision Table Based Testing
Annoucements  Next labs 9 and 10 are paired for everyone. So don’t miss the lab.  There is a review session for the quiz on Monday, November 4, at 8:00.
Practical Testing Techniques. Verification and Validation Validation –does the software do what was wanted? “Are we building the right system?” –This.
Equivalence Class Testing
Software Testing and Quality Assurance
Introduction to UML (slides adapted from Michael Mateas)
Equivalence Class Testing, Decision Table Testing
Computer science careers. Computer science at UCSC UC Santa Cruz CMPS 10 – Introduction to Computer Science
Testing an individual module
1 These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are provided with permission by.
Introduction to Testing UC Santa Cruz CMPS 171 – Game Design Studio II 8 February 2011.
Systems Analysis I Data Flow Diagrams
Path testing Path testing is a “design structural testing” in that it is based on detailed design & the source code of the program to be tested. The methodology.
Equivalence Class Testing
Black Box Software Testing
1 Functional Testing Motivation Example Basic Methods Timing: 30 minutes.
Test Design Techniques
Software Testing Sudipto Ghosh CS 406 Fall 99 November 9, 1999.
CC0002NI – Computer Programming Computer Programming Er. Saroj Sharan Regmi Week 7.
Com1040 Systems Design and Testing Part II – Testing (Based on A.J. Cowling’s lecture notes) LN-Test3: Equivalence classes and boundary conditions Marian.
Software Testing (Part 2)
Introduction Telerik Software Academy Software Quality Assurance.
CMSC 345 Fall 2000 Unit Testing. The testing process.
CS4311 Spring 2011 Unit Testing Dr. Guoqiang Hu Department of Computer Science UTEP.
CSE403 Software Engineering Autumn 2001 More Testing Gary Kimura Lecture #10 October 22, 2001.
Black Box Testing Techniques Chapter 7. Black Box Testing Techniques Prepared by: Kris C. Calpotura, CoE, MSME, MIT  Introduction Introduction  Equivalence.
Black-box Testing.
Summarizing “Structural” Testing Now that we have learned to create test cases through both: – a) Functional (blackbox)and – b) Structural (whitebox) testing.
Software Testing. Software testing is the execution of software with test data from the problem domain. Software testing is the execution of software.
Today’s Agenda  Reminder: HW #1 Due next class  Quick Review  Input Space Partitioning Software Testing and Maintenance 1.
What is Testing? Testing is the process of finding errors in the system implementation. –The intent of testing is to find problems with the system.
Equivalence Class Testing In chapter 5, we saw that all four variations of boundary value testing are vulnerable to –gaps of untested functionality, and.
Research Experience Program (REP) Spring 2008 Psychology 100 Ψ.
Dynamic Testing.
DETAILED DESIGN, IMPLEMENTATION AND TESTING Instructor: Dr. Hany H. Ammar Dept. of Computer Science and Electrical Engineering, WVU.
1 Software Testing. 2 Equivalence Class Testing 3 The use of equivalence class testing has two motivations: –Sense of complete testing –Avoid redundancy.
Equivalence Class Testing Use the mathematical concept of partitioning into equivalence classes to generate test cases for Functional (Black-box) testing.
Research Experience Program (REP) Fall 2007 Psychology 100 Ψ.
Game playtesting, Gameplay metrics (Based on slides by Michael Mateas, and Chapter 9 (Playtesting) of Game Design Workshop, Tracy Fullerton) UC Santa Cruz.
IP Issues. UC Santa Cruz CMPS 171 – Game Design Studio II courses.soe.ucsc.edu/courses/cmps171/Winter13/01 27 February 2013.
Introduction to Testing UC Santa Cruz CMPS 171 – Game Design Studio II courses.soe.ucsc.edu/courses/cmps171/Winter12/01 14 February 2011.
Independent Game Festival. IndieCade. UC Santa Cruz CMPS 171 – Game Design Studio II courses.soe.ucsc.edu/courses/cmps171/Winter13/01 15.
1 Software Testing. 2 What is Software Testing ? Testing is a verification and validation activity that is performed by executing program code.
Introduction to Typography. UC Santa Cruz CMPS 171 – Game Design Studio II courses.soe.ucsc.edu/courses/cmps171/Winter13/01 25 February.
UML Sequence Diagrams (Slides adapted from Michael Mateas) UC Santa Cruz CMPS 171 – Game Design Studio II courses.soe.ucsc.edu/courses/cmps171/Winter12/01.
Forming Effective Teams UC Santa Cruz CMPS 171 – Game Design Studio II courses.soe.ucsc.edu/courses/cmps171/Winter12/01 17 January 2012.
Fine-Tuning Game Controls UC Santa Cruz CMPS 171 – Game Design Studio II courses.soe.ucsc.edu/courses/cmps171/Winter13/01 8 February 2013.
Taxonomy of Video Game Bugs Based on slides and research originally by Chris Lewis, published at the Foundations of Digital Games 2010 conference UC Santa.
Decision Tables. UC Santa Cruz CMPS 171 – Game Design Studio II courses.soe.ucsc.edu/courses/cmps171/Winter13/01 1 February 2013.
Chapter 3: Equivalence Class Testing :EC Software Testing
UC Santa Cruz CMPS 172 – Game Design Studio III
Decision Table Testing
Data Types Variables are used in programs to store items of data e.g a name, a high score, an exam mark. The data stored in a variable is entered from.
Reflecting on Sprint 1 Reports
Input Space Partition Testing CS 4501 / 6501 Software Testing
Journey Postmortem. UC Santa Cruz CMPS 171 – Game Design Studio II
Overview Functional Testing Boundary Value Testing (BVT)
Personal branding. UC Santa Cruz CMPS 171 – Game Design Studio II
UNIT-4 BLACKBOX AND WHITEBOX TESTING
Game user research in industry Jim Whitehead
Equivalence Class Testing
Test Case Test case Describes an input Description and an expected output Description. Test case ID Section 1: Before execution Section 2: After execution.
CSE403 Software Engineering Autumn 2000 More Testing
Overview Functional Testing Boundary Value Testing (BVT)
Overview Functional Testing Boundary Value Testing (BVT)
CSE 1020:Software Development
UNIT-4 BLACKBOX AND WHITEBOX TESTING
Presentation transcript:

Equivalence Class Testing, Decision Table Testing UC Santa Cruz CMPS 171 – Game Design Studio II courses.soe.ucsc.edu/courses/cmps171/Winter12/01 23 February 2012

UC SANTA CRUZ Upcoming Events  February 28: Visit by Microsoft Studios  Game crits by staff from Microsoft Studios  Discussion of career opportunities  YetiZen: Everything you thought you knew about start-up building & fundraising told by Storm8 & Charles Hudson  In San Francisco, Thursday, Feb. 23, 6pm   See me/ me for free reg. code  Prom Week released last week  Please consider voting for it in the IGF Audience Choice awards:

UC SANTA CRUZ Upcoming deadlines  Thursday (February 23)  Sprint 2 ends  Game playtesting plan  Web site framework (make sure website URL is in your Sprint 2 report)  Friday (February 24)  Sprint 2 report due  Team status reporting due  Sprint 3 begins:: Friday, February 24  Monday (February 27)  Sprint 3 plan due

UC SANTA CRUZ Game Playtesting Plan  1-2 page document, now due Tuesday  Answer the following questions:  Who will conduct tests and write playtest report?  Should have 1-2 people on your team for whom this is a major task  How will you recruit testers?  Just sending out s isn’t likely to get a lot of responses  Need to personally recruit friends, friends of friends, put up flyers, etc.  Whole team effort  When/Where will you conduct playtests?  What day of the week will you conduct playtests? If in the lab, where, exactly?  What specific gameplay issues do you want to focus on in the first few weeks of playtesting?  Do you have any special equipment needs for conducting playtests?

UC SANTA CRUZ Lab cleanup schedule  This week: Hello World  Next week (and through end of the quarter): Puzzle Defense  Team duties:  Ensure overflowing trash cans are emptied to bin outside in 3 rd floor courtyard (anytime during week)  By 5pm Monday (Tuesday this week, due to holiday) and 5pm Friday (unless things get out of control, then more often):  Pick up food containers, bottles, etc.  Pick up stray craft materials, pens, etc and return to drawers  Clean off tables in conference rooms and big circular table  Report any major soda/food spills to me, so we can call cleanup crews  Put controllers/game boxes/etc. away (tidy up game area)  Report any cleaning materials needed

UC SANTA CRUZ Class for Artists  Next quarter all team artists should take class Art 105 (Special Topics in Drawing)  The instructor is listed as “Staff” online, but is actually Richard Wohlfeiler  Team artists who are not in 172 and who are non-Art majors should also take the class  The class is closed, and each artist will need a permission code to get in  I need to send Art Dept. a list of the names of students who will receive permission codes

UC SANTA CRUZ Ultra top secret info about classes next quarter  I’ll be sending this out soon, but you heard it here first  Lots of new classes next quarter  CS 26 (Intro. Computer Animation)  3D animation with Blender (Yonge), CS 25 as prereq.  CS 119 (Software for Society)  Applying computing to social issues (Davis)  CS 121 (Mobile Applications) – CGE  Android programming (de Alfaro)  CS 162 (Advanced Computer Graphics / Animation Lab) - CGE  Continuation of CS 160, focused on computer animation (Pang)  CS 179 (Game Design Practicum) - CGE  Make 3 games using Kinect (Whitehead)

UC SANTA CRUZ Slots for game critiques on Tuesday  Microsoft visit is next Tuesday  Game demos in the game lab  ~5 min presentation, ~5 min playing game, ~10 min q&a, crit  2:00 : Firewall  2:20 : Sonar  2:40 : Hello World  3:00 : MicroVentures  3:20 : Puzzle Defense  3:40 : Chroma  4:00 : Devil’s Bargain

UC SANTA CRUZ Functional vs Structural Testing  There are two fundamental approaches to testing software  Functional (Black Box)  No detailed knowledge of internal implementation  Treats system as function mapping inputs to outputs  Structural (White Box or Clear Box)  Can use detailed knowledge of internal implementation detail  Can “see” inside the implementation Flickr: rileyporter Flickr: cjkershner

UC SANTA CRUZ Types of Functional Tests  Boundary Value Testing  Test values just before, on, and after some input boundary  E.g., -1, 0, 1  Equivalence Class Testing  Provide representative samples within a large space of similar (“equivalent”) inputs  Decision-Table Testing  Create a table that lists all possible outcomes for a set of inputs  Use this table to drive test case selection

UC SANTA CRUZ Equivalence Class Testing  Motivating idea:  Many inputs to a program will exercise the same path through its source code  It is usually the case that, for a given path through the code, one set of inputs is just as good as any other set of inputs for a test case  Not always true – recall that for the triangle problem, for some implementation approaches, input values near Maxint would lead to integer overflow  So, if it is possible to partition the input space (or output space), then the only tests necessary are one per equivalence class.  “The idea of equivalence class testing is to identify test cases by using one element from each equivalence class”  Software Testing: A Craftman’s Approach 3 rd ed., Paul C. Jorgensen

UC SANTA CRUZ Example  Consider collision detection for the hypothetical game, “Lego Gradius”  Vic Viper needs to support parts of the ship being knocked off  A hit to the side wings by a point bullet only knocks off a wing  A hit to the main body destroys the entire ship  The tail area and front wings are immune to hits Lots more Lego Gradius goodness… abcd e f g h Two input variables: bul_x and bul_y (bullet location) a ≤ bul_x ≤ d intervals: [a, b), [b, c), [c, d] e ≤ bul_y ≤ h intervals: [e, f), [f, g], (g, h]

UC SANTA CRUZ Weak Normal Equivalence Testing  Pick one value from each equivalent class for each input variable  Test case 1: {[a,b), [e,f)}  Test case 2: {[b,c), [f,g]}  Test case 3: {[c,d], (g,h]}  Problem: misses some equivalence classes abcd e f g h Two input variables: bul_x and bul_y (bullet location) a ≤ bul_x ≤ d intervals: [a, b), [b, c), [c, d] e ≤ bul_y ≤ h intervals: [e, f), [f, g], (g, h]

UC SANTA CRUZ Strong Normal Equivalence Testing  Pick one value from the combination of each equivalence interval for one variable with every equivalence interval for the second variable  Test case 1: {[a,b), [e,f)}  Test case 2: {[a,b), [f,g]}  Test case 3: {[a,b), (g,h]}  Test case 4: {[b,c), [e,f)]} …  Problem: doesn’t test values outside of intervals… abcd e f g h Two input variables: bul_x and bul_y (bullet location) a ≤ bul_x ≤ d intervals: [a, b), [b, c), [c, d] e ≤ bul_y ≤ h intervals: [e, f), [f, g], (g, h]

UC SANTA CRUZ Weak Robust Equivalence Testing  Pick one value from each equivalent class for each input variable, and include invalid values as equivalence classes  Test case 1: {< a, [e,f)}  Test case 2: {[a,b), [e,f)}  Test case 3: {[b,c), [f,g]}  Problem: misses some equivalence classes abcd e f g h Two input variables: bul_x and bul_y (bullet location) a ≤ bul_x ≤ d intervals: [a, b), [b, c), [c, d] e ≤ bul_y ≤ h intervals: [e, f), [f, g], (g, h] Test case 4: {[c,d], (g,h]} Test case 5: { > d, > h} Test case 6: {[b,c), < e}

UC SANTA CRUZ Strong Robust Equivalence Testing  Pick one value from the combination of each equivalence interval for one variable with every equivalence interval for the second variable, including invalid values as equivalence classes  Test case 1: {[a,b), [e,f)}  Test case 2: {[a,b), [f,g]}  Test case 3: {[a,b), (g,h]}  Test case 4: {[b,c), [e,f)]} …  Problem: none, but results in lots of test cases (expensive) abcd e f g h Two input variables: bul_x and bul_y (bullet location) a ≤ bul_x ≤ d intervals: [a, b), [b, c), [c, d] also: d e ≤ bul_y ≤ h intervals: [e, f), [f, g], (g, h] also: h

UC SANTA CRUZ Triangle problem (revisited)  Triangle problem  A program reads three integer values from the console. The three values are interpreted as representing the lengths of the sides of a triangle. The program prints a message that states whether the triangle is scalene, isosceles, or equilateral.  Recall:  Equilateral triangle:  Three equal sides, three equal angles (all 60 degrees)  Isosceles triangle:  Two equal sides, two equal angles  Scalene triangle:  No equal sides, no equal angles  Use Output classes  In this case, is best to develop test cases based on output equivalence classes  What are they?  What are examples of weak normal, strong normal, weak robust, and strong robust test cases (assume max side length of 200)

UC SANTA CRUZ Decision Table Testing  Core idea:  Use a table format to capture the relationship between input conditions and output actions  Then, use combination of conditions and actions to develop test cases  Benefit: tables provide a rigorous way of specifying conditions and actions, since omissions are very clear (an empty part of the table)  Decision table format: Stub portion Entry portion

UC SANTA CRUZ Decision Table Format  Conditions describe logical states the software can be in  Often describe ranges of inputs  Actions describe possible things the software can do  Often are the outputs  Rules describe what actions (if any) occur for a given set of conditions Conditions (condition stub) Actions (action stub) Rule 1Rule 2Rule n …

UC SANTA CRUZ Triangle Problem as Decision Table C1: a,b,c form a triangle? FTTTTTTTT C2: a=b?--TTTTFFFF C3: a=c?--TTFFTTFF C4: b=c?--TFTFTFTF A1: Not a triangleX A2: Scalene A3: Isosceles A4: EquilateralX A5: ImpossibleXX From: Software Testing: A Craftman’s Approach 3 rd Ed. Fill-in last 5 columns

UC SANTA CRUZ Triangle Problem as Decision Table C1: a,b,c form a triangle? FTTTTTTTT C2: a=b?--TTTTFFFF C3: a=c?--TTFFTTFF C4: b=c?--TFTFTFTF A1: Not a triangleX A2: ScaleneX A3: IsoscelesXXX A4: EquilateralX A5: ImpossibleXXX From: Software Testing: A Craftman’s Approach 3 rd Ed.

UC SANTA CRUZ Generating Test Cases from Decision Table  For each row in the table, pick actual values that satisfy the conditions, and then verify the output state(s) match the actions.  In-class:  Find example test cases from the complete decision table for the triangle problem  What are the pros and cons of these test cases?

UC SANTA CRUZ Random Testing  One final black box testing technique is to use a random number generator to create inputs.  Requires a stopping criteria  Could generate random numbers forever – how many do you want?  Pro:  Can make a lot of tests, cheaply  Explore larger, variable part of input space  Con:  Need to control seed to ensure test cases are repeatable (otherwise vary with each run)  Tests are more computationally expensive to run  Tests may require more work to create for the first time  Example: random testing the triangle problem  Randomly pick triangle side values  Stopping criteria: continue until every possible output has been seen at least once  Example (from book):  1289 test cases, of which 663 are nontriangles, 593 are scalene, 32 are isosceles, and 1 is equilateral

UC SANTA CRUZ Oracles  No, not the database company  Think Oracle of Delphi  Tells the truth about your future  A test oracle is a function that determines whether the program’s output for a given set of inputs is correct or incorrect.  A human being often acts as the test oracle.  Either manually testing code, or determining what the outputs should be for an automatically run regression test with static inputs  For random testing, the test oracle must be a computational function.  It can often be as difficult to make the test oracle as it is to create the test

UC SANTA CRUZ