21-25 Feb 2001SIGCSE 2001 Integrating Testing into the Curriculum – Arsenic in Small Doses Edward L. Jones CIS Department Florida A&M University Tallahassee,

Slides:



Advertisements
Similar presentations
Assessing Student Performance
Advertisements

A Systems Approach To Training
A BPM Framework for KPI-Driven Performance Management
K-6 Science and Technology Consistent teaching – Assessing K-6 Science and Technology © 2006 Curriculum K-12 Directorate, NSW Department of Education and.
Pre and Post Assessments A quick and easy way to assess your Student Learning Outcomes.
PORTFOLIO.
Test Automation Success: Choosing the Right People & Process
Edward S. Shapiro Director, Center for Promoting Research to Practice Lehigh University, Bethlehem, PA Planning for the Implementation of RTI: Lessons.
HP Quality Center Overview.
Introduction to Project Management
WELCOME A little effort to show the Importance of career……….. 1.
Good teaching, good teachers and comparative analysis Fernando Reimers.
Recall The Team Skills Analyzing the Problem
Unit Assessment Plan Weber State University’s Teacher Preparation Program.
project management office(PMO)
Software Testing Prasad G.
Introduction to Software Testing
Chapter 15 Information Technology Careers.
Evolving an Elective Software Testing Course: Lessons Learned Edward L. Jones Florida A&M University Tallahassee, FL USA 3rd Workshop on Teaching Software.
SYSTEMS ANALYSIS. Chapter Five Systems Analysis Define systems analysis Describe the preliminary investigation, problem analysis, requirements analysis,
© 2006, Cognizant Technology Solutions. All Rights Reserved. The information contained herein is subject to change without notice. Automation – How to.
Teachers have a significant role in developing and implementing the most effective teaching and learning strategies in their classroom and striving for.
Interactive Science Notebooks: Putting the Next Generation Practices into Action
Test Design Techniques
Content Mastery Center- Opening the Doors of Success
REVIEW AND QUALITY CONTROL
Effective Methods for Software and Systems Integration
© Paradigm Publishing Inc Chapter 15 Information Technology Careers.
Meeting SB 290 District Evaluation Requirements
Day 1 Session 2/ Programme Objectives
CHAPTER 5 Infrastructure Components PART I. 2 ESGD5125 SEM II 2009/2010 Dr. Samy Abu Naser 2 Learning Objectives: To discuss: The need for SQA procedures.
+ Instructional Design Models EDU 560 Fall 2012 Online Module November 13, 2012.
Software Quality Assurance and Testing prof. A. C. (Alex) Telea Course description.
Symposium 2001June 24, 2001 Curriculum Is Just the Beginning Chris Stephenson University of Waterloo.
1 IBM Software Group ® Mastering Object-Oriented Analysis and Design with UML 2.0 Module 1: Best Practices of Software Engineering.
August 3,  Review “Guiding Principles for SLO Assessment” (ASCCC, 2010)  Review Assessment Pulse Roundtable results  Discuss and formulate our.
Introduction Telerik Software Academy Software Quality Assurance.
Monitoring through Walk-Throughs Participants are expected to purpose the book: The Three-Minute Classroom Walk-Through: Changing School Supervisory.
A COMPETENCY APPROACH TO HUMAN RESOURCE MANAGEMENT
Software Testing The process of operating a system or component under specified conditions, observing and recording the results, and making an evaluation.
4 Nov 2000CCSC SE Grading Student Programs: A Software Testing Approach 14th CCSC Southeastern Conference November 3-4, 2000 Edward L. Jones CIS Department.
Yazd University, Electrical and Computer Engineering Department Course Title: Advanced Software Engineering By: Mohammad Ali Zare Chahooki The Project.
Holistic Approach to Security
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
Record Keeping and Using Data to Determine Report Card Markings.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
CEN 5070 – Software V&V What is Software Testing © , Dr. E.L. Jones.
SPRAE A Framework for Teaching Software Testing Edward L. Jones Florida A&M University.
Data Analysis Processes: Cause and Effect Linking Data Analysis Processes to Teacher Evaluation Name of School.
A Guide for Management. Overview Benefits of entity-level controls Nature of entity-level controls Types of entity-level controls, control objectives,
Repositories in CS Courses - An Evolutionary Tale Edward L. Jones & Clement S. Allen Florida A&M University Tallahassee, FL USA ITiCSE.
What Are the Characteristics of an Effective Portfolio? By Jay Barrett.
My Reflections Jennifer L. Ceville ED573-EC01 Dr. Matthews Kaplan University December 07, 2005.
+ Instructional Design Models EDU 560 Fall 2012 Online Module November 13, 2012.
21 st Century Learning and Instruction Session 2: Balanced Assessment.
Springfield Public Schools SEEDS: Collecting Evidence for Educators Winter 2013.
FLORIDA EDUCATORS ACCOMPLISHED PRACTICES Newly revised.
Investigate Plan Design Create Evaluate (Test it to objective evaluation at each stage of the design cycle) state – describe - explain the problem some.
Instructional Leadership and Application of the Standards Aligned System Act 45 Program Requirements and ITQ Content Review October 14, 2010.
This has been created by QA InfoTech. Choose QA InfoTech as your Automated testing partner. Visit for more information.
Richard Woods, Georgia’s School Superintendent “Educating Georgia’s Future” gadoe.org Quality Comprehensive Improvement System Key School Performance Standards.
Florida A&M University
Chapter 15 Information Technology Careers.
Florida A&M University
The Florida A&M University Software TestLab Role and Opportunities
ASSESSMENT OF STUDENT LEARNING
HRM 326 Possible Is Everything/tutorialrank.com
HRM 326 Education for Service/tutorialrank.com
Human Resources Management: Module 2
Topic Principles and Theories in Curriculum Development
Presentation transcript:

21-25 Feb 2001SIGCSE 2001 Integrating Testing into the Curriculum – Arsenic in Small Doses Edward L. Jones CIS Department Florida A&M University Tallahassee, FL USA

21-25 Feb 2001SIGCSE 2001 Lessons from ACE2000 Practice (10 slides) SPRAE framework, carefully explained Arsenic pills: –7 things to do –Automated grading –Student mentoring Tester certification in the experience factory On-going work

21-25 Feb 2001SIGCSE 2001 Motivation Industry need Testing experience adds value Avoid isolationist view – confine to course Salt courses with test related experiences Provide opportunity for advanced study Impact our curriculum

21-25 Feb 2001SIGCSE 2001 Possible Approaches Teach a course in software testing Expose student work to rigorous testing Train selected students Opportunistic insertion of testing into existing courses Formal training environment

21-25 Feb 2001SIGCSE 2001 The Holistic Approach Software Test Lab Core Curriculum Elective Testing Course Testing In Action (Automated Grading) SPRAE Testing Framework

21-25 Feb 2001SIGCSE 2001 What is Meant by Holistic Unifying framework – what’s important Experience-based Experiences aligned to framework Multiple experiences in different contexts Multiple means of delivery

21-25 Feb 2001SIGCSE 2001 The SPRAE Framework S Specification – the basis for testing P Premeditation – a systematic process R Repeatability – tester independence A Accountability – documented process, results E Economy – cost-effective practices

21-25 Feb 2001SIGCSE 2001 Test Life Cycle Analysis Design Implementation Execution Evaluation Specification Test Plan Test Script, Data, Driver Defect Data Problem Reports Test Results, Log Test Cases

21-25 Feb 2001SIGCSE 2001 The Software Testing Course 80% practice, 20% theory 2/3 fine-grained testing (functions) 1/3 object and application testing Test cases, drivers, and scripts Decision tables the "formalism" of choice Functional, boundary, white-box testing Evaluation via coverage & error seeding

21-25 Feb 2001SIGCSE 2001 Course -- Lessons Learned Advantages –In-depth, continuous concept treatment –Complement to other software skills –Basis for future advanced work Deficiencies –Why Johnny can’t test? Programming skills –Not a mainstream course available to all –Students compartmentalize course content

21-25 Feb 2001SIGCSE 2001 Testing in Action – Automated Program Grading Prepare AssignmentImplement GraderGrade Programs Write Program Grading Log Grade Report Student Program Assignment Specification Assignment Specification Test plan Test cases Test driver Checker script Automated Grader Overhead

21-25 Feb 2001SIGCSE 2001 Results -- Automated Program Grading Student shock & outrage at exactness Behavior modification – tester mindset Extra work on teacher –Specification must be better than average! –Practice what you preach (test process) –Cost amortized via similar assignment styles Is grader too strict for CS1/CS2? Selling colleagues on the idea!

21-25 Feb 2001SIGCSE 2001 Opportunistic Insertion into Existing Courses Risk of diluting course content Hard to transfer to colleagues (comfort zone with subject matter) Value-Added approach –Testing strengthens other skills –Testing brings objectivity to student Some examples

21-25 Feb 2001SIGCSE 2001 Simple Things to Try Grade another student’s program and justify grade -- in writing. (Group) Develop test cases before writing program. Treasure Hunt. Find seeded errors, document process followed, give evidence of fix. Develop and sell certified components to be used in subsequent assignments. Blind testing. Write specification from executable.

21-25 Feb 2001SIGCSE 2001 Mentoring Students - Experiences Hand picked students -- paid Series of testing projects Skills taught as needed Student access to examples from courses, other students MANAGEMENT NIGHTHMARE! Structured environment and process needed!

21-25 Feb 2001SIGCSE 2001 The Software TestLab Environment for discovery & learning An evolving laboratory Tools & Tutorials Staff (students, faculty) Test problem/solution test bed Students participate in the evolution Feedback on lab resources Create/Refine resources Technology insertion into classroom Vison

21-25 Feb 2001SIGCSE 2001 TestLab -- The Big Picture Marketing Interns New Hires Support Curriculum Students Software TestLab Research Publications Corporate Sponsors

21-25 Feb 2001SIGCSE 2001 Student Mentorship Model Manage skill development Set clear achievement goals Key Practices x Competency Levels Certify levels of progression Enable student-student mentoring Establish recognition program Mentorship

21-25 Feb 2001SIGCSE 2001 Key Practices Practitioner - performs defined test. Builder - constructs test “machinery” Designer - designs test cases. Analyst - determines test needs, strategy. Inspector - verifies correct process, results. Environmentalist - establishes & maintains the test environment. Specialist - performs entire test life cycle. Mentorship

21-25 Feb 2001SIGCSE 2001 Competency Levels Practitioner Test Designer Test Analyst Test Inspector Test Environmentalist Test SPECIALIST 1 Test Builder Test Practitioner Mentorship

21-25 Feb 2001SIGCSE 2001 Test Specialist I Practitioner I - Run function test script & document test results. Builder I - Develop function test driver. Designer I - Develop functional and boundary test cases from decision table. Analyst I - Develop decision table from functional specification. Inspector I - Verify adherence to function test process and standards. Mentorship

21-25 Feb 2001SIGCSE 2001 Certification Testbed Repository of testing problems Repository of students’ test artifacts Best in class promoted to solutions testbed Deficient solutions used for future tester certification Infrastructure

21-25 Feb 2001SIGCSE 2001 Conclusions Testing must compete for air time with existing subject matter Opportunities to insert testing exist Testing can bring added value to course Need rigorous study of value-added hypothesis Biggest job may be selling colleagues

21-25 Feb 2001SIGCSE 2001 On-Going & Future Work Evolve TestLab Mentorship Model –Experience Ladder & Certification –Evolving Problem/Solution Artifacts Careful study of value-added hypothesis Exploit automated grading – student mindset Disseminate Results

21-25 Feb 2001SIGCSE 2001 Questions?

21-25 Feb 2001SIGCSE 2001 Thank You

21-25 Feb 2001SIGCSE 2001 Training Sequence (1) TestLab Environment Basic Training Unix basics C++/language refresher Encapsulation of function under test Repositories Infrastructure

21-25 Feb 2001SIGCSE 2001 Training Sequence (2) Black-Box Function (unit) Testing Specification Stimulus-response test cases Function Test driver (5 styles) Test driver input/results files Test script (set-up + perform) Test log Decision tables Functional (partition) test case design Boundary test case design Infrastructure

21-25 Feb 2001SIGCSE 2001 Training Sequence (3) Black-Box Object Testing Specification of object’s methods Analysis of method’s stimulus-response Test planning Test cases = method + stimulus + response Object Test driver Object Test driver input/results files Operational test scenarios Infrastructure

21-25 Feb 2001SIGCSE 2001 Training Sequence (4) White-Box Function Testing Control flow graph basics Coverage criteria/measures Instrumentation for data collection Use of in-house coverage tools Coverage analysis White-box coverage during black-box test Supplemental white-box test cases Infrastructure

21-25 Feb 2001SIGCSE 2001 Training Sequence (5) Clear-Box Object Testing Goal is to overcome information hiding Test windows into internal object state set_state ( … ) get_state ( … ) Test case Stimulus: precondition + method-stimulus Response:method-response + postcondition Clear-box object test driver Clear-box test oracles Infrastructure

21-25 Feb 2001SIGCSE 2001 Training Sequence TestLab Environment Basic Training Black-Box Function (unit) Testing Black-Box Object Testing White-Box Function Testing Clear-Box Object Testing Infrastructure

21-25 Feb 2001SIGCSE 2001 TestLab Infrastructure SPRAE Framework / Lifecycle Software Testing Course Training Sequence Standards & Techniques Student Mentorship Model Problem & Solution Testbeds Status

21-25 Feb 2001SIGCSE 2001 Standard Products Specification (narrative, semiformal) Decision Table Test Plan Test Script Test Driver Test Driver Input File Test Results File Test Log Infrastructure

21-25 Feb 2001SIGCSE 2001 Techniques Functional Equivalence Partitioning Boundary Value Analysis Function Encapsulation Control Flow Analysis Error-seeding for tester certification Infrastructure

21-25 Feb 2001SIGCSE 2001 What Makes It Holistic? Testing an integral part of curriculum Goal is multiple test experiences At least one experience in each course Repetition and reinforcement Accumulation of experiences Coverage of test lifecycle

21-25 Feb 2001SIGCSE 2001 Course Outline 1 Course Overview 2 Software Quality Assurance 3 The Practice of Testing 4 Specification-Driven Testing 5 Boundary Testing 6 Measuring Test Effectiveness 7 Testing Object-Oriented Software 8 Application Testing 9 Course Review & Wrap-Up

21-25 Feb 2001SIGCSE 2001 Automation Issues Does the teacher have the time? Is grader too strict for CS1/CS2? Additional automation to lower cost. The trap: “just a little more automation” Selling colleagues on the idea!

21-25 Feb 2001SIGCSE 2001 Outline The Holistic Approach The SPRAE Testing Framework The Software Testing Course Automated Program Grading Opportunistic Insertion The Software Test Lab Conclusions / Future Work

21-25 Feb 2001SIGCSE 2001 Example - Pay (S) Specification: Compute pay for an employee, given Hours worked and hourly pay Rate; overtime is 1.5 times hourly Rate, for Hours above 40. Hours Rate Pay Compute Pay

21-25 Feb 2001SIGCSE 2001 Principle P Premeditation: Use a systematic process to devise test cases based on the specification. Our Technique: Decision analysis -- identify behaviors One test case per behavior Determine expected result

21-25 Feb 2001SIGCSE 2001 Example - Pay Test Case Design: Decision Table. Columns identify behaviors to test.

21-25 Feb 2001SIGCSE 2001 Principle R Repeatability: Processes for test case creation and test execution must yield equivalent results, independently of the tester.

21-25 Feb 2001SIGCSE 2001 Principle A Accountability: Keep records that document test process and artifacts. Documentation answers: What tests were planned? Which tests were conducted? Who did what testing, and when? What were the results? How were the results interpreted?

21-25 Feb 2001SIGCSE 2001 Example - Pay Repeatability/Accountability:

21-25 Feb 2001SIGCSE 2001 Principle E Economy: Test activities must not require excessive time or effort. Automation Test drivers (classical tool) Simplified processes for Test case generation Data collection