Download presentation
Presentation is loading. Please wait.
1
1 Debriefing, Recommendations CSSE 376, Software Quality Assurance Rose-Hulman Institute of Technology May 3, 2007
2
2 Outline Post-test Questionnaire Debriefing Final Report Findings Analysis Recommendations
3
3 Post-test Questionnaire Purpose: collect preference information from participant May also be used to collect background information
4
4 Likert Scales Overall, I found the widget easy to use a.strongly agree b.agree c.neither agree nor disagree d.disagree e.strongly disagree
5
5 Semantic Differentials Circle the number closest to your feelings about the product: Simple..3..2..1..0..1..2..3.. Complex Easy to use..3..2..1..0..1..2..3.. Hard to use Familiar..3..2..1..0..1..2..3.. Unfamiliar Reliable..3..2..1..0..1..2..3.. Unreliable
6
6 Free-form Questions I found the following aspects of the product particularly easy to use ________________________________
7
7 First Cartoon of the Day
8
8 Debriefing Purpose: find out why the participant behaved the way they did. Method: interview May focus on specific behaviors observed during the test.
9
9 Debriefing Guidelines 1. Review participant's behaviors and post-test answers. 2. Let participant say whatever is on their mind. 3. Start with high-level issues and move on to specific issues. 4. Focus on understanding problems, not on problem-solving.
10
10 Debriefing Techniques What did you remember? When you finished inserting an appointment did you notice any change in the information display? Devil's advocate Gee, other people we've brought in have responded in quite the opposite way.
11
11 Findings Summarize what you have learned Performance Preferences
12
12 Performance Findings Mean time to complete Median time to complete Mean number of errors Median number of errors Percentage of participants performing successfully
13
13 Preference Findings Limited-choice questions sum each answer compute averages to compare questions Free-form questions group similar answers
14
14 Second Cartoon of the Day
15
15 Analysis Focus on problematic tasks 70% of participants failed to complete task successfully Conduct a source of error analysis look for multiple causes look at multiple participants Prioritize problems by criticality Criticality = Severity + Probability
16
16 Recommendations (1/2) Get some perspective wait until a couple days after testing collect thoughts from group of testers get buy-in from developers Focus on highest impact areas
17
17 Recommendations (2/2) Ignore "political considerations" for first draft Provide short-term and long-term solutions short-term: will not significantly affect development schedule long-term: needed for ultimate success of product
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.