1 Testing—tuning and monitoring © 2013 by Larson Technical Services.

Slides:



Advertisements
Similar presentations
Objectives: At the end of the class, students will (hopefully) be able to: Explain the importance of a good presentation List the steps they will take.
Advertisements

User problems, scenarios and storyboards
“People do not want to be less informed. They want to be more informed with less information!” To move forward, backward or to a Contents page, move your.
The Systems Analysis Toolkit
1 Usability Test Plans CSSE 376 Software Quality Assurance Rose-Hulman Institute of Technology April 20, 2007.
Copyright © 2003 by The McGraw-Hill Companies, Inc. All rights reserved. Business and Administrative Communication SIXTH EDITION.
Define usability testing Usability is all about how easy a product, service or system is to use. The extent to which a product can be used by specified.
Unit 8: Tests, Training, and Exercises Unit Introduction and Overview Unit objectives:  Define and explain the terms tests, training, and exercises. 
An evaluation framework
APPLICATION DEVELOPMENT BY SYED ADNAN ALI.
3 Chapter Needs Assessment.
Fact-Finding Fact-Finding Overview
Project Workshops Results and Evaluation. General The Results section presents the results to demonstrate the performance of the proposed solution. It.
Damian Gordon.  Summary and Relevance of topic paper  Definition of Usability Testing ◦ Formal vs. Informal methods of testing  Testing Basics ◦ Five.
Report Writing Three phases of report writing Exploratory phase (MAPS)
Proposal Writing.
How to Develop the Right Research Questions for Program Evaluation
1. Learning Outcomes At the end of this lecture, you should be able to: –Define the term “Usability Engineering” –Describe the various steps involved.
System Testing There are several steps in testing the system: –Function testing –Performance testing –Acceptance testing –Installation testing.
9 Closing the Project Teaching Strategies
EMPRICAL RESEARCH REPORTS
Unit 5:Elements of A Viable COOP Capability (cont.)  Define and explain the terms tests, training, and exercises (TT&E)  Explain the importance of a.
Evaluation Framework Prevention vs. Intervention CHONG POH WAN 21 JUNE 2011.
Slide 1 D2.TCS.CL5.04. Subject Elements This unit comprises five Elements: 1.Define the need for tourism product research 2.Develop the research to be.
Output and User Interface Design
Goal Setting The foundation of a plan for success includes goal setting and the achievement of goals.
Dobrin / Keller / Weisser : Technical Communication in the Twenty-First Century. © 2008 Pearson Education. Upper Saddle River, NJ, All Rights Reserved.
Incorporating Pragmatic Usability Testing Into a Software Test Plan Carla Merrill, Ph.D. Focused Design focuseddesign.com
Ways for Improvement of Validity of Qualifications PHARE TVET RO2006/ Training and Advice for Further Development of the TVET.
Organizing Your Information
What is Usability? Usability Is a measure of how easy it is to use something: –How easy will the use of the software be for a typical user to understand,
Equipment User Manual Technical Writing Yasir Jan College of EME.
© 2012 Cengage Learning. All Rights Reserved. This edition is intended for use outside of the U.S. only, with content that may be different from the U.S.
Usability Testing Chapter 6. Reliability Can you repeat the test?
IS Analysis and Design. SDLC Systems Development Life Cycle Break problems into management review stages Control cost and time Works best with well understood.
Search Engine Optimization © HiTech Institute. All rights reserved. Slide 1 What is Solution Assessment & Validation?
Chapter 8 Usability Specification Techniques Hix & Hartson.
Evaluating a UI Design Expert inspection methods Cognitive Walkthrough
Copyright 2010, The World Bank Group. All Rights Reserved. Testing and Documentation Part II.
Selection, Administration, Scoring, and Communicating Assessment Results Chapter 5.
Hi [name]! Welcome to [camp]. I’m camp counselor Biebs. Over the years our camp as grown so much, and each summer we have hundreds of boys and girls come.
Usability Evaluation, part 2. REVIEW: A Test Plan Checklist, 1 Goal of the test? Specific questions you want to answer? Who will be the experimenter?
Dobrin / Weisser / Keller: Technical Communication in the Twenty-First Century. © 2010 Pearson Education. Upper Saddle River, NJ, All Rights Reserved.
Software Engineering1  Verification: The software should conform to its specification  Validation: The software should do what the user really requires.
How to be Successful During a Job Interview ZARA ZEITOUNTSIAN DIRECTOR OF COMMUNICATIONS AUA.
Usability Testing TECM 4180 Dr. Lam. What is Usability? A quality attribute that assesses how easy user interfaces are to use Learnability – Ease of use.
New Supervisors’ Guide To Effective Supervision
After testing users Compile Data Compile Data Summarize Summarize Analyze Analyze Develop recommendations Develop recommendations Produce final report.
Task Analysis Lecture # 8 Gabriel Spitz 1. Key Points  Task Analysis is a critical element of UI Design  It describes what is a user doing or will.
Task Analysis Lecture # 8 Gabriel Spitz 1. Key Points  Task Analysis is a critical element of UI Design  It specifies what functions the user will need.
Focus groups. What are Focus Groups?  A focus group is basically a way to reach out to potential users for feedback and comment.
Usability Evaluation or, “I can’t figure this out...do I still get the donuts?”
Usability Engineering Dr. Dania Bilal IS 582 Spring 2007.
Usability Engineering Dr. Dania Bilal IS 587 Fall 2007.
Chapter Two Copyright © 2006 McGraw-Hill/Irwin The Marketing Research Process.
Dr Jane Tonge Senior Examiner
Evaluation / Usability. ImplementDesignAnalysisEvaluateDevelop ADDIE.
Assistant Instructor Nian K. Ghafoor Feb Definition of Proposal Proposal is a plan for master’s thesis or doctoral dissertation which provides the.
1 Design and evaluation methods: Objectives n Design life cycle: HF input and neglect n Levels of system design: Going beyond the interface n Sources of.
Day 8 Usability testing.
CEN3722 Human Computer Interaction User Testing
3 Chapter Needs Assessment.
Task Analysis – Input to Interaction
Fundamentals of Information Systems, Sixth Edition
Report Writing Three phases of report writing Exploratory phase (MAPS)
SYSTEMS ANALYSIS Chapter-2.
Usability Evaluation, part 2
COMP444 Human Computer Interaction Usability Engineering
Presentation transcript:

1 Testing—tuning and monitoring © 2013 by Larson Technical Services

2 Reasons for Testing During development, developers focus on the system, not on the person who will use the system. Developers believe they are typical users and fail to understand why other users have trouble using the speech application. The design of a usable system is difficult and unpredictable, yet managers believe that usability is just “common sense.” ©2002 Larson Technical Services © 2013 by Larson Technical Services

3 Questions Answered Only by Testing Is the current application usable? Is the current application ready to deploy or does it need more work? ©2002 Larson Technical Services © 2013 by Larson Technical Services

4 Consequences of Not Testing User problems –People will not use the application if it is difficult to use or fails. The help line may be flooded with calls for help. Revenues will drop –No matter what the business model, revenues will drop. Bad reputation –The company developing the speech application will gain a bad reputation, which may linger for years after the problem itself disappears. ©2002 Larson Technical Services © 2013 by Larson Technical Services

5 When to Test? Test Early Test Often ©2002 Larson Technical Services © 2013 by Larson Technical Services

In class exercise Sketch 5 usage scenarios for your application Usage scenario –Detailed example of what the user does when using your application –Includes example dialogs between user and application Hint: consider using state transition systems to describe dialogs with state containing output to user and transitions describing user input 6© 2013 by Larson Technical Services

7 Testing Is the application useful?Is the application enjoyable? PerformancePreference Measure what users actually accomplished Validate that users achieved success Measure users’ likes and dislikes Validate that users enjoyed the application and will use it again Section 11.2©2002 Larson Technical Services © 2013 by Larson Technical Services

8 Performance—General Approach Ask users to perform specific scenarios Measure users’ successes/failures Example tasks and measurements –User speaks a command—Word error rate –User hears a prompt—User performs an appropriate action –User requests a bank transfer—Time/turns to complete the transaction successfully Performance Preference ©2002 Larson Technical Services © 2013 by Larson Technical Services

9 Preference—General Approach Ask users specific questions about their likes and dislikes Ask users open-ended questions Examples –On a scale from –5 to +5, rate the help facility –Do you prefer listening to the male or female voice? –What would you change about the application? –What do you like best about the application? Performance Preference ©2002 Larson Technical Services © 2013 by Larson Technical Services

In class exercise Write 10 preference questions 10© 2013 by Larson Technical Services

11 When to Test? Identify Application Conduct ethnography studies Identify candidate applications Conduct focus groups Select the application Deploy and Monitor Application Monitor application Develop Application Specify the persona Specify dialog structure Specify dialog script Validate initial design Validate app functions Choose Technology Test components Integrate components Specify Application Construct content model Construct scenarios Specify performance & preference requirements Develop Business Model Test Application Usability test Qualify test Stress test Field test Investigation Stage Design Stage Development Stage Testing Stage Sustaining Stage ©2002 Larson Technical Services © 2013 by Larson Technical Services

12 Some Types of Tests Conceptual Design QualificationStressField Continual Monitoring Component Test at every stage of development Section 9.4©2002 Larson Technical Services © 2013 by Larson Technical Services

Test Plan 1.Purpose 2.Problem statement/test objectives 3.Subject profile 4.Test design 5.Monitor 6.Evaluation measures (data to be collected) 7.Report contents and presentation 13© 2013 by Larson Technical Services

1. Purpose Which type of test? –Comparison test—Which is “better?” –Pilot test—Test the test procedure –Field test—Can real users really benefit by using the system –Beta test—Users help debug the system –Acceptance test—Validate that the system satisfy the requirements? What do we hope to learn? 14© 2013 by Larson Technical Services

2. Test objectives Avoid unfocused and vague problem statements Examples of poor objectives –Is the current product usable? –Is the product ready for release or does it need more work? Examples of good objectives –Do the screens reflect the end content model? –Is help easier to access via a “hot key” or via a mouse selection? 15© 2013 by Larson Technical Services

3. Subject profile General computer experience –Range: none to two years Education: –Range: 10% high school, 60 % college, 20% masters 10% Phd Age –Range: 85% ages 20-50, 15% other Gender Education Major: 0% CS, 100% other 16© 2013 by Larson Technical Services

4. Test design Detailed plan for conducting the test –Groups of subjects, e.g., Group A: Mary, Fred, Sam, Jose Group B: Sue, Ron, Bob, Sally –Tests Group A does test 1 Group B does test 2 –Subjects read from printed instruction script* –Conduct a pilot test 17© 2013 by Larson Technical Services

Instruction script Orientation –Introduce yourself –Offer refreshments –Explain why they are here –Describe the equipment –Explain what is expected of the participant –Assure the participant that they are not being tested –Ask for questions 18© 2013 by Larson Technical Services

Instruction Script (continued) Nondisclosure form scenarios which the user is to perform –Describe the scenario, but not how to do it. For example “Turn on the bedroom lights” but not “Click or speak the desired widget, then click or speak the desired command” Written debriefing questionnaire –Preference questions –Open-ended questions 19© 2013 by Larson Technical Services

In class exercise Write Test Instruction Script Note that tester may NOT speak with user during the test. 20© 2013 by Larson Technical Services

5. Monitor Try to be objective Enable rather than lead the subject Acting too knowledgeable Don’t jump to conclusions Let the subject struggle Inform the subject we are testing application, not the user 21© 2013 by Larson Technical Services

6. Evaluation measures Performance –Data collected during the test Preference –Opinions collected during debreafing 22© 2013 by Larson Technical Services

7. Report contents and presentation Collect data during test Collect data during debriefing Summarize data Analyze data –Focus on tasks that did not meet criterion –Analyze source of error –Prioritize problems by criticality Develop recommendations –Focus on solutions that will have the widest impact –Ignore “political considerations” –Provide short-term and long-term recommendations –Identify areas of further research Prepare final report –Executive summary section –Method section –Results section –Findings and recommendations discussion 23© 2013 by Larson Technical Services