Deciding How to Measure Usability How to conduct successful user requirements activity?

Slides:



Advertisements
Similar presentations
Chapter 13: An evaluation framework
Advertisements

Oral Presentations.
Philanthropy, Values and Citizenship
Testing through user observations User Observation: Guidelines for Apple Developers, Kathleen Gomoll & Anne Nicol, January 1990 Notes based on:
Data gathering. Overview Four key issues of data gathering Data recording Interviews Questionnaires Observation Choosing and combining techniques.
Entering the Band Room C Conversation  Voice Level 1 or 2 H Help  Clipboard in the front of the room or wait for class to start and raise your hand A.
6.811 / PPAT: Principles and Practice of Assistive Technology Wednesday, 16 October 2013 Prof. Rob Miller Today: User Testing.
CS305: HCI in SW Development Evaluation (Return to…)
Web E’s goal is for you to understand how to create an initial interaction design and how to evaluate that design by studying a sample. Web F’s goal is.
Each individual person is working on a GUI subset. The goal is for you to create screens for three specific tasks your user will do from your GUI Project.
Customer: Rosalva Gallardo Team members: Susan Lin Buda Chiou Jim Milewski Marcos Mercado November 23, 2010.
Data gathering.
MS3307 Methods. There are four essential activities Project stageDescription Requirements gatheringUnderstanding and specifying the context of use Requirements.
Focus Groups. Contents What is a focus group and why use it Methods When to use Focus Groups Advantages and Disadvantages Example.
Think-aloud usability experiments or concurrent verbal accounts Judy Kay CHAI: Computer human adapted interaction research group School of Information.
User Interface Testing. Hall of Fame or Hall of Shame?  java.sun.com.
Lecture 5 CS171: Game Design Studio 1I UC Santa Cruz School of Engineering 18 Feb 2010.
Usable Privacy and Security Carnegie Mellon University Spring 2008 Lorrie Cranor 1 Designing user studies February.
Focus Groups for the Health Workforce Retention Study.
ICS 463, Intro to Human Computer Interaction Design: 8. Evaluation and Data Dan Suthers.
Web 2.0 Testing and Marketing E-engagement capacity enhancement for NGOs HKU ExCEL3.
Preparing a User Test Alfred Kobsa University of California, Irvine.
Key to the Future Chapter 6, Lesson 3 Warm-Up Questions CPS Questions 1 – 2 Note for teacher: Use “Pick a Student” button in CPS.
Safety On The Internet  Usage time  Locations that may be accessed  Parental controls  What information may be shared with others Online rules should.
Chapter 14: Usability testing and field studies
Communication Skills Anyone can hear. It is virtually automatic. Listening is another matter. It takes skill, patience, practice and conscious effort.
Chapter 23 How to collect data. This chapter is about the tools/techniques used to collect data Hang on, let’s review: what are we collecting? What’s.
CSCI 4163/6904, summer Quiz  Multiple choice  Answer individually - pass in  Then class discussion.
S556 SYSTEMS ANALYSIS & DESIGN Week 11. Creating a Vision (Solution) SLIS S556 2  Visioning:  Encourages you to think more systemically about your redesign.
Data gathering. Overview Four key issues of data gathering Data recording Interviews Questionnaires Observation Choosing and combining techniques.
Elementary State Testing Training Spring MCA – Estimated Times Estimated Times for Reading MCA-IIs GradeSubject Minutes per Segment (Student Work.
Gathering User Data IS 588 Dr. Dania Bilal Spring 2008.
Research and Analysis Methods October 5, Surveys Electronic vs. Paper Surveys –Electronic: very efficient but requires users willing to take them;
Human Computer Interaction
Evaluation Paradigms & Techniques IS 588 Spring 2008 Dr. D. Bilal.
Moderated by Bill Shields M.S. Department of Geography-Geology A Discussion of Large Lectures.
Usability Testing Chapter 6. Reliability Can you repeat the test?
Usability testing: A reality check for Plain Talk A brief overview October 7, 2008 Dana Howard Botka Manager, Customer Communications, L&I Plain Talk Coordinator,
Human Factors in Information Seeking and Use
COMP5047 Pervasive Computing: 2012 Think-aloud usability experiments or concurrent verbal accounts Judy Kay CHAI: Computer human adapted interaction research.
User Testing 101. Recruiting Users Find people with the same experience level as the typical user Don’t get people who are familiar with the product or.
CS5714 Usability Engineering Formative Evaluation of User Interaction: During Evaluation Session Copyright © 2003 H. Rex Hartson and Deborah Hix.
Planning and Conducting Data Collection – Community and Focus Group Discussions Session 3.1 Qualitative Approaches for FS Assessments.
Task Analysis Methods IST 331. March 16 th
WELCOME TO THE P ROBLEM- S OLVING G ROUP. PROBLEM-SOLVING Expectations 1. Show Respect- listening, raising your hand, taking turns speaking, sitting up.
Z556 Systems Analysis & Design Session 10 ILS Z556 1.
AVI/Psych 358/IE 340: Human Factors Data Gathering October 3, 2008.
Julia Eka Rini Petra Christian University, Surabaya LTBI Atma Jaya, Jakarta.
RESPONDING TO RULES HOW TO: MAKE COMPLAINTS TAKE “NO” FOR AN ANSWER DISAGREE APPROPRIATELY CHANGE RULES.
Preparing for the Assessment Day of the Assessment.
Usability Evaluation or, “I can’t figure this out...do I still get the donuts?”
Steps in Planning a Usability Test Determine Who We Want To Test Determine What We Want to Test Determine Our Test Metrics Write or Choose our Scenario.
PRESENTER: MS. CRYSTAL WATSON DATE: OCTOBER 4, 2014 Preparing for a Successful Job Interview.
Farm Bureau University 103. Please stand if they’ve ever attended a meeting. Remain standing if they’ve ever attended a meeting that went too long.
INTERVIEW TIPS WHAT YOU NEED TO KNOW BEFORE YOU GO INTO AN INTERVIEW.
School of Engineering and Information and Communication Technology KIT305/607 Mobile Application Development Week 7: Usability (think-alouds) Dr. Rainer.
Day 8 Usability testing.
02086 Writing Inspirations Aalto University
02086 Writing Inspirations Aalto University
Data Collection Interview
Lecture3 Data Gathering 1.
Information Gathering Using Quantitative Methods
02086 Writing Inspirations Aalto University
User Testing 101.
Chapter 23 Deciding how to collect data
CS305, HW1, Spring 2008 Evaluation Assignment
A quick guide to understanding and conducting a usability study.
Warm-Up: Take a sheet of paper from the tan bin.
Empirical Evaluation Data Collection: Techniques, methods, tricks Objective data IRB Clarification All research done outside the class (i.e., with non-class.
Presentation transcript:

Deciding How to Measure Usability How to conduct successful user requirements activity?

Deciding How to Measure Usability Understanding what you can measure Matching Measures to your Goals and concerns Matching Measures to the Product's Stage of Development Setting Quantitative Criteria for Each measure and each Task

Understanding what you can measure In a usability test you measure both Performance measures and Subjective measures

Understanding what you can measure Performance Measures: count of behaviors or actions you can see. Quantitative e.g. how many errors people make and how many times they repeat the same error

Understanding what you can measure Subjective Measures: people's perceptions, opinions and judgments. either quantitative or qualitative e.g. give a people 5 or 7 point scale and ask them to rate difficulty in using a product.

Deciding How to Measure Usability Understanding what you can measure Matching Measures to your Goals and concerns Matching Measures to the Product's Stage of Development Setting Quantitative Criteria for Each measure and each Task

Deciding How to Measure Usability Understanding what you can measure Matching Measures to your Goals and concerns Matching Measures to the Product's Stage of Development Setting Quantitative Criteria for Each measure and each Task

Matching Measures to your Goals and Concerns Performance measures chosen should be directly related to quantitative usability goals and concerns behind the usability test How to collect performance data? Data Logging Software

S... Stop the testB Take a Break T... Stop the taskA Assist M Menu ErrorH Help desk S Select from list error F Frustration E Other ErrorN Observations Event: O 06:21:33 Comment: Looking for online help for To: field

Points to be considered in building Data Logging Program Include preset codes for events that happen in every test. Code should be small Short description with code Timestamp every event Allow easy backup and movement of data into a file What if you don't have a data logging software? Should you measure positive behavior?

Deciding How to Measure Usability Understanding what you can measure Matching Measures to your Goals and concerns Matching Measures to the Product's Stage of Development Setting Quantitative Criteria for Each measure and each Task

Matching Measures to Product's stage of development Very important to consider where product is in the development cycle while planning performance measures. Testing prototype for a manual (without index)

Deciding How to Measure Usability Understanding what you can measure Matching Measures to your Goals and concerns Matching Measures to the Product's Stage of Development Setting Quantitative Criteria for Each measure and each Task Choose the performance measures you want to count

Selecting Performance Measures General Concerns: Ease of users who have never used Ease of users who have used other Will the online help be useful Will new user be able to select items from screens quickly and easily

Selecting Performance Measures Specific Concerns: Will the user be able to read a specific piece of mail and skip over mail they don't want to read Will new users be able to find and select people's addresses to send them mail Will users be able to find the right menu path to read/write/send a message

Setting Quantitative measure for Each Measure and Task How to select criteria for performance measures? Typical criteria for Performance measures Excellent Acceptable or OK Unacceptable Base your criteria on users

Setting Quantitative Criteria for each measure and each task MeasureExcellentAcceptableUnacceptable Task 1: Read Message Time for Task<3 min3-5>5 Time in online help <11-2>2 Write and Send Time for Task<10 min10-15 min>15 min Time in online help <22-4>4 E=Other Errors01-2More than 2

Are the measures the same for all tasks in a given test? No Are the criteria of performance same for all tasks? No

Do you take the test situation into account in selecting criteria? Yes Should you count system response time in setting the criteria? Yes

What we have discussed till now Tasks that participants will do during tests and how to measure participant's performance with products.

Deciding How to Measure Usability How to conduct successful user requirements activity?

 Welcoming your participants  Dealing with late and absent participants  Warm up exercises  Inviting Observers  Introducing your think aloud protocol  Moderating your activity  Recording and note taking  Dealing with awkward situations  Conclusion

Welcoming your participants Ask participants to come about 15 minutes earlier. Introduction. Welcome Signs. Playing CDs. Don't leave participants alone

Dealing with late and absent participants Despite your best efforts some participants can be late. The late participant You can't wait any longer Including late participants The No-Show

Warm-up Exercises Start with light conversation. Introduce yourself. Provide nametags Don't spend much time

Inviting Observers Developer observers get to know user requirements better. Tell observers to come early and remain quiet. Don't allow managers to observe

Introducing your think aloud Protocol Make participants speak Their thought process while doing task. Steps in the task. Expectations and evaluation statements. Provide some examples to participants e.g. Stapler Observe loudness of participants

Moderating your activity Have personality Ask questions Stay focused You are not a participant Keep activity moving No critiquing Everyone should participate No-one should dominate Practice makes perfect

Take Notes You get data immediately and can start analysis Participants feel they are saying important things Problems: You can get wrapped being a stenographer If discussion pace is fast all information cannot be captured

Video and Audio Recording Capture nuances, body language of participants (cant have in notes) Listen to recording and take notes Video better than audio recording Problems: Can make participants uncomfortable

What is the best approach? Combine Video/Audio Recording and Notes

Dealing with awkward Situations Some Uncomfortable situations and ways to deal with them : Participant Issues Observers Issues

Participants Issues Participant is called in middle of test Participant's cell phone rings continuously Wrong participant recruited Participant thinks he is on a job interview Participant refuses to be videotaped and wants to leave Contd..

Participants Issues Participant is confrontational with other participants Participant dominates the group Participant is not truthful about his identity Participant refuses to sign consent form Fire Alarm sounds in middle of test and participant still wants to continue

Product Team/Observers Issues Team changes the product mid-test An observer turns on light in control room Observers talk loudly during an activity

Conclusion To conduct any of your user requirements effectively Deal with participant arrivals. Make them think creatively. Moderating any activity Instruct your participants to think aloud. Deal with any awkward situation in testing that may arise.