The user as wizard: A method for early involvement in the design and evaluation of adaptive systems Judith Masthoff University of Aberdeen, Scotland.

Slides:



Advertisements
Similar presentations
Testing Relational Database
Advertisements

Critical Reading Strategies: Overview of Research Process
T EASING A PART AND P IECING T OGETHER : T OWARDS U NDERSTANDING W EB -B ASED I NTERACTIONS Clare J. Hooper University of Southampton and IBM UK
Design, prototyping and construction
Department of Mathematics and Science
©2011 1www.id-book.com Introducing Evaluation Chapter 12.
Chapter 9 User-centered approaches to interaction design By: Sarah Obenhaus Ray Evans Nate Lynch.
Assessing Relating - Eucharist
1 Welcome to Module 1 Principles of Mathematics Instruction.
COMP5047 – Design Focus groups – brainstorming for design Judy Kay CHAI: Computer human adapted interaction research group School of Information Technologies.
CS305: HCI in SW Development Evaluation (Return to…)
Human Computer Interaction
So far.. We have covered a) Requirements gathering: observation & interview. b) Requirements specification. c) Requirements validation. d) Design/paper.
WEBQUEST Let’s Begin TITLE AUTHOR:. Let’s continue Return Home Introduction Task Process Conclusion Evaluation Teacher Page Credits This document should.
Usability presented by the OSU Libraries’ u-team.
ICS 463: Intro to Human Computer Interaction Design 2. User-Centered Design Dan Suthers.
Evaluation of usability tests. Why evaluate? 1. choose the most suitable data- collection techniques 2. identify methodological strength and weaknesses.
Information & Interaction Design Fall 2005 Bill Hart-Davidson Session 6: analyzing work practices – rationale and challenges; the 5 Contextual Design work.
Design Process …and the project.
An evaluation framework
Contextual Design. Purpose for us An example A term often used, with varying levels of precision.
Part 2: Requirements Days 7, 9, 11, 13 Chapter 2: How to Gather Requirements: Some Techniques to Use Chapter 3: Finding Out about the Users and the Domain.
Junk Food Science Role Play KS2 Lesson CPD Slides.
Usability Driven GUI Design Portal as a Gateway to Intranet Resources Matthew Winkel Usability Analyst.
Reflective practice Session 4 – Working together.
1. Learning Outcomes At the end of this lecture, you should be able to: –Define the term “Usability Engineering” –Describe the various steps involved.
Interviews Stephanie Smale. Overview o Introduction o Interviews and their pros and cons o Preparing for an interview o Interview Elements: o Questions.
FORMATIVE EVALUATION Intermediate Injury Prevention Course August 23-26, 2011, Billings, MT.
LESSON 8 Booklet Sections: 12 & 13 Systems Analysis.
Evaluation Framework Prevention vs. Intervention CHONG POH WAN 21 JUNE 2011.
Planning and Decision Aids Chapter 9. Learning Objectives Explain knowledge management and how it creates value for organizations. Describe the basic.
Copyright ©: SAMSUNG & Samsung Hope for Youth. All rights reserved Tutorials Cameras and photos: Making a documentary video Suitable for: Improver.
Put the Title of the WebQuest Here A WebQuest for xth Grade (Put Subject Here) Designed by (Put Your Name Here) Put Your Address Here Put some interesting.
Put the Lesson Title Here A webquest for xth grade Designed by Put your You may include graphics, a movie, or sound to any of the slides. Introduction.
Design, prototyping and construction CSSE371 Steve Chenoweth and Chandan Rupakheti (Chapter 11- Interaction Design Text)
Chapter 9: User-centered approaches to interaction design From “Interaction design: Beyond human-computer interaction” By J. Preece, Y. Rogers, H. Sharp.
Part 1-Intro; Part 2- Req; Part 3- Design  Chapter 20 Why evaluate the usability of user interface designs?  Chapter 21 Deciding on what you need to.
Multimedia Specification Design and Production 2013 / Semester 1 / week 9 Lecturer: Dr. Nikos Gazepidis
Human Computer Interaction
THE PROCESS OF PLANNING The “Back Story” for teachers…..
Chapter 9 Prototyping. Objectives  Describe the basic terminology of prototyping  Describe the role and techniques of prototyping  Enable you to produce.
“Learning from experiences is the fundamental process of case-based reasoning.” “…this has implications on how to teach.” (p.1)
Approaches to Problem Solving. There are many approaches to problem-solving. What follows in this PowerPoint are several that provide an opportunity for.
CSCI 4163 / CSCI 6904 – Winter Housekeeping  Clarification about due date for reading comments/questions  Skills sheet  Active listening handout.
Prototyping. REVIEW : Why a prototype? Helps with: –Screen layouts and information display –Work flow, task design –Technical issues –Difficult, controversial,
Usability Evaluation, part 2. REVIEW: A Test Plan Checklist, 1 Goal of the test? Specific questions you want to answer? Who will be the experimenter?
CS160 Discussion Section 6 Task Analysis Postmortem David Sun March 06, 2007.
Task Analysis Lecture # 8 Gabriel Spitz 1. Key Points  Task Analysis is a critical element of UI Design  It describes what is a user doing or will.
Understanding User's Work Ethnography The systematic study and documentation of human activity without imposing a prior interpretation on it via immersion.
Task Analysis Lecture # 8 Gabriel Spitz 1. Key Points  Task Analysis is a critical element of UI Design  It specifies what functions the user will need.
Module 7- Evaluation: Quality and Standards. 17/02/20162 Overview of the Module How the evaluation will be done Questions and criteria Methods and techniques.
Design, prototyping and construction(Chapter 11).
Evaluation / Usability. ImplementDesignAnalysisEvaluateDevelop ADDIE.
1 Usability Analysis n Why Analyze n Types of Usability Analysis n Human Subjects Research n Project 3: Heuristic Evaluation.
User-centered approaches to interaction design By Haiying Deng Yan Zhu.
Learning Management System
Day 8 Usability testing.
Teaching for Deep Learning in an active and engaging way Session 9.
Research Methods Workshop
Assessing Relating - Eucharist
© MIT Students © Microsoft © MIT Students.
DESIGN FOR DYNAMIC REQUIREMENT AND DIVERSE USER EXPERIENCE
ELT 329 ACTION RESEARCH Week 4
Walking Through Grade 10 English
SY DE 542 User Testing March 7, 2005 R. Chow
Design Research Jon Kolko Director & Founder, Austin Center for Design.
Author: Ms. Jennifer Computer Studies
Introducing Evaluation
Put the Lesson Title Here
NATC 2009 Jennifer Pollock East Limestone High School
Presentation transcript:

The user as wizard: A method for early involvement in the design and evaluation of adaptive systems Judith Masthoff University of Aberdeen, Scotland

Introduction User Interface Design principle: Involve users early in the design process, not just in a final evaluation phase This also holds for adaptive systems However user involvement in the design of adaptive systems is rarely reported Will present a method for doing this (this method is NOT new)

Early User Involvement in Design Many methods are available and may be applicable to adaptive systems: –Interviews and Questionnaires –Focus Groups –Contextual Design –Cultural Probes –Creative Brainstorming Sessions –…..

Early User Involvement in Design Many methods are available and may be applicable to adaptive systems: –Interviews and Questionnaires –Focus Groups –Contextual Design –Cultural Probes –Creative Brainstorming Sessions –…..

Contextual Design Ethnographic method Users are observed in their work place –how they go about their work –in what environment –using which artefacts Users are the experts in their tasks Observation will provide more detail

Wizard-of-Oz studies: testing a non-existing system Early User Involvement in Evaluation Dear Henry What the user sees Speech Computer From Gould, Conti & Hovanvecz, Comm ACM 26(4) 1983.

What the user seesThe wizard Speech Computer Dear Henry From Gould, Conti & Hovanvecz, Comm ACM 26(4) Wizard-of-Oz studies: testing a non-existing system Early User Involvement in Evaluation

Early User Involvement to Inspire Adaptation Algorithms Method inspired by Contextual Design and Wizard-of-Oz Humans tend to be good at adaptation, so, observing them in the role of the wizard may help to design the adaptation Different from observing experts (e.g. teachers) in the field

Users-as-Wizards Participants in the role of the wizard, no script Given the same info as the system would have Fictional users are used rather than real ones Two stages: –Exploration Stage: Participants take the role of the adaptive system (or a part of it: Layered Design!) –Consolidation Stage: Participants judge the performance of others

Case Study: Group Recommender System System purpose. To select a sequence of items (e.g. music clips) adapted to a group of users. Problem. How should ratings by individual users be aggregated into ratings for the group as a whole? Many different aggregation strategies existed, and we needed to know how suitable they might be, and how to judge them. There were far too many for an ordinary user test, and the domain was too complicated for normal user testing anyway

Exploration Stage (Step1) Give participants a scenario describing a fictional user (or group of users) and the user’s intentions “John, Mary, and Adam are going to watch video clips together. We know how interested they are in the topics of the clips. Each clip is rated from 1 - really hate this topic - to 10 - really like this topic. [A table shows each individual’s liking for each of the clips]. They only have time to watch five clips.”

Exploration Stage (Step 2) Give participants the task the adaptive system is supposed to perform “Which clips should they watch?”

Exploration Stage (Step 3) Find out participants’ reasons for their decisions and actions “Why?” Alternative: thinking-aloud or co-discovery

Exploration Stage (Step 4) Steps 1 to 3 can be repeated for a number of scenarios.

Consolidation Stage (Step 1) Give participants (a) a scenario involving a fictional user and the user’s intentions, and (b) an associated task. Normally the same as in Exploration phase.

Consolidation Stage (Step 2) Show the participants a performance on this task for this scenario (by participants in Exploration phase or System) “The TV decides to show them the following sequence of clips [a sequence is given].” (Performance was that of various group aggregation strategies.)

Consolidation Stage (Step 3) Ask participants to judge the quality of task performance “How satisfied do you believe John would be? How satisfied do you believe Adam would be? How satisfied do you believe Mary would be?” (A seven point Likert-scale was used.)

Consolidation Stage (Step 4) Find out participants’ reasons for their judgments “Why?” for each judgement.

Consolidation Stage (Step 5-6) Normally, steps 2 to 4 are repeated for a number of task performances Participants were given 3 performances to judge Steps 1 to 5 can be repeated for a number of scenarios.

Conclusions Method can help to design adaptation algorithms Limitations: –Participants’ judgments may not correspond with what would be best for users –Method not suitable for tasks that are inherently more difficult for humans than for computers Method is only initial step: User testing is still needed Making methods explicit may help in uptake