Evaluation Paradigms & Techniques IS 588 Spring 2008 Dr. D. Bilal.

Slides:



Advertisements
Similar presentations
Chapter 13: An evaluation framework
Advertisements

Chapter 14: Usability testing and field studies
Imran Hussain University of Management and Technology (UMT)
Collecting data Chapter 5
Copyright © 2014 by The University of Kansas Qualitative Methods to Assess Community Issues.
CS305: HCI in SW Development Evaluation (Return to…)
WHAT IS INTERACTION DESIGN?
Chapter 14: Usability testing and field studies. 2 FJK User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept.
Observing Users Paul Bogen, LaShon Johnson, Jehoon Park.
Methodology Overview Dr. Saul Greenberg John Kelleher.
Focus Groups. Contents What is a focus group and why use it Methods When to use Focus Groups Advantages and Disadvantages Example.
Evaluation. formative 4 There are many times throughout the lifecycle of a software development that a designer needs answers to questions that check.
Evaluation Methodologies
User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San Luis Obispo FJK 2005.
Part 4: Evaluation Chapter 20: Why evaluate? Chapter 21: Deciding on what to evaluate: the strategy Chapter 22: Planning who, what, where, and when Chapter.
Observing users.
Observing users. The aims Discuss the benefits & challenges of different types of observation. Describe how to observe as an on-looker, a participant,
Data-collection techniques. Contents Types of data Observations Event logs Questionnaires Interview.
An evaluation framework
Intro to Evaluation See how (un)usable your software really is…
An evaluation framework
Chapter 3 Preparing and Evaluating a Research Plan Gay and Airasian
Evaluation How do we test the interaction design? Several Dimensions
Predictive Evaluation
User Interface Evaluation Usability Testing Methods.
Introduction to SDLC: System Development Life Cycle Dr. Dania Bilal IS 582 Spring 2009.
Evaluation Framework Prevention vs. Intervention CHONG POH WAN 21 JUNE 2011.
Chapter 11: An Evaluation Framework Group 4: Tony Masi, Sam Esswein, Brian Rood, & Chris Troisi.
Ch 14. Testing & modeling users
Qualitative Methods to Assess Community Issues. What are qualitative methods of assessment? Qualitative methods of assessment are those whose results.
Gathering User Data IS 588 Dr. Dania Bilal Spring 2008.
Multimedia Specification Design and Production 2013 / Semester 1 / week 9 Lecturer: Dr. Nikos Gazepidis
Chapter 12 Observing Users Li, Jia Li, Wei. Outline What and when to observe Approaches to observation How to observe How to collect data Indirect observation.
Human Computer Interaction
Chapter 12/13: Evaluation/Decide Framework Question 1.
LOOKING FOR EVIDENCE? A CASE FOR THE CASE STUDY DR. GURU NAGARAJAN DR. SARA BHATTACHARJI.
Level 2 Prepared by: RHR First Prepared on: Nov 23, 2006 Last Modified on: Quality checked by: MOH Copyright 2004 Asia Pacific Institute of Information.
CS2003 Usability Engineering Usability Evaluation Dr Steve Love.
CS3205: HCI in SW Development Evaluation (Return to…) We’ve had an introduction to evaluation. Now for more details on…
© An Evaluation Framework Chapter 13.
Chapter 8 Usability Specification Techniques Hix & Hartson.
Welcome to the Usability Center Tour Since 1995, the Usability Center has been a learning environment that supports and educates in the process of usability.
CSCI 4163 / CSCI 6904 – Winter Housekeeping  Clarification about due date for reading comments/questions  Skills sheet  Active listening handout.
Usability Engineering Dr. Dania Bilal IS 582 Spring 2006.
Chapter 12/13: Evaluation/Decide Framework. Why Evaluate? Why: to check that users can use the product and that they like it. Designers need to check.
Observing users. The aims Discuss the benefits & challenges of different types of observation. Describe how to observe as an on-looker, a participant,
DECIDE: An evaluation framework. DECIDE: a framework to guide evaluation D D etermine the goals. E E xplore the questions. C C hoose the evaluation approach.
Observing users. What and when to observe Goals & questions determine the paradigms and techniques used. Observation is valuable any time during design.
Um ambiente para avaliação. Objetivos Explicar conceitos e termos da avaliação Descrever paradigmas de avaliação e técnicas utilizadas no design de.
Oct 211 The next two weeks Oct 21 & 23: Lectures on user interface evaluation Oct 28: Lecture by Dr. Maurice Masliah No office hours (out of town) Oct.
Lesson 11: Designing Research. Naturalistic Observation When designing a naturalistic observation researchers need to consider;  behavioural categories,
Usability Engineering Dr. Dania Bilal IS 582 Spring 2007.
Data Analysis: Reporting Findings Dr. Dania Bilal IS588 Spring 2008.
Chapter 13: An evaluation framework. The aims are: To discuss the conceptual, practical and ethical issues involved in evaluation. To introduce and explain.
Cognitive Informatics for Biomedicine – Chapter 5
Methods Choices Overall Approach/Design
Introducing Evaluation
Approaches in Studying Personality
Defining Tasks; Data Analysis
WHAT IS INTERACTION DESIGN?
Lesson 1 Foundations of measurement in Psychology
Evaluation Paradigms & Techniques
Observing users.
Qualitative and Quantitative Data
Approaches in Studying Personality
Evaluation.
Human-Computer Interaction: Overview of User Studies
COMP444 Human Computer Interaction Evaluation
Presentation transcript:

Evaluation Paradigms & Techniques IS 588 Spring 2008 Dr. D. Bilal

Overview Evaluation is performed to determine how well a certain product design meets user needs. Need to decide what to evaluate? –Guided by goals, theory, model, etc. What to evaluate determines how to do the evaluation

Evaluation Paradigms Quick & Dirty Usability testing Field studies Predictive evaluation

Quick & Dirty Informal Designers or evaluators meet informally with users –Gather information about product design –Gather suggestions for design improvements Inexpensive Not time consuming

Usability Testing Formal assessment Measures user performance on predefined tasks –Tasks structured based on purpose of evaluation Controlled by evaluator Performance observed and/or captured –EXAMPLES? –based on questions guiding usability testing (i.e., what the evaluator wants to find)

Usability Testing Typically quantitative Interviews and questionnaires can result in qualitative assessments –User comments, quotes of likes/dislikes, etc. A mix method is ideal –WHY?

Usability Testing Not performed in a naturalistic setting Activities can be captured/recorded using software (e.g., Morae, HyperCam, Camtasia), or videotape Evaluator may take observational notes while activities being captured

Field Studies Naturalistic setting –User interacts with system as part of a daily routine –No tasks given by evaluator –Evaluator observes and records activities, OR uses software to capture activities, OR… Can be qualitative and quantitative –HOW?

Predictive Evaluation Experts place themselves in the users’ shoes to predict usability problems Guided by heuristics –Quick, inexpensive –Limitations WHAT ARE THEY?

Evaluation Techniques Observe user Gather user opinion Gather expert opinion Test user performance Model user performance Mix method (2 or more techniques)

DECIDE Framework Determine goals Explore/set questions to be answered Choose suitable paradigms and techniques Identify issues (e.g., how to recruit participants)

DECIDE Framework Decide on tackling ethical concerns (e.g., use of human subjects, privacy) Evaluate, interpret, present data