Analyzing and Presenting Results Establishing a User Orientation Alfred Kobsa University of California, Irvine.

Slides:



Advertisements
Similar presentations
Chapter 14: Usability testing and field studies
Advertisements

Copyright © 2014 by The University of Kansas Qualitative Methods to Assess Community Issues.
Each individual person is working on a GUI subset. The goal is for you to create screens for three specific tasks your user will do from your GUI Project.
Data analysis and interpretation. Agenda Part 2 comments – Average score: 87 Part 3: due in 2 weeks Data analysis.
ACTIVELY ENGAGING THE STAKEHOLDER IN DEFINING REQUIREMENTS FOR THE BUSINESS, THE STAKEHOLDER, SOLUTION OR TRANSITION Requirements Elicitation.
Chapter 14: Usability testing and field studies. 2 FJK User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept.
Design Activities in Usability Engineering laura leventhal and julie barnes.
Chapter 14: Usability testing and field studies. Usability Testing Emphasizes the property of being usable Key Components –User Pre-Test –User Test –User.
© De Montfort University, 2001 Questionnaires contain closed questions (attitude scales) and open questions pre- and post questionnaires obtain ratings.
Part 4: Evaluation Chapter 20: Why evaluate? Chapter 21: Deciding on what to evaluate: the strategy Chapter 22: Planning who, what, where, and when Chapter.
Usable Privacy and Security Carnegie Mellon University Spring 2008 Lorrie Cranor 1 Designing user studies February.
1 Psych 5500/6500 The t Test for a Single Group Mean (Part 5): Outliers Fall, 2008.
ICS 463, Intro to Human Computer Interaction Design: 8. Evaluation and Data Dan Suthers.
Quantifying Data.
Preparing a User Test Alfred Kobsa University of California, Irvine.
Choosing Statistical Procedures
Usability Methods: Cognitive Walkthrough & Heuristic Evaluation Dr. Dania Bilal IS 588 Spring 2008 Dr. D. Bilal.
Chapter 14: Usability testing and field studies
CHAPTER 4 Research in Psychology: Methods & Design
Basic Data Analysis for Quantitative Research
Copyright © 2008 by Pearson Education, Inc. Upper Saddle River, New Jersey All rights reserved. John W. Creswell Educational Research: Planning,
Dr. Engr. Sami ur Rahman Assistant Professor Department of Computer Science University of Malakand Research Methods in Computer Science Lecture: Research.
Project.  Topic should be: Clear and specific Practical and meaningful, this means the results of your research must have some implications in real life.
Copyright © 2008 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 24 Designing a Quantitative Analysis Strategy: From Data Collection to Interpretation.
CSCD 487/587 Human Computer Interface Winter 2013 Lecture 3 HCI and Interactive Design.
Qualitative Methods to Assess Community Issues. What are qualitative methods of assessment? Qualitative methods of assessment are those whose results.
Human Computer Interaction
Object-Oriented Software Engineering Practical Software Development using UML and Java Chapter 7: Focusing on Users and Their Tasks.
Analyzing and Interpreting Quantitative Data
Preparing a User Test Alfred Kobsa University of California, Irvine.
Usability Testing CS774 Human Computer Interaction Spring 2004.
Chapter 13. Reviewing, Evaluating, and Testing © 2010 by Bedford/St. Martin's1 Usability relates to five factors of use: ease of learning efficiency of.
COMP5047 Pervasive Computing: 2012 Think-aloud usability experiments or concurrent verbal accounts Judy Kay CHAI: Computer human adapted interaction research.
CEN st Lecture CEN 4021 Software Engineering II Instructor: Masoud Sadjadi Monitoring (POMA)
Chapter 8 Usability Specification Techniques Hix & Hartson.
PROCESSING OF DATA The collected data in research is processed and analyzed to come to some conclusions or to verify the hypothesis made. Processing of.
The product of evaluation is knowledge. This could be knowledge about a design, knowledge about the user or knowledge about the task.
1 f02laitenberger7 An Internally Replicated Quasi- Experimental Comparison of Checklist and Perspective-Based Reading of Code Documents Laitenberger, etal.
Usability Evaluation, part 2. REVIEW: A Test Plan Checklist, 1 Goal of the test? Specific questions you want to answer? Who will be the experimenter?
EVALUATION PROfessional network of Master’s degrees in Informatics as a Second Competence – PROMIS ( TEMPUS FR-TEMPUS-JPCR)
Chapter 6: Analyzing and Interpreting Quantitative Data
The Marketing Research Process. The Marketing Research Process: 11 Steps Step One:Establishing the Need for Marketing Research Step Two:Defining the Problem.
Chapter 7 Measuring of data Reliability of measuring instruments The reliability* of instrument is the consistency with which it measures the target attribute.
1 SEG3120 Analysis and Design for User Interfaces LAB1: Video tape evaluation.
Administrivia  Feedback from the mid-term evaluation  Insights from project proposal.
Descriptive and Inferential Statistics Or How I Learned to Stop Worrying and Love My IA.
EasyChair Project Reviewer sign up and bidding Art Hsieh Jean Huang Norik Davtian Ryan Nissenbaum.
Oct 211 The next two weeks Oct 21 & 23: Lectures on user interface evaluation Oct 28: Lecture by Dr. Maurice Masliah No office hours (out of town) Oct.
Field Studies (Ethnographic Studies) Alfred Kobsa University of California, Irvine.
Part 4: Evaluation Days 25, 27, 29, 31 Chapter 20: Why evaluate? Chapter 21: Deciding on what to evaluate: the strategy Chapter 22: Planning who, what,
Chapter 25 Analysis and interpretation of user observation evaluation data.
Steps in Planning a Usability Test Determine Who We Want To Test Determine What We Want to Test Determine Our Test Metrics Write or Choose our Scenario.
Usability Engineering Dr. Dania Bilal IS 587 Fall 2007.
1 Collecting and Interpreting Quantitative Data Deborah K. van Alphen and Robert W. Lingard California State University, Northridge.
Chapter 24 Assignments
Evaluation / Usability. ImplementDesignAnalysisEvaluateDevelop ADDIE.
ANNOUCEMENTS 9/3/2015 – NO CLASS 11/3/2015 – LECTURE BY PROF.IR.AYOB KATIMON – 2.30 – 4 PM – DKD 5 13/3/2015 – SUBMISSION OF CHAPTER 1,2 & 3.
Research And Evaluation Differences Between Research and Evaluation  Research and evaluation are closely related but differ in four ways: –The purpose.
15 Inferential Statistics.
Quantitative Data Analysis and Interpretation
User Interface Evaluation
Evaluation through user participation
Usability Evaluation, part 2
Analyzing and Interpreting Quantitative Data
UIDE Chapter 25. Chapter 25 Analysis and interpretation of user observation evaluation data.
Alfred Kobsa University of California, Irvine
Usability Techniques Lecture 13.
Evaluation.
Experimental Evaluation
Presentation transcript:

Analyzing and Presenting Results Establishing a User Orientation Alfred Kobsa University of California, Irvine

Tabulating and analyzing data Tabulate data in spreadsheet(s) per user and per task -Both quantitative and qualitative data (e.g., comments) Compute totals per user and averages per task Find outlier values in the raw data -Try to explain them -get back to the original data source to check for transcription errors -look at time sheet / protocol and video recording -Outliers may point to infrequent usability problems, or they may derive from “accidental” characteristics of the respective test user. In the latter case -Disregard outlier values if this can be justified, or use median instead of average -[Remove subjects with many outlier values completely if this can be justified (very few subjects only!)] Look at means/medians and possibly standard deviations to –determine whether usability concerns are confirmed by the data –discover surprises in the data, and determine whether they point to usability problems

Analyzing video and audio recordings Unless subjects were asked to “think aloud”, it is generally easier to analyze video data with concrete questions in mind rather than merely “watching out for usability problems” -This does not so much apply to audio, since subjects often verbalize the problem they encounter Observations should be noted down (with time stamps) Categories for observations may already exist, or can be created in the observation process Often it is advisable to use two independent observers who afterwards compare their notes (and get back to the recordings to resolve disputes)

Statistical presentation and analysis Results of usability tests are usually presented using -tabulated raw values -descriptive statistics (means, medians, standard deviations) -visualizations of raw values and statistical values In rare cases, inferential statistics can be used -Specifically for comparing two competing prototypes, or the “old” and the “new” system -Should be done with extreme caution, since -Preconditions for the applicability of statistical tests are often not met (randomness of subject sampling and assignment to conditions, normal distribution of data) -Sample sizes are often very small -Statistical significance of a difference does not mean that the difference is important -Decision makers do not know how to interpret the results of a statistical test (and are not familiar with the preconditions and limits of such tests) -Testers are not well trained in statistics and do not know which test is appropriate

Identifying usability problems Involve the designers / programmers (particularly if they are going to perform the revisions) Focus on global problems since they often affect many aspects of an interface -Global problems are more difficult to pinpoint and to correct Rank problems by level of severity -Level 1: problem may prevent the successful completion of a task -Level 2: problem may create significant delay and frustration -Level 3: problem has minor effect on usability -Level 4: possible enhancement that can be added in the future Recommend changes (and test those changes later)

Communicating the results Preparing a report / reports See Dumas and Reddish, Chapter 22 Courage and Baxter, Chapter 14 Preparing a Powerpoint presentation Preparing a stand-alone video/multimedia presentation See Dumas and Reddish, Chapter 23

Changing the product and process Collaborate with designers/developers throughout the evaluation process (and possibly with management) Prioritize and motivate your recommendations for re-design Collaborate on finding feasible ways to fix the problems Make suggestions to improve the design process, such as earlier involvement of users earlier testing of designs and prototypes hiring HCI staff developing design guidelines