Slides:



Advertisements
Similar presentations
Orientation Programs: June 7 th – August 23 rd 2012.
Advertisements

Welcome to my workshop Growth Mindset Maths
Caleb Stepanian, Cindy Rogers, Nilesh Patel
HFM SAN Distance Learning Project Teacher Survey 2008 – 2009 School Year... BOCES Distance Learning Program Quality Access Support.
Evaluation of Electronic MCQ/EMQ Examinations 2008/9.
PaperLens Understanding Research Trends in Conferences using PaperLens Work by Bongshin Lee, Mary Czerwinski, George Robertson, and Benjamin Bederson Presented.
Empirical Usability Testing in a Component-Based Environment: Improving Test Efficiency with Component-Specific Usability Measures Willem-Paul Brinkman.
Programming 1 Feedback Session. The unit has improved my understanding of programming. 1.Strongly Agree 2.Agree 3.Neutral 4.Disagree 5.Strongly Disagree.
Personal Aggression Core Self Evaluation Optimism Emotional Intelligence -
OSU Libraries presented by the u-team.
Evaluating Website Quality. Website/Portal Quality Is it important? Why? How could we measure it? Who would be in the best position to evaluate a website?
Testing HCI Usability Testing. Chronological order of testing Individual program units are built and tested (white-box testing / unit testing) Units are.
Assessment of Systems Effort Factors Functionality Impact Factors Functionality Interface Usability What it does Collection Value to task Effectiveness.
User interface design Designing effective interfaces for software systems Objectives To suggest some general design principles for user interface design.
Carol S. Dweck: Mindset ‘IQ tests can measure current skills, but nothing can measure someone's potential. It is impossible to tell what people are capable.
Result from Parents survey May Responses 131 families gave their opinions - 74% Scale Answers on a scale ranging from strongly disagree (1) to strongly.
APDU Webinar User Needs for Calculating Standard Errors in the ACS OR What is a Statistical Calculator? Presented by Doug Hillmer, Independent Consultant.
SAURO/LEWIS109USABILITY TESTING RESEARCH METHODS IN HCI HCI RESEARCHERS EMPLOY EMPIRICAL METHODS, TECHNIQUES FOR INVESTIGATING THE WORLD AND COLLECTING.
Purpose Review of Literature: Mentoring Benefits Janice Berry, Ed.D., RN  Pamela Petri, Senior Nursing Student  Jodi Strong, RN, BSN, Graduate Nursing.
How FACILITY CMIS and E-Portal are used within the organisation
Parent Questionnaire Results March Q1. I feel welcome in school.
HFM SAN Distance Learning Project Teacher Survey 2004 – 2005 School Year... BOCES Distance Learning Program Quality Access Support.
Utah School of Computing HCI Validation Richard F. Riesenfeld University of Utah Fall 2004.
Semi-Continuous Monitoring of Student Feedback in Interactive Synchronous E-Learning Environments.
User Centered Learning Design Ranvir Bahl (PMP, CSM)
1 InStar Studio Product Release December The AMS InStar Studio release results in a move to a more powerful and scalable platform for huge future.
Usability. Definition of Usability Usability is a quality attribute that assesses how easy user interfaces are to use. The word "usability" also refers.
Natural Europe Posttest Students The questionnaire is aimed at collecting information on the features and tools of the Natural Europe project. To analyze.
Nielsen’s Ten Usability Heuristics
This slide presentation is Confidential and Proprietary to ProtoTest LLC.
Analysis of U-scan machines Group 4 Olivia Tran Jamie Tran Josh Wortman Calvin Hsu Oscar Guerrero Shawn Flynn Andrew Wong Todd Espiritu Santo.
Heuristic evaluation Functionality: Visual Design: Efficiency:
Research Designs & Methods  Research design: the strategy and overall framework of the research. Example: A survey design.  Research method: how the.
UI Style and Usability, User Experience Niteen Borge.
Natural Europe Posttest Students (Control group) The questionnaire is aimed at collecting information on the features and tools of the Natural Europe project.
Software Architecture
Chestnuts Park Patient Survey Results 20 March 2014.
G063 - Human Computer Interface Design Designing the User Interface.
Question: Please tell me if you are very dissatisfied, dissatisfied, neutral, satisfied, or very satisfied with the Number of Unsightly or Blighted Properties.
120 I am an even number. 120 I am an even number. I am more than 100.
OSU Libraries presented by the u-team.
FINAL SURVEY ANALYSIS Betsy Ball & Emily Warpool.
Usability Testing TECM 4180 Dr. Lam. What is Usability? A quality attribute that assesses how easy user interfaces are to use Learnability – Ease of use.
What do you think about intelligence?. Intelligence questionnaire For each statement decide whether you agree or disagree : (strongly agree is 10 and.
Usability Heuristics Avoid common design pitfalls by following principles of good design Nielsen proposes 10 heuristics, others propose more or less. Inspect.
Assessment of Google Flights Lanny Chung Junior at Bentley University.
.. CRB/FEH/Questar III Distance Learning Project Teacher Survey 2008– 2009 School Year BOCES Distance Learning Program Quality Access Support.
MHV Usability Testing Neale R. Chumbler, PhD, Jason Saleem, PhD, David Haggstrom, MD, MAS VA HSR&D CIEBP and Stroke QUERI.
Media training for start ups Sarah Rice Head of Communications Mxit.
Page 1 INFORMATION DESIGN. Page 2 Information design builds upon the work we have done earlier in the process. We need to know who we are designing for.
SAN Distance Learning Project Teacher Survey 2002 – 2003 School Year... BOCES Distance Learning Program Quality Access Support.
.. HFM Distance Learning Project Teacher Survey 2002 – 2003 School Year BOCES Distance Learning Program Quality Access Support.
Consistency: A Factor that Links the Usability of Individual Interaction Components Together Willem-Paul Brinkman Brunel University Reinder Haakma Philips.
Mobile Quiz/Test Application Design Results Aida Syrkett ITEC
MERIL: On-Site Accessibility Assessments Research Conducted By: Jenny Stanhope and Todd Ries.
Main Areas for Testing System stability – is the system going to crash or not? System usability – is the system easy to use? System security – is the.
DSMA 0399 Comments of Past Students. DSMA 0399 Student Comments “Before this class as you probably remember I would not even accept that x or y could.
6. (supplemental) User Interface Design. User Interface Design System users often judge a system by its interface rather than its functionality A poorly.
HUMAN RESOURCES | SERVICE EXCELLENCE SURVEY
Standard Metrics and Scenarios for Usable Authentication
Software Development Unit 4 Outcome 1
Chapter 20 Why evaluate the usability of user interface designs?
Achieving goals. Together.
SOAPware EHR Susan cody.
FIXED VS GROWTH MINDSET QUESTIONNAIRE
Burn-Out Application TEAM MEMBERS PROBLEM JONATHAN BURNETT
Student Survey Results
Patient Questionnaire
Affirmative perception statements
Usability Created by Brett Oppegaard for Washington State University's
Presentation transcript:

CS 321: Human-Computer Interaction Design November 12, 2015 Questionnaires Usability Questionnaires Post-Study & Post-Task Questionnaires Triangulation

Usability Questionnaires Using standardized questionnaires for usability studies offers several advantages. Objectivity Usability practitioners are able to independently verify the measurement statements of others. Replicability Studies can easily be replicated, improving their reliability. Quantification Results can be reported in finer detail and more objectivity. Communication Standardized measures facilitate communication between practitioners. Economy Developing standardized measures takes work, but reusing them is inexpensive. CS 321 November 12, 2015 Questionnaires Page 272

Post-Study Usability Questionnaires The PSSUQ is a 16-item survey that measures users’ perceived satisfaction with a product or system. The Post-Study System Usability Questionnaire (Version 3) Strongly Agree Strongly Disagree 1 2 3 4 5 6 7 NA 1. Overall, I am satisfied with how easy it is to use this system.  2. It was simple to use this system. 3. I was able to complete the tasks and scenarios quickly using this system. 4. I felt comfortable using this system. 5. It was easy to learn to use this system. 6. I believe I could become productive quickly using this system. 7. The system gave error messages that clearly told me how to fix problems. 8. Whenever I made a mistake using the system, I could recover easily and quickly. 9. The information (such as on-line help, on-screen messages, and other documentation) provided with this system was clear. 10. It was easy to find the information I needed. 11. The information was effective in helping me complete the tasks and scenarios. 12. The organization of information on the system screens was clear. 13. The interface of this system was pleasant. 14. I liked using the interface of this system. 15. This system has all the functions and capabilities I expect it to have. 16. Overall, I am satisfied with this system. An overall satisfaction score is obtained by averaging the sub-scales of System Quality (items 1-6), Information Quality (items 7-12), and Interface Quality (items 13-16). The PSSUQ is susceptible to “acquiesce bias”, the fact that people are more likely to agree with a statement than to disagree with it. CS 321 November 12, 2015 Questionnaires Page 273

Interpreting Questionnaire Results Psychometric analysis of usability questionnaires is conducted to determine their reliability, validity, and sensitivity. For example, the PSSUQ-3 norms here show that most items have means that fall below the scale midpoint of 4, indicating that the scale midpoint should not be used exclusively as a reference from which to judge participants’ perceptions on usability. PSSUQ-3 Norms (Means and 99% Confidence Intervals) Lower Limit Mean Upper Limit 1. Overall, I am satisfied with how easy it is to use this system. 2.60 2.85 3.09 2. It was simple to use this system. 2.45 2.69 2.93 3. I was able to complete the tasks and scenarios quickly using this system. 2.86 3.16 3.45 4. I felt comfortable using this system. 2.40 2.66 2.91 5. It was easy to learn to use this system. 2.07 2.27 2.48 6. I believe I could become productive quickly using this system. 2.54 3.17 7. The system gave error messages that clearly told me how to fix problems. 3.36 3.70 4.05 8. Whenever I made a mistake using the system, I could recover easily and quickly. 3.21 3.49 9. The information (such as on-line help, on-screen messages, and other documentation) provided with this system was clear. 2.65 2.96 3.27 10. It was easy to find the information I needed. 2.79 3.38 11. The information was effective in helping me complete the tasks and scenarios. 2.46 2.74 3.01 12. The organization of information on the system screens was clear. 2.41 2.92 13. The interface of this system was pleasant. 2.06 2.28 2.49 14. I liked using the interface of this system. 2.18 2.42 15. This system has all the functions and capabilities I expect it to have. 2.51 3.07 16. Overall, I am satisfied with this system. 2.55 2.82 Also note the relatively poor ratings associated with Item 7, which reflect the difficulty of providing usable error messages in a software product, as well as the overall dissatisfaction that such errors cause in users. CS 321 November 12, 2015 Questionnaires Page 274

Post-Task Usability Questionnaires While post-study surveys provide information regarding the general satisfaction of users with an interface, brief mini-surveys of user reaction to specific tasks in specific scenarios are often more useful when attempting to diagnose more focused problems. The After-Scenario Questionnaire (Version 1) Strongly Agree Strongly Disagree 1 2 3 4 5 6 7 NA 1. Overall, I am satisfied with the ease of completing the tasks in this scenario.  2. Overall, I am satisfied with the amount of time it took to complete the tasks in this scenario. 3. Overall, I am satisfied with the support information (online help, messages, documentation) when completing the tasks. Example scenarios and tasks for office software systems: Mail Scenario #1 Open a note Send reply Delete note Mail Scenario #2 Open a note Forward w/reply Save response Delete original Address Scenario Create new listing Modify old listing Delete unmodified listing File Scenario Rename file Copy file Delete file Editor Scenario Locate document Edit document Open note Copy note’s text into document Save document Print document CS 321 November 12, 2015 Questionnaires Page 275

Any given research method has advantages and limitations. Triangulation Any given research method has advantages and limitations. Lab experiments are abstract and obtrusive, and may not be representative of the real world. Field studies cannot be controlled, so it’s hard to make strong, precise claims regarding comparative usability. Self-reporting (via questionnaires) is often biased by reactivity (e.g., the subjects try to be polite or to say what they think they should say, instead of the truth). One way to deal with this problem is via triangulation, using multiple methods to tackle the same research question. If they all support your claim, then you have much stronger evidence, without as many biases. CS 321 November 12, 2015 Questionnaires Page 276