IS214 Recap. IS214 Understanding Users and Their Work –User and task analysis –Ethnographic methods –Site visits: observation, interviews –Contextual.

Slides:



Advertisements
Similar presentations
1 of 20 Evaluating an Information Project From Questions to Results © FAO 2005 IMARK Investing in Information for Development Evaluating an Information.
Advertisements

Chapter 5 Development and Evolution of User Interface
Site Visits Interviews and observations. Site visits What we see and do for ourselves is more memorable, more real, more true than what someone else tells.
Data Quality and Education Sean Fox SERC, Carleton College.
Enhancing Data Quality of Distributive Trade Statistics Workshop for African countries on the Implementation of International Recommendations for Distributive.
CS305: HCI in SW Development Evaluation (Return to…)
WHAT IS INTERACTION DESIGN?
Usable Security (Part 1 – Oct. 30/07) Dr. Kirstie Hawkey Content primarily from Teaching Usable Privacy and Security: A guide for instructors (
Chapter 6: Design of Expert Systems
Semester in review. The Final May 7, 6:30pm – 9:45 pm Closed book, ONE PAGE OF NOTES Cumulative Similar format to midterm (probably about 25% longer)
Chapter Three: Determining Program Components by David Agnew Arkansas State University.
Usability presented by the OSU Libraries’ u-team.
Administrivia  Review Deliverable 2 –Overview (audience) –Excellent additions  User Goals  Usability Goals  User Group (who are you designing for?)
The Information School of the University of Washington Information System Design Info-440 Autumn 2002 Session #18.
© Tefko Saracevic, Rutgers University 1 EVALUATION in searching IR systems Digital libraries Reference sources Web sources.
Midterm Exam Review IS 485, Professor Matt Thatcher.
Project Sharing  Team discussions –Share results of heuristic evaluations –Discuss your choice of methods and results  Class-level discussion –Each spokesperson.
Part 4: Evaluation Chapter 20: Why evaluate? Chapter 21: Deciding on what to evaluate: the strategy Chapter 22: Planning who, what, where, and when Chapter.
© Tefko Saracevic, Rutgers University1 digital libraries and human information behavior Tefko Saracevic, Ph.D. School of Communication, Information and.
Reporting Stages of project at which reports are likely User and task analysis –Who (potential) users are, how they do their work, what they do, what.
Inspection Methods. Inspection methods Heuristic evaluation Guidelines review Consistency inspections Standards inspections Features inspection Cognitive.
Semester wrap-up …the final slides.. The Final  December 13, 3:30-4:45 pm  Closed book, one page of notes  Cumulative  Similar format and length to.
Needs and Usability Assessment. Intro to Usability and UCD Usability concepts –Usability as more than interface –Functionality, content, and design User-Centered.
Recap of IS214. Placing this course in context Creating information technology that helps people accomplish their goals, make the experience effective,
Analysis of Design Process Lewis/Gould 1981 Study TC518 W ’06 Students Early Focus on Users 62%58% Empirical Measurement 40%69% Iterative Design 20%62%
An evaluation framework
Course Wrap-Up IS 485, Professor Matt Thatcher. 2 C.J. Minard ( )
Data collection methods Questionnaires Interviews Focus groups Observation –Incl. automatic data collection User journals –Arbitron –Random alarm mechanisms.
SIMS 213: User Interface Design & Development Marti Hearst Thurs, Jan 22, 2004.
© Tefko Saracevic, Rutgers University1 digital libraries and human information behavior Tefko Saracevic, Ph.D. School of Communication, Information and.
HCI revision lecture. Main points Understanding Applying knowledge Knowing key points Knowing relationship between things If you’ve done the group project.
Elicitation Methods Information types Information types Information Information Internal Perspectives Internal Perspectives Behavior Behavior Facts Facts.
Evaluation. Practical Evaluation Michael Quinn Patton.
Mid-Term Exam Review IS 485, Professor Matt Thatcher.
Instructional Design Dr. Lam TECM 5180.
UOFYE Assessment Retreat
Choosing Your Primary Research Method What do you need to find out that your literature did not provide?
1. Learning Outcomes At the end of this lecture, you should be able to: –Define the term “Usability Engineering” –Describe the various steps involved.
Human Interface Engineering1 Main Title, 60 pt., U/L case LS=.8 lines Introduction to Human Interface Engineering NTU Seminar Amy Ma HIE Global Director.
‘Hints for Designing Effective Questionnaires ’
ACTIVITY. THE BRIEF You need to provide solid proof to your stakeholders that your mobile website meets the needs of your audience. You have two websites.
Too expensive Too complicated Too time consuming.
Evaluation of Adaptive Web Sites 3954 Doctoral Seminar 1 Evaluation of Adaptive Web Sites Elizabeth LaRue by.
Put it to the Test: Usability Testing of Library Web Sites Nicole Campbell, Washington State University.
Research and Analysis Methods October 5, Surveys Electronic vs. Paper Surveys –Electronic: very efficient but requires users willing to take them;
Allison Bloodworth, Senior User Interaction Designer, Educational Technology Services, University of California - Berkeley October 22, 2015 User Needs.
Innovation insight Peter H. Jones, Ph.D. Dayton, Toronto redesignresearch.com designdialogues.net A Bag of Tricks: What is the Right Mix of Methods?
IFS310: Module 3 1/25/2007 Fact Finding Techniques.
Task Analysis Methods IST 331. March 16 th
Tackling the Complexities of Source Evaluation: Active Learning Exercises That Foster Students’ Critical Thinking Juliet Rumble & Toni Carter Auburn University.
SBD: Analyzing Requirements Chris North CS 3724: HCI.
Design Process … and some design inspiration. Course ReCap To make you notice interfaces, good and bad – You’ll never look at doors the same way again.
Project Sharing  Team discussions (15 minutes) –Share results of your work on the Project Scope Proposal –Discuss your choice of methods and results –Prepare.
Administrivia  Final Exam –Due Date – Thursday, March 17 –Q & A  Deliverable 2 –Q & A.
CS5714 Usability Engineering Usability Inspection Copyright © 2003 H. Rex Hartson and Deborah Hix.
The Curriculum Development Process Dr. M
1 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation Planning your evaluation This presentation provides.
1 Usability evaluation and testing User interfaces Jaana Holvikivi Metropolia.
Oct 211 The next two weeks Oct 21 & 23: Lectures on user interface evaluation Oct 28: Lecture by Dr. Maurice Masliah No office hours (out of town) Oct.
Research Tools: Questionnaires. What is a Questionnaire? –A tool to: Collect answers to questions Collect factual data A well designed questionnaire should.
WHY IS THIS HAPPENING IN THE PROGRAM? Session 5 Options for Further Investigation & Information Flow.
Design Evaluation Overview Introduction Model for Interface Design Evaluation Types of Evaluation –Conceptual Design –Usability –Learning Outcome.
Introduction Social ecological approach to behavior change
1 Design and evaluation methods: Objectives n Design life cycle: HF input and neglect n Levels of system design: Going beyond the interface n Sources of.
User-Centered Design Services for MSU Web Teams Sarah J. Swierenga, Director Usability & Accessibility Center MSU Webmasters 402 Computer Center February.
Chapter 6: Design of Expert Systems
Introducing Evaluation
RES 728 Competitive Success/snaptutorial.com
WHAT IS INTERACTION DESIGN?
Usability Techniques Lecture 13.
Presentation transcript:

IS214 Recap

IS214 Understanding Users and Their Work –User and task analysis –Ethnographic methods –Site visits: observation, interviews –Contextual inquiry and design –Universal usability Evaluating –Usability inspection methods – including heuristics, guidelines –Surveys, interviews, focus groups –Usability testing –Server log analysis Organizational and Managerial Issues –Ethics; Managing usability

Methods: assessing needs, evaluating MethodNeedsEvaluation User and task analysis x Ethnographic methods x Observation, interviews x x Contextual inquiry& design x Universal usability x x Usability inspection– heuristics, guidelines x x Surveys, interviews, focus gps x x Usability testing x Server log analysis x

Intro to usability and UCD Usability concepts –Usability as more than interface –Functionality, content, and design User-Centered Design –Usability begins with design –At every stage in the design process, usability means using appropriate methods to perform user-based evaluation –Placing users (not cool technology or…) at the center of design –Iterative design

Understanding Users and Their Work To inform design & evaluation

User and Task Analysis Can’t ask “how good is this?” without asking “for whom and for what purpose?” Users –Selecting users: whom do you need to include? How many? –Categorizing users –Getting people’s cooperation Trust Tasks –Identifying & describing the tasks they (currently) perform –Technology design is work re-design User-task matrix

Ethnographic methods Methods and principles of social science research are fundamental to collecting, analyzing, interpreting data for needs and usability assessment –Reliability –Validity One set of methods: Ethnographic –Studying “users in the wild” –Learning their understanding of their work: purposes and practices –Seeing how they actually do their work (as opposed to formal work processes)

Site Visits Observing –Seeing people doing what they do, how they do it, under the conditions that they do it –Asking questions as they work –Tacit knowledge: people may not be able to articulate what they do –Recollection: people may not think to mention things, or not think them important Interviewing –Getting users’ understandings and interpretations –Ability to probe –Interviewing skills!

Contextual Inquiry and Design A systematic, ethnographically-based method for: –Collecting, interpreting, and summarizing information about work practices and organizational factors –Incorporating findings into design Structured approach to data collection, recording, interpretation Complex; requires that entire team be trained in it

Evaluating A design, prototype, or working system Not a clean distinction between design and evaluation

Usability inspection methods A variety of methods that consist of experts (not users) inspecting (not using) a design, prototype, or system Including: –Competitive evaluation –Heuristic evaluation Commonly-used method Easy Lots of information with not much investment Reflects short-term use; limited depth.

Surveys Useful for collecting data directly from users at various stages of design and development Can reach a large number of users Standardized questions, answer formats easy to analyze Issues of sample composition, sample size, and validity Only get answers to the questions you think to ask Question (and answer) wording affects results Lack of depth and follow-up

Usability testing Lab-based tests Usually standardized tasks observed under controlled conditions Good for getting performance data unsullied by variations in use conditions Bad for getting performance data under real conditions of use (ecological validity)

Focus groups Again, useful at many stages in process In-depth information from users Interaction among users helpful (or sometimes not) Limits: –small numbers –limited time period –effects of strong personalities or a sidetrack in the conversation Skilled facilitator! Hard to do well, easy to mess up

Server log analysis Analyzes data collected automatically Large numbers Unobtrusive Does not rely on use cooperation or memory Limits to the data available Inferences must be justified by the data

Organizational and Managerial Issues

Analyzing and presenting results Lots of data that has to be summarized in useful form What is the purpose of your study? What do you know? What do you need to know? What recommendations can you develop from your data? How do you present your findings succintly and clearly, in a way that your audience will understand and use?

Ethics Do no harm to the people you are studying Choices of projects?

Managing usability How usability fits into organizations “We don’t get no respect”

Universal usability International usability Accessibility –Removing unnecessary barriers –Being aware of and designing for the variety of people’s capabilities –Incorporating multimodal information presentation and functionality

Topic we might have covered: credibility Larger issue: when presenting content not (just) functionality, need to understand how people use and evaluate information Factors that affect web site credibility: –Source: Institutional, personal Expertise; bias or interest –Currency (how up to date the info is) –Observable factors used as indicators of unobservable: Language, (absence of) mistakes Links, imprimaturs

Some final questions How do we understand users’ activities, needs, interpretations, & preferences? –Especially for things that don’t yet exist –Users and uses are varied –People can’t always articulate what we would like to know from them –The observer is not a perfectly objective “tool” How do we translate these understandings into recommendations and designs? How do we decide what trade-offs to make? –Among users (including organization vs individuals) –Between cost of design and priority of needs