Research and Analysis Methods October 5, 2006. Surveys Electronic vs. Paper Surveys –Electronic: very efficient but requires users willing to take them;

Slides:



Advertisements
Similar presentations
Market Research Ms. Roberts 10/12. Definition: The process of obtaining the information needed to make sound marketing decisions.
Advertisements

CS305: HCI in SW Development Evaluation (Return to…)
Web E’s goal is for you to understand how to create an initial interaction design and how to evaluate that design by studying a sample. Web F’s goal is.
IS214 Recap. IS214 Understanding Users and Their Work –User and task analysis –Ethnographic methods –Site visits: observation, interviews –Contextual.
Deciding How to Measure Usability How to conduct successful user requirements activity?
Usability and taste  Taste is subjective  but not necessarily trivial  Taste is subject to fashion  Changes over time  Influenced by other people.
William H. Bowers – Understanding Users: Qualitative Research Cooper 4.
The art and science of measuring people l Reliability l Validity l Operationalizing.
Usability presented by the OSU Libraries’ u-team.
User Interface Testing. Hall of Fame or Hall of Shame?  java.sun.com.
C24 Johan Brink, IIE 8 December 2010 Reflections and Critical thinking Lecture
COMP6703 : eScience Project III ArtServe on Rubens Emy Elyanee binti Mustapha Supervisor: Peter Stradzins Client: Professor Michael.
Usage & Usability Denise A. Troll Distinguished Fellow, Digital Library Federation Associate University Librarian, Carnegie Mellon June 16, 2001 – LRRT,
Information Seeking Behavior of Scientists Brad Hemminger School of Information and Library Science University of North Carolina at Chapel.
An evaluation framework
Data collection methods Questionnaires Interviews Focus groups Observation –Incl. automatic data collection User journals –Arbitron –Random alarm mechanisms.
HCI revision lecture. Main points Understanding Applying knowledge Knowing key points Knowing relationship between things If you’ve done the group project.
]. Website Must-Haves Know your audience Good design Clear navigation Clear messaging Web friendly content Good marketing strategy.
Performance Management 2 MANA 3320
Introduction to Data Analysis *Training Session*
The 5 C’s of Web Design Craig Duncan Project Manager ReliefWeb United Nations Office for the Coordination of Humanitarian.
Choosing and Implementing a Research Design Lauren Garcia-DuPlain The University of Akron English Composition 112.
Web Usability 101: Watch (and Discuss) A Live Test John Fritz UMBC.
‘Hints for Designing Effective Questionnaires ’
Business and Management Research
MANAGEMENT OF MARKETING
By: Christopher Prewitt & Deirdre Huston.  When doing any project it is important to know as much information about the project and the views of everyone.
Research Methods in Psychology (Pp 1-31). Research Studies Pay particular attention to research studies cited throughout your textbook(s) as you prepare.
Requirements Gathering. Why are requirements important? To understand what we are going to be doing We build systems for others, not for ourselves Requirements.
Chapter 11: An Evaluation Framework Group 4: Tony Masi, Sam Esswein, Brian Rood, & Chris Troisi.
Ethics, Technology, and Qualitative Research: Thinking through the Implications of New Technology Sandra Spickard Prettyman Kristi Jackson.
“Advanced” Data Collection January 27, Slide 2 Innovation Network, Inc. Who We Are: Innovation Network National nonprofit organization Committed.
Part 1-Intro; Part 2- Req; Part 3- Design  Chapter 20 Why evaluate the usability of user interface designs?  Chapter 21 Deciding on what you need to.
Human Computer Interaction
Interface Design Natural Design. What is natural design? Intuitive Considers our learned behaviors Naturally designed products are easy to interpret and.
Usability testing. Goals & questions focus on how well users perform tasks with the product. – typical users – doing typical tasks. Comparison of products.
Usability Evaluation June 8, Why do we need to do usability evaluation?
©2010 John Wiley and Sons Chapter 6 Research Methods in Human-Computer Interaction Chapter 6- Diaries.
Sociological Research Methods Sociology: Chapter 2, Section 1.
Usability Testing CS774 Human Computer Interaction Spring 2004.
Usability Testing October 17, Overview Heuristic Evaluation and Usability Testing (25 min) Client Relationships – Lisa Lowthers (30 min) Review.
New Media Research Methods- Part 1 Planning stage: Using Methods, Data and Tools to explore User’s Experience. How methods relate to research questions?
1 Learning Objectives: 1.Understand data collection principles and practices. 2.Describe the differences between collecting qualitative and quantitative.
User Interface Design & Usability for the Web Card Sorting You should now have a basic idea as to content requirements, functional requirements and user.
Market research for a start-up. LEARNING OUTCOMES By the end of this lesson I will be able to: –Define and explain market research –Distinguish between.
Psychological Research Methods Psychology: Chapter 2, Section 2.
Marketing Research Approaches. Research Approaches Observational Research Ethnographic Research Survey Research Experimental Research.
Copyright 2010, The World Bank Group. All Rights Reserved. Testing and Documentation Part II.
Marketing Chapter Overview The meaning of market research The difference between primary and secondary market research Method of gathering information.
Primary Research Options Interview – One-on-one questions/answers with an expert – Often focuses on open-ended questions – Personal, Phone, Survey.
4.4 Marketing Research.
4. Marketing research After carefully studying this chapter, you should be able to: Define marketing research; Identify and explain the major forms of.
1 Introduction to Statistics. 2 What is Statistics? The gathering, organization, analysis, and presentation of numerical information.
Introduction to Evaluation without Users. Where are you at with readings? Should have read –TCUID, Chapter 4 For Next Week –Two Papers on Heuristics from.
Oct 211 The next two weeks Oct 21 & 23: Lectures on user interface evaluation Oct 28: Lecture by Dr. Maurice Masliah No office hours (out of town) Oct.
Evaluation / Usability. ImplementDesignAnalysisEvaluateDevelop ADDIE.
WHY IS THIS HAPPENING IN THE PROGRAM? Session 5 Options for Further Investigation & Information Flow.
Chapter 29 Conducting Market Research. Objectives  Explain the steps in designing and conducting market research  Compare primary and secondary data.
RES 320 expert Expect Success/res320expertdotcom FOR MORE CLASSES VISIT
Research in Psychology. Quantitative Methods  Quantitative: experiments and studies gathering data with questionnaires and analyzing results with correlations.
Day 8 Usability testing.
User Interface Evaluation
Imran Hussain University of Management and Technology (UMT)
Developing a Methodology
Business and Management Research
SY DE 542 User Testing March 7, 2005 R. Chow
Evaluation.
Designing Your Performance Task Assessment
Presentation transcript:

Research and Analysis Methods October 5, 2006

Surveys Electronic vs. Paper Surveys –Electronic: very efficient but requires users willing to take them; possibility of technical problems; concerns about privacy –Paper: fewer concerns about confidentiality but often low response rates Issue of length > how do you motivate people to fill your survey in? –Rewards > Introduces possibility of bias –Observer bias: act of measuring changes the thing being measured Versatile: many types (user satisfaction, w usability test, etc.)

Interviews vs. Focus Groups Interview: one person at a time; captures individual differences (individual subjectivity) –Problems: hard to generalize, hard to compare with other interviews (need schedule of questions) Focus groups: many people at one time; people stimulate responses within the group; can come to some group consensus –Problems: self-censorship; lack of privacy and confidentiality

Observation Go into the work site and watch people using a website; you may see –Problems (and how people solve them) –Use of secondary information (e.g. people need to look up words > add a glossary function) –Frequency of use of different parts of the site Generally, observations are not directed –usability tests use directed scenarios – info for scenarios often comes from user observation Problems: getting access to worksites; little value if site is only used occasionally

Market Research Much research already available on general characteristics of some user groups: –Students –Yuppies –Men vs. women Good for demographic information (info about the larger population) –Can help identify characteristics of sample to recruit for surveys, interviews, usability tests Problems: often little guidance for usability decisions (navigation, choice of info on site, etc.)

Site Usage Statistics Possible to get information from the server –Who is using the website (IP address of computer) –What pages are being accessed Problems: –Can’t just count the number of times a paper is called from the server > could just be someone moving back and forth within the site –IP addresses help you identify different users but nothing about their demographics or needs

Comparative Evaluation Ask target users what sites (and what features of other sites) they like –Identify characteristics of those sites and compare them with your own Look at other sites in your segment –Assumption: you are all trying to get the attention of the same target audience –Need to be able to match the functions, text, images that they use

Usability Tests For sites that already exist Identify specific problems through task scenario testing –Pick a typical task (perhaps from observation) –Ask user to complete task and talk their way through the steps (think-aloud protocol) –Observe; may sometimes need to prompt for thoughts and responses Final survey identifies general likes | dislikes –Colours, navigation, images, etc. of this site –Accuracy, completeness, consistency –Easy to understand, emotional involvement

Additional Methods Participatory Design –Include users in the (re)design of the website –Requires an organizational commitment to actually listen to their use (problem sometimes w mgmt vs. labour situations) Paper prototyping –Use paper rather than online prototypes because they are quick, flexible, easy to change, not too finished –Tangible (touch) methods often elicit more emotional/ subjective info from users

Card Sorts Many different versions –See Lazar One use: decide what to put in | take out of a website (content analysis) –Identify content; put one item on one index card –Ask people to sort cards in 5 piles (must have, nice to have, neutral, little use, would never use) –Size of piles: specific size (forced choice) or free choice; forced choice requires people to evaluate/ make decisions Fun for users; if you have many decks of cards, can test lots of people quickly

Error Analysis (Critical Incidents) Useful to find what does not work Error logs, messages to the webmaster, phone calls for help, etc. Sites vary in terms of importance of errors: –What happens if someone doesn’t find info on the computer science dept website? –What happens if someone can’t find info on emergency contraception? May need to classify some user scenarios/ tasks as critical (must be able to complete successfully)

Heuristic (Expert) Evaluation Many problems can be found by educated usability professionals –Design, navigation, site hierarchy (too deep), performance, etc. –Can’t identify subjective likes | dislikes, etc. Usability principles (from Lazar) and design principles (from Williams’ Non-designers Guides) can be applied to improve sites One method for class project can be your group’s own heuristic evaluation of problems –Just need to be able to explain/ classify these –Why do you decide something is a problem…

More Things to Consider Population vs. Sample –Sample needs to represent the population Convenience vs. Random samples –Need to identify potential for bias in your sampling practices do students in the ASU represent all Acadia students? do students in the Wong Centre represent all groups the Wong wants to attract? How many people to survey or test –Surveys: 30+ –Usability tests: 5 (Nielsen) – 20 (statistical validity)  8-9

Types of Information Demographic info –Need to gather info about people in your surveys, interviews, focus groups, observations, usability tests to be sure they match the target users Content Questions –Create consistency by developing a schedule (set) of questions before you start your research –Watch out for leading questions (imply the answer you want to hear) We worked hard on Welcome Week; how successful was it? You had the opportunity to attend Wel Wk; how successful was it? –Probes (encourage responses – and why do you believe that – but don’t add info

Types of Scales Open questions: free answer Closed questions: fixed set of answers (e.g., multiple choice) Semantic differential: (good for emotion and subjectivity) –Warm 12345Cold –Exciting12345Boring Major distinction: –Qualitative Research (open questions, free observation, unstructured inquiry) vs. –Quantitative Research (closed questions, able to apply statistical analysis

Final Thoughts Anonymity vs. confidentiality: when you do research with human subjects, you need to protect them from harm –Anonymous responses – their identities are protected –Confidential responses – the information itself is not revealed except in statistical averages, etc. Reliability vs. validity –Research can be reliable (always gets the same kind of data) but not be valid (the data does not reflect the target population) –Rare to be valid but not reliable!