Presentation is loading. Please wait.

Presentation is loading. Please wait.

CSC 341 Human-Computer Interaction

Similar presentations


Presentation on theme: "CSC 341 Human-Computer Interaction"— Presentation transcript:

1 CSC 341 Human-Computer Interaction
Lecture Slides by Dr. Mai Elshehaly

2 Today’s Class Evaluation Formative versus summative Paradigms
From tasks to requirements (recap) Predictive models: Fitts’ Law (recap) Evaluation Formative versus summative Paradigms Techniques Heuristic Evaluation Gestalt Theory DECIDE Framework

3 From User Tasks to Requirements (a recap)
“A functional requirement relates directly to a process the system has to perform as a part of supporting a user task and/or information it needs to provide as the user is performing a task.” Dennis, A., Wixom, B. H., & Tegarden, D. (2015). Systems analysis and design: An object-oriented approach with UML. John Wiley & Sons.

4 From User Tasks to Requirements (a recap)
The International Institute of Business Analysis (IIBA) defines functional requirements as: “the product capabilities, or things that a product must do for its users.” Functional requirements define how the system will support the user in completing a task. Dennis, A., Wixom, B. H., & Tegarden, D. (2015). Systems analysis and design: An object-oriented approach with UML. John Wiley & Sons.

5 From User Tasks to Requirements (a recap)
For example, assume the user task is: T1: Schedule a client appointment. The functional requirements associated with that task include: Determine client availability Find available openings matching client availability Select desired appointment Record appointment Confirm appointment. Notice how these functional requirements expand upon the user's task to describe capabilities and functions that the system will need to include, allowing the user to complete the task. Dennis, A., Wixom, B. H., & Tegarden, D. (2015). Systems analysis and design: An object-oriented approach with UML. John Wiley & Sons.

6 What is formative evaluation?

7 What is formative evaluation?

8 Notice some interesting keywords?
What is evaluation? “The process of systematically collecting data that informs us about what it is like for a particular user or group of users to use a product for a particular task in a certain type of environment.” [Chapter 10] Notice some interesting keywords?

9 What is evaluation? “The process of systematically collecting data that informs us about what it is like for a particular user or group of users to use a product for a particular task in a certain type of environment.” [Chapter 10]

10 Why evaluate? The goal of evaluation is to assess how well a design fulfills users' needs and whether users like it.

11 When to evaluate? Formative: Evaluations done during design to check that the product continues to meet users' needs Summative: Evaluations done to assess the success of a finished product, such as those to satisfy a sponsoring agency or to check that a standard is being upheld

12 Why formatively evaluate?
The goal of formative evaluation is to assess how well a design fulfills users' needs and whether users like it, during the design and implementation phases, not after. Developers Identify what to focus on at different stages of development Designers Understand requirements: this tends to happen by a process of negotiation between designers and users Users Understand design alternatives and are able to give better feedback

13 How to Evaluate? Evaluation Paradigms Evaluation Techniques

14 Evaluation Paradigms How people from the discipline think about evaluation Evaluation is guided by a set of beliefs that may also be underpinned by theory. These beliefs and the evaluation techniques associated with them are known as an evaluation paradigm

15 Predictive Evaluation
Evaluation Paradigms Quick and Dirty Informal meetings Solicit feedback from users or consultants to confirm that their ideas are in line with users' needs and are liked Data: descriptive sketches, notes, etc. Usability Testing Measuring users performance on controlled (but typical) tasks Users are recorded and logged Performance: error and time Opinions: questionnaires and interviews Field Studies In a natural setting through observation, interviews, ethnography Data: audio, video, notes Analysis: content, discourse, and conversational Evaluator: insider or outsider Predictive Evaluation Experts apply their knowledge of typical users, using heuristics Use of theoretical models (e.g. usability goals, interaction models, etc.)

16 Heuristic Evaluation A group of experts perform a walkthrough
Each inspector browses through each part of interaction design asking (to self) the heuristic questions Assessing compliance noting where heuristics are supported and where violated, along with context (e.g., screenshots) Inspectors get together as a team Discuss, compare, and merge problem lists Brainstorm suggested solutions Decide on recommendations Write report Most widely used HCI heuristics are Nielsen’s heuristics

17 Nielsen’s 10 usability heuristics

18 1. visibility of system status
What mode am I in now? What did I select? How is the system interpreting my actions?

19 2. Match between system and real-world
Speak the user’s language Information shown in logical order Avoid technical jargon

20 3. User Control and freedom
Users will make mistakes and want to fix or exit away from them

21 4. Consistency and Standards

22 5. Error Prevention Only legal data and legal commands are allowed

23 Bonus Activity ( +2 points)
Explain in your Project Phase II deliverable which of Nielsen’s heuristics are upheld and which are violated ( you will be rewarded for acknowledging violations) Refer to heuristic-evaluation-for-usability-in-hci-and-information-visualization

24 Gestalt Theory From the German language, Gestalt stands for configuration According to Gestalt psychologists, there are five main laws of groupings that we can apply to heuristics evaluations: Proximity Similarity Continuity Closure Common Fate

25 Proximity Dr Ayman Ezzat modified version of Dr, Frank Kriwaczek

26 Dr Ayman Ezzat modified version of Dr, Frank Kriwaczek

27 Dr Ayman Ezzat modified version of Dr, Frank Kriwaczek

28 Dr Ayman Ezzat modified version of Dr, Frank Kriwaczek

29 Dr Ayman Ezzat modified version of Dr, Frank Kriwaczek

30 Dr Ayman Ezzat modified version of Dr, Frank Kriwaczek

31 Dr Ayman Ezzat modified version of Dr, Frank Kriwaczek

32 Dr Ayman Ezzat modified version of Dr, Frank Kriwaczek

33 Dr Ayman Ezzat modified version of Dr, Frank Kriwaczek

34 Dr Ayman Ezzat modified version of Dr, Frank Kriwaczek

35 Example Workflow

36

37

38 Break

39 DECIDE: A framework for evaluation

40 Evaluation Goals?

41 Suggest a technique and paradigm
Evaluation Goals G1: Check that the evaluators have understood the users' needs. Suggest a technique and paradigm

42

43 Suggest a technique and paradigm
Evaluation Goals G2: Identify the metaphor on which to base the design. Suggest a technique and paradigm

44

45 Suggest a technique and paradigm
Evaluation Goals G3: Check to ensure that the final interface is consistent. Suggest a technique and paradigm

46

47 Suggest a technique and paradigm
Evaluation Goals G4: Investigate the degree to which technology influences working practices. Suggest a technique and paradigm

48

49 Suggest a technique and paradigm
Evaluation Goals G4: Identify how the interface of an existing product could be engineered to improve its usability Suggest a technique and paradigm

50

51 DECIDE: A framework for evaluation

52 Evaluation Questions:
Break Goals into questions Questions into sub-questions Example: Goal: Assess whether an online reservation system influences people’s dining out patterns

53 Evaluation Questions Example:
Goal: Assess whether an online reservation system influences people’s dining Questions: Is the user interface good/poor? Is the system easy/difficult to navigate? Is the search tool confusing? Is the interface design inconsistent? Is response time too slow? Is the feedback confusing or maybe insufficient?

54 DECIDE: A framework for evaluation

55 DECIDE: A framework for evaluation

56 Practical Issues for Evaluation
Recruiting Users Facilities and equipment Budget and schedule Expertise

57 DECIDE: A framework for evaluation

58 Ethical Issues Inform of procedures Explain confidentiality
Reassure (ability to stop at any time) Pay users Maintain confidentiality Ask for permission to quote users if needed Consent form Activities to familiarize users

59 References Chapter 10 Sections 11.1, 11.2, 11.3, 13.4

60 Quiz Why can a Macintosh pull-down menu be accessed at least five times faster than a typical Windows pull-down menu. Suggest at least two reasons why Microsoft made such an apparently stupid decision.

61 Quiz Explain why a Macintosh pull-down menu can be accessed at least five times faster than a typical Windows pull-down menu. Microsoft, Sun, and others have made the decision to mount the menu bar on the window, rather than at the top of the display, as Apple did. They made this decision for at least two reasons: Apple claimed copyright and patent rights on the Apple menu bar Everyone else assumed that moving the menu bar closer to the user, by putting it at the top of the window, would speed things up. More questions: Test yourself:

62 Fitts’ Law (a recap) Fitts' Law
The time to acquire a target is a function of the distance A to and size W of the target. Homework Assignment 2


Download ppt "CSC 341 Human-Computer Interaction"

Similar presentations


Ads by Google