Download presentation
Presentation is loading. Please wait.
1
Design and Evaluation Methods Chap 3
2
► Technology-oriented vs. User- or Customer-oriented ► Understanding customer needs and desires
3
Design and Evaluation Methods ► Overview of Design and Evaluation ► Front-end Analysis ► Iterative Design and Testing ► Final Test and Evaluation ► Conclusion
4
Overview of Design and Evaluation ► Cost/Benefit Analysis of Human Factors Contributions ► Human Factors in the Product Design Lifecycle ► User-centered Design ► Sources for Design Work
5
Cost/Benefit Analysis of Human Factors Contributions (1/2) ► Costs: Personnel, Materials Tab 3.1 Tab 3.2 Tab 3.1 Tab 3.2 Tab 3.1 Tab 3.2 ► Benefits: Mayhew (1992) Increased sales Decreased cost of providing training Decreased customer support costs Decreased development costs Decreased maintenance costs Increased user productivity Decreased user errors Improved quality of service Decreased training time Decreased user turnover
6
Cost/Benefit Analysis of Human Factors Contributions (2/2) ► Benefits: health or safety related ► Total Benefits Estimating relevant variables without human factors intervention (A) Estimating the same variables with HF (B) (B) - (A)
7
Human Factors in the Product Design Lifecycle ► Must be involved as early as possible Multidisciplinary design team members ► Product life cycle (six major stages) Front end analysis Iterative design and test System production Implementation and evaluation System operation and maintenance System disposal
8
User-Centered Design (1/2) ► User-centered design: center the design process around the user ► How Adequately determining user needs Involving the user at all stages of the design process ► Subfield: Usability engineering software design
9
User-Centered Design (2/2) ► Four general approaches Early focus on the user and tasks Empirical measurement: focus on quantitative performance data Iterative design using prototypes (rapid changes) Participatory design: users as part of the design team
10
Sources for Design Work ► Data Compendiums (摘要、概略) Condensed and categorized databases ► Human Factors Design Standards Precise recommendations relate to specific areas or topics ► Human Factors Principles and Guidelines Cover a wide range of topics Guides rather than hard-and-fast rules Require careful consideration and application
11
Front-End Analysis ► User Analysis ► Environment Analysis ► Function and Task Analysis ► How to Perform a Task Analysis ► Collect Task Data ► Summarize Task Data ► Analyze Task Data ► Identify User Preferences and Requirements
12
User Analysis (1/2) ► User population Most important: regular users or operators Potential users (wider range of users) ► Complete description Age, gender, education level Reading ability, physical size, physical ability Familiarity with the product, task-relevant skills etc.
13
User Analysis (2/2) ► Personas (vs. list of characteristics) Persona: hypothetical person Not real people, but represent key characteristics of the user population Specific (even have a name) For most applications: 3 or 4 personas
14
Environment Analysis ► Depend on specific environment User characteristics Activity, basic tasks
15
Function and Task Analysis (1/3) ► Goal, Function, Task Goal: end condition or reason for performing the tasks Function: general transformations (of information and system state) to achieve the goal Task: specific activities to carry out a function
16
Function and Task Analysis (2/3) ► Function analysis Analysis of the basic functions performed by the “system” System: human-machine, human-software, human-equipment- environment, etc. ► Task analysis Systematically describing human interaction with a system to understand how to match the demands of the system to human capabilities
17
Function and Task Analysis (3/3) ► Task analysis (cont.) Preliminary task analysis: Activity analysis Beginning (of the design process): preliminary task analysis Progress: more extensive task analysis
18
How to Perform a Task Analysis (1/2) ► Four steps of a task analysis Define the analysis purpose and identify the type of data required Collect task data Summarize task data Analyze task data
19
How to Perform a Task Analysis (2/2) ► Define Purpose and Required Data Define purpose: focus the analysis on the end use of the data Information gathered depends on: purpose, type of the task (physical task, cognitive task) Type of information: 1. hierarchical relationships 2. information flow 3. task sequence 4. location and environmental conditions
20
Collect Task Data (1/7) ► Observation ► Think-Aloud Verbal Protocol ► Task Performance with Questioning ► Unstructured and Structured Interviews ► Surveys and Questionnaires
21
Collect Task Data (2/7) ► Observation Perform under typical scenarios Identify different methods for accomplishing a goal Not sufficient: primarily cognitive tasks
22
Collect Task Data (3/7) ► Think-Aloud Verbal Protocol Underlying goals, strategies, decisions, other cognitive components Verbal protocol: verbalizations regarding task performance Verbal protocol analysis Type of verbal protocol − Concurrent (difficult, procedural information) − Retrospective (useful, explanations) − Prospective: imagine performing the task
23
Collect Task Data (4/7) ► Task Performance with Questioning Advantage: may cue users to verbalize goal, … Disadvantage: disruptive Retrospective analysis of video-tapes: effective − think-aloud verbalization − Fail to provide information: requested − Can pause and ask questions
24
Collect Task Data (5/7) ► Unstructured and Structured Interviews Begin with short unstructured interviews − How go about the activities − Preferences, strategies − Fail to achieve their goals, make errors, … Question probes: When, How, Why is & not Focus group − 6-10 users led by a facilitator − Facilitator: familiar with task & system, neutral
25
Collect Task Data (6/7) ► Surveys and Questionnaires After obtaining preliminary descriptions of activities or basic tasks Affirm the accuracy of the information Determine frequency (perform the task) Identify preferences or biasis
26
Collect Task Data (7/7) ► Beyond the Limitations Should focus on the basic user goals and needs Not on how they are carried out using the existing products Evaluate: the underlying characteristics of the environment the underlying characteristics of the environment the control requirements of the system the control requirements of the system
27
Summarize Task Data ► Lists, Outlines, and Matrices Tab 3.3 Tab 3.3 Tab 3.3 ► Hierarchies Hierarchical task analysis (HTA) Fig 3.1 Fig 3.1 Fig 3.1 GOMS: goals, operators, methods, and selection rules ► Flow Charts, Timelines, and Maps Operational sequence diagram (OSD) Fig 3.2 Fig 3.2 Fig 3.2
28
Analyze Task Data (1/2) ► Network Analysis Fig 3.3 Fig 3.3 Fig 3.3 Matrix representation of information flows between functions Identify clusters of related functions ► Workload Analysis ► Simulation and Modeling ► Safety Analysis
29
Analyze Task Data (2/2) ► Scenario Specification Scenario: a situation and a specific set of tasks that represent an important use of the system or product Create a scenario: only those directly serve users’ goals are retained Daily use scenarios: common sets of task that occur daily Necessary use scenarios: infrequent but critical sets of tasks that must be performed
30
Identify User Preferences and Requirements ► Can be quite extensive
31
Iterative Design and Testing ► Providing Input for System Specifications ► Organization Design ► Prototypes ► Heuristic Evaluation ► Usability Testing
32
Providing Input for System Specifications (1/7) ► System Specifications The overall objectives the system supports − What must be done to achieve the user’s goals, not how to do it − Reflect user’s goal, not technology Performance requirements and features − Determine the means: help the users achieve their goals − What the system be able to do & under what conditions Design constraints
33
Providing Input for System Specifications (2/7) ► System Specifications The overall objectives the system supports Performance requirements and features Design constraints − Various solutions Design constraints − Constraints: limit possible design alternatives ► Human factors Take a systems design approach: analyzing the entire human-machine system
34
Providing Input for System Specifications (3/7) ► Quality Function Deployment: relative importance of potential system features “House of quality” decision matrix Fig 3.4 Fig 3.4 Fig 3.4 Weighting: importance of the objectives − 9: very important; 3: somewhat important; 1: marginally important Rating: how well each feature serves each objective
35
Providing Input for System Specifications (4/7) ► Cost/Benefit Analysis Fig 3.4 Fig 3.4 Fig 3.4 Rows: features Columns: design alternatives Weight: the result of the QFD Rating: how well each design alternative address the feature Benefit: weighted sum Cost/Benefit ratio Lowest Cost/Benefit ratio: valuable
36
Providing Input for System Specifications (5/7) ► Tradeoff Analysis Small-scale study: which design alternative results in the best performance (trade studies) Modeling or performance estimates (by designer) Multiple factors: advantage , disadvantage Decision matrix − Rows: features (factors) − Columns: different means of implementation − Fail to consider global issues: how the features interact as a group (A product is more than the sum of its features) − Complemented with: scenario specification (e.g.)
37
Providing Input for System Specifications (6/7) ► Human Factors Criteria Identification Usability requirements Specify characteristics: relevant to human performance and safety ► Functional Allocation System (automatic), Person (manual), or combination Assign a function to the more “capable” system component Fig 3.5 Fig 3.5 Fig 3.5 Left with a coherent set of tasks that can be understood
38
Providing Input for System Specifications (7/7) ► Support Materials Development Should begin with the front-end analysis Manuals, assembly instructions, owner’s manual, training programs, etc. Materials: compatible with the characteristics and limitations of the human users − Critical information: maximize the likelihood that users read it, understand it, and comply it
39
Prototypes (1/2) ► Mock-up vs. prototype Mock-up: very crude approximations of the final product (e.g. foam, cardboard) Prototype: more of the look and feel of the final product, but not yet full functioning
40
Prototypes (2/2) ► Advantages of using prototypes Confirming insights gathered during the front-end analysis Support of the design team in making ideas concrete Support of the design team by providing a communication medium Support for heuristic evaluation Support for usability testing (something to react to and use)
41
Heuristic Evaluation ► Analytically considering: Characteristics of the product/system: meet human factors criteria? ► Which aspects to be evaluated: meet? Human factors criteria (requirement specifications) Other human factors standards and guidelines ► Evaluated by whom: usability experts Multiple evaluators: at least 3, preferably 5 Inspect in isolation (each evaluator) communicate and aggregate their findings
42
Usability Testing ► Usability: The degree to which the system is easy to use User friendly ► Variables relevant to usability: Learnability: easy to learn, rapidly start Efficiency: high level of productivity (once learned) Memorability: easy to remember Errors: low error rate, easily recover from errors Satisfaction: pleasant to use, subjectively satisfied in using it, like it
43
Final Test and Evaluation ► Involving users ► Data collected: Acceptability Usability Performance of the user or human-machine system
44
Conclusion ► Techniques for creating user-centered systems To understand user needs To design systems to meet those needs ► Critical step Provide human factors criteria for design
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.