Chapter 3. Information Input and Processing Part – I*

Slides:



Advertisements
Similar presentations
SIMS-201 Compressing Information. 2  Overview Chapter 7: Compression Introduction Entropy Huffman coding Universal coding.
Advertisements

Task Analysis (continued). Task analysis Observations can be done at different levels of detail fine level (primitives, e.g. therbligs, keystrokes,GOMS.
Studying Behavior. Variables Any event, situation, behavior or individual characteristic that varies In our context, the “things” that make up an experiment.
1 Information Input and Processing Information Theory: Some times called cognitive psychology, cognitive engineering, and engineering psychology. Information.
©2003/04 Alessandro Bogliolo Background Information theory Probability theory Algorithms.
Understanding Movement Preparation
1 Information Input and Processing Information Theory: Some times called cognitive psychology, cognitive engineering, and engineering psychology. Some.
Department of Industrial Management Engineering 2. Psychology and HCI ○Hard science (ComSci) would drive out soft science (Psy)  “harden Psy” to improve.
Chapter 3. Information Input and Processing Part – II* Prepared by: Ahmed M. El-Sherbeeny, PhD *(Adapted from Slides by: Dr. Khaled Al-Saleh) 1.
King Saud University College of Engineering IE – 341: “Human Factors” Fall – 2015 (1 st Sem H) Chapter 3. Information Input and Processing Part.
IE341: Human Factors Engineering Prof. Mohamed Zaki Ramadan Ahmed M. El-Sherbeeny, PhD.
Human Capabilities Part – I. Hearing (Chapter 6*) Prepared by: Ahmed M. El-Sherbeeny, PhD 1.
1 ISE 412 Information Theory (aka Communication Theory)  Grew out of the study of problems of electrical communications (especially telegraphy) and statistical.
Chapter 8 Action Preparation
King Saud University College of Engineering IE – 341: “Human Factors” Fall – 2015 (1 st Sem H) Applied Anthropometry, Work-Space Design Part II.
Understanding Movement Preparation Chapter 2. Perception: the process by which meaning is attached to information (interpretation) Theory 1: Indirect.
USER INTERFACE USER INTERFACE January 5, 2006 Intern 박지현 Information Theoretic Model of HCI : A Comparison of the Hick-Hyman Law and Fitts’ Law Steven.
King Saud University College of Engineering IE – 341: “Human Factors” Fall – 2015 (1 st Sem H) Chapter 3. Information Input and Processing Part.
King Saud University College of Engineering IE – 341: “Human Factors” Spring – 2016 (2 nd Sem H) Chapter 3. Information Input and Processing Part.
IE442 FIRST TUTORIAL (SPRING 2011) 1. A mail-sorter must place customers' mail into one of 256 bins. If the mail-sorter assumes that all bins are equally-likely.
King Saud University College of Engineering IE – 341: “Human Factors Engineering” Spring – 2016 (2 nd Sem H) Chapter 10. Human Control of Systems.
Chapter 3. Information Input and Processing Prepared by: Ahmed M. El-Sherbeeny, PhD.
King Saud University College of Engineering IE – 341: “Human Factors Engineering” Spring – 2016 (2 nd Semester H) Chapter 1. Introduction Prepared.
King Saud University College of Engineering IE – 341: “Human Factors Engineering” Spring – 2016 (2 nd Sem H) Chapter 10. Human Control of Systems.
King Saud University College of Engineering IE – 341: “Human Factors” Spring – 2016 (2 nd Sem H) Chapter 3. Information Input and Processing Part.
King Saud University College of Engineering IE – 341: “Human Factors” Spring – 2016 (2 nd Sem H) Chapter 3. Information Input and Processing Part.
Solved Problems 1. The Hick-Hyman law provides one measure of information processing ability. Assume that an air traffic controller has a channel.
Chapter 2 HYPOTHESIS TESTING
Chapter 1. Introduction Prepared by: Ahmed M. El-Sherbeeny, PhD
SUR-2250 Error Theory.
OPERATING SYSTEMS CS 3502 Fall 2017
King Saud University College of Engineering IE – 341: “Human Factors Engineering” Fall – 2016 (1st Sem H) Chapter 10. Human Control of Systems.
ENGM 742: Engineering Management and Labor Relations
EE465: Introduction to Digital Image Processing
Digital Image Processing Lecture 20: Image Compression May 16, 2005
REACTION TIME
Prepared by: Ahmed M. El-Sherbeeny, PhD
Chapter 3. Information Input and Processing
King Saud University College of Engineering IE – 341: “Human Factors Engineering” Fall – 2017 (1st Sem H) Chapter 3. Information Input and Processing.
Chapter 1. Introduction Prepared by: Ahmed M. El-Sherbeeny, PhD
Chapter 1. Introduction Prepared by: Ahmed M. El-Sherbeeny, PhD
Combinational Logic Design
Chapter 3. Information Input and Processing
Image Compression The still image and motion images can be compressed by lossless coding or lossy coding. Principle of compression: - reduce the redundant.
King Saud University College of Engineering IE – 341: “Human Factors Engineering” Fall – 2017 (1st Sem H) Chapter 3. Information Input and Processing.
Reaction Time.
Overview of Supervised Learning
Information Units of Measurement
Dynamical Models of Decision Making Optimality, human performance, and principles of neural information processing Jay McClelland Department of Psychology.
King Saud University College of Engineering IE – 341: “Human Factors Engineering” Fall – 2017 (1st Sem H) Chapter 3. Information Input and Processing.
Chapter 2 Human Information Processing
Engineering Statistics 2
Chapter 3. Information Input and Processing
King Saud University College of Engineering IE – 341: “Human Factors Engineering” Fall – 2017 (1st Sem H) Visual Displays of Dynamic Information.
Prepared by: Ahmed M. El-Sherbeeny, PhD
Quantitative vs. Qualitative Data
The Science of Biology.
Manual Materials Handling Prepared by: Ahmed M. El-Sherbeeny, PhD
King Saud University College of Engineering IE – 341: “Human Factors Engineering” Fall – 2017 (1st Sem H) Visual Displays of Dynamic Information.
King Saud University College of Engineering IE – 462: “Industrial Information Systems” Fall – 2018 (1st Sem H) Chapter 4 Structured Analysis.
King Saud University College of Engineering IE – 462: “Industrial Information Systems” Fall – 2018 (1st Sem H) Chapter 4: Structured Analysis.
King Saud University College of Engineering IE – 462: “Industrial Information Systems” Fall – 2018 (1st Sem H) Chapter 3 Data Modeling and.
King Saud University College of Engineering IE – 462: “Industrial Information Systems” Fall – 2018 (1st Sem H) Introduction (Chapter 1) part.
King Saud University College of Engineering IE – 462: “Industrial Information Systems” Fall – 2018 (1st Sem H) Chapter 4: Structured Analysis.
Manual Materials Handling Prepared by: Ahmed M. El-Sherbeeny, PhD
Analysis of Variance Objective
Language & Cognitive Development
Independence and Counting
Independence and Counting
Independence and Counting
Presentation transcript:

King Saud University College of Engineering IE – 341: “Human Factors” Fall – 2014 (1st Sem. 1435-6H) Chapter 3. Information Input and Processing Part – I* Prepared by: Ahmed M. El-Sherbeeny, PhD *(Adapted from Slides by: Dr. Khaled Al-Saleh)

Chapter Overview Information: How it can be measured (part I) How it can be displayed (part II) How it can be coded (part II)

Information Theory Information Processing is AKA: Cognitive Psychology Cognitive Engineering Engineering Psychology Objectives of Information Theory: Finding an operational definition of information Finding a method for measuring information Note, most concepts of Info. Theory are descriptive (i.e. qualitative vs. quantitative) Information (Defn): “Reduction of Uncertainty” Emphasis is on “highly unlikely” events Example (information in car): “Fasten seat belt”: likely event ⇒ not imp. in Info. Th. “Temperature warning”: unlikely event ⇒ imp. Cognitive: المعرفي

Unit of Measure of Information Case 1: ≥ 1 equally likely alternative events: H : amount of information [Bits] N: number of equally likely alternatives e.g.: 2 equally likely alternatives ⇒ ⇒ Bit (Defn): “amount of info. to decide between two equally likely (i.e. 50%-50%) alternatives” e.g.: 4 equally likely alternatives⇒ e.g.: equally likely digits (0-9)⇒ e.g.: equally likely letters (a-z)⇒ Note, for each of above, unit [bit] must be stated.

Cont. Unit of Measure of Information Case 2: ≥ 1 non-equally likely alternatives: : amount of information [Bits] for single event, i : probability of occurrence of single event, i Note, this is not usually significant (i.e. for individual event basis)

Cont. Unit of Measure of Information Case 3: Average info. of non-equally likely series of events: : average information [Bits] from all events : probability of occurrence of single event, i N : num. of non-equally likely alternatives/events e.g.: 2 alternatives (N = 2) Enemy attacks by land, p1 = 0.9 Enemy attacks by sea, p2 = 0.1 ⇒

Cont. Unit of Measure of Information Case 4: Redundancy: If 2 occurrences: equally likely ⇒ p1 = p2 = 0.5 (i.e. 50 % each) ⇒ H = Hmax = 1 In e.g. in last slide, departure from max. info. = 1 – 0.47 = 0.53 = 53% Note, as departure from equal prob. ↑ ⇒ %Red. ↑ e.g.: not all English letters equally likely: “th”,“qu” ⇒ %Red. of English language = 68 % PS. How about Arabic language?

Choice Reaction Time Experiments Subjects: exposed to different stimuli Response time is measured e.g. 4 lights – 4 buttons Hick (1952): Varied number of stimuli (eq. likely alternatives) He found: As # of eq. likely alt. ↑ ⇒ reaction time to stimulus ↑ Reaction time vs. Stimulus (in Bits): linear function Hyman (1953): Kept number of stimuli (alternatives) fixed Varied prob. of occurrence of events ⇒ info. Varies He found: “Hick-Hyman Law” AGAIN: Reaction time vs. Stimulus (in Bits): linear function!