Chapter 3. Information Input and Processing Prepared by: Ahmed M. El-Sherbeeny, PhD.

Slides:



Advertisements
Similar presentations
Slide Sets to accompany Blank & Tarquin, Engineering Economy, 6 th Edition, 2005 © 2005 by McGraw-Hill, New York, N.Y All Rights Reserved 18-1 Developed.
Advertisements

SIMS-201 Compressing Information. 2  Overview Chapter 7: Compression Introduction Entropy Huffman coding Universal coding.
Task Analysis (continued). Task analysis Observations can be done at different levels of detail fine level (primitives, e.g. therbligs, keystrokes,GOMS.
Studying Behavior. Variables Any event, situation, behavior or individual characteristic that varies In our context, the “things” that make up an experiment.
Psychology 101  Introduction to Psychology  Dr. Jacob Leonesio.
Slides prepared by Rose Williams, Binghamton University Chapter 5 Defining Classes II.
1 Introduction to Computers and Programming Quick Review What is a Function? A module of code that performs a specific job.
What is Cognitive Science? … is the interdisciplinary study of mind and intelligence, embracing philosophy, psychology, artificial intelligence, neuroscience,
What is Cognitive Science? … is the interdisciplinary study of mind and intelligence, embracing philosophy, psychology, artificial intelligence, neuroscience,
1 Information Input and Processing Information Theory: Some times called cognitive psychology, cognitive engineering, and engineering psychology. Information.
©2003/04 Alessandro Bogliolo Background Information theory Probability theory Algorithms.
Entropy and some applications in image processing Neucimar J. Leite Institute of Computing
Understanding Movement Preparation
An Information Processing Perspective on Conditioning C. R. Gallistel Rutgers Center for Cognitive Science.
1 Information Input and Processing Information Theory: Some times called cognitive psychology, cognitive engineering, and engineering psychology. Some.
Section Copyright © 2014, 2012, 2010 Pearson Education, Inc. Lecture Slides Elementary Statistics Twelfth Edition and the Triola Statistics Series.
Chapter 2 Computer Hardware
© 2006 Pearson Education Canada Inc.2-1 Chapter 2 Accounting Under Ideal Conditions.
Department of Industrial Management Engineering 2. Psychology and HCI ○Hard science (ComSci) would drive out soft science (Psy)  “harden Psy” to improve.
Object-Oriented Modeling Using UML CS 3331 Section 2.3 of Jia 2003.
Chapter 1 The Science of Biology. (What is science?) The Nature of Science.
Chapter 3. Information Input and Processing Part – II* Prepared by: Ahmed M. El-Sherbeeny, PhD *(Adapted from Slides by: Dr. Khaled Al-Saleh) 1.
King Saud University College of Engineering IE – 341: “Human Factors” Fall – 2015 (1 st Sem H) Chapter 3. Information Input and Processing Part.
IE341: Human Factors Engineering Prof. Mohamed Zaki Ramadan Ahmed M. El-Sherbeeny, PhD.
Information Theory The Work of Claude Shannon ( ) and others.
Outline Transmitters (Chapters 3 and 4, Source Coding and Modulation) (week 1 and 2) Receivers (Chapter 5) (week 3 and 4) Received Signal Synchronization.
Modeling Visual Search Time for Soft Keyboards Lecture #14.
Introduction to Digital Signals
Human Capabilities Part – I. Hearing (Chapter 6*) Prepared by: Ahmed M. El-Sherbeeny, PhD 1.
1 ISE 412 Information Theory (aka Communication Theory)  Grew out of the study of problems of electrical communications (especially telegraphy) and statistical.
ITM 734 Introduction to Human Factors in Information Systems
Experimental Psychology PSY 433
Chapter 8 Action Preparation
CS654: Digital Image Analysis
OPERATING SYSTEMS CS 3530 Summer 2014 Systems and Models Chapter 03.
Digital Image Processing Lecture 22: Image Compression
Understanding Movement Preparation Chapter 2. Perception: the process by which meaning is attached to information (interpretation) Theory 1: Indirect.
Entropy (YAC- Ch. 6)  Introduce the thermodynamic property called Entropy (S)  Entropy is defined using the Clausius inequality  Introduce the Increase.
Chapter 1. Introduction Prepared by: Ahmed M. El-Sherbeeny, PhD.
USER INTERFACE USER INTERFACE January 5, 2006 Intern 박지현 Information Theoretic Model of HCI : A Comparison of the Hick-Hyman Law and Fitts’ Law Steven.
King Saud University College of Engineering IE – 341: “Human Factors” Fall – 2015 (1 st Sem H) Chapter 3. Information Input and Processing Part.
King Saud University College of Engineering IE – 341: “Human Factors” Spring – 2016 (2 nd Sem H) Chapter 3. Information Input and Processing Part.
Psy B07 Chapter 5Slide 1 PROBABILITY – BASIC CONCEPTS.
Signal detection Psychophysics.
IE442 FIRST TUTORIAL (SPRING 2011) 1. A mail-sorter must place customers' mail into one of 256 bins. If the mail-sorter assumes that all bins are equally-likely.
King Saud University College of Engineering IE – 341: “Human Factors Engineering” Spring – 2016 (2 nd Sem H) Chapter 10. Human Control of Systems.
King Saud University College of Engineering IE – 341: “Human Factors Engineering” Spring – 2016 (2 nd Semester H) Chapter 1. Introduction Prepared.
King Saud University College of Engineering IE – 341: “Human Factors Engineering” Spring – 2016 (2 nd Sem H) Chapter 10. Human Control of Systems.
Remember the equation of a line: Basic Linear Regression As scientists, we find it an irresistible temptation to put a straight line though something that.
King Saud University College of Engineering IE – 341: “Human Factors” Spring – 2016 (2 nd Sem H) Chapter 3. Information Input and Processing Part.
Information Theory Information Suppose that we have the source alphabet of q symbols s 1, s 2,.., s q, each with its probability p(s i )=p i. How much.
Solved Problems 1. The Hick-Hyman law provides one measure of information processing ability. Assume that an air traffic controller has a channel.
Chapter 1. Introduction Prepared by: Ahmed M. El-Sherbeeny, PhD
EE465: Introduction to Digital Image Processing
Chapter 3. Information Input and Processing
King Saud University College of Engineering IE – 341: “Human Factors Engineering” Fall – 2017 (1st Sem H) Chapter 3. Information Input and Processing.
Chapter 1. Introduction Prepared by: Ahmed M. El-Sherbeeny, PhD
Chapter 1. Introduction Prepared by: Ahmed M. El-Sherbeeny, PhD
Combinational Logic Design
Image Compression The still image and motion images can be compressed by lossless coding or lossy coding. Principle of compression: - reduce the redundant.
King Saud University College of Engineering IE – 341: “Human Factors Engineering” Fall – 2017 (1st Sem H) Chapter 3. Information Input and Processing.
Reaction Time.
Overview of Supervised Learning
Information Units of Measurement
Chapter 2 Human Information Processing
Engineering Statistics 2
Chapter 3. Information Input and Processing
Beijing University of Aeronautics & Astronautics
Chapter 3. Information Input and Processing Part – I*
Presentation transcript:

Chapter 3. Information Input and Processing Prepared by: Ahmed M. El-Sherbeeny, PhD

 Information: ◦ How it can be measured ◦ How it can be coded ◦ How it can be displayed  Model of human information processing ◦ Measuring mental workload  Implications to HF’s due to information revolution

 Information Processing is AKA: ◦ Cognitive Psychology ◦ Cognitive Engineering ◦ Engineering Psychology  Objectives of Information Theory: ◦ Finding an operational definition of information ◦ Finding a method for measuring information ◦ Note, most concepts of Info. Theory are descriptive (i.e. qualitative vs. quantitative)  Information (Def n ): ◦ “Reduction of Uncertainty” ◦ Emphasis is on “highly unlikely” events ◦ Example (information in car):  “Fasten seat belt”: likely event ⇒ not imp. in Info. Th.  “Temperature warning”: unlikely event ⇒ imp.

 Case 1: ≥ 1 equally likely alternative events: ◦ H : amount of information [Bits] ◦ N: number of equally likely alternatives ◦ e.g.: 2 equally likely alternatives ⇒ ⇒ Bit (Def n ): “amount of info. to decide between two equally likely (i.e. 50%-50%) alternatives” ◦ e.g.: 4 equally likely alternatives⇒ ◦ e.g.: equally likely digits (0-9)⇒ ◦ e.g.: equally likely letters (a-z)⇒ Note, for each of above, unit [bit] must be stated.

 Case 2: ≥ 1 non-equally likely alternatives: ◦ : amount of information [Bits] for single event, i ◦ : probability of occurrence of single event, i ◦ Note, this is not usually significant (i.e. for individual event basis)

 Case 3: Average info. of non-equally likely series of events: ◦ : average information [Bits] from all events ◦ : probability of occurrence of single event, i ◦ N : number of equally likely alternatives/events ◦ e.g.: 2 alternatives (N = 2)  Enemy attacks by land,p 1 = 0.9  Enemy attacks by sea,p 2 = 0.1  ⇒

 Case 4: Redundancy: ◦ If 2 occurrences: equally likely ⇒  p 1 = p 2 = 0.5 (i.e. 50 % each)  ⇒ H = H max = 1 ◦ In e.g. in last slide, departure from max. info.  = 1 – 0.47 = 0.53 = 53% ◦ ◦ Note, as departure from equal prob. ↑ ⇒ %Red. ↑ ◦ e.g.: not all English letters equally likely: “th”,“qu”  ⇒ %Red. of English language = 68 %  PS. How about Arabic language?

 Experiments: ◦ Subjects: exposed to different stimuli ◦ Response time is measured ◦ e.g. 4 lights – 4 buttons  Hick (1952): ◦ Varied number of stimuli (eq. likely alternatives) ◦ He found:  As # of eq. likely alt. ↑ ⇒ reaction time to stimulus ↑  Reaction time vs. Stimulus (in Bits): linear function  Hyman (1953): ◦ Kept number of stimuli (alternatives) fixed ◦ Varied prob. of occurrence of events ⇒ info. Varies ◦ He found: “Hick-Hyman Law”  AGAIN: Reaction time vs. Stimulus (in Bits): linear function!