IE341: Human Factors Engineering Prof. Mohamed Zaki Ramadan Ahmed M. El-Sherbeeny, PhD.

Slides:



Advertisements
Similar presentations
Chapter 7 Hypothesis Testing
Advertisements

Anthony Greene1 Simple Hypothesis Testing Detecting Statistical Differences In The Simplest Case:  and  are both known I The Logic of Hypothesis Testing:
Chapter 12: Testing hypotheses about single means (z and t) Example: Suppose you have the hypothesis that UW undergrads have higher than the average IQ.
All About Dice Dice have been around for at least 5000 years and are used in many games. Knucklebones of animals, which are approximately tetrahedral.
Sampling Distributions
Statistics: Purpose, Approach, Method. The Basic Approach The basic principle behind the use of statistical tests of significance can be stated as: Compare.
Statistical Issues in Research Planning and Evaluation
Chapter 10 Section 2 Hypothesis Tests for a Population Mean
Statistical Techniques I EXST7005 Lets go Power and Types of Errors.
Significance Testing Chapter 13 Victor Katch Kinesiology.
Review: What influences confidence intervals?
Statistics for Decision Making Descriptive Statistics QM Fall 2003 Instructor: John Seydel, Ph.D.
Cal State Northridge  320 Ainsworth Sampling Distributions and Hypothesis Testing.
Chapter Sampling Distributions and Hypothesis Testing.
Sampling Distributions
1 Information Input and Processing Information Theory: Some times called cognitive psychology, cognitive engineering, and engineering psychology. Information.
LIMITS The Limit of a Function LIMITS Objectives: In this section, we will learn: Limit in general Two-sided limits and one-sided limits How to.
Algebra Problems… Solutions
Hypothesis Testing:.
Intermediate Statistical Analysis Professor K. Leppel.
Lecture Slides Elementary Statistics Twelfth Edition
Overview Definition Hypothesis
© 2008 McGraw-Hill Higher Education The Statistical Imagination Chapter 9. Hypothesis Testing I: The Six Steps of Statistical Inference.
Review of Statistical Inference Prepared by Vera Tabakova, East Carolina University ECON 4550 Econometrics Memorial University of Newfoundland.
Lecture for Week Spring.  Numbers can be represented in many ways. We are familiar with the decimal system since it is most widely used in everyday.
IE341: Human Factors Engineering Prof. Mohamed Zaki Ramadan.
CHAPTER 2 LIMITS AND DERIVATIVES. 2.2 The Limit of a Function LIMITS AND DERIVATIVES In this section, we will learn: About limits in general and about.
Section 9.2 ~ Hypothesis Tests for Population Means Introduction to Probability and Statistics Ms. Young.
1 Today Null and alternative hypotheses 1- and 2-tailed tests Regions of rejection Sampling distributions The Central Limit Theorem Standard errors z-tests.
1 Information Input and Processing Information Theory: Some times called cognitive psychology, cognitive engineering, and engineering psychology. Some.
EXIT NEXT Click one of the buttons below or press the enter key BACKTOPICSProbability Mayeen Uddin Khandaker Mayeen Uddin Khandaker Ph.D. Student Ph.D.
Slide Slide 1 Chapter 8 Hypothesis Testing 8-1 Overview 8-2 Basics of Hypothesis Testing 8-3 Testing a Claim about a Proportion 8-4 Testing a Claim About.
Chapter 6 Probability. Introduction We usually start a study asking questions about the population. But we conduct the research using a sample. The role.
Probability, contd. Learning Objectives By the end of this lecture, you should be able to: – Describe the difference between discrete random variables.
Theory of Probability Statistics for Business and Economics.
Correlation and Linear Regression. Evaluating Relations Between Interval Level Variables Up to now you have learned to evaluate differences between the.
TAUCHI – Tampere Unit for Computer-Human Interaction 1 Statistical Models of Human Performance I. Scott MacKenzie.
Art 315 Lecture 6 Dr. J. Parker. Variables Variables are one of a few key concepts in programming that must be understood. Many engineering/cs students.
Department of Industrial Management Engineering 2. Psychology and HCI ○Hard science (ComSci) would drive out soft science (Psy)  “harden Psy” to improve.
Psy B07 Chapter 4Slide 1 SAMPLING DISTRIBUTIONS AND HYPOTHESIS TESTING.
Copyright © Cengage Learning. All rights reserved. 10 Inferences Involving Two Populations.
10.2 Tests of Significance Use confidence intervals when the goal is to estimate the population parameter If the goal is to.
1 Psych 5500/6500 The t Test for a Single Group Mean (Part 1): Two-tail Tests & Confidence Intervals Fall, 2008.
Artificial Intelligence in Game Design N-Grams and Decision Tree Learning.
LIMITS AND DERIVATIVES The Limit of a Function LIMITS AND DERIVATIVES In this section, we will learn: About limits in general and about numerical.
1 Chapter 4 – Probability An Introduction. 2 Chapter Outline – Part 1  Experiments, Counting Rules, and Assigning Probabilities  Events and Their Probability.
Modeling Visual Search Time for Soft Keyboards Lecture #14.
Chapter 8 Parameter Estimates and Hypothesis Testing.
Fall 2002Biostat Statistical Inference - Proportions One sample Confidence intervals Hypothesis tests Two Sample Confidence intervals Hypothesis.
Week 6. Statistics etc. GRS LX 865 Topics in Linguistics.
ANOVA, Regression and Multiple Regression March
Statistical Techniques
USER INTERFACE USER INTERFACE January 5, 2006 Intern 박지현 Information Theoretic Model of HCI : A Comparison of the Hick-Hyman Law and Fitts’ Law Steven.
King Saud University College of Engineering IE – 341: “Human Factors” Fall – 2015 (1 st Sem H) Chapter 3. Information Input and Processing Part.
Review of Statistical Inference Prepared by Vera Tabakova, East Carolina University.
King Saud University College of Engineering IE – 341: “Human Factors” Spring – 2016 (2 nd Sem H) Chapter 3. Information Input and Processing Part.
IE442 FIRST TUTORIAL (SPRING 2011) 1. A mail-sorter must place customers' mail into one of 256 bins. If the mail-sorter assumes that all bins are equally-likely.
Uncertainty and confidence Although the sample mean,, is a unique number for any particular sample, if you pick a different sample you will probably get.
The expected value The value of a variable one would “expect” to get. It is also called the (mathematical) expectation, or the mean.
Chapter 3. Information Input and Processing Prepared by: Ahmed M. El-Sherbeeny, PhD.
Copyright © 2009 Pearson Education, Inc. 9.2 Hypothesis Tests for Population Means LEARNING GOAL Understand and interpret one- and two-tailed hypothesis.
The Law of Averages. What does the law of average say? We know that, from the definition of probability, in the long run the frequency of some event will.
Lecture Slides Elementary Statistics Twelfth Edition
Solved Problems 1. The Hick-Hyman law provides one measure of information processing ability. Assume that an air traffic controller has a channel.
Probability and Statistics Chapter 3 Notes
King Saud University College of Engineering IE – 341: “Human Factors Engineering” Fall – 2017 (1st Sem H) Chapter 3. Information Input and Processing.
Information Units of Measurement
Solved Problems: Signal Detection Theory
Chapter 3. Information Input and Processing Part – I*
Presentation transcript:

IE341: Human Factors Engineering Prof. Mohamed Zaki Ramadan Ahmed M. El-Sherbeeny, PhD

Information Theory Information Processing is AKA: –Cognitive Psychology –Cognitive Engineering –Engineering Psychology Objectives of Information Theory: –Finding an operational definition of information –Finding a method for measuring information –Note, most concepts of Info. Theory are descriptive (i.e. qualitative vs. quantitative) Information (Def n ): –“Reduction of Uncertainty” –Emphasis is on “highly unlikely” events –Example (information in car): “Fasten seat belt”: likely event ⇒ not imp. in Info. Th. “Temperature warning”: unlikely event ⇒ imp. 7

Unit of Measure of Information Case 1: ≥ 1 equally likely alternative events: –H : amount of information [Bits] –N: number of equally likely alternatives –e.g.: 2 equally likely alternatives ⇒ ⇒ Bit (Def n ): “amount of info. to decide between two equally likely (i.e. 50%-50%) alternatives” –e.g.: 4 equally likely alternatives ⇒ –e.g.: equally likely digits (0-9) ⇒ –e.g.: equally likely letters (a-z) ⇒ Note, for each of above, unit [bit] must be stated. 8

Cont. Unit of Measure of Information Case 2: ≥ 1 non-equally likely alternatives: – : amount of information [Bits] for single event, i – : probability of occurrence of single event, i –Note, this is not usually significant (i.e. for individual event basis) 9

Cont. Unit of Measure of Information Case 3: Average info. of non-equally likely series of events: – : average information [Bits] from all events – : probability of occurrence of single event, i –N : num. of non-equally likely alternatives/events –e.g.: 2 alternatives (N = 2) Enemy attacks by land,p 1 = 0.9 Enemy attacks by sea,p 2 = 0.1 ⇒ 10

Cont. Unit of Measure of Information Case 4: Redundancy: –If 2 occurrences: equally likely ⇒ p 1 = p 2 = 0.5 (i.e. 50 % each) ⇒ H = H max = 1 –In e.g. in last slide, departure from max. info. = 1 – 0.47 = 0.53 = 53% – –Note, as departure from equal prob. ↑ ⇒ %Red. ↑ –e.g.: not all English letters equally likely: “th”,“qu” ⇒ %Red. of English language = 68 % PS. How about Arabic language? 11

Choice Reaction Time Experiments Experiments: –Subjects: exposed to different stimuli –Response time is measured –e.g. 4 lights – 4 buttons Hick (1952): –Varied number of stimuli (eq. likely alternatives) –He found: As # of eq. likely alt. ↑ ⇒ reaction time to stimulus ↑ Reaction time vs. Stimulus (in Bits): linear function Hyman (1953): –Kept number of stimuli (alternatives) fixed –Varied prob. of occurrence of events ⇒ info. Varies –He found: “Hick-Hyman Law” AGAIN: Reaction time vs. Stimulus (in Bits): linear function! 15

Solved Problems

1. The Hick-Hyman law provides one measure of information processing ability. Assume that an air traffic controller has a channel capacity bandwidth limit of 2.8 bits/second in decision making. a. Assuming equally-likely alternatives, how many choices can this person make per second? b. As the controller gains expertise he/she develops expectations of which routes different planes will fly. Explain how this will increase the controller's channel capacity on this task. c. Describe at least three different general methods for improving the controller's information processing in this task. (Use methods we have studied in this course--don't just say "automate")

1. a. H = 2.8/sec = log 2 (N); =N; N=6.92, or about b. Actually makes more knowledge, or less uncertainty. So, less potential knowledge gain. We can process less info. faster and more accurately than more info. 1. c. A few ideas: · Allow more time and look-ahead in system · Earlier training on various target probabilities · Less targets per controller · Allow errors; may be redundance in system · Multiple channels or modalities in presentation · Better compatibility in the system.

2. After watching a dice-rolling game, you notice that a one side of a die appears twice as often as it should. All other sides of the die appear with equal probability. a. Compute the information that is present in the unfair die. b. Determine the redundancy present in the unfair die. c. In your own words, concisely state the meaning of the term "redundancy" in (b) above.

2. a. A die has 6 sides; since one side is twice as likely as any other, we have: (2x) + 5x =1; So, x=1/7 [note: 2/6 vs. 5/6 does not add to one!] Hav = 5/7log 2 (1/1/7) + 2/7log 2 (1/2/7) = = 2.53 bits 2. b. Hmax = log 2 (6) = 2.59 bits %redundancy = (1 - (2.53/2.59))*100% = 2.3% 2. c. Reduction in uncertainty due to unequally likely events.

3. How much Information (H) is contained in a fair roll of a 6-sided die? If an individual realizes that the die is unfair, with 20% chance of each of 4 sides appearing, and 10% on each of the other two, how does H change? What does this mean?

3. A 6-sided die should land on any of its sides with equal probability. Thus, information (H) is log 2 6 = log 10 6 /.301 = 2.58 bits of information. If the die is unfair (and the gambler realizes it), these should be less potential information gain; or less information in the die. In other words, the gambler already has some pre-existing knowledge so his/her potential information gain from the die will be less. If a rapid decision were required based on the outcome of the die, it should be faster due to this preexisting bias. To quantify, H ave = [(4).2log 2 (1/.2) + (2).1log 2 (1/.1)] = [4(.46) + 2(.33)] = = 2.52 bits of information. This is, in fact, lower than the maximum information case above. This result means that the gambler already has the equivalent of.06 bit of pre-existing information. The computation of Redundancy is a well- accepted measure of this pre-existing information. Here, %Redundancy = [1-2.52/2.58]*100% = 2.3%.

4. A display can show one numerical digit (0-9) per second, with equally-likely digits, in a choice reaction time task. If an observer accurately processes the information from the display, what is the observer's channel capacity, in bits/second?

4. The observer must decide from among 10 choices/second. This is log2(10) = 3.32 bits/sec. This is known as the bandwidth, or channel capacity for rate of processing.

5.What would your answer to Question 4 be if equally-likely double digits (0, 1, 2,..., 99) could be presented? Why or why not does this make sense to you?

5. This is now log2(100) = 6.62 bits/sec. This makes sense if you consider that twice as many binary decisions were required--the first splits the numbers into groups of 50 and 50, the second splits one of these into two 25 groups, etc. Note that decision making ability is not linearly related to task complexity.

6. Consider Question 4 again, with unequally-likely digits. The probabilities of the digits appearing are shown below. Determine both the channel capacity and the redundancy. Digit Prob

6. Hmax = log2(10)/sec = 3.32 bits/sec Hav = [2(.08log2(1/.08) +.25 log2(1/.25) +.12 log2(1/.12) log2(1/.22)] = 2.81 bits/sec = new channel capacity So, Redundancy = [1-(2.81/3.32)] *100% = 15.1%

7. Based upon his company experience, Ali knows that 50% of the chips are routed to Line 1, 30% to Line 2, and 20% to Line 3. Given a choice RT intercept of 250 msec, and a processing bandwidth of 7.5 bits/second, how much time does Ali require to make each routing decision? How much faster or slower is this, compared to the condition when all three routes are equally-likely?

7. The information (Hav) associated with each routing decision is: Hav =.5 log 2 (1/.5) +.3 log 2 (1/.3) +.2 log 2 (1/.2) = 3.32 {.5 log 10 (2) +.3 log 10 (3.33) +.2 log 10 (5) } = 3.32( ) = 1.48 bits A bandwidth of 7.5 bits/second equates to: 1000 ms/sec * 1/ 7.5 sec/bit = msec/bit The expected choice RT is then: (Hav) = (1.48) = 447 msec If all equally-likely, H=log 2 (3) = 1.59 bits RT = (1.59) = 462 msec So, if Ali understands the stated probabilities, his expected decisions will be ( ) = 15 msec faster than if all are equally-likely.

Solved Problems

Piano players must rapidly move their fingers between various keys on a piano keyboard. The keys are either black (0.5 inch wide) or white (1 inch wide). For a pianist, a Fitt's Law relationship was measured as: Movement Time (msec) = (ID). S ingle movement time increase or decrease if the pianist moves to a black key instead of a white key?

1. How long will it take this pianist to move his finger from a 150 Hertz white key to a 2400 Hertz white key? Assume 7 keys/octave and negligible movement time for the actual key.

The pianist must make a single movement, over a 4 octave range. Starting at 150 Hz, the 4 octaves are 300, 600, 1200, and 2400 Hz. At 7 keys/octave, this is 28 keys, with a center-to-center distance of 27 inches. The target key width is 1 inch. So, ID = log 2 [2(27)/1] = 5.7 bits. Using the given Fitt's Law equation, MT = (5.75) = 266 msec.

2. By how much will a single movement time increase or decrease if the pianist moves to a black key instead of a white key?

Now the pianist must move to a black key, which decreases the distance by 0.25 inch, and decreases the target width to 0.5 inch. So, ID = log 2 [2(26.75)/0.5] = 6.7 bits, which is a harder task. Now, MT = (6.7) = msec, or 55 msec slower for the movement.

3. By how much will a single movement time increase or decrease if the pianist starts at a black key instead of a white key?

Two answers were accepted to this--since the target width doesn't change, you could say there will be no difference. However, if the starting key is black instead of white, the overall distance decreases by 0.25 inch, which decreases the movement time by under 1 msec.

5. How else could time be predicted for the pianists finger movements between keys?

As explained both in class and in a handout, the MTM-2 system could be used instead of Fitt's Law, using Tolerance Puts. Explanations involving choice reaction time were not correct, as these don't try to predict movement, they only try to model decision making. (Did your fingers ever leave the keys in the choice reaction time lab?--they shouldn't have.)

Solved Problems

1. Compare the inspection capability of Inspector A and Inspector B, determing d' and decision criterion. Inspector A located 26 of 28 defective parts, but also called 2 of 15 good parts defective. Inspector B found 29 of 30 defects, but called 6 of 20 good parts defective. In this case, the Value of a 'hit' was greater than the Cost of a 'false alarm'.

Inspector A: HR = 26/28 =.93; Z = 1.48 FAR = 2/15 =.13; Z = 1.13 We draw a picture of the model, and determine that the decision criterion is located between the means of the two distributions; therefore, we add the Z-scores from each distribution to compute d'. Note that this inspector is fairly neutral (neither liberal or conservative). d' = = 2.61 Inspector B: HR = 29/30 =.97; Z = 1.88 FAR = 6/20 =.3; Z =.52 Similarly, we add the two Z-scores to compute d'. Note that, if FAR had been greater than.5, we would have to subtract one of the Z-values. This inspector was somewhat more liberal than inspector A. d' = = Thus, Inspector A was a bit better discriminator. If the value of a hit is greater than the cost of a FA, then we seek a very liberal inspector. We might decide to go with inspector B, if the cost of a FA is small enough and the value of a Hit is great enough.

Ali is a rain forecaster at 'Predicta-Weather'. Over a 3-month period, he forecast that no rain would fall on 60 of the 67 days on which no rain actually fell. He also (incorrectly) forecast that no rain would fall on 2 of the 22 days in which rain actually fell. For the following analyses, assume that a 'signal' is a rainy day. a. Determine Ali's d', stating whether he is a liberal or conservative forecaster. b. Management is very concerned that Ali is making too many False Alarms, and would like to see these reduced to a probability of Determine Ali's resultant Hit Rate with this reduced False Alarm Rate, assuming the same d' as found in (a).

Betty Said.... Actual Rainy Days Actual Non- Rainy Days "Rain" 207 "No Rain"260 Totals2267 a. We (arbitrarily) define a signal as a rainy day. A simple response matrix follows: From the table, the Hit Rate (HR) = 20/22 = 0.91 Likewise, the False Alarm Rate (FAR) = 7/67 = 0.10 Drawing a picture, we see the following: Looking up the Z values for tail areas of 0.1 and 0.09, we find Zn = 1.28 and Zsn = The d' = = 2.62.

Since Ali's decision criterion is just towards the left of the crossing point between the distributions, he is pretty much a neutrally-biased responder, with possibly a very slight tendency towards being a liberal responder (making more FARs in order to increase his HR). b. Looking up the Zn for a tail area of 0.001, we find that Zn = 3.1. Redrawing the criterion in the above figure (shown in yellow), we see that the criterion has moved towards the right, or conservative side (fewer FAs, but also fewer HITS). Since the FAR is so small, the criterion is actually to the right of the mode on the 'Rain' distribution. We know that Zsn = 2.62, so Zsn = Looking up the area associated with a Z = 0.48 gives us an area of Thus, the area to the right of the criterion = Hit Rate = 0.31.

Try to solve this