Download presentation
Presentation is loading. Please wait.
Published byDerek Malone Modified over 9 years ago
1
Automated Reading Assistance System Using Point-of-Gaze Estimation M.A.Sc. Thesis Presentation Automated Reading Assistance System Using Point-of-Gaze Estimation Jeffrey J. Kang Supervisor: Dr. Moshe Eizenman Department of Electrical and Computer Engineering Institute of Biomaterials and Biomedical Engineering January 24, 2006
2
Introduction Reading Visual examination of text Convert words to sounds to activate word recognition We learn appropriate conversions through repetitive exposure to word-to-sound mappings Insufficient reader skill or irregular spelling can lead to failed conversion: assistance is required Objective: Develop an automated reading assistance system that automatically vocalizes unknown words in real-time on the reader’s behalf. The system should operate within a natural reading setting.
3
What We Need To Do — Step 1 1. Identify the word being read, in real-time 2. Detect when the word being read is an unknown word 3. Vocalization of the unknown word
4
Identifying the Word Being Read Identify the viewed word using point-of-gaze estimation Point-of-gaze is: Where we are looking with the highest visual acuity region of the retina Intersection of the visual axis of each eye within the 3D scene Intersection of the visual axis one eye with a 2D plane
5
Point-of-Gaze Estimation Methodologies 1. Head-mounted 2. Remote (no head-worn components)
6
Head-mounted Point-of-Gaze Estimation Based on principle of tracking the pupil centre, and corneal reflections to measure eye position Point-of-gaze is estimated with respect to a coordinate system attached to the head scene camera eye camera IR LEDs hot mirror corneal reflections pupil centre
7
Point-of-Gaze in Head Coordinate System Point-of-gaze is measured in the head coordinate system, and placed on the scene camera image
8
Locating the Reading Object The position of the reading object is determined by tracking markers
9
Mapping the Point-of-Gaze Establish point correspondences from the estimated positions of the markers in the scene image the known positions of the markers on the reading object Homographic mapping of point-of-gaze from scene camera image to reading object coordinate system
10
Identify the Reading Object Extract the barcode from the scene camera image to identify the reading object (e.g. page number) Match barcode to database of reading objects to determine what text is being read
11
Identifying the Word Being Read Using the mapped point-of-gaze, identify the word being read by table lookup
12
Sample Reading Video
14
Mapping Accuracy
15
Point-of-Gaze Estimation Methodologies 1. Head-mounted 2. Remote (no head-worn components)
16
O 2D scene object Z X P C visual axis Y Remote Point-of-Gaze Estimation Point-of-gaze is estimated to a fixed coordinate system C – centre of corneal curvature P – point-of-gaze IR LEDs eye camera computer screen
17
O assumed position of 2D scene object Z X P C visual axis P’ true position of 2D scene object Y Moving Reading Card How can point-of-gaze be estimated to a coordinate system attached to a moving reading object?
18
O assumed position of 2D scene object Z X P C visual axis P’ true position of 2D scene object Y Estimate Motion R, T t0t0 t1t1
19
O assumed position of 2D scene object Z X P C visual axis P’ true position of 2D scene object Y Use a Scene Camera and Targets t0t0 t1t1 Scene Camera
20
H 0 O assumed position of 2D scene object Z X P C visual axis P’ true position of 2D scene object Y Calculate Two Homographies t0t0 t1t1 H1H1 Scene Camera
21
O assumed position of 2D scene object Z X P C visual axis P’ true position of 2D scene object Y Decompose Homography Matrices t0t0 t1t1 Scene Camera R 0, T 0 R 1, T 1
22
O assumed position of 2D scene object Z X P C visual axis P’ true position of 2D scene object Y Calculate Motion of 2D Scene Object t0t0 t1t1 Scene Camera R, T R 0, T 0 R 1, T 1
23
Point-of-Gaze Accuracy
24
What We Need To Do: Step 2 1. Identify the word being read, in real-time 2. Detect when the word being read is an unknown word 3. Vocalization of the unknown word
25
Dual Route Reading Model Coltheart, M. et al. (2001)
26
Each word’s graphemes are processed in parallel Dual Route Reading Model
27
Each word’s graphemes are individually converted into phonemes based on mapping rules
28
Detecting Unknown Words For unknown words, the lexical route fails and the slower non-lexical route is used Hypothesis: we can differentiate between known and unknown words by the duration of the processing time
29
Processing Time
30
Setting a Threshold Curve
31
Threshold curve is a function of word length Model processing time for known words (length k ) as a Gaussian random variable (μ k, σ k 2 ) Estimate μ k, σ k 2 from a short training set for each subject Each point on threshold curve is given by α is the constrained probability of false alarm Setting the Threshold
32
Experiment: Detecting Unknown Words Remote point-of-gaze estimation system Reading material presented on computer screen Head position stabilized using a chinrest Four subjects read from 40 passages of text 20 passages aloud and 20 passages silently Divided into training set to “learn” μ k, σ k 2 and set detection threshold curves Set false alarm probability α = 0.10 Evaluate detection performance
33
Experiment: Detecting Unknown Words
34
Experiment: Natural Setting Reading Assistance Natural reading pose Unrestricted head movement Reading material is hand-held Head-mounted eye-tracker Identify viewed word in real-time Measure per-word processing time Detecting unknown words Processing time threshold curves established in previous experiment Assistance Detection of unknown word activates vocalization
35
Experiment: Natural Setting Reading Assistance Results Point-of-gaze mapping method accommodated head and reading material movement without reducing detection performance SubjectDetection RateFalse Alarm Rate M.E.0.940.10 P.L.0.950.09
36
Conclusions Developed methods to map point-of-gaze estimates to an object coordinate system attached to a moving 2D scene object (e.g. reading card) Head-mounted system Remote system Developed method to detect when a reader encounters an unknown word Demonstrated principle of operation for an automated reading assistance system
37
Future Work Implement reading assistant using remote-gaze estimation methodology Validate efficacy of system as a teaching tool for unskilled English readers, in collaboration with an audiologist Evaluate other forms of assistive intervention e.g. translation, definition
38
Questions?
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.