Journal of Vision. 2003;3(11):7. doi: /3.11.7 Figure Legend:

Slides:



Advertisements
Similar presentations
From: What limits performance in the amblyopic visual system: Seeing signals in noise with an amblyopic brain Journal of Vision. 2008;8(4):1. doi: /8.4.1.
Advertisements

From: Coordinated Control of Three-Dimensional Components of Smooth Pursuit to Rotating and Translating Textures Invest. Ophthalmol. Vis. Sci ;58(1):
From: A unified Bayesian observer analysis for set size and cueing effects on perceptual decisions and saccades Journal of Vision. 2012;12(6):27. doi: /
From: Complex interactions between spatial, orientation, and motion cues for biological motion perception across visual space Journal of Vision. 2013;13(2):8.
From: Attentive and pre-attentive aspects of figural processing
From: Statistics for optimal point prediction in natural images
Journal of Vision. 2016;16(10):9. doi: / Figure Legend:
From: Holistic integration of gaze cues in visual face and body perception: Evidence from the composite design Journal of Vision. 2017;17(1):24. doi: /
Journal of Vision. 2011;11(8):11. doi: / Figure Legend:
From: Unsupervised clustering method to detect microsaccades
Journal of Vision. 2008;8(7):12. doi: / Figure Legend:
Journal of Vision. 2010;10(3):19. doi: / Figure Legend:
From: Vector subtraction using visual and extraretinal motion signals: A new look at efference copy and corollary discharge theories Journal of Vision.
From: Responses of the human visual cortex and LGN to achromatic and chromatic temporal modulations: An fMRI study Journal of Vision. 2010;10(13):13. doi: /
Journal of Vision. 2012;12(13):19. doi: / Figure Legend:
Journal of Vision. 2004;4(9):3. doi: /4.9.3 Figure Legend:
From: Occlusion-related lateral connections stabilize kinetic depth stimuli through perceptual coupling Journal of Vision. 2009;9(10):20. doi: /
Journal of Vision. 2011;11(9):17. doi: / Figure Legend:
Figure Legend: From: A moving-barber-pole illusion
Journal of Vision. 2017;17(6):6. doi: / Figure Legend:
From: Event-related potentials reveal an early advantage for luminance contours in the processing of objects Journal of Vision. 2011;11(7):1. doi: /
Figure Legend: From: The modern Japanese color lexicon
From: Temporal eye movement strategies during naturalistic viewing
Journal of Vision. 2008;8(11):3. doi: / Figure Legend:
Journal of Vision. 2013;13(4):21. doi: / Figure Legend:
Figure Legend: From: Adaptation to blurred and sharpened video
From: Hue shifts produced by temporal asymmetries in chromatic signals
From: Motion processing with two eyes in three dimensions
From: Noise masking of White's illusion exposes the weakness of current spatial filtering models of lightness perception Journal of Vision. 2015;15(14):1.
From: Does spatial invariance result from insensitivity to change?
Figure Legend: From: Comparing integration rules in visual search
Journal of Vision. 2009;9(5):31. doi: / Figure Legend:
Journal of Vision. 2004;4(6):4. doi: /4.6.4 Figure Legend:
From: Maximum likelihood difference scales represent perceptual magnitudes and predict appearance matches Journal of Vision. 2017;17(4):1. doi: /
Journal of Vision. 2008;8(4):27. doi: / Figure Legend:
Journal of Vision. 2013;13(4):1. doi: / Figure Legend:
From: Perceptual entrainment of individually unambiguous motions
Figure Legend: From: Dynamics of oculomotor direction discrimination
Journal of Vision. 2011;11(6):14. doi: / Figure Legend:
From: Cross-modal attention influences auditory contrast sensitivity: Decreasing visual load improves auditory thresholds for amplitude- and frequency-modulated.
Journal of Vision. 2008;8(11):18. doi: / Figure Legend:
From: A new analytical method for characterizing nonlinear visual processes with stimuli of arbitrary distribution: Theory and applications Journal of.
Journal of Vision. 2010;10(12):1. doi: / Figure Legend:
From: Inhibition of saccade and vergence eye movements in 3D space
Journal of Vision. 2009;9(1):19. doi: / Figure Legend:
From: Rat performance on visual detection task modeled with divisive normalization and adaptive decision thresholds Journal of Vision. 2011;11(9):1. doi: /
Figure Legend: From: Latitude and longitude vertical disparities
From: The vertical horopter is not adaptable, but it may be adaptive
From: Motion processing with two eyes in three dimensions
From: Probing visual consciousness: Rivalry between eyes and images
Journal of Vision. 2015;15(6):4. doi: / Figure Legend:
Journal of Vision. 2007;7(11):9. doi: / Figure Legend:
From: Rapid visual categorization of natural scene contexts with equalized amplitude spectrum and increasing phase noise Journal of Vision. 2009;9(1):2.
From: Variability of eye movements when viewing dynamic natural scenes
From: The perceptual basis of common photographic practice
Journal of Vision. 2016;16(10):11. doi: / Figure Legend:
Figure Legend: From: The resolution of facial expressions of emotion
Figure Legend: From: Spatial structure of contextual modulation
Figure Legend: From: The absolute threshold of cone vision
Figure Legend: From: The fine structure of multifocal ERG topographies
Journal of Vision. 2008;8(11):18. doi: / Figure Legend:
Figure Legend: From: Eidolons: Novel stimuli for vision research
Figure Legend: From: Crowding and eccentricity determine reading rate
Figure Legend: From: Analysis of human vergence dynamics
From: Motion-induced blindness and microsaccades: Cause and effect
Journal of Vision. 2015;15(9):20. doi: / Figure Legend:
Volume 49, Issue 3, Pages (February 2006)
Jeremy B. Wilmer, Ken Nakayama  Neuron 
Optic flow induces spatial filtering in fruit flies
Jude F. Mitchell, Kristy A. Sundberg, John H. Reynolds  Neuron 
Presentation transcript:

From: Shared motion signals for human perceptual decisions and oculomotor actions Journal of Vision. 2003;3(11):7. doi:10.1167/3.11.7 Figure Legend: Converting pursuit responses into oculometric data. A. Two examples pursuit responses to rightward plus 1° upward (R+1) stimulus motion illustrate the range of response variability. The upper two traces show the eye-speed time courses, while the lower two traces show the corresponding eye-direction time courses. The apparent oscillations are simply 60-Hz noise that tends to be synchronized by our latency algorithm and amplified by the ratio taken to compute direction. To minimize this artifact, we chose a 100-ms analysis interval (indicated by the vertical dashed lines) that lies after the initial transient, but before the end-of-trial pursuit slowing (see purple eye-speed trace). B. Boxcar-filtered direction traces for all 18 R+1 trials within a single session illustrate the robustness of our analyses to changes in analysis interval. The boxcar filter converts each time-point of the raw response into the mean value over the 101-ms interval centered on that point. Thus, our analysis interval in A corresponds to the single point at t = 300 ms in B. Oculometric performance (% upward) is computed by dividing the number of traces above threshold (indicated by the horizontal dotted line) by the total number of traces. The upper row of nonparenthetic numbers above the vertical dashed lines show the measured % upward pursuit decisions for 5 different candidate analysis intervals, ranging from 150 to 250 ms after pursuit onset to 350 to 450 ms after pursuit onset. Covariation is evident by the fact that, for identical stimuli, trials associated with upward perceptual decisions (red traces) tend to be associated with upward pursuit decisions (traces above threshold). Conversely, trials associated with downward perceptual decisions (blue traces) tend to fall below threshold. The %Same is computed by dividing the sum of the number of red traces above threshold and the number of blue traces below threshold by the total number of traces. The lower row of parenthetical numbers above each vertical dashed line represent the measured %Same for the same 5 candidate analysis intervals. Gaps in the traces are due to saccades. Date of download: 10/21/2017 The Association for Research in Vision and Ophthalmology Copyright © 2017. All rights reserved.