3/23/2005 © Dr. Zachary Wartell 1 “Dynamic Adjustment of Stereo Display Parameters” Colin Ware, Cyril Gobrecht, and Mark Andrew Paton IEEE Transactions.

Slides:



Advertisements
Similar presentations
Outline 3. Data Analysis 4. Follow Up Study 1. Previous Work 2. Experiment.
Advertisements

Seeing 3D from 2D Images. How to make a 2D image appear as 3D! ► Output and input is typically 2D Images ► Yet we want to show a 3D world! ► How can we.
Render Cache John Tran CS851 - Interactive Ray Tracing February 5, 2003.
3D Graphics Rendering and Terrain Modeling
Chapter 8: Vision in three dimensions Basic issue: How do we construct a three-dimension visual experience from two- dimensional visual input? Important.
Epipolar lines epipolar lines Baseline O O’ epipolar plane.
Last Time Pinhole camera model, projection
Three Dimensional Visual Display Systems for Virtual Environments Michael McKenna, David Zeltzer Presence, Vol. I, No. 4, 1992 Presenter: Dong Jeong.
Imaging Science FundamentalsChester F. Carlson Center for Imaging Science Binocular Vision and The Perception of Depth.
3/23/2005 © Dr. Zachary Wartell – [Scarfe2006] 1 “Disparity-defined objects moving in depth do not elicit three-dimensional shape constancy” P. Scarfe,
Inversion of Z-Axis Tipper Electromagnetic (Z-TEM)‏ Data The UBC Geophysical Inversion Facility Elliot Holtham and Douglas Oldenburg.
The plan for today Camera matrix
1 Lecture 11 Scene Modeling. 2 Multiple Orthographic Views The easiest way is to project the scene with parallel orthographic projections. Fast rendering.
3/23/2005 © Dr. Zachary Wartell 1 “ The Effect of Interocular Distance upon Operator Performance using Stereoscopic Displays To Perform Depth Tasks ” Louis.
3/23/2005 © Dr. Zachary Wartell 1 “Just Enough Reality: Comfortable 3-D Viewing via Microstereopsis” Mel Seigel and Shojiro Nagata IEEE TCSVT, Vol 10.,
Lecture 4: Perception and Cognition in Immersive Virtual Environments Dr. Xiangyu WANG.
A Novel 2D To 3D Image Technique Based On Object- Oriented Conversion.
CSE473/573 – Stereo Correspondence
Space Perception Depth Cues Tasks Shape-from-Shading.
3/23/2005 © Dr. Zachary Wartell 1 3D Displays Overview Revision 1.3 Copyright 2006 Zachary Wartell University of North Carolina Charlotte.
3/23/2005 © Dr. Zachary Wartell 1 “Visual Cues for Perceiving Distances from Objects to Surfaces” Helen H. Hu, Amy A. Gooch, Sarah H. Creem-Regehr, William.
Scenes, Cameras & Lighting. Outline  Constructing a scene  Using hierarchy  Camera models  Light models.
Painterly Rendering for Animation Barbara J. Meier Walt Disney Feature Animation SIGGRAPH 96.
Dinesh Ganotra. each of the two eyes sees a scene from a slightly different perspective.
Stereo vision A brief introduction Máté István MSc Informatics.
1B50 – Percepts and Concepts Daniel J Hulme. Outline Cognitive Vision –Why do we want computers to see? –Why can’t computers see? –Introducing percepts.
3D/Multview Video. Outline Introduction 3D Perception and HVS 3D Displays 3D Video Representation Compression.
Satellites in Our Pockets: An Object Positioning System using Smartphones Justin Manweiler, Puneet Jain, Romit Roy Choudhury TsungYun
CAP4730: Computational Structures in Computer Graphics 3D Concepts.
Technology and Historical Overview. Introduction to 3d Computer Graphics  3D computer graphics is the science, study, and method of projecting a mathematical.
Space Perception: the towards- away direction The third dimension Depth Cues Tasks Navigation Cost of Knowledge Interaction.
Chapter 5 Human Stereopsis, Fusion, and Stereoscopic Virtual Environments.
3D SLAM for Omni-directional Camera
Object Stereo- Joint Stereo Matching and Object Segmentation Computer Vision and Pattern Recognition (CVPR), 2011 IEEE Conference on Michael Bleyer Vienna.
1 Computational Vision CSCI 363, Fall 2012 Lecture 20 Stereo, Motion.
1 Perception, Illusion and VR HNRS 299, Spring 2008 Lecture 8 Seeing Depth.
Depth Perception and Visualization Matt Williams From:
Computer Vision, Robert Pless
Plenoptic Modeling: An Image-Based Rendering System Leonard McMillan & Gary Bishop SIGGRAPH 1995 presented by Dave Edwards 10/12/2000.
A Study of Balanced Search Trees: Brainstorming a New Balanced Search Tree Anthony Kim, 2005 Computer Systems Research.
Anthony J Greene1 MOTION & EVENT PERCEPTION. 2 Event Perception Because perception evolved to provide organisms with world information, our ability to.
Stereo Viewing Mel Slater Virtual Environments
1 Perception and VR MONT 104S, Fall 2008 Lecture 21 More Graphics for VR.
Vision: Distance & Size Perception. Useful terms: ● Egocentric distance: distance from you to an object. ● Relative distance: distance between two objects.
Unit 6 3D Modeling Concepts
CSE 185 Introduction to Computer Vision Stereo. Taken at the same time or sequential in time stereo vision structure from motion optical flow Multiple.
Chapter 10 Verification and Validation of Simulation Models
1 1 Spatialized Haptic Rendering: Providing Impact Position Information in 6DOF Haptic Simulations Using Vibrations 9/12/2008 Jean Sreng, Anatole Lécuyer,
Chapter 3 Response Charts.
Taking stereoscopic tours of astronomical scenes Stuart Levy with Robert Patterson Advanced Visualization Lab - AVL at NCSA Nat'l Center for Supercomputing.
(c) 2000, 2001 SNU CSE Biointelligence Lab Finding Region Another method for processing image  to find “regions” Finding regions  Finding outlines.
Immersive Rendering. General Idea ► Head pose determines eye position  Why not track the eyes? ► Eye position determines perspective point ► Eye properties.
Depth Perception Kimberley A. Clow
By: David Gelbendorf, Hila Ben-Moshe Supervisor : Alon Zvirin
1Ellen L. Walker 3D Vision Why? The world is 3D Not all useful information is readily available in 2D Why so hard? “Inverse problem”: one image = many.
Perception and VR MONT 104S, Fall 2008 Lecture 8 Seeing Depth
By James J. Todd and Victor J. Perotti Presented by Samuel Crisanto THE VISUAL PERCEPTION OF SURFACE ORIENTATION FROM OPTICAL MOTION.
Remcom Inc. 315 S. Allen St., Suite 416  State College, PA  USA Tel:  Fax:   ©
Image-Based Rendering Geometry and light interaction may be difficult and expensive to model –Think of how hard radiosity is –Imagine the complexity of.
Instructor: Mircea Nicolescu Lecture 5 CS 485 / 685 Computer Vision.
Cogs1 mapping space in the brain Douglas Nitz – Feb. 19, 2009 any point in space is defined relative to other points in space.
Space Perception: the towards- away direction Cost of Knowledge Depth Cues Tasks Navigation.
Correspondence and Stereopsis. Introduction Disparity – Informally: difference between two pictures – Allows us to gain a strong sense of depth Stereopsis.
Design of Visual Displays for Driving Simulator Research G. John Andersen Department of Psychology University of California, Riverside.
Quality of Images.
Fitting: Voting and the Hough Transform
Head-Tracked Displays (HTDs)
Chapter 10 Verification and Validation of Simulation Models
Geometry 3: Stereo Reconstruction
Coding Approaches for End-to-End 3D TV Systems
Presentation transcript:

3/23/2005 © Dr. Zachary Wartell 1 “Dynamic Adjustment of Stereo Display Parameters” Colin Ware, Cyril Gobrecht, and Mark Andrew Paton IEEE Transactions on Systems, Man and Cybernetics—Part A: Systems and Humans, vol. 28, no. 1, January 1998, pg Presentation: Revision 1.0

3/23/2005 © Dr. Zachary Wartell 2 Introduction reviewed depth cues: –retinal disparity, diplopia, panum’s fusion area –occlusion –SFM –need to reduce/control screen parallax Warning: –[Ware] “virtual eye separation” = [Class] model eye separation –doesn’t distinguish measurement in scaled space Accommodation/vergence conflict

3/23/2005 © Dr. Zachary Wartell 3 SFM: Motion Parallax optic flow

3/23/2005 © Dr. Zachary Wartell 4 screen wireframe cube (backlit) SFM: Kinetic Depth Effect

3/23/2005 © Dr. Zachary Wartell 5 Modeled versus True Eye Separation -

3/23/2005 © Dr. Zachary Wartell 6 Experiment 1: Rate of Change of Eye Separation Cyberscope

3/23/2005 © Dr. Zachary Wartell 7 Experiment 1: Rate of Change of Eye Separation ●plane infinite field of truncated pyramids moving towards viewer ●plane tilted 70 o from screen

3/23/2005 © Dr. Zachary Wartell 8 Varying Eye Separation –IV: amplitude variation: 10%, 20%, 30% –IV: separations: 6.3, 4.2, 2.1 cm Ex: 6.3 cm with 10% yields [5.67,6.3] –3 x 3 conditions –DV: sinusoidal variation frequency increased until subjects noticed it –within subjects, fully randomized –9 subjects, each subject sees all conditions

3/23/2005 © Dr. Zachary Wartell 9 Results - detectable frequency varies inversely with amplitude of oscillation -no significant interaction between amplitude and model eye sep. -worst case: average detectable frequency is 0.3 Hz occurs at max. amp. & max. e.s.

3/23/2005 © Dr. Zachary Wartell 10 Examine Peak Velocity significant effects of amplitude and model e.s. on V p so peak velocity not primary determinant of response thresholds should keep rate of change of model e.s. < 0.2 cm/s

3/23/2005 © Dr. Zachary Wartell 11 Experiment 2 subjects adjust model e.s. until diplopia occurs than back to comfortable value: find “max. comfortable value” Hypothesis: Sub’s will vary max. model e.s. based on scene depth

3/23/2005 © Dr. Zachary Wartell 12 Exp 2: Method vary scene depth via angle of “moving carpet”: 10 through 80 degrees sub’s make two e.s. settings per angle (diplopia & back to comfortable value) 12 subjects randomized, within-subjects

3/23/2005 © Dr. Zachary Wartell 13 Results considerable sub. variation high neg. correlation (r 2 =0.99) E v = 18.5 – θ Implications: drive algorithm to adjust e.s.; expect sub. variation

3/23/2005 © Dr. Zachary Wartell 14 Algorithm for Dynamic Disparity Adjustment two stages –cyclopean scale –modeled eye separation adjustment purpose: reduce a/c conflict and optimize stereo depth effectiveness use experimental result to control stereo adjustment

3/23/2005 © Dr. Zachary Wartell 15 Cyclopean Scale determine near point scale about eye center to bring near point to screen (equivalent of changing virtual e.s. in our term.)

3/23/2005 © Dr. Zachary Wartell 16 Dynamic Implementation

3/23/2005 © Dr. Zachary Wartell 17 Dynamic Implementation P P'P' PlPl PrPr

3/23/2005 © Dr. Zachary Wartell 18 Adjustment of Eye Separation optional addition to cyclopean scale model e.s. function from lower 95 th percentile of exp. 2 (model e.s. varies 14cm through 4 cm) E v = – θ , θ = arctan ( dz/dh ) E v = – arctan ( dz/dh )

3/23/2005 © Dr. Zachary Wartell 19 Adjustment of Eye Separation How to transform results from “magic carpet” to general scene? E v = arctan ( Z max /S h )+14.0 Z max : zbuffer max S h : screen height Z max ShSh θ

3/23/2005 © Dr. Zachary Wartell 20 Properties of Algorithm Z max from 0 to ∞ → modeled e.s. 14 cm to 4 cm

3/23/2005 © Dr. Zachary Wartell 21 Properties of Algorithm relate max. displayed depth after m.e.s. adjustment to max. depth before ( Z max ) assume modeled depth is infinite:

3/23/2005 © Dr. Zachary Wartell 22 Individual Differences customize with MaxSep ( E v = arctan ( Z max /S h )+14.0)

3/23/2005 © Dr. Zachary Wartell 23 Controlling Rate of Change Options: leave m.e.s. uncontrolled use running average last 5 of model e.s. to smooth change use threshold 0.2 cm/s for m.e.s. problem: sudden changes in scene depth cause diplopia for some period Tradeoff: anecdotally its better to allow abrupt m.e.s. change than allow diplopia

3/23/2005 © Dr. Zachary Wartell 24 Experiment 3: Distortion with Changes in M.E.S. investigate perceived distortion: How much do users notice distortion? How do users rate it compared to diplopia? Method: –terrain map, 80 x 80 grid height field colored by height, shaded, shadows –pre-set flyby path: downlooking flying in, rotate to horizontal view, flyout –Goal: large changes in view distance and relative depths

3/23/2005 © Dr. Zachary Wartell 25 6 Conditions no algorithm – m.e.s. = 6.4 cm cyclopean scale only dynamic adjustment – max m.e.s. 6.4 cm dynamic adjustment – max m.e.s cm dynamic adjustment – max m.e.s cm dynamic adjustment – max m.e.s cm

3/23/2005 © Dr. Zachary Wartell 26 Measurements sub’s report perception of double images and note “overall shape of surface appeared to change” Rate distortion (0-4) Rate eye strain (0-4) 7 subjects, within subjects, randomized presentation of all cond.’s

3/23/2005 © Dr. Zachary Wartell 27 Results: Diplopia “no algorithm” – all sub’s report double images when scene was distant “dynamic adj. with max 25 cm” – 2 / 7 reported double images indicates support for algorithm’s ability to control diplopia

3/23/2005 © Dr. Zachary Wartell 28 Results: Distortion See chart one report of eye strain but exp. Only lasted 20 minutes

3/23/2005 © Dr. Zachary Wartell 29 Conclusions dynamic disparity adjustment used in terrain vis. app. algorithm provides; –strong depth cues & minimize diplopia compare Williams & Parrish: –old alg. always maps modeled space to fixed depth range in display space –new alg. limits exaggeration of shallow scenes (factor of 2) SFM cues still dominate in giving sense of “flying”

3/23/2005 © Dr. Zachary Wartell 30 Future Work investigate penalty from distortion when depth judgements are important improve depth buffer sampling; small # of sample points: –works best for large extended surfaces –missing small/narrow features alternative: use bounding boxes, but more complex implemention

3/23/2005 © Dr. Zachary Wartell 31 Questions, Comments long term (> 20 min.) study of eye strain affect (subjective questionnaire versus A/V ratio test) user study comparing Will. and Parrish. technique why test angle of magic carpet? not direct depth? better way to rate distortion (tease out diplopia effect in cond. #1) use near point from previous frame & scale about cyclopean eye battery of tests for best way to get zbuffer info.—GPU solution?