Download presentation
Presentation is loading. Please wait.
1
3/23/2005 © Dr. Zachary Wartell 1 3D Displays Overview Revision 1.3 Copyright 2006 Zachary Wartell University of North Carolina Charlotte
2
3/23/2005 © Dr. Zachary Wartell 2 3D Displays : Basic Properties (1) “retinal disparity”- (also called stereo parallax) display presents a left eye perspective view of a virtual scene to the left eye and different right eye perspective view to the right eye. The retinal disparities in the left and right retinal images induce a perceived 3D image of the scene. Implicitly the user experiences ocular vergence as she fixates on objects at different stereo depths. (2) “multi-viewpoint”- (also called motion parallax) image pair presented to the user’s eyes is dependent on the user’s head position; moving or walking around the display; the user perceives the virtual objects from different vantage points. “Multi-viewpoint” means continuous and correct changes to the perceived images as the head moves in any direction.
3
3/23/2005 © Dr. Zachary Wartell 3 Taxonomy
4
3/23/2005 © Dr. Zachary Wartell 4 Ideal Surface Display
5
3/23/2005 © Dr. Zachary Wartell 5 Physical world creates natural blur
6
3/23/2005 © Dr. Zachary Wartell 6 Ideal Surface Display creates natural blur
7
3/23/2005 © Dr. Zachary Wartell 7 Stereoscopic Surface Display cannot create the wavefronts of the synthetic 3D multi-viewpoint property: a stereoscopic display must : –determine the user’s head position –render a left and right eye image specifically computed for that head position; –channel each of the two images to the appropriate eye. (while at a given moment a full holographic display outputs an image for every possible eye position, a typical stereoscopic display outputs an image for only the two current eye positions).
8
3/23/2005 © Dr. Zachary Wartell 8 Stereoscopic Surface Display Wavefronts
9
3/23/2005 © Dr. Zachary Wartell 9 Accommodation and Vergence Link physical box eyes
10
3/23/2005 © Dr. Zachary Wartell 10 Accommodation and Vergence Link physical box fovea fixation point
11
3/23/2005 © Dr. Zachary Wartell 11 physical box fovea fixation point Accommodation and Vergence Link
12
3/23/2005 © Dr. Zachary Wartell 12 physical box accommodation depth = vergence depth fovea fixation point Accommodation and Vergence Link
13
3/23/2005 © Dr. Zachary Wartell 13 Stereoscopic Display versus Physical World
14
3/23/2005 © Dr. Zachary Wartell 14 Fixate on farther object
15
3/23/2005 © Dr. Zachary Wartell 15 Fixate on even farther objects
16
3/23/2005 © Dr. Zachary Wartell 16 Fusion Metrics -screen parallax, HVA, vergence difference
17
3/23/2005 © Dr. Zachary Wartell 17 Screen Parallax
18
3/23/2005 © Dr. Zachary Wartell 18 Vergence Difference ( - )
19
3/23/2005 © Dr. Zachary Wartell 19 Image Cross-talk/Ghosting filters display left right
20
3/23/2005 © Dr. Zachary Wartell 20 Experimental Limits in Stereo Display Valyus [Valy66] gives a vergence difference range of +/- 1.6 degrees. Yeh and Silverstein [Yeh90] –find a fusible HVA range of -4.93 to 1.57 degrees for viewing durations that allow ocular vergence (2 s) –a HVA range of -27 min arc to 24 min arc for viewing durations that don’t allow ocular vergence (200ms). They recommend keeping applications to the smaller of these ranges. William’s and Parrish’s experiments suggest a viewing volume of –25% through +60% of the head-to-screen distance.
21
3/23/2005 © Dr. Zachary Wartell 21 Comparing Far Fusible Depth maximum depth planes screen p max1 p max2 d maxA2 d maxA1 A B d maxB2
22
3/23/2005 © Dr. Zachary Wartell 22 Far Depth Limits Head to Screen Distance (m) A B Solid line – Valyus’s+1.6 vergence difference; Dash line – Yeh’s +1.57 HVA; Dash-dot - Valyus’s max parallax approximation; Circles – William and Parrish limits.
23
3/23/2005 © Dr. Zachary Wartell 23 Nearest Portrayable Depth maximum depth planes screen p max1 p max2 d maxA1 d maxA2 A B d maxB2
24
3/23/2005 © Dr. Zachary Wartell 24 Limitations of Fusibility Limits varies with display technology varies with contrast of given image scenes have arbitrary distribution of depth – experiments usually use single stimuli individual differences, etc.
25
3/23/2005 © Dr. Zachary Wartell 25 View Frustums in Stereo u v n
26
3/23/2005 © Dr. Zachary Wartell 26 View Frustum ≠ Eye Pupil and Retina Recall ideal surface display (slide 6) versus stereoscopic display (slide 8)68 left right pupils! retina
27
3/23/2005 © Dr. Zachary Wartell 27 View Frustum = Eye Pupil and Physical Pixels graphics pipeline (frustum) knows nothing of retina’s shape, that is the brains problem! left right physical pixels! pupils! retina
28
3/23/2005 © Dr. Zachary Wartell 28 View Frustum = Eye Pupil and Physical Pixels Recall ideal surface display (slide 6) versus stereoscopic display (slide 8)68 pupils! projection =physical plane pixels near clipping planes far clipping planes
29
3/23/2005 © Dr. Zachary Wartell 29 HMD Headset Displays (Internal) Head Mounted Display (HMD)
30
3/23/2005 © Dr. Zachary Wartell 30 HTD (Head-tracked Display) 3D Glasses Head Tracker Stationary Display Desktop Stereo HTD
31
3/23/2005 © Dr. Zachary Wartell 31 View Coordinate Hierarchy (Simplified)
32
3/23/2005 © Dr. Zachary Wartell 32 View Coordinate Hierarchy platform-to-world – position/orientation/scale – UI manipulates tracker-to-platform – fixed by physical display arrangement headSensor-to-tracker – dynamically determined by tracker hardware eyes-to-headSensor – fixed by physical display arrangement projectionPlane-to-(headSensor/platform) - fixed by physical display arrangement
33
3/23/2005 © Dr. Zachary Wartell 33 Small object closer by (=/≠) Large object far away COP Proj. Plane
34
3/23/2005 © Dr. Zachary Wartell 34 Platform Scale DOF
35
3/23/2005 © Dr. Zachary Wartell 35 Four Eye Separations
36
3/23/2005 © Dr. Zachary Wartell 36 Proj. Window 5 x 5m (phys) (virtual) 12km (physical) 5m Platform Tracker Head- Sensor Eyes Projection Plane S W←Plat =S Parent(Plat)←Plat =2400 y x z Plat. physical: 1m virtual: 2400m y x z Eyes HTD Example modeled e.s. phy: 3.0cm vir: 72m true e.s. phy: 6.5cm vir: 156m S W ← ? =2400 S Parent(?) ← ? =1
37
3/23/2005 © Dr. Zachary Wartell 37 Goals for 3D Display generating fusible stereoscopic imagery generating accurate stereoscopic imagery maximizing the added value of stereoscopic depth images minimizing frame cancellation bringing manipulated stereoscopic imagery within arms’ reach to improve direct manipulation
38
3/23/2005 © Dr. Zachary Wartell 38 Generate Fusible Imagery hardware: –need ideal surface display (hologram) –dynamic depth image plane –multi-planar display software: geometric scene manipulation to reduce screen parallax
39
3/23/2005 © Dr. Zachary Wartell 39 Fusion Control heuristic limits on portrayed range of stereoscopic depth heuristic limits on portrayed range of stereoscopic depth Comfortable Near Fusion Limit Comfortable Far Fusion Limit Screen
40
3/23/2005 © Dr. Zachary Wartell 40 Fusion Control Screen heuristic limits on portrayed range of stereoscopic depth Comfortable Near Fusion Limit Comfortable Far Fusion Limit
41
3/23/2005 © Dr. Zachary Wartell 41 Fusion Control Screen heuristic limits on portrayed range of stereoscopic depth Comfortable Near Fusion Limit Comfortable Far Fusion Limit
42
3/23/2005 © Dr. Zachary Wartell 42 Frame cancellation/window violation
43
3/23/2005 © Dr. Zachary Wartell 43 Frame cancellation/window violation
44
3/23/2005 © Dr. Zachary Wartell 44 Spatial Distortion VirtualModeled Adapted Perceived -perceptual matching -magnitude estimation -category estimation -mapping -registration exp. -subjective magnitude view scale factor fusion control technique individual differences & procedure effect and error anaytic/geometric analysis experimental analysis Displayed calibration errors
45
3/23/2005 © Dr. Zachary Wartell 45 Displayed Distortion assume no modeled→adaptation distortion: Image Display = T * Image Modeled sources: –stereo-no tracking –stereo with eye pair offset error –stereo with tracking latency error –stereo with tracking & calibration error
46
3/23/2005 © Dr. Zachary Wartell 46 near space Stereo-No Tracking: Induced Stereo Motion assume projection window is correctly measured and model eye separation = true eye sep.! modeled =true =displayed displayed true displayed near space far space x z
47
3/23/2005 © Dr. Zachary Wartell 47 modeled Induced Stereo Motion: Lateral-Near Space =true true displayed true displayed near space far space Near SpaceFar Space with head motion opposite head motion x z
48
3/23/2005 © Dr. Zachary Wartell 48 near space Induced Stereo Motion: Perpendicular-Far Space modeled =true =displayed true far space displayed true displayed true x z
49
3/23/2005 © Dr. Zachary Wartell 49 modeled near space Induced Stereo Motion: Perpendicular/Near Space modeled true far space =true true Near SpcFar Spc Hd Forw. forwardback Hd Back backforward x z
50
3/23/2005 © Dr. Zachary Wartell 50 Induced Stereo Motion: Shape near space far space
51
3/23/2005 © Dr. Zachary Wartell 51 Tracking ideally removes induction stereo motion
52
3/23/2005 © Dr. Zachary Wartell 52 Eye Pair Offset Error modeled eye location pair is at constant translation offset true eye locaiton pair –constrant translational tracking error –translational mis-calibration between coordinate systems in view hierarchy
53
3/23/2005 © Dr. Zachary Wartell 53 Tracking Latency Error
54
3/23/2005 © Dr. Zachary Wartell 54 Calibration Error miss-measure eye separation or screen size
55
3/23/2005 © Dr. Zachary Wartell 55 Fusion Adaptation Distortions assume no modeled→display distortion: Image Adapted = T * Image Modeled
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.