Presentation is loading. Please wait.

Presentation is loading. Please wait.

Wk 9: AR/VR  Immersion  Stereo  Head motion  Sensors  Sensor fusion  Hands-on VR demo.

Similar presentations


Presentation on theme: "Wk 9: AR/VR  Immersion  Stereo  Head motion  Sensors  Sensor fusion  Hands-on VR demo."— Presentation transcript:

1 Wk 9: AR/VR  Immersion  Stereo  Head motion  Sensors  Sensor fusion  Hands-on VR demo

2 Immersion The sense of ‘being there’ How much of user’s overall experience is due to the game/app/artefact Physical: sensory immersion + control Emotional: story + relatedness Cognitive: believability + challenge Only one is necessary for immersion, but breaking one breaks all

3 Reminder: 3D reconstruction Viewer task: given a 2D image of the world, reconstruct the 3D scene that is out there: shape, size and relative positioning of objects Made easier by: –Two eyes –Consistent changes over time –Other internal information: eyes’ position, balance etc. –Previous knowledge: laws of physics, object sizes etc. More 3D – more physical immersion

4 Computer Graphics Fool the eye: Enable the viewer to reconstruct a 3D scene that is not there ‘Depth Cues’ are features of input that allow this reconstruction

5 Depth cues: common Occlusion: Z-buffer Size: perspective projection Shading: diffuse lighting Perspective: perspective projection Coherent motion: camera and animation Fade to blue and fog: shaders

6 Depth cues: rare and epic Stereo disparity: various stereo hardware Self-motion: motion tracking Convergence: eye tracking, light-field displays Accommodation: holodecks

7 Cue conflicts If cues hint at conflicting 3D interpretations of the scene: Either resolved –Some depth cues are stronger than others Or not –Annoying –Breaking immersion –Worst case scenario: cybersickness

8 Cybersickness https://www.nsf.gov/news/special_reports/science_nation/cybersickness.jsp https://www.nsf.gov/news/special_reports/science_nation/cybersickness.jsp Aka simulation sickness –First research: USAF pilot training Discrepancy between expected (from proprioception) and perceived scene motion Body assumes it's been poisoned and tries to instinctively rid itself of poison To be avoided

9 Depth Cue I: Stereo To provide stereo depth cues, the renderer has to provide two different images to two eyes Relatively easy in software: just render the scene twice Hardware: must present images to the eyes separately (images mainly from mtbs3d.com)

10 Physical separation I

11 Physical separation II

12 Time separation Shutter-glasses, synchronised with display refresh rate –Halves the effective refresh rate

13 Wavelength separation Anaglyph: simple, cheap, but with loss of colour Dolby 3D: two different RGB wavelength sets –no loss of colour, but uses complicated and expensive glasses

14 Polarisation separation Electro-magnetic waves oscillate in a plane perpendicular to the direction of travel That can be constrained and separated –Horizontal/vertical sensitive to head orientation –Clockwise/anti-clockwise Head orientation does not matter

15 Polarised (passive) stereo projection Two projectors Per-pixel polarisation –Interlaced, checkerboard, etc. Per-frame polarisation IMAX, Real3D,etc.

16 Depth cue II: Head tracking Given: the position of the head in real world, adjust the position of the camera Provides a strong depth cue: change of scene due to self motion (“fishtank”) https://www.youtube.com/watch?v=Jd3-eiid-Uw

17 Implementation Recalculate the View and Projection each frame Static screens: the eye moves and the image plane stays in place https://www.youtube.com/watch?v=CG__PZzNfkw Helmets: the eye and the image plane move together Most likely to cause cybersickness due to: Slow or inaccurate tracking Lag between tracker update and scene refresh Especially pronounced in fast head movements

18 Sensor fusion Given: –Readings from various sensors –Noise estimation for each sensor type Calculate cleaner data –Smoothing –Weighting by sensor reliability (1/noise) –Kalman filter

19 Hardware: trackers Internal (accelerometers, compasses, gyros) –Nintendo Wii, mobile phones, etc. –Problem: drift External (active or passive) –Optical, infra-red, magnetic, ultrasound etc. –Problem: occlusion, interference from other sources Integrated: internal + external –Vicon (+ passive optical), Wii plus (+ active IR); Intersense (+ passive ultrasound) –Self-correction from two sources; basic sensor fusion

20 Hardware: tracker less PS eye: –video capture and analysis Kinekt –projecting IR grid onto environment FaceAPI: face capture only http://www.youtube.com/watch?v=qWkpdtFZoBE

21 Off: Kalman filter A way to calculate the next state of a system based on: –Measurements (noisy) –Previous state (uncertain) –System dynamics (limited model)

22 'aware' devices Position in the world –Local, global, relative to something World around them –Scene, objects, agents User/users –Position, identity, preferences

23 Oculus Rift Fast: tracking data is sent directly to GPU –Orientation only in DK1, position+orientation in DK2 Lightweight –Single screen –Minimal optics; distorted rendering, corrected in shader DK2: Low persistence display

24 VR/AR As a spectrum As a dichotomy


Download ppt "Wk 9: AR/VR  Immersion  Stereo  Head motion  Sensors  Sensor fusion  Hands-on VR demo."

Similar presentations


Ads by Google