Presentation is loading. Please wait.

Presentation is loading. Please wait.

Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe.

Similar presentations


Presentation on theme: "Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe."— Presentation transcript:

1 Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe

2 Introduction

3 Project Motivation Inexpensive Variable-light conditions Use low-resolution devices Did we mention inexpensive?

4 Project Breakdown RussellBertusChristopher & Bertus Christopher

5 Layer 1 Low-level image processing

6 Overview Camera – Distortion example – Calibration “Outside-in” model Marker-based tracking – Thresholding – Sub-pixel accuracy – Search space reduction

7 Fundamental constraint of project: Low cost Camera choice: Logitech webcam (< R150) Camera may be prone to distortion  need to calibrate Camera

8 Camera Distortion Example VRVis Zentrum für Virtual Reality und Visualisierung Forschungs- GmbH http://www.vrvis.at/2d3d/technology/camera calibration/cameracalibration.html

9 Camera Calibration WHY? – Important for calculating accurate metric data HOW? – Camera calibration toolkit.

10 “Outside-in” model Markers are placed on the user Cameras are fixed in position Inside-out model: Cameras placed on users

11 Marker-based tracking Tasks: – Find position of markers in environment – Match corresponding markers from cameras – Extract marker centres

12 Marker-based tracking Thresholding (1/4) PURPOSE: Find regions in which markers are most likely to be METHOD: Partition the image into background and foreground based on intensity threshold. Problems?

13 Marker-based tracking Thresholding (2/4) Threshold too high Localisation of only one marker

14 Marker-based tracking Thresholding (3/4) Threshold too low Localisation of all markers Extra background noise in foreground

15 Marker-based tracking Thresholding (4/4) Threshold just about right Localisation of all three markers Minor noise in image

16 Marker-based tracking Sub-pixel accuracy After thresholding, a large blob remains We would like to find the centre of the light source Naïve method: Take the brightest pixel in the area  accurate to one pixel Binary centroid: Take the average position of all points in the region, above the threshold Weighted centroid: Treat positions of intensities above threshold as a mask and weight the points according to their original intensities

17 Marker-based tracking Search space reduction Likely 3D position

18 Layer 2 Motion prediction & Model Generation

19 Overview Tracking the current location and rotation of the user Reducing latency in the system by using motion prediction Ensuring the prediction coincides with the actual motion Passing the information on to the environment

20 User Tracking Common problems with user tracking – Latency End-to-end delay from capturing data to updating the screen – Efficiency Of the tracking algorithm – Accuracy Accuracy of detecting changes in position and rotation

21 Motion Prediction I Motivation – Reduce the effects of latency – Allows smooth transition between frames Different inputs – For 2D input devices – For 3D input devices Types of algorithms – Polynomial Predictor – Kalman based Predictor

22 Motion Prediction II Existing vs new Algorithm – Existing algorithms Might not be suited to our problem May require modifications – May require new algorithm Testing the efficiency and accuracy of implemented algorithms

23 Layer 3 Movement Processing

24 Layer 4 Virtual Environment

25 Overview Movement data mapped to VE screen updates Tracker vs. Standard Input (Keyb & Mouse) Hypothesis: – “An optical tracking system works better for navigating through a virtual environment than conventional means”

26 Performance goals High Accuracy Low Latency Speed + Usability

27 2D / 3D Environments OpenGL – 2D (non-walking) – Pacman type game – 3D (with walking) – Landscape / Game (undecided) CAVEAT

28 Layer 4 User Testing

29 User testing techniques Questionnaires – Hypothesis test Continuous Assessment – Performance statistics Interviews Ethnographic Observation Postural Response

30 Conclusion

31 Conclusions Project consists of four sections One section each – Layer 3, joins Layer 2 and Layer 4. Final Outcome Lastly a look at our deliverables

32 Questions?

33 Deliverables

34 20th June 2006Obtain cameras 30th June 2006Get images from cameras 20th SeptemberLED system built 20th SeptemberTest centroid-finding algorithms 20th SeptemberTest images for algorithms captured 22nd SeptemberSystem design complete 25th SeptemberVE design/User test design complete 27th September1st implementation of stand alone algorithms on images 2nd October2nd test of algorithms 6th OctoberAll modules completed 10th October1st system integrated and running 13th OctoberPreliminary tests 16th October Design for 2nd version


Download ppt "Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe."

Similar presentations


Ads by Google