Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe.
-
Upload
elisabeth-henderson -
Category
Documents
-
view
217 -
download
2
Transcript of Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe.
![Page 1: Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe.](https://reader036.fdocuments.net/reader036/viewer/2022081519/56649e6b5503460f94b68e76/html5/thumbnails/1.jpg)
Optical Tracking for VR
Bertus Labuschagne
Christopher Parker
Russell Joffe
![Page 2: Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe.](https://reader036.fdocuments.net/reader036/viewer/2022081519/56649e6b5503460f94b68e76/html5/thumbnails/2.jpg)
Introduction
![Page 3: Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe.](https://reader036.fdocuments.net/reader036/viewer/2022081519/56649e6b5503460f94b68e76/html5/thumbnails/3.jpg)
Project Motivation
Inexpensive
Variable-light conditions
Use low-resolution devices
Did we mention inexpensive?
![Page 4: Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe.](https://reader036.fdocuments.net/reader036/viewer/2022081519/56649e6b5503460f94b68e76/html5/thumbnails/4.jpg)
Project Breakdown
Layer 3Layer 2Layer 1
Low level image processing Motion prediction & model generation Movement processing
Russell Bertus Christopher & Bertus
Christopher
![Page 5: Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe.](https://reader036.fdocuments.net/reader036/viewer/2022081519/56649e6b5503460f94b68e76/html5/thumbnails/5.jpg)
Layer 1
Low-level image processing
![Page 6: Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe.](https://reader036.fdocuments.net/reader036/viewer/2022081519/56649e6b5503460f94b68e76/html5/thumbnails/6.jpg)
Overview
Camera– Distortion example– Calibration
“Outside-in” model
Marker-based tracking– Thresholding– Sub-pixel accuracy– Search space reduction
![Page 7: Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe.](https://reader036.fdocuments.net/reader036/viewer/2022081519/56649e6b5503460f94b68e76/html5/thumbnails/7.jpg)
Fundamental constraint of project: Low cost Camera choice: Logitech webcam (< R150) Camera may be prone to distortion need to calibrate
Camera
![Page 8: Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe.](https://reader036.fdocuments.net/reader036/viewer/2022081519/56649e6b5503460f94b68e76/html5/thumbnails/8.jpg)
CameraDistortion Example
VRVis Zentrum für Virtual Reality und Visualisierung Forschungs-GmbH
http://www.vrvis.at/2d3d/technology/cameracalibration/cameracalibration.html
![Page 9: Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe.](https://reader036.fdocuments.net/reader036/viewer/2022081519/56649e6b5503460f94b68e76/html5/thumbnails/9.jpg)
CameraCalibration
WHY?– Important for calculating accurate metric data
HOW?– Camera calibration toolkit.
![Page 10: Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe.](https://reader036.fdocuments.net/reader036/viewer/2022081519/56649e6b5503460f94b68e76/html5/thumbnails/10.jpg)
“Outside-in” model
Markers are placed on the user Cameras are fixed in position
Inside-out model: Cameras placed on users
![Page 11: Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe.](https://reader036.fdocuments.net/reader036/viewer/2022081519/56649e6b5503460f94b68e76/html5/thumbnails/11.jpg)
Marker-based tracking
Tasks:– Find position of markers in environment– Match corresponding markers from cameras– Extract marker centres
![Page 12: Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe.](https://reader036.fdocuments.net/reader036/viewer/2022081519/56649e6b5503460f94b68e76/html5/thumbnails/12.jpg)
Marker-based trackingThresholding (1/4)
PURPOSE: Find regions in which markers are most likely to be
METHOD: Partition the image into background and foreground based on intensity threshold.
Problems?
![Page 13: Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe.](https://reader036.fdocuments.net/reader036/viewer/2022081519/56649e6b5503460f94b68e76/html5/thumbnails/13.jpg)
Marker-based trackingThresholding (2/4)
Threshold too high
Localisation of only one marker
![Page 14: Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe.](https://reader036.fdocuments.net/reader036/viewer/2022081519/56649e6b5503460f94b68e76/html5/thumbnails/14.jpg)
Marker-based trackingThresholding (3/4)
Threshold too low
Localisation of all markers
Extra background noise in foreground
![Page 15: Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe.](https://reader036.fdocuments.net/reader036/viewer/2022081519/56649e6b5503460f94b68e76/html5/thumbnails/15.jpg)
Marker-based trackingThresholding (4/4)
Threshold just about right
Localisation of all three markers
Minor noise in image
![Page 16: Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe.](https://reader036.fdocuments.net/reader036/viewer/2022081519/56649e6b5503460f94b68e76/html5/thumbnails/16.jpg)
Marker-based trackingSub-pixel accuracy
After thresholding, a large blob remains
We would like to find the centre of the light source
Naïve method: Take the brightest pixel in the area accurate to one pixel
Binary centroid: Take the average position of all points in the region, above the threshold
Weighted centroid: Treat positions of intensities above threshold as a mask and weight the points according to their original intensities
![Page 17: Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe.](https://reader036.fdocuments.net/reader036/viewer/2022081519/56649e6b5503460f94b68e76/html5/thumbnails/17.jpg)
Layer 3Layer 2Layer 1
Low level image processing Motion prediction & model generation Movement processing
Marker-based trackingSearch space reduction
Likely 3D position
![Page 18: Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe.](https://reader036.fdocuments.net/reader036/viewer/2022081519/56649e6b5503460f94b68e76/html5/thumbnails/18.jpg)
Layer 2
Motion prediction & Model Generation
![Page 19: Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe.](https://reader036.fdocuments.net/reader036/viewer/2022081519/56649e6b5503460f94b68e76/html5/thumbnails/19.jpg)
Overview
Tracking the current location and rotation of the user
Reducing latency in the system by using motion prediction
Ensuring the prediction coincides with the actual motion
Passing the information on to the environment
![Page 20: Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe.](https://reader036.fdocuments.net/reader036/viewer/2022081519/56649e6b5503460f94b68e76/html5/thumbnails/20.jpg)
User Tracking
Common problems with user tracking– Latency
End-to-end delay from capturing data to updating the screen
– Efficiency Of the tracking algorithm
– Accuracy Accuracy of detecting changes in position and rotation
![Page 21: Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe.](https://reader036.fdocuments.net/reader036/viewer/2022081519/56649e6b5503460f94b68e76/html5/thumbnails/21.jpg)
Motion Prediction I
Motivation– Reduce the effects of latency– Allows smooth transition between frames
Different inputs– For 2D input devices – For 3D input devices
Types of algorithms– Polynomial Predictor– Kalman based Predictor
![Page 22: Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe.](https://reader036.fdocuments.net/reader036/viewer/2022081519/56649e6b5503460f94b68e76/html5/thumbnails/22.jpg)
Motion Prediction II
Existing vs new Algorithm– Existing algorithms
Might not be suited to our problem May require modifications
– May require new algorithm
Testing the efficiency and accuracy of implemented algorithms
![Page 23: Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe.](https://reader036.fdocuments.net/reader036/viewer/2022081519/56649e6b5503460f94b68e76/html5/thumbnails/23.jpg)
Layer 3
Movement Processing
![Page 24: Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe.](https://reader036.fdocuments.net/reader036/viewer/2022081519/56649e6b5503460f94b68e76/html5/thumbnails/24.jpg)
Layer 4
Virtual Environment
![Page 25: Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe.](https://reader036.fdocuments.net/reader036/viewer/2022081519/56649e6b5503460f94b68e76/html5/thumbnails/25.jpg)
Overview
Movement data mapped to VE screen updates
Tracker vs. Standard Input (Keyb & Mouse)
Hypothesis:– “An optical tracking system works better for
navigating through a virtual environment than conventional means”
![Page 26: Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe.](https://reader036.fdocuments.net/reader036/viewer/2022081519/56649e6b5503460f94b68e76/html5/thumbnails/26.jpg)
Performance goals
High Accuracy
Low Latency
Speed + Usability
![Page 27: Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe.](https://reader036.fdocuments.net/reader036/viewer/2022081519/56649e6b5503460f94b68e76/html5/thumbnails/27.jpg)
2D / 3D Environments
OpenGL– 2D (non-walking)
– Pacman type game
– 3D (with walking)– Landscape / Game (undecided)
CAVEAT
![Page 28: Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe.](https://reader036.fdocuments.net/reader036/viewer/2022081519/56649e6b5503460f94b68e76/html5/thumbnails/28.jpg)
Layer 4
User Testing
![Page 29: Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe.](https://reader036.fdocuments.net/reader036/viewer/2022081519/56649e6b5503460f94b68e76/html5/thumbnails/29.jpg)
User testing techniques
Questionnaires– Hypothesis test
Continuous Assessment– Performance statistics
Interviews
Ethnographic Observation
Postural Response
![Page 30: Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe.](https://reader036.fdocuments.net/reader036/viewer/2022081519/56649e6b5503460f94b68e76/html5/thumbnails/30.jpg)
Conclusion
![Page 31: Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe.](https://reader036.fdocuments.net/reader036/viewer/2022081519/56649e6b5503460f94b68e76/html5/thumbnails/31.jpg)
Conclusions
Project consists of four sections
One section each– Layer 3, joins Layer 2 and Layer 4.
Final Outcome
Lastly a look at our deliverables
![Page 32: Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe.](https://reader036.fdocuments.net/reader036/viewer/2022081519/56649e6b5503460f94b68e76/html5/thumbnails/32.jpg)
Questions?
![Page 33: Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe.](https://reader036.fdocuments.net/reader036/viewer/2022081519/56649e6b5503460f94b68e76/html5/thumbnails/33.jpg)
Deliverables
![Page 34: Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe.](https://reader036.fdocuments.net/reader036/viewer/2022081519/56649e6b5503460f94b68e76/html5/thumbnails/34.jpg)
Deliverables
20th June 2006 Obtain cameras 30th June 2006 Get images from cameras 20th September LED system built 20th September Test centroid-finding algorithms 20th September Test images for algorithms captured 22nd September System design complete 25th September VE design/User test design complete 27th September 1st implementation of stand alone algorithms on images 2nd October 2nd test of algorithms 6th October All modules completed 10th October 1st system integrated and running 13th October Preliminary tests 16th October Design for 2nd version