COMP 417 – Jan 12 th , 2006
-
Upload
wilma-bonner -
Category
Documents
-
view
18 -
download
0
description
Transcript of COMP 417 – Jan 12 th , 2006
![Page 1: COMP 417 – Jan 12 th , 2006](https://reader030.fdocuments.net/reader030/viewer/2022032804/56812abe550346895d8e840d/html5/thumbnails/1.jpg)
COMP 417 – Jan 12th, 2006
Guest Lecturer: David MegerTopic: Camera Networks for
Robot Localization
![Page 2: COMP 417 – Jan 12 th , 2006](https://reader030.fdocuments.net/reader030/viewer/2022032804/56812abe550346895d8e840d/html5/thumbnails/2.jpg)
Introduction
Who am I? Overview, Camera Networks for
Robot Localization What Where Why How (technical stuff)
![Page 3: COMP 417 – Jan 12 th , 2006](https://reader030.fdocuments.net/reader030/viewer/2022032804/56812abe550346895d8e840d/html5/thumbnails/3.jpg)
Introduction - Hardware
![Page 4: COMP 417 – Jan 12 th , 2006](https://reader030.fdocuments.net/reader030/viewer/2022032804/56812abe550346895d8e840d/html5/thumbnails/4.jpg)
Intro - What
Previously: Localization is a key task for a robot. It’s typically achieved using the robot’s sensors and a map.
Can “the environment” help with this?
![Page 5: COMP 417 – Jan 12 th , 2006](https://reader030.fdocuments.net/reader030/viewer/2022032804/56812abe550346895d8e840d/html5/thumbnails/5.jpg)
Typical Robot Localization
![Page 6: COMP 417 – Jan 12 th , 2006](https://reader030.fdocuments.net/reader030/viewer/2022032804/56812abe550346895d8e840d/html5/thumbnails/6.jpg)
Sensor Networks
![Page 7: COMP 417 – Jan 12 th , 2006](https://reader030.fdocuments.net/reader030/viewer/2022032804/56812abe550346895d8e840d/html5/thumbnails/7.jpg)
Sensor Networks
![Page 8: COMP 417 – Jan 12 th , 2006](https://reader030.fdocuments.net/reader030/viewer/2022032804/56812abe550346895d8e840d/html5/thumbnails/8.jpg)
Intro - Where
In cases where there is sensing already in the environment, we can invert the direction of sensing.
Where is this true? Buildings with security systems Public transportation areas (metro) More and more large cities (scary but
true)
![Page 9: COMP 417 – Jan 12 th , 2006](https://reader030.fdocuments.net/reader030/viewer/2022032804/56812abe550346895d8e840d/html5/thumbnails/9.jpg)
Intro – Why
Advantages: In many cases sensors already exist Many robots operating in the same
place, can all share the same sensors Computation can be done at a
powerful central computer, saves robot computation
Interesting research problem
![Page 10: COMP 417 – Jan 12 th , 2006](https://reader030.fdocuments.net/reader030/viewer/2022032804/56812abe550346895d8e840d/html5/thumbnails/10.jpg)
Intro – How As the robot appears in images, we can
use 3-D vision techniques to determine its position relative to the cameras
What do we need to know about the cameras to make this work? Can we assume we know where the cameras
are? Can we assume we know the camera
properties?
![Page 11: COMP 417 – Jan 12 th , 2006](https://reader030.fdocuments.net/reader030/viewer/2022032804/56812abe550346895d8e840d/html5/thumbnails/11.jpg)
Problem
Can we use images from arbitrarycameras placed in unknown
positions inthe environment to help a robot
navigate?
![Page 12: COMP 417 – Jan 12 th , 2006](https://reader030.fdocuments.net/reader030/viewer/2022032804/56812abe550346895d8e840d/html5/thumbnails/12.jpg)
Proposed Method
1. Detect the robot2. Measure the relative positions3. Place the camera in the map4. Move robot to the next camera5. Repeat
![Page 13: COMP 417 – Jan 12 th , 2006](https://reader030.fdocuments.net/reader030/viewer/2022032804/56812abe550346895d8e840d/html5/thumbnails/13.jpg)
Detection – An algorithm to detect these robots?
![Page 14: COMP 417 – Jan 12 th , 2006](https://reader030.fdocuments.net/reader030/viewer/2022032804/56812abe550346895d8e840d/html5/thumbnails/14.jpg)
Detection (cont’d) Computer Vision techniques attempt
detection of (moving) objects Background subtraction or image
differencing Image templates Color matching Feature matching
A robust algorithm for arbitrary robots is likely beyond current methods
![Page 15: COMP 417 – Jan 12 th , 2006](https://reader030.fdocuments.net/reader030/viewer/2022032804/56812abe550346895d8e840d/html5/thumbnails/15.jpg)
Detection – Our Method
![Page 16: COMP 417 – Jan 12 th , 2006](https://reader030.fdocuments.net/reader030/viewer/2022032804/56812abe550346895d8e840d/html5/thumbnails/16.jpg)
ARTag Markers
![Page 17: COMP 417 – Jan 12 th , 2006](https://reader030.fdocuments.net/reader030/viewer/2022032804/56812abe550346895d8e840d/html5/thumbnails/17.jpg)
Proposed Method
Detect the robot2. Measure the relative positions3. Place the camera in the map 4. Move robot to the next camera5. Repeat
![Page 18: COMP 417 – Jan 12 th , 2006](https://reader030.fdocuments.net/reader030/viewer/2022032804/56812abe550346895d8e840d/html5/thumbnails/18.jpg)
Position Measurement
Question: Can we determine the 3-D position of an object relative to the camera from examining 2-D images?
Hint: start from the introduction to Computer Vision from last time
![Page 19: COMP 417 – Jan 12 th , 2006](https://reader030.fdocuments.net/reader030/viewer/2022032804/56812abe550346895d8e840d/html5/thumbnails/19.jpg)
Pinhole Camera Model
![Page 20: COMP 417 – Jan 12 th , 2006](https://reader030.fdocuments.net/reader030/viewer/2022032804/56812abe550346895d8e840d/html5/thumbnails/20.jpg)
Camera Calibration An image depends on BOTH scene
geometry and camera properties
For example, zooming in and out and moving the object closer and farther have essentially the same effect
Calibration means determining relevant camera properties (e.g. focal length f)
![Page 21: COMP 417 – Jan 12 th , 2006](https://reader030.fdocuments.net/reader030/viewer/2022032804/56812abe550346895d8e840d/html5/thumbnails/21.jpg)
Projective Calibration Equations
![Page 22: COMP 417 – Jan 12 th , 2006](https://reader030.fdocuments.net/reader030/viewer/2022032804/56812abe550346895d8e840d/html5/thumbnails/22.jpg)
Coordinate Transformation
![Page 23: COMP 417 – Jan 12 th , 2006](https://reader030.fdocuments.net/reader030/viewer/2022032804/56812abe550346895d8e840d/html5/thumbnails/23.jpg)
Calibration Equations
Matrix AT is a 3x4 and fully describes the geometry of image formation
Given known object points M, and image points m, it is possible to solve for both A and T
How many points are needed?
![Page 24: COMP 417 – Jan 12 th , 2006](https://reader030.fdocuments.net/reader030/viewer/2022032804/56812abe550346895d8e840d/html5/thumbnails/24.jpg)
Calibration Targets
![Page 25: COMP 417 – Jan 12 th , 2006](https://reader030.fdocuments.net/reader030/viewer/2022032804/56812abe550346895d8e840d/html5/thumbnails/25.jpg)
3-Plane ARTag Target
![Page 26: COMP 417 – Jan 12 th , 2006](https://reader030.fdocuments.net/reader030/viewer/2022032804/56812abe550346895d8e840d/html5/thumbnails/26.jpg)
Position Measurement Conclusion
With enough image points whose 3-D location are known, measurement of coordinate transformation T is possible
The process is more complicated than traditional sensing, but luckily, we only need to do it once per camera
![Page 27: COMP 417 – Jan 12 th , 2006](https://reader030.fdocuments.net/reader030/viewer/2022032804/56812abe550346895d8e840d/html5/thumbnails/27.jpg)
Proposed Method
Detect the robot Measure the relative positions3. Place the camera in the map 4. Move robot to the next camera5. Repeat
![Page 28: COMP 417 – Jan 12 th , 2006](https://reader030.fdocuments.net/reader030/viewer/2022032804/56812abe550346895d8e840d/html5/thumbnails/28.jpg)
Mapping Camera Locations
Given the robot’s position, a measurement of the relative position of the camera allows us to place it in our map
Question: What affects the accuracy of this type of relative measurement?
![Page 29: COMP 417 – Jan 12 th , 2006](https://reader030.fdocuments.net/reader030/viewer/2022032804/56812abe550346895d8e840d/html5/thumbnails/29.jpg)
Proposed Method
Detect the robot Measure the relative positions Place the camera in the map 4. Move robot to the next camera5. Repeat
![Page 30: COMP 417 – Jan 12 th , 2006](https://reader030.fdocuments.net/reader030/viewer/2022032804/56812abe550346895d8e840d/html5/thumbnails/30.jpg)
Robot Motion
A robot moves by using electric motors to turn its wheels. There are numerous strategies here in each of the important aspects: Physical Design Control algorithms Programming Interface High-level software architecture
![Page 31: COMP 417 – Jan 12 th , 2006](https://reader030.fdocuments.net/reader030/viewer/2022032804/56812abe550346895d8e840d/html5/thumbnails/31.jpg)
Nomad Scout
![Page 32: COMP 417 – Jan 12 th , 2006](https://reader030.fdocuments.net/reader030/viewer/2022032804/56812abe550346895d8e840d/html5/thumbnails/32.jpg)
Differential Drive Kinematics
![Page 33: COMP 417 – Jan 12 th , 2006](https://reader030.fdocuments.net/reader030/viewer/2022032804/56812abe550346895d8e840d/html5/thumbnails/33.jpg)
Odometry Position Readings
![Page 34: COMP 417 – Jan 12 th , 2006](https://reader030.fdocuments.net/reader030/viewer/2022032804/56812abe550346895d8e840d/html5/thumbnails/34.jpg)
Robot Motion - Specifics
Robot control accomplished by using an in-house application – Robodaemon
Allows “point and shoot” motion, not continuous control
Graphical and programmatic interface to query robot odometry, send motion commands, collect sensor data
![Page 35: COMP 417 – Jan 12 th , 2006](https://reader030.fdocuments.net/reader030/viewer/2022032804/56812abe550346895d8e840d/html5/thumbnails/35.jpg)
Proposed Method Detect the robot Measure the relative positions Place the camera in the map Move robot to the next camera Repeat
Are we done?
![Page 36: COMP 417 – Jan 12 th , 2006](https://reader030.fdocuments.net/reader030/viewer/2022032804/56812abe550346895d8e840d/html5/thumbnails/36.jpg)
Challenges In general, it’s impossible to know the
robot or camera positions exactly. All measurements have error
What should the robot do if the cameras can’t see the whole environment?
I didn’t say anything about how the robot should decide where to go next
More?
![Page 37: COMP 417 – Jan 12 th , 2006](https://reader030.fdocuments.net/reader030/viewer/2022032804/56812abe550346895d8e840d/html5/thumbnails/37.jpg)
Mapping with Uncertainty
Given exact knowledge of the robot’s position, mapping is possible
Given a pre-built map, localization is possible
What if neither are present? Is it realistic to assume they will be? If so, when?
![Page 38: COMP 417 – Jan 12 th , 2006](https://reader030.fdocuments.net/reader030/viewer/2022032804/56812abe550346895d8e840d/html5/thumbnails/38.jpg)
Uncertainty in Robot Position In general, kinematics equations do
not exactly predict robot locations Sources of error
Wheel slippage Encoder quantization Manufacturing artifacts Uneven and terrain Rough/slippery/wet terrain
![Page 39: COMP 417 – Jan 12 th , 2006](https://reader030.fdocuments.net/reader030/viewer/2022032804/56812abe550346895d8e840d/html5/thumbnails/39.jpg)
Typical Odometry Error
![Page 40: COMP 417 – Jan 12 th , 2006](https://reader030.fdocuments.net/reader030/viewer/2022032804/56812abe550346895d8e840d/html5/thumbnails/40.jpg)
Simultaneous Localization and Mapping (SLAM)
When both the robot and map features are uncertain, both must be estimated
Progress can be made by viewing measurements as probability densities instead of precise quantities
![Page 41: COMP 417 – Jan 12 th , 2006](https://reader030.fdocuments.net/reader030/viewer/2022032804/56812abe550346895d8e840d/html5/thumbnails/41.jpg)
SLAM Progress
![Page 42: COMP 417 – Jan 12 th , 2006](https://reader030.fdocuments.net/reader030/viewer/2022032804/56812abe550346895d8e840d/html5/thumbnails/42.jpg)
SLAM (cont’d) A quantity of the work in robotics in the
last 5-10 years has involved localization and SLAM, results are now very pleasing indoors with good sensing
These methods apply to our system
More on this later in the course, or after class today if you’re interested
![Page 43: COMP 417 – Jan 12 th , 2006](https://reader030.fdocuments.net/reader030/viewer/2022032804/56812abe550346895d8e840d/html5/thumbnails/43.jpg)
Motion Planning
The mapping framework described is dependant on the robot’s motion: The robot must pass in front of a
camera in order to collect any images Numerous points are needed for each
camera to perform calibration SLAM accuracy affected by order of
camera visitation
![Page 44: COMP 417 – Jan 12 th , 2006](https://reader030.fdocuments.net/reader030/viewer/2022032804/56812abe550346895d8e840d/html5/thumbnails/44.jpg)
Local and Global Planning
Local: how should the robot move while in front of one camera, to collect the set of calibration images?
Global: in which order should the cameras be visited?
![Page 45: COMP 417 – Jan 12 th , 2006](https://reader030.fdocuments.net/reader030/viewer/2022032804/56812abe550346895d8e840d/html5/thumbnails/45.jpg)
Local Planning
Modern calibration algorithms are quite good at estimating from noisy data, but there are some geometric considerations Field of view Detection accuracy Singularities in calibration equations
![Page 46: COMP 417 – Jan 12 th , 2006](https://reader030.fdocuments.net/reader030/viewer/2022032804/56812abe550346895d8e840d/html5/thumbnails/46.jpg)
Local Planning
We must avoid configurations where all points collected lie in a linear sub-space of R3
For example, a set of images of a single plane moved only through translation, gives all co-planar points
![Page 47: COMP 417 – Jan 12 th , 2006](https://reader030.fdocuments.net/reader030/viewer/2022032804/56812abe550346895d8e840d/html5/thumbnails/47.jpg)
Projective Calibration Equations
![Page 48: COMP 417 – Jan 12 th , 2006](https://reader030.fdocuments.net/reader030/viewer/2022032804/56812abe550346895d8e840d/html5/thumbnails/48.jpg)
Global Planning
Camera positions estimated by relative measurements from the robot
This information is only as accurate as our knowledge about the robot
“Re-localizing” is our only way to reduce error
![Page 49: COMP 417 – Jan 12 th , 2006](https://reader030.fdocuments.net/reader030/viewer/2022032804/56812abe550346895d8e840d/html5/thumbnails/49.jpg)
Distance / Accuracy Tradeoff Returning to well-known cameras
helps our position estimates but causes the robot to travel farther than necessary
An intelligent strategy is needed to manage this tradeoff
Some partial results so far, this is work in progress
![Page 50: COMP 417 – Jan 12 th , 2006](https://reader030.fdocuments.net/reader030/viewer/2022032804/56812abe550346895d8e840d/html5/thumbnails/50.jpg)
Review
Using sensors in the environment, we can localize a robot
In order to use previously un-calibrated and unmapped cameras, a robot can carry out exploration, and SLAM
This must only be done once, and then accurate localization is possible
![Page 51: COMP 417 – Jan 12 th , 2006](https://reader030.fdocuments.net/reader030/viewer/2022032804/56812abe550346895d8e840d/html5/thumbnails/51.jpg)
Future Work
Better motion planning strategies globally
Integrate other sensing (especially if the cameras have blind spots)
Lose the targets? Other types of ubiquitous sensing
(wireless, motion detection, etc)