Pose Estimation 2010. 3. 16. TUE. Kim Kyungkoo Active
Grasp
Slide 2
2 Introduction Pose Estimation Object modeling with features
Real-time pose estimation Demo Future works Contents
Slide 3
Introduction Importance of object recognition and pose
estimation 3
Slide 4
Pose Estimation Problem Definition Robot knows The target
object to grasp The corresponded 3D model The grasp point on a 3D
model BUT! Do not know The grasp point in real-environment 4
Orientation matching between an object and a 3D model is
needed
Slide 5
Pose Estimation System overview Object modeling Automatic pose
estimation 5 Stereo Camera Live video TrackingReconstruction
Partial model Model features Transformation Stereo Camera Live
video Feature matching Pose estimation 3D model of an object
Slide 6
Object Modeling with features Object Modeling process 6 2D
image 3D depth Image Disparit y Bi-layer Segmentation 2D image 3D
depth Image Object Segmentation Object depth image SURF Feature
Tracking 2D image 3D depth Image Merged Object Depth image Merging
Disparity Image Homogeneous Matrix Calculation Merged Foreground
Depth image Merged Image Set Captured Image Set Accumulated Image
Set Depth Image Reconstruction
Slide 7
Object Modeling with features Object feature list creation
during modeling process Features Using SURF algorithm to extract
features Each feature consist of a 3D coordinate and a descriptor
Storing features extracted from object region of each frame As the
system extracts features from each image, it accumulates the
features with a previous feature list It stores all features for
the first image in image stream 7 A 3D Feature SURF Feature Match
Updated 3D Feature list Transformed 3D Feature list Matched? YES NO
Add feature descriptor into same ID Create new ID for corresponding
points
Slide 8
Feature list creation on an object Example of feature list
8
Slide 9
Real Time Pose Estimation Feature matching between feature list
of an object and features of current image Using SURF feature
extraction and matching algorithm Each feature consist of a 3D
coordinate and a descriptor Acquisition of 3D corresponding points
Transformation The 3D model of an object is transformed to fit a
current image using 3D corresponding points Method? 9
Slide 10
Pose Estimation of current view Transformation of the 3D model
for pose estimation Using three corresponding points Calculate the
best transformation matrix with three corresponding points using
RANSAC algorithm 10
Slide 11
Pose Estimation of current view Transformation of the 3D model
for pose estimation 11 H
Slide 12
Pose Estimation of current view Transformation of the 3D model
for pose estimation 12 3 corresponding points random 3 0,0,0 T1T1
T2T2
Slide 13
Pose Estimation of current view Transformation of the 3D model
for pose estimation 13 2 corresponding points R1R1
Slide 14
Pose Estimation of current view Transformation of the 3D model
for pose estimation 14 corresponding point scaling
Slide 15
Pose Estimation of current view Transformation of the 3D model
for pose estimation 15 corresponding point
Slide 16
Pose Estimation of current view Transformation of the 3D model
for pose estimation 1.Choose three corresponding points randomly
2.Calculate a transformation matrix 3.Transform all the
corresponding point of model using the transformation matrix 4.Sum
the distance between each corresponding point 5.Repeat 1 st to 4 th
process 6.Select the transformation matrix which contains minimum
distance summation value 7.Transform all the point of an object
model using the inverse matrix of the selected transformation
matrix in 6 th process 16
Slide 17
Demo Modeling process 17
Slide 18
Demo Pose estimation 18
Slide 19
Future Works Accuracy Transformation Feature list 19