Pointing Based Object Localization CS223b Final Project Stanford University Bio-Robotics Lab Paul...

7
Pointing Based Object Localization CS223b Final Project Stanford University Bio-Robotics Lab Paul Nangeroni & Ashley Wellman March 17, 2008
  • date post

    21-Dec-2015
  • Category

    Documents

  • view

    224
  • download

    2

Transcript of Pointing Based Object Localization CS223b Final Project Stanford University Bio-Robotics Lab Paul...

Page 1: Pointing Based Object Localization CS223b Final Project Stanford University Bio-Robotics Lab Paul Nangeroni & Ashley Wellman March 17, 2008.

Pointing Based Object LocalizationCS223b Final Project

Stanford University Bio-Robotics LabPaul Nangeroni & Ashley Wellman

March 17, 2008

Page 2: Pointing Based Object Localization CS223b Final Project Stanford University Bio-Robotics Lab Paul Nangeroni & Ashley Wellman March 17, 2008.

( Motivation )

• Present robotic object detection relies on dense stereo mapping of 3D environments

• Pointing based object localization is an intuitive interface for improving accuracy of object detectors

• Project represents several advances over prior art – Uses actual human line of sight (eye through fingertip) – Works in cluttered background– Detects objects in free space.

March 17, 2008 Stanford University Bio-Robotics Lab 2

Page 3: Pointing Based Object Localization CS223b Final Project Stanford University Bio-Robotics Lab Paul Nangeroni & Ashley Wellman March 17, 2008.

( Approach: Face Detection )

Page 4: Pointing Based Object Localization CS223b Final Project Stanford University Bio-Robotics Lab Paul Nangeroni & Ashley Wellman March 17, 2008.

( Approach: Stereopsis )

Step 1: Warp Images along epilines of eye and fingertip in left image

Step 2: Use NCC along epilines to find the matching eye and fingertip in right image

Step 3: Project eye and fingertip locations into 3D

Step 4: Resolve errors in projection via least squares

Step 5: Create line of sight vector. - object known to exist on that line

March 17, 2008 Stanford University Bio-Robotics Lab 4

Page 5: Pointing Based Object Localization CS223b Final Project Stanford University Bio-Robotics Lab Paul Nangeroni & Ashley Wellman March 17, 2008.

( Approach: Stereopsis )

Step 6: Reproject actual eye and fingertip positions back into 2D

Step 7: Rotate images along line of sight and create a slice from the fingertip to the edge of the image

Step 8: Apply SIFT and RANSAC to the slice

Step 9: locate the target object by selecting the match point closest to the centerline of the slice

Step 10: Project the point into 3D and find the closest point along the known line of sight. This point is the location of the target object

RANSAC pt

Target Object

NCC pts

Reprojected Pts

SIFT matches

RANSAC matches

Minimum norm from line of sight

March 17, 2008 Stanford University Bio-Robotics Lab 5

Page 6: Pointing Based Object Localization CS223b Final Project Stanford University Bio-Robotics Lab Paul Nangeroni & Ashley Wellman March 17, 2008.

( Results + Future Work )

• Conclusions• World coordinates output from stereo accurate to within

3cm at range of 2.5m• Face and finger detection needs more training • Object localization sensitive to background clutter• Object location often at edge or corner rather than centroid

of object itself

• Future Work• Object location used to center high resolution close-up

for improved accuracy and efficiency• Laser will highlight target object before robotic arm

attempts to grasp

March 17, 2008 Stanford University Bio-Robotics Lab 6

Page 7: Pointing Based Object Localization CS223b Final Project Stanford University Bio-Robotics Lab Paul Nangeroni & Ashley Wellman March 17, 2008.

( Breakdown of work )

• Paul (60%)– Stereo Calibration, Stereopsis, Object Localization

• Ashley (40%)– Eye Detection, Fingertip Detection

March 17, 2008 Stanford University Bio-Robotics Lab 7