Hardware and Software Systems for Control of...

21
Hardware and Software Systems for Control of Assistive Robotic Devices Using Point-of- Gaze Estimation Christopher McMurrough May 1, 2013 Supervising Committee: Fillia Makedon, CSE Frank Lewis, EE Gian-Luca Mariottini, CSE Vassilis Athitsos, CSE

Transcript of Hardware and Software Systems for Control of...

Page 1: Hardware and Software Systems for Control of …heracleia.uta.edu/~mcmurrough/documents/McMurrough...Hardware and Software Systems for Control of Assistive Robotic Devices Using Point-of-Gaze

Hardware and Software Systems for Control of Assistive Robotic Devices Using Point-of-

Gaze Estimation

Christopher McMurroughMay 1, 2013

Supervising Committee:Fillia Makedon, CSE

Frank Lewis, EEGian-Luca Mariottini, CSE

Vassilis Athitsos, CSE

Page 2: Hardware and Software Systems for Control of …heracleia.uta.edu/~mcmurrough/documents/McMurrough...Hardware and Software Systems for Control of Assistive Robotic Devices Using Point-of-Gaze

Overview• Introduction

• Towards Mobile 3D Point-of-Gaze Esimation

– A Dataset for Head-Mounted Eye tracker Evaluation and Benchmarking

– Low-Cost 3D Head Pose Tracking

– 3D Field-of-View Scanning

• Assistive Robotic Platforms

– An Assistive UGV Platform for Visually Impaired Users

– Omniwheel-based Holonomic Assistive Robotic Platforms

– A Development Platform for Non-Tactile Wheelchair Controls

• Applications

– 3D Mapping of Visual Saliency

– Intuitive Non-Tactile Wheelchair Controls

– Object of Interest Detection and Recognition

– Assisted Object Manipulation

• Discussion

– Open Problems

– Future Work

– Expected Outcomes

Page 3: Hardware and Software Systems for Control of …heracleia.uta.edu/~mcmurrough/documents/McMurrough...Hardware and Software Systems for Control of Assistive Robotic Devices Using Point-of-Gaze

Introduction• Eye tracking has been shown to be very useful for people with

severe physical disabilities (ALS, spinal injury, CP, etc)

• Commercial devices are expensive and limited to 2D interaction with fixed displays

• Head-mounted eye trackers are more mobile, but are still limited to 2D interaction with monitors when used for real-time control

• The work focuses on 3 areas...

– Development of a mobile, 3D Point of Gaze (PoG) tracker

– Development of assistive robotic platforms able to utilize the control capabilities that 3D PoG tracking provides

– Demonstration of 3D PoG utility through real-world applications

Page 4: Hardware and Software Systems for Control of …heracleia.uta.edu/~mcmurrough/documents/McMurrough...Hardware and Software Systems for Control of Assistive Robotic Devices Using Point-of-Gaze

Eye Tracking for Communication and Control

Communication– Going beyond “eye typing” and text-to-speech– Eye gaze → user intent → dialog

Wheelchair control– Look where you want to move

Co-Robot control– Look at what you want to move

Page 5: Hardware and Software Systems for Control of …heracleia.uta.edu/~mcmurrough/documents/McMurrough...Hardware and Software Systems for Control of Assistive Robotic Devices Using Point-of-Gaze

Towards Mobile 3D Point-of-Gaze Estimation

Research tools, hardware, and software leading to 3D PoG detection

Devices intended to be used for HCI / control

Page 6: Hardware and Software Systems for Control of …heracleia.uta.edu/~mcmurrough/documents/McMurrough...Hardware and Software Systems for Control of Assistive Robotic Devices Using Point-of-Gaze

A Dataset for Head-Mounted Eye Tracker Evaluation and Benchmarking

The relatively new field of eye tracking lacks standard datasets, evaluation methods Currently it is difficult to evaluate new algorithms Published dataset contains 20 test subjects viewing different eye stimuli scenes. 6 videos of the eye for each test subject are provided along with 3D positions of head, monitor, and visual targets during sessions (acquired with motion capture equipment) heracleia.uta.edu/eyetracking

C. D. McMurrough, V. Metsis, J. Rich, and F. Makedon, “An eye tracking dataset for point of gaze detection,” in Proceedings of the Symposium on Eye Tracking Research and Applications - ETRA ’12, 2012

C. D. McMurrough, V. Metsis, D. Kosmopoulos, I. Maglogiannis, and F. Makedon, “A dataset for point of gaze detection using head poses and eye images,” Journal on Multimodal User Interfaces, Apr. 2013

Page 7: Hardware and Software Systems for Control of …heracleia.uta.edu/~mcmurrough/documents/McMurrough...Hardware and Software Systems for Control of Assistive Robotic Devices Using Point-of-Gaze

Low-Cost 3D Head Pose Tracking

C. D. McMurrough, J. Rich, V. Metsis, A. Nguyen, and F. Makedon, “Low-cost Head Position Tracking for Gaze Point Estimation,” in Proceedings of the 5th International Conference on PErvasive Technologies Related to Assistive Environments, 2012.

●Head-mounted eye trackers are generally calibrated relative to a fixed surface (monitor or head-mounted camera)●If the transformation between the pupil camera and calibration reference is not constant, calibration fails! (not tolerant to head motion relative to world frame)●Head pose-tracking is necessary in both head-mounted and remote trackers, but is generally expensive●Wiimote IR sensors were used to create a stereo camera, track position relative to fixed world LEDs●Sensor bar costs ~$50, can be attached to existing eye trackers

Page 8: Hardware and Software Systems for Control of …heracleia.uta.edu/~mcmurrough/documents/McMurrough...Hardware and Software Systems for Control of Assistive Robotic Devices Using Point-of-Gaze

3D Field-of-View Scanning Commercial eye trackers are limited to interaction with fixed displays (monitors), do not facilitate head movement, cost $20K and upPupil center / head pose aware eye glasses developed in lab using inexpensive sensors, cost ~$200 Glasses are able to track the user visual point-of-gaze (PoG) in 3D space3D point cloud analysis allows identification of objects / point of interest

C. McMurrough, J. Rich, and C. Conly, “Multi-Modal Object of Interest Detection Using Eye Gaze and RGB-D Cameras,” in 4th Workshop on Eye Gaze in Intelligent Human Machine Interaction, 2012.

C. McMurrough, C. Conly, V. Athitsos, and F. Makedon, “3D Point of Gaze Estimation Using Head-Mounted RGB-D Cameras,” in ASSETS, 2012.

C. McMurrough, J. Rich, V. Metsis, A. Nguyen, and F. Makedon, “Low-cost Head Position Tracking for Gaze Point Estimation,” in Proceedings of the 5th International Conference on PErvasive Technologies Related to Assistive Environments - PETRA ’12, 2012.

Page 9: Hardware and Software Systems for Control of …heracleia.uta.edu/~mcmurrough/documents/McMurrough...Hardware and Software Systems for Control of Assistive Robotic Devices Using Point-of-Gaze

Robot Platforms

Robotic platforms developed for assistive scenarios

Platforms used to implement future applications

Page 10: Hardware and Software Systems for Control of …heracleia.uta.edu/~mcmurrough/documents/McMurrough...Hardware and Software Systems for Control of Assistive Robotic Devices Using Point-of-Gaze

An Assistive UGV Platform for Visually Impaired Users

Guide dogs (“seeing eye dogs”) cost roughly $30K after breeding and training. UGV technology can extend guide dog capability with location awareness, network connectivity, activity monitoring, etc. Robotic guide dog platform uses computer vision to detect paths in the environment, LIDAR scanners to avoid obstacles. GPS sensors provide location awareness and high-level guidance

G. Galatas, C. McMurrough, G. L. Mariottini, and F. Makedon, “eyeDog: An Assistive-Guide Robot for the Visually Impaired,” in PETRA 2011, 2011, vol. 10, no. 11, p. 12.

Page 11: Hardware and Software Systems for Control of …heracleia.uta.edu/~mcmurrough/documents/McMurrough...Hardware and Software Systems for Control of Assistive Robotic Devices Using Point-of-Gaze

Omniwheel-based Holonomic Assistive Robotic Platforms

C. D. McMurrough, H. Enotiades, S. Phan, S. Savoie, and F. Makedon, “Development of An Omniwheel-based Holonomic Robot Platform for Rough Terrain,” in Proceedings of the 6th International Conference on PErvasive Technologies Related to Assistive Environments - PETRA ’13, 2013.

●Holonomic robot platforms are able to move in any direction with any orientation●These platforms are valuable for assistive applications because of their maneuverability●Wheelchairs and mobile manipulators stand to benefit, but the platforms generally do not work well on non-smooth surfaces●A novel, omniwheel based design uses vertical suspension and low-cost subwheels to maintain contact with the ground on rough terrain●Powerful wheelchair motors drive the 4 omniwheels, system could be mounted with a chair or robot arm for assistive scenarios.

Page 12: Hardware and Software Systems for Control of …heracleia.uta.edu/~mcmurrough/documents/McMurrough...Hardware and Software Systems for Control of Assistive Robotic Devices Using Point-of-Gaze

A Development Platform for Non-Tactile Wheelchair Controls

●Intelligent wheelchairs are a combination navigation and HCI●Commercial wheelchair was modified to support rapid prototyping of advanced HCI modalitiesCombination of continuous data streams (pupil tracking) with discrete events (brain-computer interface and/or dialog)Wheelchair platform supports modalities in a “plug and play” fashion using ROS

Page 13: Hardware and Software Systems for Control of …heracleia.uta.edu/~mcmurrough/documents/McMurrough...Hardware and Software Systems for Control of Assistive Robotic Devices Using Point-of-Gaze

Applications

Real-world applications made possible by the preceding hardware and software contributions

Each application is currently under development.

Page 14: Hardware and Software Systems for Control of …heracleia.uta.edu/~mcmurrough/documents/McMurrough...Hardware and Software Systems for Control of Assistive Robotic Devices Using Point-of-Gaze

3D Mapping of Visual Saliency

●Salience describes how much a particular image region stands out from its neighbors (used in psychology and computer vision)●Compute Vision Salience detection methods based on natural behavior exist ●2D eye tracking “heat maps” have been used for marketing studies●3D salience mapping has not been practical due to lack of mobile 3D localization and mapping of PoG●Our 3D PoG headset can provide this capability●3D saliency mapping will support work in automatic eye tracker calibration adjustments, as inherent “calibration points” in the environment can be detected (the eyes should hit these points more than others over some time)

Page 15: Hardware and Software Systems for Control of …heracleia.uta.edu/~mcmurrough/documents/McMurrough...Hardware and Software Systems for Control of Assistive Robotic Devices Using Point-of-Gaze

Intuitive Non-Tactile Wheelchair Controls

●Continued development of wheelchair platform will include intelligent navigation, obstacle avoidance●Using 3D PoG from scene point cloud, a position command can be extracted●Direct position commands do not require “steering” using joysticks, eyes, etc.●Command structure would be much more intuitive and require less user cognitive load

Page 16: Hardware and Software Systems for Control of …heracleia.uta.edu/~mcmurrough/documents/McMurrough...Hardware and Software Systems for Control of Assistive Robotic Devices Using Point-of-Gaze

Object of Interest Detection and Recognition / Assisted Object Manipulation

Using headset, we can identify an object of interestOnce object is identified, we can reason about its size, shape, and location in 3D using point cloudsIdentification pipeline can be used as a communication aidUsing Iterative Closest Points (ICP), we can identify the position of the object in another frame of reference (the Robot)Using the PR2 manipulation pipeline, we can pick up the object and bring it to the user

C. McMurrough, J. Rich, and C. Conly, “Multi-Modal Object of Interest Detection Using Eye Gaze and RGB-D Cameras,” in 4th Workshop on Eye Gaze in Intelligent Human Machine Interaction, 2012.

Page 17: Hardware and Software Systems for Control of …heracleia.uta.edu/~mcmurrough/documents/McMurrough...Hardware and Software Systems for Control of Assistive Robotic Devices Using Point-of-Gaze

Discussion• Open Problems

– PoG headset is cumbersome

● New sensors, processors, and lens configuration will remedy this

– Automatic calibration

– Perception of user state (nystagmus, saccades, vestibulo-occular motion)

• Future Work

– Formulation of true 3D calibration

● Working on a few different untested approaches

– 3D localization and mapping of visual salience

– 3D PoG wheelchair control and object manipulation demo

– Each remaining task will be published

● Goal 1: Submit 2 more papers to eye tracking related journals● Goal 2: Submit at least 1 more paper to a top tier robotics conference

• Expected Outcomes

– Commercialization of 3D PoG headset

– Currently investigating patent options

Page 18: Hardware and Software Systems for Control of …heracleia.uta.edu/~mcmurrough/documents/McMurrough...Hardware and Software Systems for Control of Assistive Robotic Devices Using Point-of-Gaze

Publications (M.S. 2009 - 2010)

● [1] F. L. Lewis, G. R. Hudas, C. K. Pang, M. B. Middleton, and C. McMurrough, “Discrete event command and control for networked teams with multiple missions,” in Proceedings of SPIE, 2009, vol. 7332, p. 73320V–73320V–15.

● [2] P. Ballal, A. Ramani, M. Middleton, C. McMurrough, A. Athamneh, W. Lee, C. Kwan, and F. Lewis, “Mechanical fault diagnosis using wireless sensor networks and a two-stage neural network classifier,” in 2009 IEEE Aerospace conference, 2009, pp. 1–10.

● [3] C. Mcmurrough, K. French, and D. B. Doman, “Developing a Real-Time MAV Flight Control System Test Bed Using NI LabVIEW and PXI,” in NI Week, 2009. *Winner, Intel Multicore Award

● [4] C. McMurrough, “Real Time Hardware And Software Systems For Micro Air Vehicle Flight Control Testing,” The University of Texas at Arlington, 2010.

Page 19: Hardware and Software Systems for Control of …heracleia.uta.edu/~mcmurrough/documents/McMurrough...Hardware and Software Systems for Control of Assistive Robotic Devices Using Point-of-Gaze

Publications (Ph.D 2010 - 2013)● [1] G. Galatas, G. Potamianos, D. Kosmopoulos, C. McMurrough, and F. Makedon, “Bilingual Corpus for AVASR using

Multiple Sensors and Depth Information,” in Auditory-Visual Speech Processing 2011, 2011.

● [2] P. Doliotis, A. Stefan, C. McMurrough, D. Eckhard, and V. Athitsos, “Comparing gesture recognition accuracy using color and depth information,” in Proceedings of the 4th International Conference on PErvasive Technologies Related to Assistive Environments - PETRA ’11, 2011, p. 1.

● [3] G. Galatas, C. McMurrough, G. L. Mariottini, and F. Makedon, “iDog : An Assistive-Guide Robot for the Visually Impaired,” in Proceedings of the 4th International Conference on PErvasive Technologies Related to Assistive Environments - PETRA ’11, 2011, p. 1.

● [4] C. D. McMurrough, “Multi-modal interfaces for control of assistive robotic devices,” in Proceedings of the 14th ACM international conference on Multimodal interaction - ICMI ’12, 2012, p. 329.

● [5] C. McMurrough, J. Rich, C. Conly, V. Athitsos, and F. Makedon, “Multi-modal object of interest detection using eye gaze and RGB-D cameras,” in Proceedings of the 4th Workshop on Eye Gaze in Intelligent Human Machine Interaction - Gaze-In ’12, 2012, pp. 1–6.

● [6] C. D. McMurrough, V. Metsis, J. Rich, and F. Makedon, “An eye tracking dataset for point of gaze detection,” in Proceedings of the Symposium on Eye Tracking Research and Applications - ETRA ’12, 2012, p. 305.

● [7] C. D. McMurrough, V. Metsis, D. Kosmopoulos, I. Maglogiannis, and F. Makedon, “A dataset for point of gaze detection using head poses and eye images,” Journal on Multimodal User Interfaces, Apr. 2013.

● [8] C. McMurrough, C. Conly, V. Athitsos, and F. Makedon, “3D point of gaze estimation using head-mounted RGB-D cameras,” in Proceedings of the 14th international ACM SIGACCESS conference on Computers and accessibility - ASSETS ’12, 2012, p. 283.

● [9] C. Mcmurrough, A. Papangelis, and A. Boisselle, “A Survey of Assistive Devices for Cerebral Palsy Patients,” in Proceedings of the 5th International Conference on PErvasive Technologies Related to Assistive Environments - PETRA ’12, 2012.

● [10] C. McMurrough, J. Rich, V. Metsis, A. Nguyen, and F. Makedon, “Low-cost Head Position Tracking for Gaze Point Estimation,” in Proceedings of the 5th International Conference on PErvasive Technologies Related to Assistive Environments - PETRA ’12, 2012.

Page 20: Hardware and Software Systems for Control of …heracleia.uta.edu/~mcmurrough/documents/McMurrough...Hardware and Software Systems for Control of Assistive Robotic Devices Using Point-of-Gaze

Publications (accepted, to appear)

● [1] C. McMurrough, C. Conly, V. Athitsos, and F. Makedon, “A Mobile , Low-Cost Headset for 3D Point of Gaze Estimation,” International Journal of Advanced Computer Science, vol. 1, no. 1, pp. 2–5, 2013.

● [2] C. D. McMurrough, I. Ranatunga, A. Papangelis, D. O. Popa, and F. Makedon, “A Development and Evaluation Platform for Non-Tactile Power Wheelchair Controls,” in Proceedings of the 6th International Conference on PErvasive Technologies Related to Assistive Environments - PETRA ’13, 2013.

● [3] A. Papangelis and C. McMurrough, “An Assistive Object Manipulation System,” in Proceedings of the 6th International Conference on PErvasive Technologies Related to Assistive Environments - PETRA ’13, 2013.

● [4] C. D. McMurrough, H. Enotiades, S. Phan, S. Savoie, and F. Makedon, “Development of An Omniwheel-based Holonomic Robot Platform for Rough Terrain,” in Proceedings of the 6th International Conference on PErvasive Technologies Related to Assistive Environments - PETRA ’13, 2013.

Page 21: Hardware and Software Systems for Control of …heracleia.uta.edu/~mcmurrough/documents/McMurrough...Hardware and Software Systems for Control of Assistive Robotic Devices Using Point-of-Gaze

Thank you!

Christopher McMurrough

[email protected]

heracleia.uta.edu/~mcmurrough

Questions?