Real-time gaze-tracking for freely-moving observers · Vicon:

1
Real-time gaze-tracking for freely-moving observers Real-time gaze-tracking for freely-moving observers Framework: libGaze Framework: libGaze Equipment Equipment Motivation Motivation MPI FOR BIOLOGICAL CYBERNETICS Experiment 2 Experiment 2 Evaluation Evaluation Experiment 1 Experiment 1 Sebastian Herholz Sebastian Herholz Heinrich H. B Heinrich H. B ülthoff ülthoff D.M. Stampe: Heuristic filtering and reliable calibration methods for video-based pupil-tracking systems (1993) Jeffery S. Johnson:Calibration algorithm for eyetracking with unrestricted head movement. (2007) Renaud Ronsse: Computation of gaze orientation under unrestrained head movements (2007) MAX-PLANCK-GESELLSCHAFT Thomas G. Tanner Thomas G. Tanner Roland W. Fleming Roland W. Fleming Summary Summary headtracker: Vicon eyetracker: Eyelink2 libGaze application display: libGaze calibrate validate getCurrentGaze waitForFixation loadEyeTracker loadHeadTracker eye- tracker head- tracker libGaze network Many eye-tracking systems require the observer’s head to be static during tracking, which considerably reduces the range of potential applications, and may also lead to unnatural eye-movement behaviours if the field of view is large. To overcome these limitations, we have developed a system for tracking the gaze of freely-moving observers in real-time as they walk around, gesture and interact with large (wall-sized) display screens. Goal: to see how accurate the combined system is, when the subject is instructed to make head movements. Task: as experiment 1 except that after each series of grids the subject was ask to rotate their head to a new position. (overall: 5 different head positions arranged in a vertical cross of 40x30deg) Analysis: as experiment 1 Range: The range of the system describes the area in which the gaze of the observer can be tracked. functions: offers various functions to calibrate and control the system language: C / Python interface modularity: completely wraps underlying hardware components. Allows the user to switch between different hardware components without causing any major changes in the code of the application/experiment. Inputs: -position of the pupil on the camera images -position and orientation of the head, from tracked markers Output: -gaze position on the screen -gaze direction in the 3D environment Eyetracker: - Eyelink2 (SR-Research): 250-500Hz Headtracker: - Vicon: 120-180Hz - ART-Track: 60Hz Displays: - Backprojection- screen (MPI): 2.20x1.80m 1280x1024px - Displaywall at Univesity Konstanz: 5.20x2.15m 4640x1920px Related work Related work -“Calibration algorithm for eyetracking with unrestricted head movement” Jeffery S. Johnson, Li Liu, Geb Thomas, John P. Spencer, university of Iowa -“Computation of gaze orientation under unrestrained head movements” Renaud Ronsse, Olivier White, Philippe Lefèvre, Université de Liège overall Latency (worst case): 10+1 <= 11ms libGaze: a software framework which combines data from eye- and head-tracking systems to calculate the current gaze position in real-time. Goal: compare accuracy of our combined system with standard eye-tracking which uses a chin- rest to suppress head-movements. Task: The subject was instructed to fixate a series of targets. These targets were arranged in 50x40deg rectangular grids. Conditions: -fixed: the subject was sitting on a chair with head stabilized using a chin-rest -free: the subject was sitting on a chair in a comfortable position and was asked to hold head stationary Analysis: To measure the accuracy under each condition the angular error between the true gaze position and our estimated gaze position was calculated. Only one calibration at the beginning was used to process all data of the experiment. A session lasted approx. 30mins. Luiz H. Canto-Pereira Luiz H. Canto-Pereira eye-tracker range: The area on the horizontal and vertical axis of the observer's field of view. -Eyelink2: 50x40degree head-tracker range: volume in which the observer can move around and still be captured by the tracking-system. - Konstanz (ART): 6dof at 6x3x4m - MPI (Vicon): 6dof at 1.5x1.5x1.5m References References The system flexibly combines an off-the-shelf video-based eyetracker with an infra-red motion-capture system. Our primary aims were to produce a system with: - low latency: to enable real-time interaction (e.g. gaze-contingent display) - high accuracy: for use in psychophysical experiments or as input device - modular design: each hardware component should be easy replaceable by an alternative technology We presented a system which combines video-based eyetracking with a motion capture system to enable real-time gaze tracking in-front of wall-sized displays. The system gives the observer the opportunity to move in a free and natural manner. We evaluated the system and found it to be fast and accurate enough to support gaze-contingent applications. When the system reaches a stable state, we we intend to make it available to the scientific community. median: 0.85 median: 1.09 median: 0.74 median: 1.12 network latency (<=1ms) included Latency: total latency of the combined system depends on the latency of each component. For the set-up we used latency was estimated as follows: Eyelink2: 2ms (optimal conditions) 4ms (normal conditions) Vicon: <=10ms libGaze: <=1ms D.M. Stampe: Heuristic filtering and reliable calibration methods for video-based pupil-tracking systems (1993) Jeffery S. Johnson, Li Liu, Geb Thomas, John P. Spencer: Calibration algorithm for eyetracking with unrestricted head movement. Behavior Research Methods 39 (1), p. 123-132 (2007) Renaud Ronsse, Olivier White, Philippe Lefèvre: Computation of gaze orientation under unrestrained head movements. Journal of Neuroscience Methods 159, p. 158-169 (2007) Max Planck Institute for Biological Cybernetics, Tübingen Max Planck Institute for Biological Cybernetics, Tübingen contact: [email protected] contact: [email protected]

Transcript of Real-time gaze-tracking for freely-moving observers · Vicon:

Page 1: Real-time gaze-tracking for freely-moving observers · Vicon:

Real-time gaze-tracking for freely-moving observersReal-time gaze-tracking for freely-moving observers

Framework: libGazeFramework: libGaze

EquipmentEquipment

MotivationMotivation

MPI FOR BIOLOGICAL CYBERNETICS

Experiment 2Experiment 2EvaluationEvaluation

Experiment 1Experiment 1

Sebastian HerholzSebastian HerholzHeinrich H. BHeinrich H. Bülthoffülthoff

•• D.M. Stampe: Heuristic filtering and reliable calibration methods for video-based pupil-tracking systems (1993)• Jeffery S. Johnson:Calibration algorithm for eyetracking with unrestricted head movement. (2007)• Renaud Ronsse: Computation of gaze orientation under unrestrained head movements (2007)

MAX-PLANCK-GESELLSCHAFT

Thomas G. TannerThomas G. TannerRoland W. FlemingRoland W. Fleming

SummarySummary

headtracker: Vicon

eyetracker: Eyelink2

libGaze application

display:

libGaze

calibrate

validate

getCurrentGaze

waitForFixation

loadEyeTracker

loadHeadTracker

eye-tracker head-

tracker

libGaze

network

Many eye-tracking systems require the observer’s head to be static during tracking, which considerably reduces the range of potential applications, and may also lead to unnatural eye-movement behaviours if the field of view is large. To overcome these limitations, we have developed a system for tracking the gaze of freely-moving observers in real-time as they walk around, gesture and interact with large (wall-sized) display screens.

Goal: to see how accurate the combined system is, when the subject is instructed to make head movements.

Task: as experiment 1 except that after each series of grids the subject was ask to rotate their head to a new position. (overall: 5 different head positions arranged in a vertical cross of 40x30deg)

Analysis: as experiment 1

Range: The range of the system describes the area in which the gaze of the observer can be tracked.

functions: offers various functions to calibrate and control the system

language: C / Python interface

modularity: completely wraps underlying hardware components. Allows the user to switch between different hardware components without causing any major changes in the code of the application/experiment.

Inputs:-position of the pupil on the camera images

-position and orientation of the head, from tracked markers

Output:-gaze position on the screen-gaze direction in the 3D environment

Eyetracker: - Eyelink2 (SR-Research):

250-500Hz

Headtracker: - Vicon:

120-180Hz

- ART-Track: 60Hz

Displays: - Backprojection- screen (MPI):

2.20x1.80m 1280x1024px

- Displaywall at Univesity Konstanz:

5.20x2.15m 4640x1920px

Related workRelated work-“Calibration algorithm for eyetracking with unrestricted head movement”

Jeffery S. Johnson, Li Liu, Geb Thomas, John P. Spencer, university of Iowa

-“Computation of gaze orientation under unrestrained head movements”Renaud Ronsse, Olivier White, Philippe Lefèvre, Université de Liège

overall Latency (worst case): 10+1 <= 11ms

libGaze: a software framework which combines data from eye- and head-tracking systems to calculate the current gaze position in real-time.

Goal: compare accuracy of our combined system with standard eye-tracking which uses a chin-rest to suppress head-movements.

Task: The subject was instructed to fixate a series of targets. These targets were arranged in 50x40deg rectangular grids.

Conditions:

-fixed: the subject was sitting on a chair with head stabilized using a chin-rest

-free: the subject was sitting on a chair in a comfortable position and was asked to hold head stationary

Analysis: To measure the accuracy under each condition the angular error between the true gaze position and our estimated gaze position was calculated. Only one calibration at the beginning was used to process all data of the experiment. A session lasted approx. 30mins.

Luiz H. Canto-PereiraLuiz H. Canto-Pereira

eye-tracker range: The area on the horizontal and vertical axis of the observer's field of view.

-Eyelink2: 50x40degree

head-tracker range: volume in which the observer can move around and still be captured by the tracking-system.

- Konstanz (ART): 6dof at 6x3x4m - MPI (Vicon): 6dof at 1.5x1.5x1.5m

ReferencesReferences

The system flexibly combines an off-the-shelf video-based eyetracker with an infra-red motion-capture system. Our primary aims were to produce a system with:

- low latency: to enable real-time interaction (e.g. gaze-contingent display)

- high accuracy: for use in psychophysical experiments or as input device

- modular design: each hardware component should be easy replaceable by an alternative technology

We presented a system which combines video-based eyetracking with a motion capture system to enable real-time gaze tracking in-front of wall-sized displays.The system gives the observer the opportunity to move in a free and natural manner. We evaluated the system and found it to be fast and accurate enough to support gaze-contingent applications. When the system reaches a stable state, we we intend to make it available to the scientific community.

median: 0.85

median: 1.09

median: 0.74

median: 1.12

network latency (<=1ms) included

Latency: total latency of the combined system depends on the latency of each component. For the set-up we used latency was estimated as follows:

Eyelink2:2ms (optimal conditions)4ms (normal conditions)

Vicon: <=10ms

libGaze: <=1ms

• D.M. Stampe: Heuristic filtering and reliable calibration methods for video-based pupil-tracking systems (1993)• Jeffery S. Johnson, Li Liu, Geb Thomas, John P. Spencer: Calibration algorithm for eyetracking with unrestricted head

movement. Behavior Research Methods 39 (1), p. 123-132 (2007)• Renaud Ronsse, Olivier White, Philippe Lefèvre: Computation of gaze orientation under unrestrained head movements.

Journal of Neuroscience Methods 159, p. 158-169 (2007)

Max Planck Institute for Biological Cybernetics, TübingenMax Planck Institute for Biological Cybernetics, Tübingencontact: [email protected]: [email protected]