Is apparent fixational drift in eye-tracking data due to ... · Keywords Eye tracking ·Fixational...

14
Behavior Research Methods https://doi.org/10.3758/s13428-020-01414-3 Is apparent fixational drift in eye-tracking data due to filters or eyeball rotation? Diederick C. Niehorster 1 · Raimondas Zemblys 2 · Kenneth Holmqvist 3,4,5 © The Author(s) 2020 Abstract Eye trackers are sometimes used to study the miniature eye movements such as drift that occur while observers fixate a static location on a screen. Specifically, analysis of such eye-tracking data can be performed by examining the temporal spectrum composition of the recorded gaze position signal, allowing to assess its color. However, not only rotations of the eyeball but also filters in the eye tracker may affect the signal’s spectral color. Here, we therefore ask whether colored, as opposed to white, signal dynamics in eye-tracking recordings reflect fixational eye movements, or whether they are instead largely due to filters. We recorded gaze position data with five eye trackers from four pairs of human eyes performing fixation sequences, and also from artificial eyes. We examined the spectral color of the gaze position signals produced by the eye trackers, both with their filters switched on, and for unfiltered data. We found that while filtered data recorded from both human and artificial eyes were colored for all eye trackers, for most eye trackers the signal was white when examining both unfiltered human and unfiltered artificial eye data. These results suggest that color in the eye-movement recordings was due to filters for all eye trackers except the most precise eye tracker where it may partly reflect fixational eye movements. As such, researchers studying fixational eye movements should be careful to examine the properties of the filters in their eye tracker to ensure they are studying eyeball rotation and not filter properties. Keywords Eye tracking · Fixational eye movements · Drift · Power spectrum · Signal color Introduction When humans fixate a static object to stabilize its retinal image (see Hessels et al., 2018), their eyes still make small Diederick C. Niehorster and Kenneth Holmqvist contributed equally to this work. Diederick C. Niehorster diederick [email protected] Kenneth Holmqvist [email protected] 1 Lund University Humanities Lab and Department of Psychology, Lund University, Lund, Sweden 2 ˇ Siauliai University, ˇ Siauliai, Lithuania 3 Institute of Psychology, Nicolaus Copernicus University in Torun, Torun, Poland 4 Department of Psychology, Regensburg University, Regensburg, Germany 5 Department of Computer Science and Informatics, University of the Free State, Bloemfontein, South Africa movements (Ratliff & Riggs, 1950; Ditchburn & Ginsborg, 1953; Collewijn & Kowler, 2008)—termed fixational eye movements. These consist of microsaccades, fixational drift, and tremor (Martinez-Conde et al., 2004; Rolfs, 2009; Rucci & Poletti, 2015). In recent years, the study of these fixational eye movements has elucidated their myriad functional and perceptual consequences (e.g., Ditchburn et al., 1959; Kuang et al., 2012; Rucci et al., 2018; Engbert 2006; Martinez-Conde et al., 2013). Figure 1 shows two example segments of eye-tracking data during fixations. In this figure, the recorded gaze position signal during the fixations appears unstable despite the parti- cipant attempting to keep their gaze stable at a certain loca- tion on the screen. In eye-movement data, these fluctuations are thought to arise from at least two sources, 1) the mea- surement device, and 2) rotations of the eyeball itself—the fixational eye movements. In this paper, we will refer to these two signal components as measurement noise and fixational eye movements, respectively. While the fixational eye move ments are of interest to some researchers, the measurement noise originating from the eye tracker is a potential problem, as it may obscure eye movements of interest.

Transcript of Is apparent fixational drift in eye-tracking data due to ... · Keywords Eye tracking ·Fixational...

Page 1: Is apparent fixational drift in eye-tracking data due to ... · Keywords Eye tracking ·Fixational eye movements ·Drift ·Power spectrum ·Signal color Introduction When humans fixate

Behavior Research Methodshttps://doi.org/10.3758/s13428-020-01414-3

Is apparent fixational drift in eye-tracking data due to filtersor eyeball rotation?

Diederick C. Niehorster1 · Raimondas Zemblys2 · Kenneth Holmqvist3,4,5

© The Author(s) 2020

AbstractEye trackers are sometimes used to study the miniature eye movements such as drift that occur while observers fixate a staticlocation on a screen. Specifically, analysis of such eye-tracking data can be performed by examining the temporal spectrumcomposition of the recorded gaze position signal, allowing to assess its color. However, not only rotations of the eyeballbut also filters in the eye tracker may affect the signal’s spectral color. Here, we therefore ask whether colored, as opposedto white, signal dynamics in eye-tracking recordings reflect fixational eye movements, or whether they are instead largelydue to filters. We recorded gaze position data with five eye trackers from four pairs of human eyes performing fixationsequences, and also from artificial eyes. We examined the spectral color of the gaze position signals produced by the eyetrackers, both with their filters switched on, and for unfiltered data. We found that while filtered data recorded from bothhuman and artificial eyes were colored for all eye trackers, for most eye trackers the signal was white when examining bothunfiltered human and unfiltered artificial eye data. These results suggest that color in the eye-movement recordings was dueto filters for all eye trackers except the most precise eye tracker where it may partly reflect fixational eye movements. Assuch, researchers studying fixational eye movements should be careful to examine the properties of the filters in their eyetracker to ensure they are studying eyeball rotation and not filter properties.

Keywords Eye tracking · Fixational eye movements · Drift · Power spectrum · Signal color

Introduction

When humans fixate a static object to stabilize its retinalimage (see Hessels et al., 2018), their eyes still make small

Diederick C. Niehorster and Kenneth Holmqvist contributedequally to this work.

� Diederick C. Niehorsterdiederick [email protected]

Kenneth [email protected]

1 Lund University Humanities Lab and Departmentof Psychology, Lund University, Lund, Sweden

2 Siauliai University, Siauliai, Lithuania

3 Institute of Psychology, Nicolaus Copernicus Universityin Torun, Torun, Poland

4 Department of Psychology, Regensburg University,Regensburg, Germany

5 Department of Computer Science and Informatics,University of the Free State, Bloemfontein, South Africa

movements (Ratliff & Riggs, 1950; Ditchburn & Ginsborg,1953; Collewijn & Kowler, 2008)—termed fixational eyemovements. These consist of microsaccades, fixationaldrift, and tremor (Martinez-Conde et al., 2004; Rolfs, 2009;Rucci & Poletti, 2015). In recent years, the study ofthese fixational eye movements has elucidated their myriadfunctional and perceptual consequences (e.g., Ditchburnet al., 1959; Kuang et al., 2012; Rucci et al., 2018;Engbert 2006; Martinez-Conde et al., 2013).

Figure 1 shows two example segments of eye-trackingdata during fixations. In this figure, the recorded gaze positionsignal during the fixations appears unstable despite the parti-cipant attempting to keep their gaze stable at a certain loca-tion on the screen. In eye-movement data, these fluctuationsare thought to arise from at least two sources, 1) the mea-surement device, and 2) rotations of the eyeball itself—thefixational eye movements. In this paper, we will refer to thesetwo signal components as measurement noise and fixationaleye movements, respectively. While the fixational eye movements are of interest to some researchers, the measurementnoise originating from the eye tracker is a potential problem,as it may obscure eye movements of interest.

Page 2: Is apparent fixational drift in eye-tracking data due to ... · Keywords Eye tracking ·Fixational eye movements ·Drift ·Power spectrum ·Signal color Introduction When humans fixate

Behav Res

Fig. 1 Gaze position data example. Example segments showing gazeposition signals recorded at 1000 Hz during two fixations from twodifferent participants with an SR EyeLink 1000 Plus, with its filterseither turned on or turned off. The sudden changes in gaze positionat 1250 ms and 1500 ms in both signals are likely microsaccades,whereas drift is visible throughout most of the rest of the signals

Researchers interested in fixational eye movements suchas microsaccades and drift should ensure that the magnitudeof measurement noise in their eye-tracker’s output issufficiently low so as to not obscure these eye movements(Ko et al., 2016). For this reason, such research is usuallyconducted with the video-based eye trackers that providethe lowest noise levels in their class such as the various SR-Research EyeLinks (e.g., Engbert & Mergenthaler, 2006;Scholes et al., 2015; Nystrom et al., 2017) and recently theTobii Spectrum (Nystrom et al., in press). Note also thatit has recently been questioned whether video-based eyetrackers are suitable for microsaccade research (Holmqvist& Blignaut, 2020). Other researchers interested in fixationaleye movements use different measurement techniques thatprovide even better data quality (lower measurement noiselevels), such as Dual-Purkinje eye trackers (Kuang et al.,2012; Horowitz et al., 2007), scleral search coils (McCamyet al., 2015; Ko et al., 2016) and various scanning laseropthalmoscopes (Stevenson et al., 2010; Sheehy et al., 2012;Bowers et al., 2019).

Even with these best-available video-based eye trackers,it is not straightforward to determine what the source ofthe fluctuations in the eye movement signal is becausethe eye movements of interest to fixational eye-movementresearchers can have magnitudes close to, or even below,the noise floor of the eye tracker. One may thereforeresort to knowledge about the dynamics of physiologicalmovements and measurement noise to characterize thecontent and possible origin of a gaze position signal (e.g.,Findlay 1971; Eizenman et al., 1985; Coey et al., 2012;Bowers et al., 2019). While formal analyses of the dynamicsof the gaze position signal usually examine its spectralcomposition (ibid), we think there is value in training the

scientist’s visual pattern recognizer to discriminate betweenthe different types of signals. Here we will therefore firstexamine a set of example gaze position signals collectedfrom human and artificial eyes, before introducing theanalysis techniques that will be used in this paper.

Oculomotor drift

Different signal dynamics are readily seen when examininggaze position data recorded from humans (Fig. 2a) orfrom artificial eyes (Fig. 2b). While some eye trackersproduce data that show large sample-to-sample steps andlook essentially randomly distributed around a central point(Tobii TX300), data from other eye trackers (SR EyeLink1000 Plus and SMI RED250) show smoother trends thatappear more similar to a random walk (see also Blignaut& Beelders, 2012, who provide the visual diagnosis thatthese smooth gaze position signals look like “ant trails”).Since oculomotor drift looks like the smoother signals inFig. 2 (e.g., Ko et al., 2016; and Engbert et al., 2011),when seeing such signals in the eye tracker with the lowestnoise magnitude among the three shown in the plot (theSR EyeLink 1000 Plus), it is tempting to conclude that thiseye tracker has a low enough noise magnitude to renderoculomotor drift visible in the gaze position signal. Thatthese smooth signals are not visible in an eye tracker withhigher noise magnitude, the Tobii TX 300, may be thoughtto strengthen this conclusion.

However, as Blignaut and Beelders (2012; see also Bedell& Stevenson, 2013) already warned, caution is warrantedin drawing such conclusions, as smooth gaze positionsignals could also be produced by noise suppression systemssuch as filters in the eye tracker’s software or hardware.This call for cautious interpretation receives further weightfrom finding such smooth signals also in another trackerthan the EyeLink, viz. the SMI RED250. As the SMIRED250’s noise magnitude is at least as large as that of theTobii TX300, it should be wondered whether the smoothgaze position signals produced by this eye tracker reflectfixational eye movements.

Moreover, as seen in Fig. 2b, these smooth signals alsoexist in data recorded with static artificial eyes on boththe low-noise EyeLink and the higher-noise SMI RED250.During these recordings, we took care to ensure that anyphysical movement of these artificial eyes relative to theeye tracker (e.g., due to vibrations of the table on whichthe eye tracker was placed) was likely well below the noisefloor of the eye trackers and thus undetectable in the eyetrackers’ output. As such, smooth gaze position signals inthese recordings could not be due to any physical movementof the tracked artificial eyes. That smooth gaze positionsignals which appear similar to those recorded from humaneyes (if at smaller magnitude) are nonetheless seen in these

Page 3: Is apparent fixational drift in eye-tracking data due to ... · Keywords Eye tracking ·Fixational eye movements ·Drift ·Power spectrum ·Signal color Introduction When humans fixate

Behav Res

Fig. 2 Gaze position data examples. For three trackers, example 200-ms segments from five fixations recorded from humans (left) andexample 200-ms segments of data recorded with an artificial eye(right). Compared to the 300-Hz Tobii data in the middle row, the1000-Hz EyeLink and 250-Hz SMI data in the top and bottom rowslook smoother. Note that the scale at which data is visualized differs

for each eye tracker and panel to make differences between the signalseasier to see. The scale of the signals is therefore indicated for eacheye tracker’s data. Note also that the scale of the data recorded fromhuman eyes is about three times larger than that for the data recordedfrom artificial eyes

recordings therefore adds further support to the idea thatfinding smooth gaze position signals in an eye tracker’soutput does not necessarily entail that oculomotor drift isbeing measured. Instead, the smoothness may be due tofilters in the eye tracker. These inconsistencies betweeneye trackers in whether recorded gaze position signals aresmooth or not highlight that it is important to uncoverwhether the smooth signals reflect true rotations of the eyeor whether these signals are artifactually generated by filtersin the eye tracker.

Spectral analyses

As discussed, gaze position signals can range from smoothto spiky in appearance. These smooth and spiky gazeposition signal types in fact lie along a continuum that canbe described by a single parameter, the spectral color of thesignal. The spectral color of a signal is a description of thepower in the signal at different temporal frequencies. Thereis a long history of applying frequency analyses to eye-tracking data in general (e.g., Stark et al., 1961; Campbellet al., 1959; Bahill et al., 1981) and for fixational eyemovements in specific (e.g., Findlay, 1971; Eizenman et al.,1985; Stevenson et al., 2010; Coey et al., 2012; Sheehy etal., 2012; Ko et al., 2016; Bowers et al., 2019). Here, we willprovide a brief discussion of how the spectral color of gazeposition signals is interpreted; for a more detailed discussionof spectral analyses of eye-tracking data, see Niehorsteret al. (2020c).

As an example, the temporal spectral decomposition ofthe gaze position signals of Fig. 1 is shown in Fig. 3.

Specifically, Fig. 3 shows amplitude spectra, allowing todirectly read off the amplitude of gaze movement at a givenfrequency. As can be seen, for the fixation recorded withthe EyeLink’s filter switched off (yellow line), the spectrumappears to consist of two distinct segments, i.e., a segmentat the lower frequencies that slopes downward and afterapproximately 100 Hz, a flattened out segment. Both theslopes in the amplitude spectrum and the location of theflattening-out are consistent with the above-cited literature.

In the literature, the downward-sloping segment is ofteninterpreted as due to fixational eye movements. A signalwith a straight-line downward slope as spectrum is said toexhibit 1/f dynamics, because the shape of the spectrum is

Fig. 3 Amplitude spectrum example. Amplitude spectra for thefixational eye movement traces shown in Fig. 1, computed using themultitaper method (Thomson, 1982; Babadi & Brown, 2014). Thesedata were recorded with an SR EyeLink 1000 Plus at 1000 Hz, with itsfilters either turned on or turned off

Page 4: Is apparent fixational drift in eye-tracking data due to ... · Keywords Eye tracking ·Fixational eye movements ·Drift ·Power spectrum ·Signal color Introduction When humans fixate

Behav Res

characterized by the equation 1/f α . Here, f is frequencyand α is a scaling exponent reflecting the slope of the line,which characterizes the scaling of the signal’s power withfrequency. In this case, the power spectral slope of thissegment is close to 6 dB/octave (i.e., 1/f 2), and reflectsa signal that looks smooth. Such scaling is what wouldbe expected for the random walk-like nature of oculardrift (Cornsweet, 1956; Findlay, 1971; Burak et al., 2010;Engbert et al., 2011; Nystrom et al., in press).

Such signals with non-zero spectral slope are referredto as “colored”. In contrast to this colored segment ofthe power spectrum stands the flat segment observed athigher frequencies (Fig. 3). Such flat power spectra wheresignal power is constant over frequency are called whitesignals, and appear random and more spiky. For eye-tracking signals, such signal dynamics are often attributedto measurement noise. Since fixational eye movements havea bandwidth of up to about 100 Hz (e.g., Findlay, 1971;Ko et al., 2016; Bowers et al., 2019), it is expectedthat the spectrum at higher frequencies only reflects suchmeasurement noise. Note that if the measurement noiseof an eye tracker is too large, it will drown out the 1/f

component of the gaze position signal that is due to fixatio-nal eye movements, indicating that the eye tracker is notsensitive enough to measure the eye movements of interest.

An understanding of the noise characteristics of the mea-surement device is critical when studying gaze dynamicsby means of examinations of signal color because it mustbe ascertained that the source of the 1/f dynamics in thegaze position signals is due to the participant’s eye move-ments and neither the measurement noise produced by theeye tracker nor a filter in the eye tracker. As can be seen bycontrasting the two spectra in Fig. 3, the filters of the Eye-Link strongly change the shape of the measurement noisepart of the signal’s spectrum (i.e., beyond 100 Hz). Indeed,while measurement noise in an eye tracker is likely white ifeach output sample is processed independently, 1/f powerlaw behavior in the signal can be introduced not only dueto the dynamics of a rotating human eye but also by apply-ing temporal filters to the recorded gaze signals. This was,for instance, shown by Coey et al. (2012), who recorded anartificial eye (thus exhibiting no fixational eye movements)with an eye tracker and examined the color of the resultinggaze position signal. They reported that the signal in theirASL eye tracker was indeed white, as would be expected formeasurement noise, when the eye tracker’s averaging filterwas switched off, while switching on this noise suppressionsystem yielded colored gaze position signals.

Wang et al. (2016) have extended Coey et al.’s (2012)results to 12 further eye trackers, and report white signalsreflecting measurement noise when recording from artificialeyes for each of the systems they examined. These results

of Wang et al. (2016), however, appear inconsistent with areport by Blignaut and Beelders (2012), who have examinedtwo eye trackers that were also reported on by Wang et al.(2016). Blignaut and Beelders (2012) found that one of thetwo eye trackers exhibited the kind of smooth traces thatcharacterize colored signals, even when recording from anartificial eye. Although Blignaut and Beelders (2012) didnot perform frequency analyses, their findings suggest thateye trackers may produce gaze position signals that exhibit1/f dynamics even in the absence of any physical eyemovement, which is at odds with the findings of Wang et al.(2016), but consistent with those of Coey et al. (2012).

Aims of this paper

Here we examine whether color (visually identified assmooth gaze position signals) in the output of an eye trackerreflects fixational eye movements or whether these signalsare instead due to filters in the eye tracker. For this purpose,the spectral composition of gaze position signals obtainedfrom human and artificial eye recordings made with fivevideo-based eye trackers is analyzed.

We posit the following two models for the origin ofcolored signal dynamics in eye tracker data. First, Wanget al. (2016) reported that data recorded with artificial eyesare always white, while data recorded from humans arealways colored. Based on this observation, they speculatedthat the color in human eye-tracking data originates fromfixational eye movements. We will refer to this statement asthe oculomotor hypothesis.

The second hypothesis, the filter hypothesis, statesthat the color in eye-tracker data is due to temporalfilters in the eye tracker hardware or software. The filterhypothesis offers an explanation for why data recorded fromartificial eyes can also appear smooth (cf. Blignaut andBeelders, 2012 and Fig. 2 above) and exhibit color (cf. Coeyet al., 2012).

Under these hypotheses, we may predict the followingoutcomes for our measurements. To generate these predic-tions, we assume that unfiltered measurement noise is white(see, e.g., Coey et al., 2012; Wang et al., 2016), which isalso borne out by the results reported below. Table 1 summa-rizes the predicted results under the two hypotheses. First,under the oculomotor hypothesis, as reported above, wewould predict that all signals recorded from human eyes arecolored, while all signals recorded from artificial eyes arewhite. In contrast, under the filter hypothesis, data recordedwith artificial eyes are white at the early processing stages(e.g., determining the location of the pupil and cornealreflection features, as well as gaze estimation), but becomecolored in later stages of gaze signal processing through theapplication of filters. In the case of the filter hypothesis, a

Page 5: Is apparent fixational drift in eye-tracking data due to ... · Keywords Eye tracking ·Fixational eye movements ·Drift ·Power spectrum ·Signal color Introduction When humans fixate

Behav Res

Table 1 Expected results under the oculomotor and filter hypothesesin terms of signal color as denoted by the signal’s power spectrumslope α (white or

Human data Artificial eye data

Oculomotor hypothesis

Filtered white (α = 0)

Unfiltered white (α = 0)

Filter hypothesis

Filtered

Unfiltered white† (α = 0) white (α = 0)

*If the measurement noise magnitude of the eye tracker is large,white signals may be found in human unfiltered data. †If themeasurement noise magnitude of the eye tracker is low enough, some

signals may be found in human unfiltered data.

similarly colored signal is thus expected for data recordedfrom human and artificial eyes.

Note, however, that the outcomes of our measurementsmay not conform strictly to one of these hypotheses.Specifically, if the magnitude of measurement noise ofan eye tracker is much larger than that of fixational eyemovements, we may expect to find that human unfilteredeye-tracker data is white also under the oculomotorhypothesis. Conversely, under the filter hypothesis, ifthe measurement noise magnitude of an eye tracker issufficiently small so as to render some fixational eyemovements detectable in its signal, the unfiltered signal canalso be expected to exhibit color when recording from ahuman. Signal color in this case may however reflect notonly fixational eye movements but could also arise due to,for instance, incomplete compensation for head movement,or deviations in the gaze position signal caused by changesin pupil size (Wyatt, 2010; Drewes et al., 2012; Dreweset al., 2014; Choe et al., 2016; Hooge et al., 2019).

Some of this material has previously been presented inHolmqvist and Andersson (2017 pp. 179–182). Furthermore,the data analyzed in this paper are also used for parallel analy-ses in Niehorster et al. (2020c), which provides an overviewof various measures for characterizing eye-tracking signals,and investigates how these measures relate to the slope ofthe signal’s power spectrum α used in this paper.

Method

Participants and artificial eye

Human data were acquired from three volunteers and authorDN, yielding data from a total of eight eyes. One participantwore glasses, three did not. The participants providedinformed consent.

We also recorded data from a set of artificial eyesprovided by SMI GmbH. The same set was previously usedby Wang et al. (2016).

Apparatus

Gaze position signals were recorded on five eye trackers:the SR Research EyeLink 1000 Plus in desktop mountand head stabilized mode at 1000 Hz, the SMI RED250at 250 Hz, the SMI RED-m at 120 Hz, the Tobii TX300at 300 Hz, and the Tobii X2-60 at 60 Hz. Recordingswere performed with the default settings of the eyetrackers, i.e., any filters were left on if they wouldbe on by default. Recordings with the EyeLink werecontrolled by the EyeLink Toolbox (Cornelissen, Peters& Palmer, 2002), the SMIs with an early version ofSMITE (Niehorster & Nystrom, 2020b) and the Tobiiswith an early version of Titta (Niehorster, Andersson &Nystrom, 2020a). For the EyeLink 1000 Plus, we have madeadditional recordings with its heuristic filter switched off.For the SMI RED-m, the default setting to average the gazeposition data for the two eyes was switched off to yieldseparate signals for each eye. Viewing distance was appro-ximately 65 cm. Participants were unconstrained, except forthe Eyelink, where chin and forehead rests were used.

Stimuli and procedure

After a default calibration (i.e., the default numberof calibration points for the system in their defaultconfiguration) as was appropriate for the specific eye trackerand setup, we had participants look at a further seriesof points on the monitor. These points (a 1.6-cm-wideblack circle overlaid with a white cross and a 0.3-cm-widecentered black circle, following Thaler et al., 2013) weredistributed in a 4 x 8 rectangular grid. Further points wereplaced on a 3 x 7 rectangular grid such that they were at thecenters of the cells defined by the 4 x 8 grid. This leads to atotal of 53 points (see Fig. 4). To avoid differences betweeneye trackers due to the different screen sizes used, for alleye trackers this grid spanned (horizontally and vertically)45.5 x 26.8 cm. After fixating the center of the screen, thefixation points were presented for 1500 ms each in a randomsequence containing each location four times, yielding atotal of 213 presented points in a session.

For recordings with artificial eyes, an experimenterfirst calibrated the eye tracker. The pair of artificial eyes,mounted on a heavy tripod at 6-cm separation fromeach other, was then positioned at the location where theexperimenter’s eyes were during calibration. After a briefresting period, data recording was then started and ran for19 s while the experimenter left the room. This procedurefollowed established practice in the field (e.g., Coey et al.,

Page 6: Is apparent fixational drift in eye-tracking data due to ... · Keywords Eye tracking ·Fixational eye movements ·Drift ·Power spectrum ·Signal color Introduction When humans fixate

Behav Res

Fig. 4 Fixation target locations. The 53 fixation targets were laid outon a 4 x 8 rectangular grid, with another 3 x 7 rectangular grid placedon top such that its points coincided with the centers of the cells formedby the 4 x 8 grid.

2012; Wang et al., 2016; Holmqvist and Blignaut, 2020) andwas required because not all the eye trackers would deliverdata without a prior calibration, and we did not have a wayto perform a calibration using the artificial eyes themselves.

Analysis

Window selection

To test our hypothesis, we had to compute the slope ofthe power spectrum of the gaze position signal during eachfixation. To be able to compute this for each fixation point,we first had to select a time window of data points toanalyze. To do so, we developed a window selection methodthat aims to place the window we take data from as close tothe presented fixation point as possible. Ideally, the methodshould not rely on any measure of fixational stability so asnot to bias the signal’s spectrum. It should also not rely on afixation classification algorithm, as we could not guaranteethat any of the known algorithms is sufficiently reliablein producing comparable windows across the large rangeof sampling frequencies and noise magnitudes of the eyetrackers employed in this examination. The procedure toselect a data window for each presented fixation point wasas follows. A 200-ms window (Hooge et al., 2017) slid overa section of data ranging from 200 ms after fixation pointonset until fixation point offset. To exclude windows thatlikely contained a (micro-)saccade, we then performed thefollowing procedure. For each possible window position,the dispersion of the samples in the window,√

(max(x) − min(x))2 + (max(y) − min(y))2 (1)

was calculated. For each fixation, we then excluded half ofthe candidate window positions, i.e., those that yielded the50% largest dispersion values, from further consideration.For the remaining window positions, the average gaze

position during the window was calculated. The windowfor which the average gaze position was closest in spaceto the fixation point was selected as the window for whichthe spectrum was analyzed. For the data recorded fromthe artificial eyes, measures were calculated for a 200-mswindow that was moved across the entire recording in 50-mssteps.

As we are not interested in the eye movements of theparticipant but in the gaze position signals, we treatedeach eye independently. The below analyses thus reportresults for eight eyes. “Eye” in the below text refers toone of these unique eyes. The above window selectionmethod was executed separately for the data from eacheye. No differences between data from the participants’dominant and non-dominant eyes were found (all p valuesof dependent samples t tests > 0.91).

Amplitude spectra

Amplitude spectra for the gaze position signals of eacheye were separately computed for the horizontal andvertical channels using the function periodogram from theMATLAB (Natick, MA, USA) Signal Processing Toolbox,with the default rectangular window. The output of thisfunction is a power spectrum. To create amplitude spectra,the square root of the power spectra was computed.

The slope of the power spectrum (scaling exponent α)was determined by fitting a line in log-log space to thepower spectrum using the MATLAB function polyfit. Notethat although slightly uneven inter-sample intervals werereported in the data for some of the eye trackers, possiblydue to variations in camera framerate or jitter in softwaretimestamps (standard deviations of the human data’s inter-sample interval [ISI] as reported by the eye trackers were0, 0.74, 9.98, 1.15, and 1.89% of the nominal ISI of,respectively, the SR EyeLink 1000 Plus, SMI RED250, SMIRED-m, Tobii TX300, and Tobii X2-60), we found the sameresults when using the Lomb–Scargle periodogram methodthat Wang et al. (2016) recommended be used for unevenlysampled data. Estimates of α calculated from these twoperiodogram methods correlated very highly, R2 > 0.99.

Unfiltered data

Where possible, the analyses in this paper were done bothfor gaze position signals recorded with the eye tracker’sdefault filters applied, and for data that were recorded withany configurable filters switched off. For the EyeLink,besides the set of recordings with its heuristic filter set to itsdefault level (2, “extra”), we also made recordings with theheuristic filter switched off for a subset of participants.

The SMI eye trackers tested do not offer the option toswitch off the filters applied to the gaze position signal

Page 7: Is apparent fixational drift in eye-tracking data due to ... · Keywords Eye tracking ·Fixational eye movements ·Drift ·Power spectrum ·Signal color Introduction When humans fixate

Behav Res

that is provided in screen coordinates. Both SMI eyetrackers, however, also provide a gaze vector in SMI’sheadbox coordinate system in their data files, and wesuspect that no temporal filters are applied to these dataduring gaze estimation (our analyses below corroboratethis assumption). To enable analyses of unfiltered SMIdata, we therefore decomposed the gaze vectors for eacheye into Fick angles (Fick, 1854; Haslwanter, 1995), andthen applied the methods described above to calculate theperiodogram and power spectrum slope from the resultingeye orientation data.

Tobii claims in their product documentation that theTX300 and X2-60 do not apply any temporal filter to therecorded gaze position signals and that these machines thusalways deliver unfiltered data. Our analyses below indeedappear to confirm this claim. As such, for recordings madewith the Tobii eye trackers, we only present analyses ofunfiltered data.

Results

In this result section, we examine plots of the amplitudespectra of the recorded data. Along with these figures,we have also listed the corresponding slopes of the powerspectra (scaling exponent α) for the five eye trackers inTable 2. Scaling exponents for both human data and datarecorded from artificial eyes are listed both with the eyetracker’s filters switched on where possible, and with thefilters switched off. Furthermore, Table 2 lists both scalingexponents derived by fitting lines to the entire frequencyrange of the power spectrum, and scaling exponents for fitsto only the first 100 Hz of the power spectrum. The latterfits indicate gaze position signal dynamics in the frequencyrange of fixational eye movements (see above).

Data at default settings

Figure 5 shows amplitude spectra derived from gaze positiondata from the five eye trackers, with human data presentedin the left column and data recorded from artificial eyesin the right column. First we examine the two Tobiis, theTX300, and the X2-60. Note that both eye trackers onlyprovided unfiltered data, and as such amplitude spectra forunfiltered data are plotted in Fig. 5. As can be immediatelyseen, data for both eye trackers showed amplitude spectrathat are close to flat, corresponding to scaling exponentsthat were close to zero (Table 2). This indicates that theirsignals are close to white, for both artificial eyes and humandata. Importantly, the spectral slope is the same for bothhuman and artificial eyes for both eye trackers, even thoughthe magnitude of variability in the signal—the height of theamplitude spectrum in the plots—is larger for human datathan for artificial eye data. This finding does not have abearing on the oculomotor vs. filter hypothesis discussionbecause we cannot exclude the possibility that these two eyetrackers produced a white measurement noise componentthat was sufficiently large when recording from human eyesto drown out possible small fixational eye movements thatmay otherwise have been recorded.

At their default setting of providing filtered data, the SMIeye trackers exhibited a different type of signal than the twoTobiis. The amplitude spectra for data from the RED-m, andespecially the RED250, showed a clear downward slope.This corresponded to scaling exponents for filtered data thatwere around 1 and 2, for the RED-m and RED250, respec-tively, for both human data and data recorded from artificialeyes. This indicates that both eye trackers provide gazeposition signals with significant color and replicates thereport of Blignaut and Beelders (2012) that an SMI RED250produces smooth data when recording from an artificial eye

Table 2 Power spectrum scaling exponents (α) for filtered and unfiltered human and artificial eye data determined by fitting a line to (left columns)the entire power spectrum, and (right columns) the first 100 Hz of the power spectrum

Eye tracker Fit to entire power spectrum Fit to first 100 Hz

Filtered Unfiltered Filtered Unfiltered

Human Artificial eye Human Artificial eye Human Artificial eye Human Artificial eye

EyeLink 2.281 2.328 0.544 0.081 1.307 0.466 1.179 0.219

RED250 2.280 2.456 0.903 –0.001 2.353 2.482 1.004 0.006

RED-m 0.886 1.364 0.330 0.210 0.886 1.364 0.330 0.210

TX300 – – 0.280 0.182 – – 0.351 0.224

X2-60 – – 0.382 0.308 – – 0.382 0.308

The power spectrum scaling exponents presented in this table were derived by averaging the exponents for horizontal and vertical gaze positiondata. Note that for the Tobii eye trackers, only unfiltered data were available. Note also that as the SMI RED-m and Tobii X2-60 had samplingfrequencies below 200 Hz, fits to their entire spectrum were used for all entries in the table.

Page 8: Is apparent fixational drift in eye-tracking data due to ... · Keywords Eye tracking ·Fixational eye movements ·Drift ·Power spectrum ·Signal color Introduction When humans fixate

Behav Res

Fig. 5 Amplitude spectrum plots for data recorded at each eye tracker’s default settings from human eyes (left column) and artificial eyes (rightcolumn). Different color lines denote different participants. Solid lines show amplitude spectra derived from horizontal gaze position data, anddotted lines for vertical gaze position data

Page 9: Is apparent fixational drift in eye-tracking data due to ... · Keywords Eye tracking ·Fixational eye movements ·Drift ·Power spectrum ·Signal color Introduction When humans fixate

Behav Res

(see also Fig. 2), but is inconsistent with the report of Wanget al. (2016) that both these SMIs produced white signalsin this case. These data offer a strong contradiction to theoculomotor hypothesis because the data are not consistentwith the expectation derived from this hypothesis thatrecording from artificial eyes should yield white signals.

Finally, the EyeLink 1000 Plus, like the two SMIs,exhibits a downward slope in the amplitude spectrum,although at a much lower magnitude than the SMIs. As wasthe case for the other four eye trackers, for the EyeLink, thehuman data were noisier than the artificial eye data, but notqualitatively different. When its filters were switched on, theEyeLink’s data yielded colored signals both when recordingfrom human eyes and when recording from artificial eyes.As was the case for the data of the SMI eye trackers, thisfinding is in contradiction to the oculomotor hypothesis, andis inconsistent with the white signals reported by Wang et al.(2016) for an EyeLink when recording from artificial eyes.

However, while the scaling exponent for the EyeLinkdata was very similar for human and artificial eye datawhen it was determined from the entire power spectrum(see Table 2, left columns), the scaling exponent wasmuch larger for the human data than for the artificial eyedata if it is computed for data only up to 100 Hz (seeTable 2, right columns)—the frequency range of fixationaleye movements. This indicates that the EyeLink’s filterhas a larger effect in the frequency range beyond 100 Hzthan for the first 100 Hz, and interestingly also suggeststhat likely at least part of the observed color in the humandata is due to fixational eye movements or artifacts suchas the slow-varying deviations in recorded gaze positioncaused by fluctuations in pupil size (Wyatt, 2010; Hoogeet al., 2019), and not only due to the EyeLink’s filter. Thisfinding may therefore suggest that the noise magnitude inthe EyeLink recordings was low enough to enable recordingsome fixational eye movements. Note that the scalingexponents were very similar for the two frequency rangesfor all the other eye trackers.

In summary, the above analysis shows that for each eyetracker, the color of the signal varied little between datarecorded from human and artificial eyes. This pattern offindings across the five eye trackers is inconsistent withthe hypothesis that fixational eye movements are the originof colored signals in eye-tracker recordings. If this werethe case, we would have instead expected to see differentsignal colors in human and artificial eye data for most eyetrackers and, importantly, that data recorded from artificialeyes would consistently have exhibited white signals.

Unfiltered data

The observations reported above are consistent with thefilter hypothesis, which states that the color of the signal

and hence the smoothness of its visual appearance derivefrom filters applied by the eye tracker. If such filters arethe predominant cause of the colored dynamics observed invideo-based eye-tracker data, it would be expected that thesignal color is similar for both artificial eyes and humandata for each eye tracker, which is what we observed.Importantly, it would furthermore be expected that thesignal would appear white when such filters are switchedoff or when data from an unfiltered stage of the eye tracker’sgaze estimation pipeline are examined. In order to providethis further test of the filter hypothesis, next we report onunfiltered data acquired with the EyeLink and SMI systems.

For the EyeLink 1000 Plus, new recordings were madewith an artificial eye and for two of the participants usingan identical setup and procedure as the previous recordings,but with its heuristic filter switched off.

Figure 6 presents amplitude spectra of these data, whichcan be compared to the top row of Fig. 5 where recordingsmade with the filter switched on are presented. As Fig. 6bclearly shows, unfiltered EyeLink data recorded with anartificial eye yielded a white signal, in stark contrast tothe colored signal observed when recording with the filterswitched on. Furthermore, in Fig. 6a, it is seen that theunfiltered human data were also much whiter (flatter slope)than when recording with the filter switched on. Thisshows that the color of the signal recorded from humaneyes with the EyeLink was for an important part due toits heuristic filter, as expected under the filter hypothesis.Closer examination of Fig. 6a and comparison with Fig. 5however allows us to add some nuance to this conclusion.In this comparison, it is seen that the scaling exponents forfiltered and unfiltered human data when determined overthe first 100 Hz (cf. Table 2) of the signal were nearlyidentical. This suggests that the EyeLink’s heuristic filteressentially acts as a low-pass filter, and has only minimaleffect in the frequency range that contains information aboutfixational eye movements, but sharply colors the frequencyrange beyond it.

The slight color that remains in Fig. 6a in the human gazeposition signals recorded with the EyeLink after turningoff the heuristic filter is not indicative of further, hiddenfilters, because such filters would also have colored thesignal recorded with artificial eyes. We cannot rule out thatthe colored signal dynamics may reflect deviations in thegaze position data due to uncompensated head movements,but consider this unlikely since the recordings with theEyeLink were made on experienced participants whoseheads were stabilized on chin and forehead rests, whichwould minimize head movements. Instead, the remainingsignal color indicates that the EyeLink’s measurement noisemagnitude is low enough that it may in fact pick up somefixational eye movements. If so, these constitute mainly driftas our analysis-window selection procedure should have

Page 10: Is apparent fixational drift in eye-tracking data due to ... · Keywords Eye tracking ·Fixational eye movements ·Drift ·Power spectrum ·Signal color Introduction When humans fixate

Behav Res

Fig. 6 Amplitude spectra for unfiltered EyeLink 1000 Plus data.Recordings made with SR EyeLink 1000 Plus with its heuristic filterturned off and recording from human eyes (left column) and artificialeyes (right column). Artificial eye data now exhibit a flat amplitudespectrum (white signal). The slope for data recorded from humans has

become much shallower than when recording with the filter switchedon, as shown in Fig. 5, but still exhibits some color. Different colorlines denote different participants. Solid lines show amplitude spectraderived from horizontal gaze position data, and dotted lines for verticalgaze position data

excluded most segments that contained microsaccades. Thesignificant color in unfiltered human EyeLink data seenin Fig. 6a when examining the first 100 Hz supports thisinterpretation. Alternatively, however, the remaining colormay reflect the slow-varying deviations that artefactuallyoccur in the recorded gaze position of video-based eyetrackers due to fluctuations in pupil size (e.g., Wyatt 2010;Hooge et al., 2019).

For the SMI eye trackers, it was not possible to turn off allfilters, but the data files include gaze vector information thatis an intermediate representation of gaze used to determinegaze position on the screen (see “Unfiltered data” in the“Method” section above). We posited that these gaze vectorsmay have undergone less filtering than the gaze positiondata. Figure 7 plots amplitude spectra based on the dataderived from these gaze vectors from the same SMI RED-m and RED250 recordings as presented in Fig. 5. Indeed,for all data except the human SMI RED250 data, thedata are now practically white signals, consistent with theexpectation that the gaze vectors in these SMI systemsare unfiltered, and importantly that the color and smoothappearance of data from these systems is due to filtering ofthe gaze position signal provided by these eye trackers. Thehuman SMI RED250 gaze vector data still exhibit signalsthat are clearly more white than the corresponding filteredhuman and artificial eye gaze position data. That humangaze data on the SMI RED250 retains some color whenbypassing its filters suggests that imperfect head-movementcompensation may have caused deviations in the data(participants were not stabilized on a chin rest in this setup,which may cause such deviations, cf., Niehorster et al.,2018), may be due to the pupil-size artifact in gaze data,or similar as for the EyeLink above indicates that the SMIRED250 can measure human oculomotor drift. We consider

the latter explanation to be very unlikely given that theRED250’s noise magnitude is almost an order of magnitudelarger than that of the EyeLink, which would likely renderany fixational eye movements undetectable.

In summary, these further analyses confirm that thesmooth colored signal characteristics seen in artificial eyedata recorded with the EyeLink and SMI systems (cf. Fig. 5)are due to filters in these eye trackers. For human data,the results furthermore strongly suggest that filters, and notfixational eye movements, are the main cause of the colorobserved in the gaze position signals provided by these eyetrackers.

Discussion

In this paper, we have reported analyses of the spectral colorobserved in gaze position signals recorded with video-basedeye trackers during human fixation episodes and in datafrom artificial eyes. Using these data, we examined whethercolor in the gaze position signal is likely due to fixationaleye movements of the human participants, or is insteadmostly caused by filters in the eye tracker. Below we discussthe findings of this investigation, and their implications forresearch into fixational eye movements.

Eye-tracking data signal properties and their origin

The findings of this paper support the hypothesis that filters,and not fixational eye movements, are the main cause of thecolor that is frequently observed in the gaze position data ofmany video-based eye trackers. The oculomotor hypothesisholds that colored (smooth-looking) signals would beobserved only in data recorded from human eyes, while data

Page 11: Is apparent fixational drift in eye-tracking data due to ... · Keywords Eye tracking ·Fixational eye movements ·Drift ·Power spectrum ·Signal color Introduction When humans fixate

Behav Res

Fig. 7 Amplitude spectra for gaze vector data from the SMI eye trackers. As not all filters could be turned off in the SMI RED systems, we insteadanalyzed the gaze vector data provided by these eye trackers for both human eyes (left column) and artificial eyes (right column). Except for thehuman data recorded on the SMI RED250, all gaze vector data exhibit white signal dynamics. The human gaze vector data on the SMI RED250nonetheless exhibits amplitude spectra that are much less colored than the filtered gaze position data shown in Fig. 5. Different color lines denotedifferent participants. Solid lines show amplitude spectra derived from horizontal gaze position data, and dotted lines for vertical gaze positiondata

recorded from artificial eyes would be exclusively white(random). Ascribing to this theory, previous work (Wanget al., 2016) has interpreted the pink color they observed inthe gaze position data they recorded from human subjectsto be of an oculomotor origin, i.e., microsaccades and drift.In contrast, our recordings show that for each eye trackerexcept the EyeLink, data recorded from human eyes exhibitthe same signal color as data from artificial eyes. Even forthe EyeLink, whether its heuristic filter was enabled or notcaused a much larger change in the color of the recordedgaze position signals than the difference in color betweenrecordings made from human and artificial eyes (contrastthe top panels of Fig. 5 with Fig. 6).

As such, across the eye trackers examined in this study,the main contributor to whether the gaze position signalexhibited color was found to be whether filters were applied.This result is consistent with a finding reported by Coeyet al. (2012) that data recorded with an artificial eye werewhite when their eye tracker’s averaging filter was switchedoff, and colored when it was switched on. Our results are,however, inconsistent with those of Wang et al. (2016), whoreported that recording from artificial eyes yielded whitesignals for all of the eye trackers they examined. As we

have discussed and shown in this paper, the filters found inthe eye trackers we examined (the SMIs and the EyeLink)necessarily introduce color in the recorded signal. Since itis, to the best of our knowledge, impossible to switch offthe filters applied to the gaze position output of the SMIRED250 and RED-m, we find it remarkable that Wang et al.(2016) report that these SMI eye trackers produce whitesignals when recording from an artificial eye1. Our findingof colored signals in the SMI RED250 is furthermoreconsistent with Blignaut and Beelders (2012), who havepreviously reported that the SMI RED250 produces smooth-looking (colored) gaze position signals when recordingfrom an artificial eye.

Implications for fixational eye-movement research

Our results may have important implications for fixationaleye movement research. First, when viewing visualizationsof recorded eye-tracking data like those presented in the topand bottom panels of Fig. 2a, it is tempting to interpret the

1For the EyeLink the filter can be switched off, but it is unclear fromthe method reported by Wang et al. (2016) whether they have done so.

Page 12: Is apparent fixational drift in eye-tracking data due to ... · Keywords Eye tracking ·Fixational eye movements ·Drift ·Power spectrum ·Signal color Introduction When humans fixate

Behav Res

smoothly changing signal as being indicative of oculomotordrift, since it looks like and is statistically similar to arandom walk. Our results, however, should lead to cautionin doing so, since we found that the same kinds ofsmoothly changing signals can be created by filtering awhite signal (see also, Niehorster et al., 2020c). Just likea cautious eye-movement researcher takes care to not inferwhy participants, for instance, look in a certain order tospecific points on a stimulus display unless their researchdesign enables them to do so (e.g., Ballard et al., 1995), careshould also be taken to not read too much into the smoothlychanging eye movement signal.

Second, for the SMI RED250 and RED-m and the TobiiTX300 and X2-60 eye trackers used in this paper, theauthors deem it very unlikely that their data contain arecoverable trace of fixational eye movements. Even thoughsome of these systems provide colored gaze position signalssuggestive of ocular drift, when examining unfiltered datafrom these eye trackers, a white signal with a powerspectral density slope close to 0 was found. This suggeststhat the noise level in these systems is too high to beable to resolve fixational eye movements. Although to thebest of our knowledge these four systems have not beenused for fixational eye movement research, our resultsdo provide a cautionary tale by highlighting that it isimperative when doing such studies to ensure that thesmooth-looking colored gaze position signals output by theeye tracker are not created by a filter that is applied tothe eye-tracker data. We further discuss the EyeLink in theSection “The EyeLink” below.

Third, important open questions regarding the use offilters in the recording of eye movements remain, especiallyas influential work using video-based eye trackers to studyfixational eye movements (e.g., Roberts et al. 2013; Engbertand Kliegl, 2004; Engbert et al., 2011; Liang et al., 2005)was performed with the EyeLink’s heuristic filter enabled(Engbert, pers. comm.; Roberts, pers. comm.).2 To be ableto fully evaluate the work in this field, it is importantto establish to what extent filters such as the EyeLink’sheuristic filter lift the signal of interest from the backgroundnoise, and to what extent the applied filters instead alteror even create the signal dynamics of interest. Such aneffort would especially be of interest since others have notedthat the displacements and eye velocities characterizingfixational eye movements are of similar magnitude as the

2It should be noted that in these works from both the Engbert and theRoberts labs, heuristic filter level 1 was used, which is a lower levelthan the default level 2 that was used in the recordings for the currentstudy and may thus be expected to influence the recorded signal less.Furthermore, most studies from the Engbert lab used an EyeLink 2,which may produce signals with different noise levels and dynamicsthan the EyeLink 1000 Plus used for the current study.

noise in video-based eye trackers such as the EyeLink(Collewijn & Kowler, 2008), which led them to questiontheir suitability for research into fixational eye movements.We furthermore call on authors to explicitly state in theirpapers which filters were enabled during the gaze-datarecordings, and not only report the filters that were usedduring data analysis.

Filters in eye trackers

Since filters have an important impact on the output ofan eye tracker, is it possible to recognize their presencefrom an eye-tracker’s data? Luckily, this is a non-issuefor the systems of the three eye-tracker manufacturerswe examined, as they state in their data exports orcommunication with customers whether the gaze positionsignals provided by their system are filtered or not, even ifthey do not make available the exact implementation of theirfilters.

It should be noted that the signal type (i.e., whethera signal is white or colored) can be assessed with boththe scaling exponent α as done in this paper, or evenmore straightforwardly using new measures introduced inthe companion paper, (Niehorster et al., 2020c).3 However,these values by themselves do not provide sufficientinformation to determine whether the recorded data fromthe eye tracker is filtered. This is because multiple factorscan lead to colored signals, e.g., it can (1) be due tothe application of filters; (2) be due to pupil-size orparticipant-movement artefacts; and (3) arise because theeye tracker’s noise magnitude is low enough that fixationaleye-movements are represented in the gaze position data. Inthis study, one of the latter two possibilities was likely thecause of some of the coloring in the gaze position signalrecorded with the EyeLink system.

When using systems that are too noisy to recordfixational eye movements or when recording from perfectlystabilized eyes, can examinations of the signal’s dynamicsdetect the presence of all types of filters? No, such analysisonly reflects the presence of temporal (anti-)correlations inthe assessed signal and as such is not sensitive to filters thatdo not affect the temporal correlation structure of a signal.This means that, for instance, the downsampling operationdone by the EyeLink 1000 and 1000 Plus when outputtinggaze position signals at 500 Hz or 250 Hz (i.e., recordingat 1000 Hz internally, splitting the signal up in chunksof two or four samples and then averaging each chunk

3An analysis of the data in this paper that uses these new measurescan be generated using the data and analysis scripts that we havemade public at https://github.com/dcnieho/FixationalNoise data. Thisalternative analysis supports the same conclusion as the analysispresented in this paper.

Page 13: Is apparent fixational drift in eye-tracking data due to ... · Keywords Eye tracking ·Fixational eye movements ·Drift ·Power spectrum ·Signal color Introduction When humans fixate

Behav Res

independently)4 cannot be detected, because this operationdoes not introduce temporal dependencies between adjacentsamples into the output signal. Furthermore, any otheroperation on the signal that does not take the signal’s historyinto account cannot be detected by this analysis technique.

The EyeLink

For the EyeLink, our results have consistently suggestedthat its noise magnitude is low enough that it may bepossible to record fixational eye movements (drift) with thisdevice. Most telling is that the power spectral density ofunfiltered EyeLink data has a significant slope up to 100Hz (Fig. 6a, Table 2), which suggests that the recordedsignal may be of biological origin (e.g., Findlay, 1971).However, caution in making this interpretation is required,because it is possible that the 1/f α characteristics of thesignal output by the eye tracker originate from other sourcesthan physical eyeball rotation. A possible alternative causeis the artefactual changes of the recorded gaze directiondue to continuous changes in pupil size (Wyatt, 2010;Drewes et al., 2012; Drewes et al., 2014; Choe et al.,2016; Hooge et al., 2019). This artifact causes deviationsin the gaze position signal that can be up to severaldegrees in size, which is an order of magnitude largerthan ocular drift is thought to be (e.g., Ko et al., 2016),and may thus be resolvable above the system noise ceilingmuch more easily. The amplitude spectra of the EyeLink’spupil size data (available by running the analyses placedonline at https://github.com/dcnieho/FixationalNoise data)were qualitatively similar to those reported in Figs. 5 and 6,lending some support to this idea. It is, however, possibleto use calibration procedures to reduce the effect of pupilsize changes on the gaze position signal (Merchant et al.,1974; Drewes et al., 2012). Future research could employsuch techniques to resolve whether artefacts due to changingpupil size are an important driver of the intrafixational driftmovements recorded with the EyeLink.

Acknowledgements We thank Tobii AB and SMI GmbH for lendingus equipment and Jeff Mulligan and two anonymous reviewers forhelpful comments on a previous version of this manuscript.

The data and analysis code are available at https://github.com/dcnieho/FixationalNoise data, and the experiment was notpreregistered.

Funding Information Open access funding provided by Lund Univer-sity.

Open Access This article is licensed under a Creative CommonsAttribution 4.0 International License, which permits use, sharing,adaptation, distribution and reproduction in any medium or format, as

4The Tobii Spectrum, at the time of writing, also performs such chunk-based averaging to achieve output sample rates of 300 Hz or 150 Hzwhile running at 600 Hz internally.

long as you give appropriate credit to the original author(s) and thesource, provide a link to the Creative Commons licence, and indicateif changes were made. The images or other third party material in thisarticle are included in the article’s Creative Commons licence, unlessindicated otherwise in a credit line to the material. If material is notincluded in the article’s Creative Commons licence and your intendeduse is not permitted by statutory regulation or exceeds the permitteduse, you will need to obtain permission directly from the copyrightholder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

References

Babadi, B., & Brown, E. N. (2014). A review of multitaper spectralanalysis. IEEE Transactions on Biomedical Engineering, 61(5),1555–1564.

Bahill, A. T., Brockenbrough, A., & Troost, B. T. (1981). Variabilityand development of a normative data base for saccadic eyemovements. Investigative Ophthalmology & Visual Science, 21(1),116.

Ballard, D. H., Hayhoe, M. M., & Pelz, J. B. (1995). Memory rep-resentations in natural tasks. Journal of Cognitive Neuroscience,7(1), 66–80.

Bedell, H. E., & Stevenson, S. B. (2013). Eye movement testing inclinical examination. Vision Research, 90, 32–37.

Blignaut, P., & Beelders, T. (2012). The precision of eye-trackers: acase for a new measure. In Spencer, S. N. (Ed.) Proceedings ofthe symposium on eye tracking research and applications, SantaBarbara, CA, (pp. 289–292). ACM: New York.

Bowers, N. R., Boehm, A. E., & Roorda, A. (2019). The effects of fix-ational tremor on the retinal image. Journal of Vision, 19(11), 8–8.

Burak, Y., Rokni, U., Meister, M., & Sompolinsky, H. (2010).Bayesian model of dynamic image stabilization in the visualsystem. Proceedings of the National Academy of Sciences,107(45), 19525–19530.

Campbell, F. W., Robson, J. G., & Westheimer, G. (1959). Fluctuationsof accommodation under steady viewing conditions. The Journalof Physiology, 145(3), 579–594.

Choe, K. W., Blake, R., & Lee, S.-H. (2016). Pupil size dynamicsduring fixation impact the accuracy and precision of video-basedgaze estimation. Vision Research, 118, 48–59. Fixational eyemovements and perception.

Coey, C., Wallot, S., Richardson, M., & Van Orden, G. (2012). Onthe structure of measurement noise in eye-tracking. Journal of EyeMovement Research, 5(4), 1–10.

Collewijn, H., & Kowler, E. (2008). The significance of microsaccadesfor vision and oculomotor control. Journal of Vision, 8(14), 20.

Cornelissen, F. W., Peters, E. M., & Palmer, J. (2002). The EyelinkToolbox: Eye tracking with MATLAB and the PsychophysicsToolbox. Behavior Research Methods, Instruments, & Computers,34, 613-617. https://doi.org/10.3758/BF03195489.

Cornsweet, T. N. (1956). Determination of the stimuli for involuntarydrifts and saccadic eye movements. Journal of the Optical Societyof America, 46(11), 987–993.

Ditchburn, R. W., Fender, D. H., & Mayne, S. (1959). Visionwith controlled movements of the retinal image. The Journal ofPhysiology, 145, 98–107.

Ditchburn, R. W., & Ginsborg, B. L. (1953). Involuntary eye move-ments during fixation. The Journal of Physiology, 119(1), 1–17.

Drewes, J., Masson, G. S., & Montagnini, A. (2012). Shifts in reportedgaze position due to changes in pupil size: Ground truth andcompensation. In Proceedings of the symposium on eye trackingresearch and applications, (pp. 209–212). New York: ACM.

Page 14: Is apparent fixational drift in eye-tracking data due to ... · Keywords Eye tracking ·Fixational eye movements ·Drift ·Power spectrum ·Signal color Introduction When humans fixate

Behav Res

Drewes, J., Zhu, W., Hu, Y., & Hu, X. (2014). Smaller is better: Drift ingaze measurements due to pupil dynamics. PLOSONE, 9(10), 1–6.

Eizenman, M., Hallett, P., & Frecker, R. (1985). Power spectra forocular drift and tremor. Vision Research, 25(11), 1635–1640.

Engbert, R. (2006). Microsaccades: a microcosm for research onoculomotor control, attention, and visual perception. In Martinez-Conde, S., Macknik, S., Martinez, L., Alonso, J.-M., & Tse,P. (Eds.) Visual perception, volume 154 of progress in brainresearch, (pp. 177–192). Amsterdam: Elsevier.

Engbert, R., & Kliegl, R. (2004). Microsaccades keep the eyes’balance during fixation. Psychological Science, 15(6), 431–431.

Engbert, R., & Mergenthaler, K. (2006). Microsaccades are triggeredby low retinal image slip. Proceedings of the National Academy ofSciences, 103(18), 7192–7197.

Engbert, R., Mergenthaler, K., Sinn, P., & Pikovsky, A. (2011). Anintegrated model of fixational eye movements and microsaccades.Proceedings of the National Academy of Sciences, 108(39),16149–16150.

Fick, A. (1854). Die bewegungen des menschlichen augapfels.Zeitschrift fu,r rationelle Medicin, 4, 101–128.

Findlay, J. M. (1971). Frequency analysis of human involuntary eyemovement. Kybernetik, 8(6), 207–214.

Haslwanter, T. (1995). Mathematics of three-dimensional eye rota-tions. Vision Research, 35(12), 1727–1739.

Hessels, R. S., Niehorster, D. C., Nystrom, M., Andersson, R., &Hooge, I. T. C. (2018). Is the eye-movement field confused aboutfixations and saccades? A survey among 124 researchers, Vol. 5.

Holmqvist, K., & Andersson, R. (2017). Eye Tracking. A com-prehensive guide to methods, paradigms, and measures. LundEye-tracking Research Institute.

Holmqvist, K., & Blignaut, P. (2020). Small eye movements cannotbe reliably measured by video-based P-CR eye-trackers. BehaviorResearch Methods.

Hooge, I. T., Hessels, R. S., & Nystrom, M. (2019). Do pupil-basedbinocular video eye trackers reliably measure vergence?. VisionResearch, 156, 1–9.

Hooge, I. T. C., Niehorster, D. C., Nystrom, M., Andersson, R., &Hessels, R. S. (2017). Is human classification by experienceduntrained observers a gold standard in fixation detection?Behavior Research Methods.

Horowitz, T. S., Fine, E. M., Fencsik, D. E., Yurgenson, S., & Wolfe,J. M. (2007). Fixational eye movements are not an index of covertattention. Psychological Science, 18(4), 356–363.

Ko, H.-K., Snodderly, D. M., & Poletti, M. (2016). Eye movementsbetween saccades: Measuring ocular drift and tremor. VisionResearch, 122, 93–104.

Kuang, X., Poletti, M., Victor, J., & Rucci, M. (2012). Temporalencoding of spatial information during active visual fixation.Current Biology, 22(6), 510–514.

Liang, J.-R., Moshel, S., Zivotofsky, A. Z., Caspi, A., Engbert, R.,Kliegl, R., & Havlin, S. (2005). Scaling of horizontal and verticalfixational eye movements. Physical Review E, 71, 031909-1-031909-6.

Martinez-Conde, S., Macknik, S. L., & Hubel, D. H. (2004). The roleof fixational eye movements in visual perception. Nature ReviewsNeuroscience, 5(3), 229–240.

Martinez-Conde, S., Otero-Millan, J., & Macknik, S. L. (2013). Theimpact of microsaccades on vision: towards a unified theory ofsaccadic function. Nature Reviews Neuroscience, 14(2), 83–96.

McCamy, M. B., Otero-Millan, J., Leigh, R. J., King, S. A.,Schneider, R. M., Macknik, S. L., & Martinez-Conde, S. (2015).Simultaneous recordings of human microsaccades and drifts witha contemporary video eye tracker and the search coil technique.PLOS ONE, 10(6), 1–20.

Merchant, J., Morrissette, R., & Porterfield, J. L. (1974). Remotemeasurement of eye direction allowing subject motion over

one cubic foot of space. IEEE Transactions on BiomedicalEngineering, BME-21(4), 309–317.

Niehorster, D. C., & Nystrom, M. (2020b). SMITE: A toolbox forcreating Psychophysics Toolbox and PsychoPy experiments withSMI eye trackers. Behavior Research Methods, 52, 295-304.https://doi.org/10.3758/s13428-019-01226-0

Niehorster, D. C., Andersson, R., & Nystrom, M. (2020a). Titta: Atoolbox for creating PsychToolbox and Psychopy experimentswith Tobii eye trackers.

Niehorster, D. C., Cornelissen, T. H. W., Holmqvist, K., Hooge,I. T. C., & Hessels, R. S. (2018). What to expect from your remoteeye-tracker when participants are unrestrained. Behavior ResearchMethods, 50(1), 213–227.

Niehorster, D. C., Zemblys, R., Beelders, T., & Holmqvist, K. (2020c).Characterizing gaze position signals and synthesizing noiseduring fixations in eye-tracking data. Behavior ResearchMethods.

Nystrom, M., Andersson, R., Niehorster, D. C., & Hooge, I. (2017).Searching for monocular microsaccades—a red herring of moderneye trackers?. Vision Research, 140, 44–54.

Nystrom, M., Niehorster, D. C., Andersson, R., & Hooge, I. (inpress). The Tobii Pro Spectrum: A useful tool for studyingmicrosaccades?.

Ratliff, F., & Riggs, L. A. (1950). Involuntary motions of the eyeduring monocular fixation. Journal of Experimental Psychology,40(6), 687–701.

Roberts, J., Wallis, G., & Breakspear, M. (2013). Fixational eyemovements during viewing of dynamic natural scenes. Frontiersin Psychology, 4, 797.

Rolfs, M. (2009). Microsaccades: Small steps on a long way. VisionResearch, 49(20), 2415–2441.

Rucci, M., Ahissar, E., & Burr, D. (2018). Temporal coding of visualspace. Trends in Cognitive Sciences, 22(10), 883–895. SpecialIssue: Time in the Brain.

Rucci, M., & Poletti, M. (2015). Control and functions of fixationaleye movements. Annual Review of Vision Science, 1(1), 499–518.

Scholes, C., McGraw, P. V., Nystrom, M., & Roach, N. W.(2015). Fixational eye movements predict visual sensitivity. Pro-ceedings of the Royal Society B: Biological Sciences, 282(1817),20151568.

Sheehy, C. K., Yang, Q., Arathorn, D. W., Tiruveedhula, P., de Boer,J. F., & Roorda, A. (2012). High-speed, image-based eye trackingwith a scanning laser ophthalmoscope. Biomedical Optics Express,3(10), 2611–2622.

Stark, L., Iida, M., & Willis, P. A. (1961). Dynamic characteristics ofthe motor coordination system in man. Biophysical Journal, 1(4),279–300.

Stevenson, S. B., Roorda, A., & Kumar, G. (2010). Eye tracking withthe adaptive optics scanning laser ophthalmoscope. In Proceedingsof the 2010 symposium on eye-tracking research & applications,ETRA ’10, (pp. 195–198). New York: ACM.

Thaler, L., Schutz, A., Goodale, M., & Gegenfurtner, K. (2013). Whatis the best fixation target? the effect of target shape on stabilityof fixational eye movements. Vision Research, 76(Supplement C),31–42.

Thomson, D. J. (1982). Spectrum estimation and harmonic analysis.Proceedings of the IEEE, 70(9), 1055–1096.

Wang, D., Mulvey, F., Pelz, J. B., & Holmqvist, K. (2016), A studyof artificial eyes for the measurement of precision in eye-trackers.Behavior Research Methods.

Wyatt, H. J. (2010). The human pupil and the use of video-basedeyetrackers. Vision Research, 50(19), 1982–1988.

Publisher’s note Springer Nature remains neutral with regard tojurisdictional claims in published maps and institutional affiliations.