Eye-Tracking Technology in Human-Computer Interactionrjh/courses/ResearchTopicsInHCI/... · With...

6
Eye-Tracking Technology in Human-Computer Interaction Mengsen Huang School of Computer Science University of Birmingham Email: [email protected] Abstract—Eye-Tracking technology has been a hot topic in Human-Computer Interaction (HCI) for a long time. It is a technology which is able to recognize the point where the user is looking and track the eye movements. Nowadays, Eye-Tracking has been developed into many applications as input devices, and is regarded as a research tool for researchers to analyze human behaviour as well. Today this paper aims to introduce some technical issues about Eye-Tracking, such as the principle in realizing Eye-Tracking technology, and studies focusing on input devices and research tools at present developing stage respectively. In the end, the gaps of studies are found and the future trends about Eye-Tracking are concluded. Index Terms—Eye-Tracking, input devices, research tool, hu- man behaviour, HCI. I. I NTRODUCTION With the development of technology in Human-Computer Interaction (HCI), people find that the way in which they lead the life has been changed considerably and life becomes much more convenient. Eye-Tracking is regarded as one of techniques in HCI, which is able to figure out where a user is looking at and track the eye movements, playing a significant role in human’s life. Eye-Tracking has been developed into many applications as input devices which meet the need of special population of users who have difficulties in operating devices and are willing to take advantage of executing com- mands by eyes. In addition, Eye-Tracking can be a research tool for HCI researchers to analyze human behaviour in many aspects by collecting data of eye movements. Researchers are also expected to figure out the way to optimize the system according to the characteristic of eye movements and evaluate the usability of system. In a word, Eye-Tracking is becoming an essential technical mean in HCI at present stage. The paper is organized as follows: Section 2 illustrates some technical issues and main measurements in Eye-Tracking technology; Section 3 describes some applications where Eye- Tracking has been developed as input devices; Section 4 introduces several researches in human behaviour using Eye- Tracking technology as a research tool; Section 5 exposes conclusion and future trends about Eye-Tracking technology. II. TECHNICAL ISSUES IN EYE-TRACKING A. Operating principle of eye trackers Although there are numbers of approaches of tracking eye movements such as electronic methods, mechanical meth- ods, optical/video meth[1], the most commercial eye tracker systems measure point-of-regard (to figure out where the user is looking at) by using corneal-reflection/pupil-centre approach[2]. The method involves a normal computer with an image processing software which is responsible for tracking eye movements by determining the position where the user is looking and indentifying the features, an infrared camera as- sembled under the display monitor. In the process of tracking, infrared light from camera gets into the eye, generating the pupil reflection (bright pupil) after reaching the retina. At the mean time, corneal reflection is generated by the infrared light from infrared camera as well (as is shown in Figure 1). It is also worth mentioning that as the light is infrared, it is nearly invisible to the subject. Fig. 1. Bright pupil and corneal reflection as seen in the infrared camera image[2]. The reason why pupil reflection needs generating is to disassociate eye movements from head movements in case that user moves the head during the operation[3][4]. Once the pupil reflection and corneal reflection are detected by the software, the vector between two reflections can be measured and the point-of-regard is figured out after further trigonometric cal- culations. Having established the relationship between feature of the eyes and the contents on screen of the computer, the contents at which user is looking can be determined (Figure 2). With the improvement of Eye-Tracking technology and increasing needs of various researches in HCI, applications with Eye-Tracking tend to be more portable and flexible. Two innovative solutions have been proposed by Maria in 2012[6]: the SMI RED-M eye-tracker, a portable remote eye- tracker adapted for different settings with high performance;

Transcript of Eye-Tracking Technology in Human-Computer Interactionrjh/courses/ResearchTopicsInHCI/... · With...

Page 1: Eye-Tracking Technology in Human-Computer Interactionrjh/courses/ResearchTopicsInHCI/... · With the development of technology in Human-Computer Interaction (HCI), people find that

Eye-Tracking Technology inHuman-Computer Interaction

Mengsen HuangSchool of Computer Science

University of BirminghamEmail: [email protected]

Abstract—Eye-Tracking technology has been a hot topic inHuman-Computer Interaction (HCI) for a long time. It is atechnology which is able to recognize the point where the user islooking and track the eye movements. Nowadays, Eye-Trackinghas been developed into many applications as input devices,and is regarded as a research tool for researchers to analyzehuman behaviour as well. Today this paper aims to introducesome technical issues about Eye-Tracking, such as the principlein realizing Eye-Tracking technology, and studies focusing oninput devices and research tools at present developing stagerespectively. In the end, the gaps of studies are found and thefuture trends about Eye-Tracking are concluded.

Index Terms—Eye-Tracking, input devices, research tool, hu-man behaviour, HCI.

I. INTRODUCTION

With the development of technology in Human-ComputerInteraction (HCI), people find that the way in which theylead the life has been changed considerably and life becomesmuch more convenient. Eye-Tracking is regarded as one oftechniques in HCI, which is able to figure out where a user islooking at and track the eye movements, playing a significantrole in human’s life. Eye-Tracking has been developed intomany applications as input devices which meet the need ofspecial population of users who have difficulties in operatingdevices and are willing to take advantage of executing com-mands by eyes. In addition, Eye-Tracking can be a researchtool for HCI researchers to analyze human behaviour in manyaspects by collecting data of eye movements. Researchers arealso expected to figure out the way to optimize the systemaccording to the characteristic of eye movements and evaluatethe usability of system. In a word, Eye-Tracking is becomingan essential technical mean in HCI at present stage.

The paper is organized as follows: Section 2 illustratessome technical issues and main measurements in Eye-Trackingtechnology; Section 3 describes some applications where Eye-Tracking has been developed as input devices; Section 4introduces several researches in human behaviour using Eye-Tracking technology as a research tool; Section 5 exposesconclusion and future trends about Eye-Tracking technology.

II. TECHNICAL ISSUES IN EYE-TRACKING

A. Operating principle of eye trackers

Although there are numbers of approaches of tracking eyemovements such as electronic methods, mechanical meth-ods, optical/video meth[1], the most commercial eye tracker

systems measure point-of-regard (to figure out where theuser is looking at) by using corneal-reflection/pupil-centreapproach[2]. The method involves a normal computer with animage processing software which is responsible for trackingeye movements by determining the position where the user islooking and indentifying the features, an infrared camera as-sembled under the display monitor. In the process of tracking,infrared light from camera gets into the eye, generating thepupil reflection (bright pupil) after reaching the retina. At themean time, corneal reflection is generated by the infrared lightfrom infrared camera as well (as is shown in Figure 1). It isalso worth mentioning that as the light is infrared, it is nearlyinvisible to the subject.

Fig. 1. Bright pupil and corneal reflection as seen in the infrared cameraimage[2].

The reason why pupil reflection needs generating is todisassociate eye movements from head movements in case thatuser moves the head during the operation[3][4]. Once the pupilreflection and corneal reflection are detected by the software,the vector between two reflections can be measured and thepoint-of-regard is figured out after further trigonometric cal-culations. Having established the relationship between featureof the eyes and the contents on screen of the computer, thecontents at which user is looking can be determined (Figure2).

With the improvement of Eye-Tracking technology andincreasing needs of various researches in HCI, applicationswith Eye-Tracking tend to be more portable and flexible.Two innovative solutions have been proposed by Maria in2012[6]: the SMI RED-M eye-tracker, a portable remote eye-tracker adapted for different settings with high performance;

Page 2: Eye-Tracking Technology in Human-Computer Interactionrjh/courses/ResearchTopicsInHCI/... · With the development of technology in Human-Computer Interaction (HCI), people find that

Fig. 2. Schematic showing how eye-gaze interface operates[5].

The SMI Eye-Tracking glasses, a wearable mobile device usedfor research in virtual environment and real-world (Figure 3).

Fig. 3. The SMI head-mounted eye tracker and sample views of bothcameras[7].

B. Types of eye movements

Saccade: Saccade is described as a sudden motion whenuser moves the fovea to see a different part of a visual scene,which takes around 30–120 milliseconds with 140 degrees ofvisual angle to traverse[1]. More searching results in moresaccades[8]; Less meaningful cues exist when regressionshappen[9].

Fixations: fixation is a period of relative stability where anobject can be viewed by user[1]. The area may be importantor noticeable to the user when more fixations are focusedon a particular one[10]; User may find it difficult extractinginformation or the object seems engaging as longer fixationduration is detected[11].

III. EYE-TRACKING AS INPUT DEVICES

As the technology has become mature, Eye-Tracking hasbeen developed into applications as input devices to helpspecial populations with their work and ease people’s life. Atthe early stage, a workstation with a user interface called Erica(Figure 4) was developed for disabled people to execute com-mands by eye movements using Eye-Tracking technology[5].Menu commands are shown at the different positions oncomputer screen and users can simply select the option theydesire by focusing on the corresponding position for a shortperiod of time. This invention has strongly improved thework efficiency of disabled people who have difficulties inmoving their hands and arms, replacing the keyboard andmouse. Nevertheless, the workstation Erica has a restriction

on the head movements of users. That is to say, users haveto keep their heads in a nearly fixed position so that thecamera is more likely to capture the eye images in focus.In this situation, this application is not an appropriate onefor a special population with difficulties in controlling heads.Subsequently, as the technology for measuring eye movementshas improved, some interactive techniques which incorporatethe eye movements into the human-computer dialogue in amore convenient and natural way rather than requiring usersto use specific eye movements to control the system havebeen realized[12], eye-only object selection, moving an object,choosing command from menu included. Experiments havejustified that, however, they all far from perfect —It is betterto operate the system in combination with keypad and mouse.Based on these techniques, Hyrskykari et al.[13] describedan application of translator using Eye-Tracking. While useris reading web pages in non-native language, the system isable to recognize their difficulties by detecting long fixationsand regressions, with the corresponding translation expectedto pop up automatically to help with the reader.

Fig. 4. Laboratory configuration of Erica hardware system[5].

Moreover, Eye-Tracking technology has gradually beenshifted to the mobile field at the same time. Selker et al.[14]developed a glasses-mounted eye tracker used for monitoringeye fixations of users under diverse environmental target.When the situation where two different people wearing deviceswant to exchange their business cards in a meeting happens, allthey need to do is looking at each other for several seconds.Similarly, a smart phone equipped with EyeContact sensorsused for detecting face-to-face conversation was reported byVertegaal et al.[15]. As a face-to-face conversation occurs, thesmart phone is able to switch to the silent mode automatically.Not only that, the same sensor was also used in an attentivemobile video player which could pause by itself when user isno longer watching and an reading system which is expectedto text the contents only as user is going to read[16]. Anothermobile equipment is designed for museum based on Eye-Tracking: When customer is looking at an exhibit, it can bedetected by the system and then the audio of introduction about

Page 3: Eye-Tracking Technology in Human-Computer Interactionrjh/courses/ResearchTopicsInHCI/... · With the development of technology in Human-Computer Interaction (HCI), people find that

exhibit is played automatically[7].It is convinced that applications based on Eye-Tracking

technology have facilitated human’s daily life and have helpedspecial population with aid to some extent. However, the tech-niques used for operating system with eye movements are farfrom perfect. In most cases, the intentions of users are difficultto interpret and predict. Furthermore, it is always unfriendlyto users while using devices based on eye movements only.Under this circumstance, a better way to solve this problemis to use other input devices associated with Eye-Trackingapplications to make intentions clear[2]. Mobile devices basedon Eye-Tracking technology existing today have also pointeda way for the future research in HCI, although there is stillroom for improvement in mobile of eye-based interaction.

IV. EYE-TRACKING AS A RESEARCH TOOL IN STUDYINGHUMAN HEBAVIOUR

Compared with applications based on Eye-Tracking, thetechnology is more likely to be used as a research tool in HCIto investigate human behaviour. During the experiments withEye-Tracking test, in order to make sure that the data of eyemovements is clean, the factor of distractions should be elimi-nated, such as objects moving around the screen, colorful stuff,etc[17]. Besides, the tasks for Eye-Tracking test should also bewell-defined with clear descriptions delivered to participants,so that the data can be targeted and purposeful. On consideringthe huge differences on feature of eye movements betweentwo participants, within-participants design is expected to bea better choice to make comparisons[17]. It is also worthmentioning that performing filtering after collecting data isalways a significant step for saving time and in case of errorsoccurring while processing[2].

One research focused on the early detection of autismon children by measuring gaze[18]. The result was foundthat 55 percent of the autistic children were more likely toperform hypometric saccades which were too short comparedwith normal children[19]. Afterwards, researchers tended tofigure out exactly where the autistic children look at whencommunicating. As the consequence, they found that childrenwho performed autistic syndromes preferred to look at mouthsrather than eyes[20]. In order to observe the eye movementsof autism children in a better way, in particular, for long-termstudies, trackers based on the mobile devices with light weightare expected to be developed for the researchers, makingchildren exposed to the real-world.

Similarly, another research investigated the different eye-movement performance between 18-months-old late talkersand their typical peers[21]. A late talking group and an age-matched normal group of infants took part in a word learningtask focusing on cognitive ability. As is shown in Figure 5,infants were trained by hearing the word from speaker withthe corresponding picture moved back and forth across thescreen. Test began with two pictures presented on the screenfor several seconds. A small picture then appeared at themidpoint between two pictures with the voice ”look” to directparticipants’ attention to the central point. Once they focused

on the point for 100ms, the small picture disappeared and thetarget picture was spoken by the speaker. Finally, the eye-movement was detected and the accuracy of performance ineach group was calculated respectively.

Fig. 5. Example of Word Test Trial[21].

The result is shown in Figure 6. It can be concluded that thenormal group performed better in the test. That is to say, latetalkers suffer problems in recognizing the picture of spokenfamiliar word compared with their typical peers[22], and theyare more likely to fail to figure out the known words in realtime.

Fig. 6. Accuracy by group[21].

Based on the experiments above, more issues should beemphasized. For instance, the reaction time in recognizingthe correct pictures of spoken words between late talkersand normal peers can be compared. The whole track of eyemovements should be recorded and compared to the typicalpeers during the test, so that researchers are able to understandthe different point when late talkers making eye movementswith respect to typical peers and find the method for treatment.

Moreover, Eye-Tracking technology contributes a lot inpsychological area. A research focused on whether the facescould capture the attention of 3-month-olds, 6-month-olds and

Page 4: Eye-Tracking Technology in Human-Computer Interactionrjh/courses/ResearchTopicsInHCI/... · With the development of technology in Human-Computer Interaction (HCI), people find that

adults was conducted in 2012[23]. A picture consisting of 1face target and 3 stimuli as distracters and a picture consistingof 1 face target and 5 stimuli as distracters were made to showfor the participants and their eye fixations were measured. Theobjects in picture were the same size and were put on a circulargrid with equal distance (Figure 7).

Fig. 7. Example of the stimuli[23].

As for the result, adults put more ”first fixations” on thefaces compared with 3-month-olds and 6-month-olds, and itwas hard to find any difference in performance between 3-month-olds and 6-month-olds (Figure 8).

Fig. 8. Proportion of first fixations[23].

However, things were different in total fixations on faces.Both adults and 6-month-olds were more likely to show morefixations on faces. 3-month-olds, instead, differed from that ofadults and 6-month-olds (Figure 9).

To summarize, all three populations made more fixations onfaces in 4 items than 6 items. Although faces were not ableto capture the 6-month-olds’ first attentions, they performedsimilarly with adults in terms of total fixations: both of themlooked longer at faces compared with 3-month-olds who didnot put more fixations on target faces. Future work can befocused on their performance in fixations on different types offaces, such as emotions and genders. As for 3-month-olds, wecan investigate whether they are sensitive to different status offaces, for instance, paying more attentions on the vertical faceamong horizontal faces.

Another experiment in psychology described different atten-tions paid on different types of faces between anxious youthand normal youth[24]. Eighteen anxious youth and fifteen

Fig. 9. Proportion of total fixations[23].

normal youth (with age 8-17) were made to view happy-neutral and angry-neutral face pairs for 10 seconds and thentheir eye movements were tracked. The results showed thatanxious youth were more likely to pay more attention to angryfaces than normal youth. Not only that, they preferred to puttheir first fixations on angry faces, presenting their bias towardthreat-related stimuli.

Apart from psychology, Eye-Tracking technology has beenused by researchers to investigate human behaviour in dailylife. A study focused on the way in which students solve prob-lems when doing a multiple-choice science question[25]. Thequestion was an image-based multiple-choice science problemand eye movements were measured when task being conducted(Figure 10). The title stated that ”Select the image(s) inferringa landslide would occur”.

Fig. 10. The Hot Zone image for a participant[25].

The results indicated that students preferred to focus moreon the chosen options than others. To be more specific,they were more likely to fix on relevant factors instead of

Page 5: Eye-Tracking Technology in Human-Computer Interactionrjh/courses/ResearchTopicsInHCI/... · With the development of technology in Human-Computer Interaction (HCI), people find that

irrelevant ones. Moreover, a successful solver tended to paymore attention to the relevant factors compared to unsuccessfulsolvers. An unsuccessful solver, instead, always found it diffi-cult figuring out key factors. The findings from the above studycan really help researchers with learning students’ strategiesof problem solving. For the further studies, Eye-Trackingtechniques are expected to be used in online assessmentsystems. In that situation, not only can results be used forthe judgement, but also the performance of eye movementscan be considered into account.

Another study investigated the relationship between nu-trition label viewing and food decision making[26]. Adultswith different backgrounds (ages, sexes, income, educationalattainment, etc) were gathered and asked to view 64 foodproducts on the computer with eye movements measured.Having viewed the items, they were asked to make thesimulated decision whether they are going to purchase them(the screen shown in Figure 11).

Fig. 11. Sample screen[26].

Results illustrated that adults focused more on labels formeal, soup and yoghurt than others (fruits and vegetables),and they paid more attention to the food they decided to buycompared to the food they did not. It always took participantslonger time to view the label when the food’s healthiness wasambiguous. Another finding was that there was no significantdifference in label viewing between adults with differentbackgrounds. These results can be used for recognizing thehabits of users when viewing the food labels and figuring outcustomers’ priorities on types of compositions noted on labels.Under this circumstance, a user-friendly and well-structuredfood label can be designed for a food item.

Advertisement can also benefit a lot from Eye-Trackingtechnology. Figure 12 shows an example for the design ofthe poster according to the Eye-Tracking experiment. In thefirst pair of pictures, with the person in the poster lookingdirectly at the reader, has been measured the hot zone wherethe reader is looking. It shows that people pay more attentionto the words and the face of character instead of the product.In contrast, the second pair of pictures with the person in theposter looking at the product proves that people tend to fix

more on the product itself, which is definitely a better solutionfor the designer.

Fig. 12. Eye-Tracking in advertisement.

V. CONCLUSION AND FUTURE TRENDS

As is described above, Eye-Tracking has gradually gotinvolved into humans daily life with the development oftechnology and the passage of time. The applications basedon Eye-Tracking have been updating from an immovable andinaccurate machine with loads of constraints to a pair of smart,portable and user-friendly glasses with high accuracy. Nev-ertheless, regardless of whether an immovable machine withcamera or a portable glasses, they all have their own realm.Although too big to carry, huge devices with sophisticatedsystems now take the advantage of accuracy, which stronglyhelp with disabled people to execute commands by using theireyes. Not only that, they also can be used in specific situationswhere high-accurate eye-tracking patterns or measurementsare required. In terms of portable devices, they bring loadsof convenience to people in their daily life, although therelative lower accuracy compared to sophisticated system iscompensated. Intentions from eye movements are able to berecognized wherever the users are, then the solution arisesunder that situation. In a word, although immature, there is nodoubt that applications would continue to evolve, replacing

Page 6: Eye-Tracking Technology in Human-Computer Interactionrjh/courses/ResearchTopicsInHCI/... · With the development of technology in Human-Computer Interaction (HCI), people find that

normal glasses with a superior one with numbers of functionsbased on Eye-Tracking in the future.

As a research tool, Eye-tracking has contributed a lot instudying human behaviour, helping researchers learn wellfrom special populations and pointing an effective way tosolve the problems. Advertisement also benefits from Eye-Tracking tests. Designers are able to figure out better solutionsdepending on the preference of fixations when customerslooking at an advertisement, which considerably contributesto their economic income. Moreover, after collecting the dataof fixations when users looking through a web page or anarticle, the UX engineers could find a better way for the designof layout and put the elements they would like to emphasizeon the position where users focus most. Having discussedand proposed some gaps about some researches, we foundthat some of them could continue to conduct the investigationbased on Eye-Tracking for the extension from the present one.To think deeply, researches using Eye-Tracking technologyshould tend to focus more on vertical field instead of horizontalfield only, that is, investigating the reasons and solutions whendifferent phenomenon occurring in one study. For the futuretrends, apart from psychology, advertisement and design, morefields are expected to be investigated by using Eye-Trackingwith the techniques improving. For instance, detect whethera person is cheating by gathering the data of eye movementsand improving algorithms, which could be applied to variousindustries. In addition, many products target different usergroups, such as men and women. In this situation, studies ineye fixations when advertisement being viewed by men andwomen should be conducted respectively so that the posterscan be designed according to different user groups for thebetter benefit.

ACKNOWLEDGMENT

The author would like to thank the teacher of ResearchTopics in HCI, Robert Hendley, for offering advice on thisreport.

REFERENCES

[1] R. J. Jacob, “Eye movement-based human-computer interaction tech-niques: Toward non-command interfaces,” Advances in human-computerinteraction, vol. 4, pp. 151–190, 1993.

[2] A. Poole and L. J. Ball, “Eye tracking in hci and usability research,”Encyclopedia of human computer interaction, vol. 1, pp. 211–219, 2006.

[3] A. T. Duchowski, “Diversity and types of eye tracking applications,” inEye Tracking Methodology: Theory and Practice. Springer, 2003, pp.131–132.

[4] R. Jacob and K. S. Karn, “Eye tracking in human-computer interactionand usability research: Ready to deliver the promises,” Mind, vol. 2,no. 3, p. 4, 2003.

[5] T. E. Hutchinson, K. P. White Jr, W. N. Martin, K. C. Reichert, and L. A.Frey, “Human-computer interaction using eye-gaze input,” Systems, Manand Cybernetics, IEEE Transactions on, vol. 19, no. 6, pp. 1527–1534,1989.

[6] M. L. Mele and S. Federici, “Gaze and eye-tracking solutions forpsychological research,” Cognitive processing, vol. 13, no. 1, pp. 261–265, 2012.

[7] T. Toyama, T. Kieninger, F. Shafait, and A. Dengel, “Gaze guided objectrecognition using a head-mounted eye tracker,” in Proceedings of theSymposium on Eye Tracking Research and Applications. ACM, 2012,pp. 91–98.

[8] J. H. Goldberg and X. P. Kotval, “Computer interface evaluation usingeye movements: methods and constructs,” International Journal ofIndustrial Ergonomics, vol. 24, no. 6, pp. 631–645, 1999.

[9] J. L. Sibert, M. Gokturk, and R. A. Lavine, “The reading assistant:eye gaze triggered auditory prompting for reading remediation,” inProceedings of the 13th annual ACM symposium on User interfacesoftware and technology. ACM, 2000, pp. 101–107.

[10] A. Poole, L. J. Ball, and P. Phillips, “In search of salience: A response-time and eye-movement analysis of bookmark recognition,” in Peopleand computers XVIIIDesign for life. Springer, 2005, pp. 363–378.

[11] M. A. Just and P. A. Carpenter, “Eye fixations and cognitive processes,”Cognitive psychology, vol. 8, no. 4, pp. 441–480, 1976.

[12] R. J. Jacob, “The use of eye movements in human-computer interactiontechniques: what you look at is what you get,” ACM Transactions onInformation Systems (TOIS), vol. 9, no. 2, pp. 152–169, 1991.

[13] A. Hyrskykari, P. Majaranta, A. Aaltonen, and K.-J. Raiha, “Designissues of idict: a gaze-assisted translation aid,” in Proceedings of the2000 symposium on Eye tracking research & applications. ACM, 2000,pp. 9–14.

[14] T. Selker, A. Lockerd, and J. Martinez, “Eye-r, a glasses-mounted eyemotion detection interface,” in CHI’01 extended abstracts on Humanfactors in computing systems. ACM, 2001, pp. 179–180.

[15] R. Vertegaal, C. Dickie, C. Sohn, and M. Flickner, “Designing attentivecell phone using wearable eyecontact sensors,” in CHI’02 extendedabstracts on Human factors in computing systems. ACM, 2002, pp.646–647.

[16] C. Dickie, R. Vertegaal, C. Sohn, and D. Cheng, “eyelook: usingattention to facilitate mobile media consumption,” in Proceedings of the18th annual ACM symposium on User interface software and technology.ACM, 2005, pp. 103–106.

[17] J. H. Goldberg and A. M. Wichansky, “Eye tracking in usabilityevaluation: A practitioners guide,” To appear in: Hyona, 2002.

[18] A. Navab, K. Gillespie-Lynch, S. P. Johnson, M. Sigman, and T. Hutman,“Eye-tracking as a measure of responsiveness to joint attention in infantsat risk for autism,” Infancy, vol. 17, no. 4, pp. 416–431, 2012.

[19] U. Rosenhall, E. Johansson, and C. Gillberg, “Oculomotor findings inautistic children,” The Journal of Laryngology & Otology, vol. 102,no. 05, pp. 435–439, 1988.

[20] Z. Boraston and S.-J. Blakemore, “The application of eye-trackingtechnology in the study of autism,” The Journal of physiology, vol. 581,no. 3, pp. 893–898, 2007.

[21] E. M. Ellis, A. Borovsky, J. L. Elman, and J. L. Evans, “Novel wordlearning: An eye-tracking study. are 18-month-old late talkers reallydifferent from their typical peers?” Journal of communication disorders,vol. 58, pp. 143–157, 2015.

[22] A. Fernald and V. A. Marchman, “Individual differences in lexical pro-cessing at 18 months predict vocabulary growth in typically developingand late-talking toddlers,” Child development, vol. 83, no. 1, pp. 203–222, 2012.

[23] E. Di Giorgio, C. Turati, G. Altoe, and F. Simion, “Face detection incomplex visual displays: an eye-tracking study with 3-and 6-month-oldinfants and adults,” Journal of experimental child psychology, vol. 113,no. 1, pp. 66–77, 2012.

[24] T. Shechner, J. M. Jarcho, J. C. Britton, E. Leibenluft, D. S. Pine, andE. E. Nelson, “Attention bias of anxious youth during extended exposureof emotional face pairs: An eye-tracking study,” Depression and anxiety,vol. 30, no. 1, pp. 14–21, 2013.

[25] M.-J. Tsai, H.-T. Hou, M.-L. Lai, W.-Y. Liu, and F.-Y. Yang, “Visualattention for solving multiple-choice science problem: An eye-trackinganalysis,” Computers & Education, vol. 58, no. 1, pp. 375–385, 2012.

[26] D. J. Graham and R. W. Jeffery, “Predictors of nutrition label viewingduring food purchase decision making: An eye tracking investigation,”Public health nutrition, vol. 15, no. 02, pp. 189–197, 2012.