Scott MacKenzie at BayCHI: Evaluating Eye Tracking Systems for Computer Data Entry
description
Transcript of Scott MacKenzie at BayCHI: Evaluating Eye Tracking Systems for Computer Data Entry
1
York University – Department of Computer Science and Engineering
Evaluating Eye Tracking Systems for Computer Data Entry
I. Scott MacKenzieYork University
http://www.yorku.ca/mack/
2
York University – Department of Computer Science and Engineering
Evaluating Eye Tracking Systems for Computer Data Entry
I. Scott MacKenzieYork University
http://www.yorku.ca/mack/
3
York University – Department of Computer Science and Engineering
Overview
• Eye tracking challenges• Looking• Selecting• Evaluating• Case studies
4
York University – Department of Computer Science and Engineering
Eye Tracking Challenges
• “It’s not about the bike” (Lance Armstrong)• “It’s not about the tracker”• Evaluate the interface, the interaction, not the
eye tracker• Easier said than done (like speech or handwriting
recognition)• Human-machine interaction
5
York University – Department of Computer Science and Engineering
Human-Computer Interface
Using the eye as the input control
6
York University – Department of Computer Science and Engineering
“Double Duty”
7
York University – Department of Computer Science and Engineering
An Eye Tracking Session at CHI
Input control
Visual search
Visual search
Visual search
8
York University – Department of Computer Science and Engineering
Two-step Input Control
• Mouse point-select• Eye look-select
• Where is the user looking?
• Select it!
9
York University – Department of Computer Science and Engineering
Overview
• Eye tracking challenges• Looking• Selecting• Evaluating• Case studies
10
York University – Department of Computer Science and Engineering
The Eye
11
York University – Department of Computer Science and Engineering
The Eye
12
York University – Department of Computer Science and Engineering
The Eye
13
York University – Department of Computer Science and Engineering
Where Is The User Looking?Pointing
= 0.17
1000 ft
3 ft
14
York University – Department of Computer Science and Engineering
Where Is The User Looking?
30 in
0.25 in
= 0.48
15
York University – Department of Computer Science and Engineering
Where Is My Thumb?
16
York University – Department of Computer Science and Engineering
Where Is My Thumb?
0.6 in25 in
= 1.2
17
York University – Department of Computer Science and Engineering
Double Trouble!
1.2 in25 in
= 2.3
18
York University – Department of Computer Science and Engineering
Fovea
• Area of the eye responsible for sharp central vision
• Necessary for reading, driving, sewing, etc.
• 1% of retina, but 50% of visual cortex in brain
• The fovea sees “twice the width of a thumbnail at arms length” (about 2-3 degrees)
Eye tracking relevance: The fovea image represents a “region of uncertainty”
19
York University – Department of Computer Science and Engineering
Where is the User Looking?
20
York University – Department of Computer Science and Engineering
Demo
21
York University – Department of Computer Science and Engineering
Two-step Eye Input Control
• Mouse point-select• Eye look-select
• Where is the user looking?
• Select it!
22
York University – Department of Computer Science and Engineering
Overview
• Eye tracking challenges• Looking• Selecting• Evaluating• Case studies
23
York University – Department of Computer Science and Engineering
“Smart” Target Acquisition
• Target acquisition complicated by• Fovea’s “region of uncertainty”
• Eye jitter
• Identifying a fixation from raw data
• No natural selection method for eye tracking
• Selection methods• Dwell time (250 ms to 1000 ms)
• Separate key press
• Blink, wink, frown, etc.
• E.g., “smart” target acquisition techniques
24
York University – Department of Computer Science and Engineering
Dwell-time Progress Indicators1
1 Hansen, J. P., Hansen, D. W., & Johansen, A. S. (2001). Bringing gaze-based interaction back to basics. Proceedings of HCI International 2001, 325-328. Mahwah, NJ: Erlbaum.
25
York University – Department of Computer Science and Engineering
Shrinking Letter Progress Indicator1
1 Majaranta, P., MacKenzie, I. S., Aula, A., & Räiha, K.-J. (2003). Auditory and visual feedback during eye typing. Extended Abstracts of the ACM Conference on Human Factors in Computing Systems - CHI 2003, 766-767. New York: ACM.
26
York University – Department of Computer Science and Engineering
Grab And Hold Algorithm1
1 Miniotas, D., Špakov, O., & MacKenzie, I. S. (2004). Eye gaze interaction with expanding targets. Extended Abstracts of the ACM Conference on Human Factors in Computing Systems - CHI 2004, 1255-1258. New York: ACM.
27
York University – Department of Computer Science and Engineering
Next-letter Prediction1
1 MacKenzie, I. S., & Zhang, X. (2008). Eye typing using word and letter prediction and a fixation algorithm. Proceedings of the ACM Symposium on Eye Tracking Research and Applications -- ETRA 2008, 55-58. New York: ACM.
28
York University – Department of Computer Science and Engineering
Speech-assisted Selection1
1 Zhang, Q., Imamiya, A., Go, K., & Mao, X. (2004). Resolving ambiguities of a gaze and speech interface, Proceedings of the ACM Symposium on Eye Tracking Research and Applications - ETRA 2004 (pp. 85-92): New York: ACM.
29
York University – Department of Computer Science and Engineering
Speech-assisted Selection1
1 Miniotas, D., Spakov, O., & MacKenzie, I. S. (2006). Speech-augmented eye gaze interaction with small closely-spaced targets. Proceedings of the Symposium on Eye Tracking Research and Applications - ETRA 2006, 67-72. New York: ACM.
30
York University – Department of Computer Science and Engineering
Snap-on Selection1
1 Tien, G., & Atkins, M. S. (2008). Improving hands-free menu selection using eye gaze glances and fixations. Proceedings of the ACM Symposium on Eye Tracking Research and Applications - ETRA 2008, 47-50. New York: ACM.
31
York University – Department of Computer Science and Engineering
Snap-clutch Selection1
1 Istance, H., Bates, R., Hyrskykari, A., & Vickers, S. (2008). Snap clutch: A moded approach to solving the Midas touch problem. Proceedings of the ACM Symposium on Eye Tracking Research and Applications - ETRA 2008, 221-228. New York: ACM.
32
York University – Department of Computer Science and Engineering
Fisheye Lens Selection1
1 Ashmore, M., Duchowski, A. T., & Shoemaker, G. (2005). Efficient eye pointing with a fisheye lens. Proceedings of Graphics Interface 2005, 203-210. Toronto: Canadian Information Processing Society.
33
York University – Department of Computer Science and Engineering
Cursor Warping1
1 Zhai, S., Morimoto, C., & Ihde, S. (1999). Manual gaze input cascaded (MAGIC) pointing. Proceedings of the ACM Conference on Human Factors in Computing Systems - CHI '99, 248-253. New York: ACM.
34
York University – Department of Computer Science and Engineering
Early-raw-smoothed-late Selection1
1 Kumar, M., Klingner, J., Winograd, T., & Paepcke, A. (2008). Improving the accuracy of gaze input for interaction. Proceedings of the ACM Symposium on Eye Tracking Research and Applications - ETRA 2008, 65-68. New York: ACM.
35
York University – Department of Computer Science and Engineering
Gesture-based Selection1
1 Mollenbach, E., Hansen, J. P., Lillholm, M., & Gale, A. G. (2009). Single stroke gaze gestures. Extended Abstracts of the ACM Conference on Human Factors in Computing Systems - CHI 2009, 4555-4560. New York: ACM.
36
York University – Department of Computer Science and Engineering
Nearest Neighbor Selection1
1 Sibert, L. E., & Jacob, R. J. K. (2000). Evaluation of eye gaze interaction. Proceedings of the ACM Conference on Human Factors in Computing Systems - CHI 2000, 281-288. New York: ACM.
37
York University – Department of Computer Science and Engineering
Big Targets1
1 Hansen, J. P., Johansen, A. S., Hansen, D. W., Itoh, K., & Mashino, S. (2003). Command without a click: Dwell time typing by mouse and gaze selection. Proceedings of INTERACT 2003, 121-128. Berlin: Springer.
Demo
38
York University – Department of Computer Science and Engineering
Overview
• Eye tracking challenges• Looking• Selecting• Evaluating• Case studies
39
York University – Department of Computer Science and Engineering
Focus on the Interaction
• Eye tracking research tends to focus on the technology (hardware, software, algorithms)
• Focus on the interaction• With eye tracking, easier said than done• Eye tracking technology is finicky (perhaps you
have a better word)• Remember the human-computer interface (5th
slide)
40
York University – Department of Computer Science and Engineering
Evaluation Paradigm
• Experimental in nature• Follow “the Scientific Method”
• Human participants
• Independent variables (factors and levels)
• Dependent variables (measure behaviours)
• Counterbalancing
• Hypothesis testing (analysis of variance)
• Etc.
41
York University – Department of Computer Science and Engineering
Independent Variables (eye tracking)
42
York University – Department of Computer Science and Engineering
Dependent Variables (eye tracking)
43
York University – Department of Computer Science and Engineering
Read Text Events (RTE)
44
York University – Department of Computer Science and Engineering
Eye Tracking Path Characteristics
Are there observable, measurable behaviours here – that can be used as dependent variables?
45
York University – Department of Computer Science and Engineering
Case Studies
• Case Study #1 – Eye typing• Case Study #2 – BlinkWrite
46
York University – Department of Computer Science and Engineering
Based on…
47
York University – Department of Computer Science and Engineering
Background
• The problem• Eye typing – using the eye to enter text into an
application by fixating on keys on an on-screen keyboard
• The challenge• Improve the speed and accuracy of input
• Ideas• Word prediction, letter prediction
• Use linguistic information to correct for fixation error
48
York University – Department of Computer Science and Engineering
Q W E R T Y U
A S D F G H J
Z X C V B N M
I O P
K L
Desired input: EYE TRACKIN_
Fixation Error
• User fixation point Computed fixation point
++
User fixates here Computed fixation point
49
York University – Department of Computer Science and Engineering
Fixation Algorithm
• Use linguistic knowledge to choose the “correct” button
Q W E R T Y U
A S D F G H J
Z X C V B N M
I O P
K L
Desired input: EYE TRACKIN_
++
User fixates here Computed fixation point
56
York University – Department of Computer Science and Engineering
Method
• Participants• 10 (8 male, 2 female; 19-34 years of age)
• All with normal vision
• Apparatus• Arrington Research ViewPoint
• 30 Hz sampling
• 0.25°-1.0° visual arc accuracy (~10-40 pixels)
• Camera focused on a participant’s dominant eye
• 19" 1280×1024 LCD at a distance of ~60 cm
• Selection via key press
57
York University – Department of Computer Science and Engineering
Setup
58
York University – Department of Computer Science and Engineering
Independent Variables (IV)
59
York University – Department of Computer Science and Engineering
Button Size, Word Prediction
60
York University – Department of Computer Science and Engineering
Letter Prediction
61
York University – Department of Computer Science and Engineering
Button Feedback
Normal LetterPredict
Eye-over
Select
(+audible “click”)
62
York University – Department of Computer Science and Engineering
Task
• Phrases presented for input• Example:
Error (“beep”)Next character
to enterTimeout (aftersix seconds)
63
York University – Department of Computer Science and Engineering
Dependent Variables (DV)
• Speed and accuracy• Others (not mentioned here)
64
York University – Department of Computer Science and Engineering
Data Collection
• 10 participants × • 2 prediction modes × • 2 fixation algorithms × • 2 button sizes × • 10 sentences per condition = 800 phrases
65
York University – Department of Computer Science and Engineering
Results
• Entry speed• Grand mean: 11.7 wpm
• Error rate• Grand mean: 11.8%
Method Entry speed
EyeWrite 4.87 wpm
Dasher (1st session) 2.49 wpmDasher (10th session) 17.26 wpm
Eye-S 6.8 wpm (2 experts)
2008
Comparison with other results from ETRA 2008
66
York University – Department of Computer Science and Engineering
Entry Speed by Button Size
Difference NOT statistically significant (F1,15 = 1.162, p > .05)
67
York University – Department of Computer Science and Engineering
Entry Speed by Prediction Mode
Difference (marginally) significant (F1,15 = 5.531, p < .05)
68
York University – Department of Computer Science and Engineering
Entry Speed by Fixation Algorithm
Difference NOT statistically significant (F1,15 =0.878, ns)
69
York University – Department of Computer Science and Engineering
Conclusions
• Letter prediction as good as word prediction (in our implementation)
• Fixation algorithm• Slight improvement• Sensitive to the correctness of the first letters in a
word• Reduced error rates, but only in letter prediction
mode
• Experiment was too big (two IVs max!)• Too much variability in the data (eye tracker
issues)
70
York University – Department of Computer Science and Engineering
Skip
Case Studies
• Case Study #1 – Eye typing• Case Study #2 – BlinkWrite
71
York University – Department of Computer Science and Engineering
Based on…
72
York University – Department of Computer Science and Engineering
Background
• The problem• Create and evaluate a single-switch text entry method
• Switch input using eye blinks
• The application• Severely disable users (e.g., locked-in syndrome)
• The general idea• SAK: Scanning Ambiguous Keyboard1
1 MacKenzie, I. S. (2009). The one-key challenge: Searching for an efficient one-key text entry method. Proceedings of the ACM Conference on Computers and Assessibility - ASSETS 2009, 91-98. New York: ACM.
73
York University – Department of Computer Science and Engineering
BlinkWrite Application
Demo
74
York University – Department of Computer Science and Engineering
Three Types of Blinks
• Involuntary (0-200 ms)
• Key select (200-500 ms)
• Delete (>500 ms)
75
York University – Department of Computer Science and Engineering
Method
• Participants• 12 (8 male, 4 female; 23-31 years of age)
• All with normal vision
• Apparatus• EyeTech TM3
• Windows XP notebook
• 15” LCD
• (next slide)
• Task• Enter phrases of text
• Corrections allowed using “long blink”
76
York University – Department of Computer Science and Engineering
Setup
77
York University – Department of Computer Science and Engineering
Independent Variable
• Scanning Interval• 700 ms, 850 ms, 1000 ms
• Five phrases each
• Counterbalanced
78
York University – Department of Computer Science and Engineering
Dependent Variables
• Entry speed (wpm)• Error rate (%)• Deletes per phrase
79
York University – Department of Computer Science and Engineering
Results – Entry Speed
Differences NOT statistically significant (F2,22 = 2.05, p > .05)
80
York University – Department of Computer Science and Engineering
Results – Error Rate
Differences NOT statistically significant (F2,22 = 1.89, p > .05)
81
York University – Department of Computer Science and Engineering
Results – Deletes Per Phrase
82
York University – Department of Computer Science and Engineering
Video Demo
84
York University – Department of Computer Science and Engineering
The End (almost)
but first…
85
York University – Department of Computer Science and Engineering
Cheap Advertising
• I’m giving a Course at CHI 2010 (Atlanta in April)
• Please consider attending, or• Invite me to give the course (or other talks) at your
company• I’m also available tomorrow to visit, meet, present,
demo, discuss, etc. (departing Friday a.m.)
Empirical Research Methodsfor Human-Computer Interaction
Put yourlogo here
Put yourlogo here
86
York University – Department of Computer Science and Engineering
Thank You – Questions?