LC Technologies Eyegaze System (left), Card, Moran, and Newell's Model Human Processor (center), and Kieras and Meyer's EPIC (right)


CIS 607 - Tracking visual cognition in human-computer interaction

A research seminar by Prof. Anthony J. Hornof

Spring, 2003 - 2 Credits - CRN 35088
Friday, 1:00 PM-2:20 PM
200 Deschutes

Overview

This research seminar will examine how cognitive psychology, the measurement of eye movements, and computer programming can be integrated to build and refine psychological theory, predict aspects of human performance, contribute to the design and analysis of useful and usable computer systems, and improve accessibility for people with physical impairments.

Cognitive modeling is an important component in this endeavor. Cognitive modeling refers to the construction, analysis, and empirical validation of computer programs that simulate the human perception, memory, cognition, and action that people use to accomplish tasks.

Eye tracking, in which a device monitors the position of the user's gaze on the computer screen, is another important component of this research endeavor. Eye tracking is useful as a post hoc analysis technique for figuring out how a person did a task, and can be used to evaluate, validate, and refine cognitive models.

Eye tracking is also useful as an input device to a computer, in which a user's gaze position is used in real time as an input to the computer. This is useful for automatically gauging where a user is putting his or her attention, and is also useful for people who can move their eyes but otherwise have severe mobility impairments. This seminar will examine some of the uses of eye tracking for "attentive user interfaces," and for providing increased computer accessibility for people with physical impairments.

This seminar should be of particular interest to students who are interested in human-computer interaction, eye tracking, applied cognitive psychology, cognitive architectures, and accessibility.

The five important things to learn in this seminar

1. Cognitive models are computer programs that behave in the way that humans behave. Cognitive modeling can inform the design of useful and usable human-computer interfaces by (1) providing accurate post hoc explanations of how people accomplish tasks on a computer and (2) providing a foundation for building accurate a priori predictive models.

2. Predictive cognitive models will form the core of predictive visual interface analysis tools, which will predict human-computer visual interaction based on a description of the task, user profile, and the visual layout. A major challenge in building a predictive visual layout analysis tool is determining which aspects of human visual cognition must be included in the simulation at the core of the tool. A great deal is known about vision. A difficult but essential challenge will be to figure out how much of the vision system must be included for a model to make accurate predictions, and to figure out how to represent these details in a computer program.

3. Cognitive strategies will be a crucial, central component in any predictive model of visual search.

4. Eye tracking is used in human-computer interaction in two modes: post hoc analysis and real time input. In post hoc mode, eye movement data is used as is any dependent variable in a psychological experiment--to form theories about how people behave and to generalize across people and tasks. In real time mode, the eye tracker is used to monitor where a user is looking on a computer screen, and the computer responds based on where the user is looking. This can be used for a wide range of applications. One is "attentive user interfaces," which monitor where a user is putting his or her focus, and accommodates his or her needs, such as by pausing a video the user was watching when they stop to answer the phone. Another set of applications give mobility impaired users the ability to communicate, such as by typing with their eyes.

5. Collecting eye movement data for any randomly chosen human-computer interaction task is not likely to lead to a very interesting result. It is difficult to identify a task for which eye movement data will clearly contribute to a better understanding of how people do that task. The following is not a very interesting question: How do people move their eyes when they do task X? The following is much more interesting: A theory has been established pertaining to how people do task X. It has been established by collecting and analyzing the standard psychological measures of speed and accuracy, but not based on eye movement data. The theory can be supported or refuted with eye movement data, as follows. If eye movement Y is observed, the theory is supported. If eye movement Z is observed, the theory is refuted. It is difficult to identify such ideal circumstances.

What will be expected of students who attend this seminar

The course is open to all UofO grad students who are interested in learning more about the topics listed above. You will be expected attend all ten sessions, read the papers assigned for each class (roughly one journal article per week), provide a thoughtful written response to the assignment posed each week, participate in discussions in an appropriate and constructive manner, and successfully complete an occasional quiz. Let me know during the first week of class if you will have any difficulty fulfilling these requirements.

Readings

We will usually read one or two articles per week. The following is a preliminary reading list:

Salvucci, D. D., & Lee, F. J. (2003). Simple cognitive modeling in a complex cognitive architecture. To appear in Human Factors in Computing Systems: CHI 2003 Conference Proceedings. New York: ACM Press. Downloadable at http://www.mcs.drexel.edu/~salvucci/CHI03/CHI03.pdf

Hornof, A. (in review). Cognitive strategies for the visual search of hierarchical computer displays. Submitted to Human-Computer Interaction. Downloadable at http://www.cs.uoregon.edu/~hornof/downloads/Hierarchical.pdf

Kieras, D. E. (in press). Model-based Evaluation. In J. Jacko & A. Sears (Eds.), The Human-Computer Interaction Handbook. Lawrence Erlbaum Associates.

Jacob, R. J. K. and K. S. Karn (2003). Eye tracking in human-computer interaction and usability research: Ready to deliver the promises (Section commentary). The Mind's Eyes: Cognitive and Applied Aspects of Eye Movements. J. Hyona, R. Radach and H. Deubel. Oxford, Elsevier Science. Downloadable from http://www.cs.tufts.edu/~jacob/papers/ecem.pdf

Findlay, J. M. and I. D. Gilchrist (1998). Eye guidance and visual search. Eye Guidance in Reading and Scene Perception. G. Underwood. Amsterdam, Elsevier: 295-312.

There will likely be another two or three cognitive psychology or visual search papers here in the schedule.

L. C. Technologies (2003). The Eyegaze Development System: A Tool for Eyetracking Applications. Fairfax, VA: LC Technologies, Inc.

Hornof, A. J. (2002). Simulating the Human Visual "Find" Command. A grant proposal submitted to the National Science Foundation.

Shell, J., Selker, T., and Vertegaal, R. "Interacting with Groups of Computers" In Special Issue on Attentive User Interfaces, Communications of ACM 46(3), March 2003. Downloadable at http://www.hml.queensu.ca/papers/shellcacm0303.pdf

Zhai, S. "What's in the Eyes for Attentive Input" In Special Issue on Attentive User Interfaces, Communications of ACM 46(3), March 2003. Downloadable via http://www.acm.org/dl

Vertegaal, R. "Attentive User Interfaces" Editorial, Special Issue on Attentive User Interfaces, Communications of ACM 46(3), March 2003. Downloadable at http://www.hml.queensu.ca/papers/vertegaalcacm0303.pdf

L. C. Technologies (2003). The Eyegaze Communication System. Online document available at http://eyegaze.com/indexdis.htm