CIS 607 - Cognitive Modeling of Multimodal Multitasking User Interfaces

A research seminar by Prof. Anthony J. Hornof

Winter, 2008 - 2 Credits - CRN 26202
Mondays, 2 PM - 3:20 PM
200 Deschutes

Prof. Hornof's office hours for the term will be Tuesdays and Thursdays 1:30 - 2:30 PM, or by appointment.

Overview

This research seminar will advance an understanding of (a) human performance when engaged in multitasking behavior that requires interaction with complex multimodal auditory and visual displays and (b) how to simulate and ultimately predict human performance in such situations by means of computational cognitive modeling, ultimately to inform the design of multimodal watchstations. Computational cognitive models are computer programs that behave in some way like humans. Scientific and technical objectives include developing of these computational cognitive models, and advancing a framework for future development of these models which simulate parallel visual-perceptual and auditory-perceptual processing in multitasking situations.

Two fundamental questions that will be explored in this seminar include: (a) How can people use 3D auditory displays to direct their visual attention to appropriate areas of a visual display when people are trying to do multiple tasks in parallel, such as actively engaging one visual display while monitoring visual activity on a second display? How can 3D audio help a person to "listen" to that visual display even when not looking at it. (b) How can this multimodal dual task human behavior be modeled by a computer program that ultimately could be used to predict aspects of human performance in such situations, to assist user interface designers in the specification of systems that maximize human-perceptual throughput and effective multitasking behavior?

This class relates to Dr. Hornof's research on Multimodal Multitasking.

Evaluation

Each student will pass this course if he or she attends all sessions and submits each week's homework at the start of class. A student will not pass the course if four classes or homeworks are missed. Medical and approved-in-advance absences are not penalized.

Weekly Homework

Unless otherwise specified, your weekly homework is to actively read each week's paper(s) or chapter(s), taking notes on and responding to what you are reading. This is a great practice to develop as a reseacher. You can take your notes in whatever structure and format feels best to you, and your notes can be handwritten or typed, but each summary should include:

Here two good examples: Carswell (1992) and Dourish (2006). Note that they are each very different, but that both satisfy all of the above criteria.

Readings
 
Week One (1/7/08):
No reading assignment.
 
Week Two (1/14/08):
Postponed because instructor presenting research off campus.
 
Week Three (1/21/08):
No class. MLK holiday.
 
Week Four (1/28/08):
Kieras, D. E. and D. E. Meyer (1997). "An overview of the EPIC architecture for cognition and performance with application to human-computer interaction." Human-Computer Interaction 12(4): 391-438. PDF
 
Week Five (2/4/08):
Kieras and Meyer (1997) continued.
 
Week Six (2/11/08):
Kieras, D. E., Ballas, J., & Meyer, D. E. (2001). Computational Models for the Effects of Localized Sound Cuing in a Complex Dual Task (No. EPIC Report No. 13). Ann Arbor, Michigan: University of Michigan, Department of Electrical Engineering and Computer Science. PDF
 
Week Seven (2/18/08):
Chapter 1 from: Begault, D. R. (2000). 3-D Sound for Virtual Reality and Multimedia. Ames Research Center, Moffett Field, California, National Aeronautics and Space Administration. PDF
 
Week Eight (2/25/08):
Chapters 2 and 3 from: Begault, D. R. (2000). 3-D Sound for Virtual Reality and Multimedia. Ames Research Center, Moffett Field, California, National Aeronautics and Space Administration. PDF
 
Week Nine (3/3/08):
Pierno, A. C., A. Caria, et al. (2004). "Comparing effects of 2-D and 3-D visual cues during aurally aided target acquisition." Hum. Factors 46(4): 728-37. PDF
 
Week Ten (3/10/08):
Bregman, A. S. (1990). Auditory Scene Analysis. MIT Press.
Preface: Nonprintable PDF and Printable PDF
Chapter 1: Nonprintable PDF (6.5 MB) and Printable PDF (13 MB)
If you are interested, the entire book (nonprintable) is available online through the UO libraries.

Other Possible Readings

Kieras, D. E. and D. E. Meyer (1997). "An overview of the EPIC architecture for cognition and performance with application to human-computer interaction." Human-Computer Interaction 12(4): 391-438.

Perrott, D. R., T. Sadralodabai, et al. (1991). "Aurally aided visual search in the central visual field: Effects of visual load and visual enhancement of the target." Hum. Factors 33(4): 389-400.

Perrott, D. R., K. Saberi, et al. (1990). "Auditory psychomotor coordination and visual search performance." Perception and Psychophysics 48(3): 214-226.