Taming HAL

 Asaf Degani, Nasa AMES Research

Seminar on People, Computers, and Design
Stanford University February 18, 2005

Machines are an integral part of our lives, from alarm clocks that awaken us in the morning to computers, information systems, and automated control systems. Most of our interactions with these machines are problem-free, but more often than we would like, they can be irritating and confusing. This is frequently harmless, such as a VCR recording the wrong show, but when it involves critical systems like medical devices, navigation systems, and autopilots--it can be a matter of life or death. In this presentation, which is based on my new book, Taming HAL, I will explain the kind of miscommunications that frequently occur between humans and machines by delineating the differences between models of machine behavior and models of (user) tasks and processes. I will present several examples, from consumer electronic and medical devices to ship navigation and modern autopilot design, in order to illustrate fundamental interface and interactions problems that can plague systems. Using examples from 2001: A Space Odyssey, I will show parallels between HAL's dangerous behavior and current automation design.

Asaf Degani is a research scientist at Code IC (NASA Ames Research Center). Working at Ames since 1989, he has conducted observational and simulation experiments on human factors aspects of cockpit design, procedure usage and development, and automated decision aids. His current research work is on development of interface design methodologies for controlling automated systems (e.g., the next generation Mars rovers), and mathematical approaches for analysis and design for emergency/abnormal procedures (used by pilots of commercial aircraft). He holds a maritime master ticket, a private pilot license, and a Ph.D. in Industrial and Systems Engineering from Georgia Tech.


View this talk on line at CS547 on Stanford OnLine

Titles and abstracts for previous years are available by year and by speaker.