Backtracking Events as Indicators of Software Usability Problems

Mira Dontcheva   David Akers, Stanford CS 

Seminar on People, Computers, and Design
Stanford University October 2, 2009
, 12:50-2:05pm, Gates B01

Creation-oriented software applications such as photo editors and word processors are often difficult to test with traditional laboratory usability testing methods. A diversity of creation goals and strategies results in a diversity of usability problems encountered by users. This diversity in problems translates into the need for a large pool of participants in order to identify a high percentage of the problems. However, recruiting a large pool of participants can be prohibitively expensive, due to the high costs of traditional, expert-moderated think-aloud usability testing.

To address this problem, this talk describes a new usability evaluation method called backtracking analysis, designed to automate the process of detecting and characterizing usability problems in creation-oriented applications. The key insight is that interaction breakdowns in creation-oriented applications often manifest themselves in simple backtracking operations that can be automatically logged (e.g., undo operations, erase operations, and abort operations). Backtracking analysis synchronizes these events to contextual data such as screen capture video, helping the evaluator to characterize specific usability problems.

The claim of this talk is that backtracking events are effective indicators of usability problems in creation-oriented applications, and can yield a scalable alternative to traditional laboratory usability testing. The investigation of this claim consists of five parts. First, a set of experiments demonstrate that it is possible to extract usability problem descriptions from backtracking events without the aid of a human test-moderator, by pairing participants during an automated retrospective interview. Second, a within-subjects experiment with the Google SketchUp 3D modeling application shows that backtracking analysis is comparable in effectiveness (number and severity of usability problems identified, and percentage of false alarms) to the user-reported critical incident technique, a cost-effective usability evaluation method that relies on participants to report their own difficulties. Third, another experiment generalizes this result to the Adobe Photoshop application. Fourth, to situate backtracking analysis within usability evaluation practice, a between-subjects experiment explores the strengths and weaknesses of backtracking analysis compared to traditional think-aloud usability testing. Finally, this work contributes a theory to help explain the influence of task design on the effectiveness of backtracking analysis.


David Akers is a PhD student in computer science at Stanford, working with Professor Terry Winograd. His research interests include usability testing methodologies, interactive data visualization, and the design of educational software. After completing his PhD this fall, David will begin working as an Assistant Professor of Computer Science at the University of Puget Sound in Tacoma, WA.

The talks are open to the public. They are in the Gates Building, Room B01 in the basement. The nearest public parking is in the structure at Campus Drive and Roth Way.

View this talk on line at CS547 on Stanford OnLine or using this video link.

Titles and abstracts for previous years are available by year and by speaker.