Crowd-scale Interactive Formal Reasoning and Analytics

Ethan Fast, Colleen Lee, Alex Aiken, Michael Bernstein, Daphne Koller, Eric Smith
UIST: ACM Symposium on User Interface Software and Technology, 2013
Large online courses often assign problems that are gradable by simple checks such as multiple choice, but these checks are inappropriate for domains in which students may produce an infinity of correct solutions. One such domain is derivations: sequences of logical steps commonly used in assignments for technical, mathematical and scientific subjects. We present DeduceIt, a system for creating, grading, and analyzing derivation assignments across arbitrary formal domains. DeduceIt supports assignments in any logical formalism, provides students with incremental feedback, and aggregates student paths through each proof to produce instructor analytics. DeduceIt benefits from checking thousands of derivations on the web: it introduces a proof cache, a novel data structure which leverages a crowd of students to decrease the cost of checking derivations and providing real-time, constructive feedback. We evaluate DeduceIt with 990 students in an online compilers course, finding students take advantage of its incremental feedback and instructors benefit from its structured insights into confusing course topics. Our work suggests that automated reasoning can extend online assignments and large-scale education to many new domains.