Pario: the Next Step Beyond Audio and Video
Todd C. Mowry, Computer Science Department Carnegie Mellon University
Seminar on People, Computers, and Design
Stanford University January 9, 2009
While audio and video technologies have had a profound impact on our lives, it is important to remember that we do not live in a world of just sounds and moving images: we live in a physical, 3D world. To enable new classes of applications that are currently unthinkable, we would like to create a new technology (which we call “pario”) that will allow us to physically render moving 3D objects as real artifacts. Similar to how audio and video technologies allows us to capture and reproduce sound and moving images, respectively, with “pario” we could capture and reproduce the shape, motion, and appearance of arbitrary 3D objects. At Carnegie Mellon University and Intel Research Pittsburgh, we are exploring hardware and software techniques to make this vision a reality through something that we call “claytronics”. Claytronics is analogous to modeling clay that can control its own shape. It is comprised of very large numbers of very tiny robots that can collectively morph into arbitrary shapes under software control. In this talk, I will describe the technical progress that we have made so far on claytronics, as well as suggesting what this technology might enable.
Todd C. Mowry is a professor in the Computer Science Department at Carnegie Mellon University. He received his PhD from Stanford University in 1994, and he was an assistant professor at the University of Toronto from 1994 through 1997. Professor Mowry’s research interests span a broad set of systems areas, as well as database performance and modular robotics. He currently co-leads the Claytronics project and the Log-Based Architectures project. From 2004 through 2007, Professor Mowry served a rotation as the Director of Intel’s research lab in Pittsburgh. He is currently on sabbatical at Stanford during the 2008-2009 academic year.
View this talk on line at CS547 on Stanford OnLine or using this video link.
Titles and abstracts for previous years are available by year and by speaker.