Bennett Wilburn
Stanford University |
Neel
Joshi
Stanford University |
Katherine Chou
Stanford University |
Marc Levoy
Stanford University |
Mark
Horowitz
Stanford University |
Submitted to Siggraph 2004
The 96 camera array used for this work. |
Abstract
We explore the application of dense camera arrays to view interpolation across space and time for dynamic scenes. Large video camera arrays are typically synchronized, but we show that staggering camera triggers provides a much richer set of samples on which to base the interpolation. We do not increase the total number of samples---we merely distribute them more effectively in time. We use optical flow to interpolate new views. Within this framework, we find that the dense space-time sampling provided by staggered timing improves the robustness of the interpolation. We present an novel optical flow method that combines a plane plus parallax framework with knowledge of camera spatial and temporal offsets to generate flow fields for virtual images at new space-time locations. We present results interpolating video from a 96-camera light field using this method.
Paper
MPEG-4 Video