The Semantic Paintbrush: Interactive 3D Mapping and Recognition in Large Outdoor Spaces
Ondrej Miksik1     Vibhav Vineet2     Morten Lidegaard1     Ram Prasaath1     Matthias Nießner3     Stuart Golodetz1     Stephen L. Hicks1     Patrick Pérez4     Shahram Izadi5     Philip H. S. Torr1    
    1University of Oxford     2Stanford University     3Technical University of Munich     4Technicolor R&I     5Microsoft Research
SIGCHI 2015
Abstract

We present an augmented reality system for large scale 3D reconstruction and recognition in outdoor scenes. Unlike existing prior work, which tries to reconstruct scenes using active depth cameras, we use a purely passive stereo setup, allowing for outdoor use and extended sensing range. Our system not only produces a map of the 3D environment in real-time, it also allows the user to draw (or ‘paint’) with a laser pointer directly onto the reconstruction to segment the model into objects. Given these examples our system then learns to segment other parts of the 3D map during online acquisition. Unlike typical object recognition systems, ours therefore very much places the user ‘in the loop’ to segment particular objects of interest, rather than learning from predefined databases. The laser pointer additionally helps to ‘clean up’ the stereo reconstruction and final 3D map, interactively. Using our system, within minutes, a user can capture a full 3D map, segment it into objects of interest, and refine parts of the model during capture. We provide full technical details of our system to aid replication, as well as quantitative evaluation of system components. We demonstrate the possibility of using our system for helping the visually impaired navigate through spaces. Beyond this use, our system can be used for playing large-scale augmented reality games, shared online to augment streetview data, and used for more detailed car and person navigation.