We are rapidly moving toward a world where personal networked video
cameras are ubiquitous. Already, camera-equipped cell phones are becoming
commonplace. Imagine being able to tap into all of these live video feeds to
remotely explore the world in real-time. We introduce RealityFlythrough, a
telepresence system that makes this vision possible. By situating live 2d video
feeds in a 3d model of the world, RealityFlythrough allows any space to be
explored remotely. No special cameras, tripods, rigs, scaffolding, or lighting
is required to create the model, and no lengthy preprocessing of images is
necessary. Rather than try to achieve photorealism at every point in space, we
instead focus on providing the user with a sense of how the video streams
relate to one another spatially. By providing cues in the form of dynamic
transitions, we can approximate photorealistic telepresence while harnessing
cameras ▒in the wild.▒ This paper describes the RealityFlythrough system, and
reports on a live flythrough experience. We find that telepresence can work in
the wild using only commodity hardware and off-the-shelf software, and that
imperfect transitions are sensible and provide a compelling user experience.
Pre-2018 CSE ID: CS2005-0814