Tele-Reality for the Rest of Us
Skip to main content
eScholarship
Open Access Publications from the University of California

Tele-Reality for the Rest of Us

Abstract

We are rapidly moving toward a world where personal networked video cameras will be ubiquitous. Already, camera-equipped cell phones are becoming commonplace. Imagine being able to tap into all of these real-time video feeds to remotely explore the world live. We introduce RealityFlythrough, a tele-reality/telepresence system that will make this vision possible. By situating live 2d images in a 3d model of the world, RealityFlythrough allows any space to be explored remotely. No special cameras, tripods, rigs, scaffolding, or lighting is required to create the model, and no lengthy preprocessing of images is necessary. Rather than try to achieve photorealism at every point in space, we instead focus on providing the user with a sense of how the images spatially relate to one another. By providing spatial cues in the form of dynamic transitions, we can approximate tele-reality and harness cameras in the wild. This paper focuses on the sensibility of these imperfect dynamic transitions from camera to camera. We present early experimental results that suggest that imperfect transitions are more sensible, and provide a more pleasant user experience than no transitions at all.

Pre-2018 CSE ID: CS2004-0778

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View