ETH Zurich - D-INFK - IVC - CVG - Research - Real-Time View Correction for Mobile Devices

Real-Time View Correction for Mobile Devices


Thomas Schöps, Martin R. Oswald, Pablo Speciale, Shuoran Yang, and Marc Pollefeys
Department of Computer Science, ETH Zürich, Switzerland

TVCG / ISMAR 2017

Abstract

We present a real-time method for rendering novel virtual camera views from given RGB-D (color and depth) data of a different viewpoint. Missing color and depth information due to incomplete input or disocclusions is efficiently inpainted in a temporally consistent way. The inpainting takes the location of strong image gradients into account as likely depth discontinuities. We present our method in the context of a view correction system for mobile devices, and discuss how to obtain a screen-camera calibration and options for acquiring depth input. Our method has use cases in both augmented and virtual reality applications. We demonstrate the speed of our system and the visual quality of its results in multiple experiments in the paper as well as in the supplementary video.

Publication

  • Real-Time View Correction for Mobile Devices.
    Thomas Schöps, Martin R. Oswald, Pablo Speciale, Shuoran Yang, and Marc Pollefeys.
    Special issue of IEEE Transactions on Visualization and Computer Graphics (TVCG), 2017. Presented at IEEE International Symposium on Mixed and Augmented Reality (ISMAR) 2017.

Video


© CVG, ETH Zürich lm@inf.ethz.ch