Skip to main content

InDepth: Real-time Depth Inpainting for Mobile Augmented Reality

Publication ,  Journal Article
Zhang, Y; Scargill, T; Vaishnav, A; Premsankar, G; Di Francesco, M; Gorlatova, M
Published in: Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies
March 1, 2022

Mobile Augmented Reality (AR) demands realistic rendering of virtual content that seamlessly blends into the physical environment. For this reason, AR headsets and recent smartphones are increasingly equipped with Time-of-Flight (ToF) cameras to acquire depth maps of a scene in real-time. ToF cameras are cheap and fast, however, they suffer from several issues that affect the quality of depth data, ultimately hampering their use for mobile AR. Among them, scale errors of virtual objects - appearing much bigger or smaller than what they should be - are particularly noticeable and unpleasant. This article specifically addresses these challenges by proposing InDepth, a real-time depth inpainting system based on edge computing. InDepth employs a novel deep neural network (DNN) architecture to improve the accuracy of depth maps obtained from ToF cameras. The DNN fills holes and corrects artifacts in the depth maps with high accuracy and eight times lower inference time than the state of the art. An extensive performance evaluation in real settings shows that InDepth reduces the mean absolute error by a factor of four with respect to ARCore DepthLab. Finally, a user study reveals that InDepth is effective in rendering correctly-scaled virtual objects, outperforming DepthLab.

Duke Scholars

Altmetric Attention Stats
Dimensions Citation Stats

Published In

Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies

DOI

EISSN

2474-9567

Publication Date

March 1, 2022

Volume

6

Issue

1

Related Subject Headings

  • 4608 Human-centred computing
  • 4606 Distributed computing and systems software
  • 4602 Artificial intelligence
 

Citation

APA
Chicago
ICMJE
MLA
NLM
Zhang, Y., Scargill, T., Vaishnav, A., Premsankar, G., Di Francesco, M., & Gorlatova, M. (2022). InDepth: Real-time Depth Inpainting for Mobile Augmented Reality. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, 6(1). https://doi.org/10.1145/3517260
Zhang, Y., T. Scargill, A. Vaishnav, G. Premsankar, M. Di Francesco, and M. Gorlatova. “InDepth: Real-time Depth Inpainting for Mobile Augmented Reality.” Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 6, no. 1 (March 1, 2022). https://doi.org/10.1145/3517260.
Zhang Y, Scargill T, Vaishnav A, Premsankar G, Di Francesco M, Gorlatova M. InDepth: Real-time Depth Inpainting for Mobile Augmented Reality. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies. 2022 Mar 1;6(1).
Zhang, Y., et al. “InDepth: Real-time Depth Inpainting for Mobile Augmented Reality.” Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, vol. 6, no. 1, Mar. 2022. Scopus, doi:10.1145/3517260.
Zhang Y, Scargill T, Vaishnav A, Premsankar G, Di Francesco M, Gorlatova M. InDepth: Real-time Depth Inpainting for Mobile Augmented Reality. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies. 2022 Mar 1;6(1).

Published In

Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies

DOI

EISSN

2474-9567

Publication Date

March 1, 2022

Volume

6

Issue

1

Related Subject Headings

  • 4608 Human-centred computing
  • 4606 Distributed computing and systems software
  • 4602 Artificial intelligence