Where To Place: A Real-Time Visual Saliency Based Label Placement for Augmented Reality Applications

Abstract

Textual overlays/labels add contextual information in Augmented Reality (AR) applications. The spatial placement of labels is a challenging task due to constraints that labels (i) are not occluding the object/scene of interest, and, (ii) are optimally placed for better interpretation of scene. To this end, we present a novel method for optimal placement of labels for AR. We formulate this method by an objective function that minimizes both occlusion with visually salient regions in scenes of interest, and the temporal jitter for facilitating coherence in real-time AR applications. The main focus of proposed algorithm is real-time label placement on low-end android phones/tablets. The sophisticated state-of-the-art algorithms for optimal positioning of textual label work only on the images and are often inefficient for real-time performance on those devices. We demonstrate the efficiency of our method by porting the algorithm on a smart-phone/tablet. Further, we capture objective and subjective metrics to determine the efficacy of the method; objective metrics include computation time taken for determining the label location and Label Occlusion over Saliency (LOS) score over salient regions in the scene. Subjective metrics include position, temporal coherence in the overlay, color and responsiveness.

Publication
IEEE International Conference on Image Processing, 2018.