Partners from NTUA participated at the physical conference XXIV ISPRS Congress 2022, 5 – 9 June, with the presentation of the peer-review paper entitled “Pose Estimation through Mask-R CNN and vSLAM in Large-Scale Outdoors Augmented Reality”.
The work presented concerns Deep Learning (DL) ingrained into Mobile Augmented Reality (MAR) that enables a new information-delivery paradigm. In the context of 6 DoF pose estimation, powerful DL networks could provide a direct solution for AR systems. However, their concurrent operation requires a significant number of computations per frame and yields to both misclassifications and localization errors. In this paper, a hybrid and lightweight solution on 3D tracking of arbitrary geometry for outdoor MAR scenarios is presented. The camera pose information obtained by ARCore SDK and vSLAM algorithm is combined with the semantic and geometric output of a CNN-object detector to validate and improve tracking performance in large-scale and uncontrolled outdoor environments. The methodology involves three main steps: i) training of the Mask-R CNN model to extract the class, bounding box and mask predictions, ii) real-time detection, segmentation and localization of the region of interest (ROI) in camera frames, and iii) computation of 2D-3D correspondences to enhance pose estimation of a 3D overlay. The dataset holds 30 images of the rock of St. Modestos – Modi in Meteora, Greece in which the ROI is an area with characteristic geological features. The comparative evaluation between the prototype system and the original one, as well as with R-CNN and FAST-R CNN detectors demonstrates higher precision accuracy and stable visualization at half a kilometre distance, while tracking time has decreased at 42% during far-field AR session.
The authors of the paper are Argyro Maria Boutsi, Nikos Bakalos and Charalabos Ioannidis, from the team of Laboratory of Photogrammetry of NTUA.
The full paper is available here.