Back to Search Start Over

MARViN: Mobile AR Dataset with Visual-Inertial Data

Authors :
Liu, Changkun
Zhao, Yukun
Braud, Tristan Camille
Liu, Changkun
Zhao, Yukun
Braud, Tristan Camille
Publication Year :
2024

Abstract

Accurate camera relocalisation is a fundamental technology for extended reality (XR), facilitating the seamless integration and persistence of digital content within the real world. Benchmark datasets that measure camera pose accuracy have driven progress in visual re-localisation research. Despite notable progress in this field, there is a limited availability of datasets incorporating Visual Inertial Odometry (VIO) data from typical mobile AR frameworks such as ARKit or ARCore. This paper presents a new dataset, MARViN, comprising diverse indoor and outdoor scenes captured using heterogeneous mobile consumer devices. The dataset includes camera images, ARCore or ARKit VIO data, and raw sensor data for several mobile devices, together with the corresponding ground-truth poses. MARViN allows us to demonstrate the capability of ARKit and ARCore to provide relative pose estimates that closely approximate ground truth within a short timeframe. Subsequently, we evaluate the performance of mobile VIO data in enhancing absolute pose estimations in both desktop simulation and user study. MARViN is available at https://github.com/XRIM-Lab/MarViN.

Details

Database :
OAIster
Notes :
English
Publication Type :
Electronic Resource
Accession number :
edsoai.on1440207197
Document Type :
Electronic Resource