The UMA-VI dataset: Visual–inertial odometry in low-textured and dynamic illumination environments

Loading...
Thumbnail Image

Files

Identifiers

Publication date

Reading date

Collaborators

Advisors

Tutors

Editors

Journal Title

Journal ISSN

Volume Title

Publisher

SAGE Journals

Metrics

Google Scholar

Share

Research Projects

Organizational Units

Journal Issue

Abstract

This article presents a visual–inertial dataset gathered in indoor and outdoor scenarios with a handheld custom sensor rig, for over 80 min in total. The dataset contains hardware-synchronized data from a commercial stereo camera (Bumblebee®2), a custom stereo rig, and an inertial measurement unit. The most distinctive feature of this dataset is the strong presence of low-textured environments and scenes with dynamic illumination, which are recurrent corner cases of visual odometry and simultaneous localization and mapping (SLAM) methods. The dataset comprises 32 sequences and is provided with ground-truth poses at the beginning and the end of each of the sequences, thus allowing the accumulated drift to be measured in each case. We provide a trial evaluation of five existing state-of-the-art visual and visual–inertial methods on a subset of the dataset. We also make available open-source tools for evaluation purposes, as well as the intrinsic and extrinsic calibration parameters of all sensors in the rig. The dataset is available for download at http://mapir.uma.es/work/uma-visual-inertial-dataset

Description

Artículo de dataset. Disponible en http://mapir.uma.es/work/uma-visual-inertial-dataset

Bibliographic citation

zuniga2020umavi

Collections

Endorsement

Review

Supplemented By

Referenced by

Creative Commons license

Except where otherwised noted, this item's license is described as Attribution-NonCommercial-NoDerivatives 4.0 Internacional