This article presents a visual–inertial dataset gathered in indoor and outdoor scenarios with a handheld custom sensor rig, for over 80 min in total. The dataset contains hardware-synchronized data from a commercial stereo camera (Bumblebee®2), a custom stereo rig, and an inertial measurement unit. The most distinctive feature of this dataset is the strong presence of low-textured environments and scenes with dynamic illumination, which are recurrent corner cases of visual odometry and simultaneous localization and mapping (SLAM) methods. The dataset comprises 32 sequences and is provided with ground-truth poses at the beginning and the end of each of the sequences, thus allowing the accumulated drift to be measured in each case. We provide a trial evaluation of five existing state-of-the-art visual and visual–inertial methods on a subset of the dataset. We also make available open-source tools for evaluation purposes, as well as the intrinsic and extrinsic calibration parameters of all sensors in the rig. The dataset is available for download at http://mapir.uma.es/work/uma-visual-inertial-dataset