RT Journal Article T1 Automatically annotated dataset of a ground mobile robot in natural environments via Gazebo simulations A1 Sánchez-Montero, Manuel A1 Morales-Rodríguez, Jesús A1 Martínez-Rodríguez, Jorge Luis A1 Fernández-Lozano, Juan Jesús A1 García-Cerezo, Alfonso José K1 Robótica AB This paper presents a new synthetic dataset obtained from Gazebo simulations of an Unmanned Ground Vehicle (UGV) moving on different natural environments. To this end, a Husky mobile robot equipped with a tridimensional (3D) Light Detection and Ranging (LiDAR) sensor, a stereo camera, a Global Navigation Satellite System (GNSS) receiver, an Inertial Measurement Unit (IMU) and wheel tachometers has followed several paths using the Robot Operating System (ROS). Both points from LiDAR scans and pixels from camera images, have been automatically labeled into their corresponding object class. For this purpose, unique reflectivity values and flat colors have been assigned to each object present in the modeled environments. As a result, a public dataset, which also includes 3D pose ground-truth, is provided as ROS bag files and as human-readable data. Potential applications include supervised learning and benchmarking for UGV navigation on natural environments. Moreover, to allow researchers to easily modify the dataset or to directly use the simulations, the required code has also been released. PB MDPI YR 2022 FD 2022 LK https://hdl.handle.net/10630/29495 UL https://hdl.handle.net/10630/29495 LA eng NO Sánchez, M.; Morales, J.; Martínez, J.L.; Fernández-Lozano, J.J.; García-Cerezo, A. Automatically Annotated Dataset of a Ground Mobile Robot in Natural Environments via Gazebo Simulations. Sensors 2022, 22, 5599. https://doi.org/10.3390/s22155599 NO Andalusian project UMA18-FEDERJA-090 and Spanish project RTI2018-093421-B-I00 DS RIUMA. Repositorio Institucional de la Universidad de Málaga RD 20 ene 2026