Mostrar el registro sencillo del ítem

dc.contributorRobotics and Mechatronics Group (TEP-119)es_ES
dc.contributor.authorSánchez-Montero, Manuel
dc.contributor.authorMorales-Rodríguez, Jesús 
dc.contributor.authorMartínez-Rodríguez, Jorge Luis 
dc.contributor.authorFernández-Lozano, Juan Jesús 
dc.contributor.authorGarcía-Cerezo, Alfonso José 
dc.contributor.otherIngeniería de Sistemas y Automáticaes_ES
dc.date.accessioned2023-02-21T11:15:24Z
dc.date.available2023-02-21T11:15:24Z
dc.date.issued2022-07-26
dc.identifier.citationSánchez, M.; Morales, J.; Martínez, J.L.; Fernández-Lozano, J.J.; García-Cerezo, A. Automatically Annotated Dataset of a Ground Mobile Robot in Natural Environments via Gazebo Simulations. Sensors 2022, 22, 5599. https://doi.org/10.3390/s22155599es_ES
dc.identifier.urihttps://hdl.handle.net/10630/26015
dc.description.abstractThis paper presents a new synthetic dataset openly available at https://www.uma.es/robotics-and-mechatronics/info/132852/negs-ugv-dataset obtained from Gazebo simulations of an Unmanned Ground Vehicle (UGV) moving on different natural environments. To this end, a Husky mobile robot equipped with a tridimensional (3D) Light Detection and Ranging (LiDAR) sensor, a stereo camera, a Global Navigation Satellite System (GNSS) receiver, an Inertial Measurement Unit (IMU) and wheel tachometers has followed several paths using the Robot Operating System (ROS). Both points from LiDAR scans and pixels from camera images, have been automatically labeled into their corresponding object class. For this purpose, unique reflectivity values and flat colors have been assigned to each object present in the modeled environments. As a result, a public dataset, which also includes 3D pose ground-truth, is provided as ROS bag files and as human-readable data. Potential applications include supervised learning and benchmarking for UGV navigation on natural environments. Moreover, to allow researchers to easily modify the dataset or to directly use the simulations, the required code has also been releasedes_ES
dc.description.sponsorshipThis work was partially supported by the Andalusian project UMA18-FEDERJA-090 and by the Spanish project RTI2018-093421-B-I00.es_ES
dc.language.isoenges_ES
dc.relation.isreferencedbySánchez M, Morales J, Martínez JL, Fernández-Lozano JJ, García-Cerezo A. Automatically Annotated Dataset of a Ground Mobile Robot in Natural Environments via Gazebo Simulations. Sensors. 2022; 22(15):5599. https://doi.org/10.3390/s22155599
dc.rightsinfo:eu-repo/semantics/openAccesses_ES
dc.rights.urihttp://creativecommons.org/licenses/by-nc-sa/4.0/*
dc.subjectRobóticaes_ES
dc.subject.otherSynthetic datasetes_ES
dc.subject.otherGazebo simulatores_ES
dc.subject.otherUGV navigationes_ES
dc.subject.otherNatural environmentses_ES
dc.subject.otherAutomatic data labelinges_ES
dc.subject.other3D LiDARes_ES
dc.subject.otherStereo cameraes_ES
dc.titleAutomatically Annotated Dataset of a Ground Mobile Robot in Natural Environments via Gazebo Simulationses_ES
dc.typeinfo:eu-repo/semantics/datasetes_ES
dc.centroEscuela de Ingenierías Industrialeses_ES
dc.identifier.doi10.24310/riuma.26015
dc.rights.ccAtribución-NoComercial-CompartirIgual 4.0 Internacional*
dc.publication.year2022
dc.version1.0es_ES
dc.identifier.urlhttp://u.uma.es/dr3/datasetcode/es_ES
dc.identifier.urlhttp://u.uma.es/dr4/datasetforest_1/es_ES
dc.identifier.urlhttp://u.uma.es/dr5/datasetforest_2/es_ES
dc.identifier.urlhttp://u.uma.es/dr6/datasethill_1/es_ES
dc.identifier.urlhttp://u.uma.es/dr7/datasethill_2/es_ES
dc.identifier.urlhttp://u.uma.es/dr8/datasetlake_1/es_ES
dc.identifier.urlhttp://u.uma.es/dr9/datasetlake_2/es_ES
dc.identifier.urlhttp://u.uma.es/dsa/datasetpark_1/es_ES
dc.identifier.urlhttp://u.uma.es/dsb/datasetpark_2/es_ES


Ficheros en el ítem

Este ítem aparece en la(s) siguiente(s) colección(ones)

Mostrar el registro sencillo del ítem

Atribución-NoComercial-CompartirIgual 4.0 Internacional
Excepto si se señala otra cosa, la licencia del ítem se describe como Atribución-NoComercial-CompartirIgual 4.0 Internacional