Vision-Based Autonomous Following of a Moving Platform and Landing for an Unmanned Aerial Vehicle

dc.centroEscuela de Ingenierías Industrialeses_ES
dc.contributor.authorMorales-Rodríguez, Jesús
dc.contributor.authorCastelo, Isabel
dc.contributor.authorSerra, Rodrigo
dc.contributor.authorLima, Pedro U.
dc.contributor.authorBasiri, Meysam
dc.date.accessioned2023-02-20T18:31:15Z
dc.date.available2023-02-20T18:31:15Z
dc.date.issued2023-01-11
dc.departamentoIngeniería de Sistemas y Automática
dc.description.abstractInterest in Unmanned Aerial Vehicles (UAVs) has increased due to their versatility and variety of applications, however their battery life limits their applications. Heterogeneous multi-robot systems can offer a solution to this limitation, by allowing an Unmanned Ground Vehicle (UGV) to serve as a recharging station for the aerial one. Moreover, cooperation between aerial and terrestrial robots allows them to overcome other individual limitations, such as communication link coverage or accessibility, and to solve highly complex tasks, e.g., environment exploration, infrastructure inspection or search and rescue. This work proposes a vision-based approach that enables an aerial robot to autonomously detect, follow, and land on a mobile ground platform. For this purpose, ArUcO fiducial markers are used to estimate the relative pose between the UAV and UGV by processing RGB images provided by a monocular camera on board the UAV. The pose estimation is fed to a trajectory planner and four decoupled controllers to generate speed set-points relative to the UAV. Using a cascade loop strategy, these set-points are then sent to the UAV autopilot for inner loop control. The proposed solution has been tested both in simulation, with a digital twin of a solar farm using ROS, Gazebo and Ardupilot Software-in-the-Loop (SiL); and in the real world at IST Lisbon’s outdoor facilities, with a UAV built on the basis of a DJ550 Hexacopter and a modified Jackal ground robot from DJI and Clearpath Robotics, respectively. Pose estimation, trajectory planning and speed set-point are computed on board the UAV, using a Single Board Computer (SBC) running Ubuntu and ROS, without the need for external infrastructure.es_ES
dc.description.sponsorshipThis research was funded by the ISR/LARSyS Strategic Funding through the FCT project UIDB/50009/2020, the DURABLE project, under the Interreg Atlantic Area Programme through the European Regional Development Fund (ERDF), the Andalusian project UMA18-FEDERJA-090 and the University of Málaga Research Plan. Partial funding for open access charge: Universidad de Málagaes_ES
dc.identifier.citationMorales J, Castelo I, Serra R, Lima PU, Basiri M. Vision-Based Autonomous Following of a Moving Platform and Landing for an Unmanned Aerial Vehicle. Sensors. 2023; 23(2):829. https://doi.org/10.3390/s23020829es_ES
dc.identifier.doihttps://doi.org/10.3390/s23020829
dc.identifier.urihttps://hdl.handle.net/10630/26011
dc.language.isoenges_ES
dc.publisherIOAP-MDPIes_ES
dc.rightsAtribución 4.0 Internacional*
dc.rights.accessRightsopen accesses_ES
dc.rights.urihttp://creativecommons.org/licenses/by/4.0/*
dc.subjectVehículoses_ES
dc.subject.otherUnmanned aerial vehiclees_ES
dc.subject.otherUnmanned ground vehiclees_ES
dc.subject.otherAutonomous landinges_ES
dc.subject.otherTarget followinges_ES
dc.subject.otherPose estimationes_ES
dc.subject.otherArtificial fiducial markerses_ES
dc.subject.otherCascade loopes_ES
dc.titleVision-Based Autonomous Following of a Moving Platform and Landing for an Unmanned Aerial Vehiclees_ES
dc.typejournal articlees_ES
dc.type.hasVersionVoRes_ES
dspace.entity.typePublication
relation.isAuthorOfPublication14fa0e60-c422-48ee-8093-600fb95e788c
relation.isAuthorOfPublication.latestForDiscovery14fa0e60-c422-48ee-8093-600fb95e788c

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
sensors-23-00829-v2.pdf
Size:
12.65 MB
Format:
Adobe Portable Document Format
Description:

Collections