Mostrar el registro sencillo del ítem

dc.contributor.advisorGonzález-Jiménez, Antonio Javier 
dc.contributor.authorGómez-Ojeda, Rubén
dc.contributor.otherIngeniería de Sistemas y Automáticaen_US
dc.date.accessioned2020-04-20T11:35:08Z
dc.date.available2020-04-20T11:35:08Z
dc.date.created2020
dc.date.issued2020
dc.identifier.urihttps://hdl.handle.net/10630/19479
dc.description- Robustness to Dynamic Illumination conditions is also one of the main open challenges in visual odometry and SLAM, e.g. high dynamic range (HDR) environments. The main difficulties in these situations come from both the limitations of the sensors, for instance automatic settings of a camera might not react fast enough to properly record dynamic illumination changes, and also from limitations in the algorithms, e.g. the track of interest points is typically based on brightness constancy. The work of this thesis contributes to mitigate these phenomena from two different perspectives. The first one addresses this problem from a deep learning perspective by enhancing images to invariant and richer representations for VO and SLAM, benefiting from the generalization properties of deep neural networks. In this work it is also demonstrated how the insertion of long short term memory (LSTM) allows us to obtain temporally consistent sequences, since the estimation depends on previous states. Secondly, a more traditional perspective is exploited to contribute with a purely geometric-based tracking of line segments in challenging stereo streams with complex or varying illumination, since they are intrinsically more informative. Fecha de lectura de Tesis Doctoral: 26 de febrero 2020en_US
dc.description.abstractIn the last years, visual Simultaneous Localization and Mapping (SLAM) has played a role of capital importance in rapid technological advances, e.g. mo- bile robotics and applications such as virtual, augmented, or mixed reality (VR/AR/MR), as a vital part of their processing pipelines. As its name indicates, it comprises the estimation of the state of a robot (typically the pose) while, simultaneously, incrementally building and refining a consistent representation of the environment, i.e. the so-called map, based on the equipped sensors. Despite the maturity reached by state-of-art visual SLAM techniques in controlled environments, there are still many open challenges to address be- fore reaching a SLAM system robust to long-term operations in uncontrolled scenarios, where classical assumptions, such as static environments, do not hold anymore. This thesis contributes to improve robustness of visual SLAM in harsh or difficult environments, in particular: - Low-textured Environments, where traditional approaches suffer from an accuracy impoverishment and, occasionally, the absolute failure of the system. Fortunately, many of such low-textured environments contain planar elements that are rich in linear shapes, so an alternative feature choice such as line segments would exploit information from structured parts of the scene. This set of contributions exploits both type of features, i.e. points and line segments, to produce visual odometry and SLAM algorithms robust in a broader variety of environments, hence leveraging them at all instances of the related processes: monocular depth estimation, visual odometry, keyframe selection, bundle adjustment, loop closing, etc. Additionally, an open-source C++ implementation of the proposed algorithms has been released along with the published articles and some extra multimedia material for the benefit of the community.en_US
dc.language.isoengen_US
dc.publisherUMA Editorialen_US
dc.rightsinfo:eu-repo/semantics/openAccessen_US
dc.rights.urihttp://creativecommons.org/licenses/by-nc-nd/4.0/*
dc.subjectRobóticaen_US
dc.subjectVisión artificial (Robótica)en_US
dc.subject.otherRobóticaen_US
dc.subject.otherVisión Artificialen_US
dc.subject.otherIngeniería de Controlen_US
dc.titleRobust Visual SLAM in Challenging Environments with Low-texture and Dynamic Illuminationen_US
dc.typeinfo:eu-repo/semantics/doctoralThesisen_US
dc.centroE.T.S.I. Informáticaen_US
dc.rights.ccAttribution-NonCommercial-NoDerivatives 4.0 Internacional*


Ficheros en el ítem

Este ítem aparece en la(s) siguiente(s) colección(ones)

Mostrar el registro sencillo del ítem

Attribution-NonCommercial-NoDerivatives 4.0 Internacional
Excepto si se señala otra cosa, la licencia del ítem se describe como Attribution-NonCommercial-NoDerivatives 4.0 Internacional