High-fidelity 3D reconstruction for planetary exploration
| dc.contributor.author | Martínez-Petersen, Alfonso | |
| dc.contributor.author | Gerdes, Levin | |
| dc.contributor.author | Rodríguez-Martínez, David | |
| dc.contributor.author | Pérez-del-Pulgar-Mancebo, Carlos Jesús | |
| dc.date.accessioned | 2026-02-18T09:16:53Z | |
| dc.date.issued | 2026-02-11 | |
| dc.departamento | Instituto Universitario de Investigación en Ingeniería Mecatrónica y Sistemas Ciberfísicos | |
| dc.departamento | Ingeniería de Sistemas y Automática | |
| dc.description.abstract | Planetary exploration increasingly relies on autonomous robotic systems capable of perceiving, interpreting, and reconstructing their surroundings in the absence of global positioning or real-time communication with Earth. Rovers operating on planetary surfaces must navigate under severe environmental constraints, limited visual redundancy, and communication delays, making onboard spatial awareness and visual localization key components for mission success. Traditional techniques based on Structure-from-Motion (SfM) or Simultaneous Localization and Mapping (SLAM) provide geometric consistency but struggle to capture radiometric detail or to scale efficiently in unstructured, low-texture terrains typical of extraterrestrial environments. This work explores the integration of radiance field-based methods -specifically Neural Radiance Fields (NeRF) and Gaussian Splatting- into a unified, automated environment reconstruction pipeline for planetary robotics. Our system combines the Nerfstudio and COLMAP frameworks with a ROS2-compatible workflow capable of processing raw rover data directly from rosbag recordings. This approach enables the generation of dense, photorealistic, and metrically consistent 3D representations from minimal visual input, supporting improved perception and planning for autonomous systems operating in planetary-like conditions. The resulting pipeline establishes a foundation for future research in radiance field–based mapping, bridging the gap between geometric and neural representations in planetary exploration. | |
| dc.identifier.uri | https://hdl.handle.net/10630/45531 | |
| dc.language.iso | eng | |
| dc.publisher | IEEE | |
| dc.relation.eventdate | 2026 | |
| dc.relation.eventplace | Granada, Spain | |
| dc.relation.eventtitle | IEEE Conference on Artificial Intelligence (CAI) | |
| dc.relation.projectID | PID2024-160373OB-C21 | |
| dc.relation.projectID | 4000140043/22/NL/GLC/ces | |
| dc.relation.projectID | PPRO-IUI-2023-02 | |
| dc.rights | Attribution 4.0 International | en |
| dc.rights.accessRights | open access | |
| dc.rights.uri | http://creativecommons.org/licenses/by/4.0/ | |
| dc.subject | Inteligencia artificial | |
| dc.subject.other | planetary robotics | |
| dc.subject.other | Exploration | |
| dc.subject.other | 3D reconstruction | |
| dc.subject.other | Situational awareness | |
| dc.subject.other | Artificial intelligence | |
| dc.subject.other | NeRF | |
| dc.subject.other | Gaussian splatting | |
| dc.subject.other | ROS | |
| dc.title | High-fidelity 3D reconstruction for planetary exploration | |
| dc.type | conference output | |
| dspace.entity.type | Publication | |
| relation.isAuthorOfPublication | fdab044e-453f-40cc-bc3a-4c884f9e63b0 | |
| relation.isAuthorOfPublication.latestForDiscovery | fdab044e-453f-40cc-bc3a-4c884f9e63b0 |
Files
Original bundle
1 - 1 of 1
Loading...
- Name:
- IEEE_CAI_2026___High_fidelity_3D_reconstruction_for_planetary_exploration.pdf
- Size:
- 8.78 MB
- Format:
- Adobe Portable Document Format

