Exploiting semantic knowledge for robot object recognition

Research Projects

Organizational Units

Journal Issue

Abstract

This paper presents a novel approach that exploits semantic knowledge to enhance the object recognition capability of autonomous robots. Semantic knowledge is a rich source of information, naturally gathered from humans (elicitation), which can encode both objects’ geometrical/appearance proper- ties and contextual relations. This kind of information can be exploited in a variety of robotics skills, especially for robots performing in human environ- ments. In this paper we propose the use of semantic knowledge to eliminate the need of collecting large datasets for the training stages required in typ- ical recognition approaches. Concretely, semantic knowledge encoded in an ontology is used to synthetically and effortless generate an arbitrary number of training samples for tuning Probabilistic Graphical Models (PGMs). We then employ these PGMs to classify patches extracted from 3D point clouds gathered from office environments within the UMA-offices dataset, achieving a ∼ 90% of recognition success, and from office and home scenes within the NYU2 dataset, yielding a success of ∼ 81% and ∼ 69.5% respectively. Addi- tionally, a comparison with state-of-the-art recognition methods also based on graphical models has been carried out, revealing that our semantic-based training approach can compete with, and even outperform, those trained with a considerable number of real samples.

Description

We are very grateful to our colleague E. Fernandez-Moral for providing us the implementation of the plane-based mapping algorithm, as well as for his support during the collection of the office dataset used to evaluate our method.

Bibliographic citation

José-Raúl Ruiz-Sarmiento, Cipriano Galindo, Javier Gonzalez-Jimenez, Exploiting semantic knowledge for robot object recognition, Knowledge-Based Systems, Volume 86, 2015, Pages 131-142, ISSN 0950-7051, https://doi.org/10.1016/j.knosys.2015.05.032

Collections

Endorsement

Review

Supplemented By

Referenced by