Dataset with Tactile and Kinesthetic Information from a Human Forearm and Its Application to Deep Learning

dc.centroEscuela de Ingenierías Industrialeses_ES
dc.contributor.authorPastor-Martín, Francisco
dc.contributor.authorLin-Yang, Da-hui
dc.contributor.authorGómez-de-Gabriel, Jesús Manuel
dc.contributor.authorGarcía-Cerezo, Alfonso José
dc.date.accessioned2023-02-06T08:25:53Z
dc.date.available2023-02-06T08:25:53Z
dc.date.issued2022-11-12
dc.departamentoIngeniería de Sistemas y Automática
dc.description.abstractThere are physical Human–Robot Interaction (pHRI) applications where the robot has to grab the human body, such as rescue or assistive robotics. Being able to precisely estimate the grasping location when grabbing a human limb is crucial to perform a safe manipulation of the human. Computer vision methods provide pre-grasp information with strong constraints imposed by the field environments. Force-based compliant control, after grasping, limits the amount of applied strength. On the other hand, valuable tactile and proprioceptive information can be obtained from the pHRI gripper, which can be used to better know the features of the human and the contact state between the human and the robot. This paper presents a novel dataset of tactile and kinesthetic data obtained from a robot gripper that grabs a human forearm. The dataset is collected with a three-fingered gripper with two underactuated fingers and a fixed finger with a high-resolution tactile sensor. A palpation procedure is performed to record the shape of the forearm and to recognize the bones and muscles in different sections. Moreover, an application for the use of the database is included. In particular, a fusion approach is used to estimate the actual grasped forearm section using both kinesthetic and tactile information on a regression deep-learning neural network. First, tactile and kinesthetic data are trained separately with Long Short-Term Memory (LSTM) neural networks, considering the data are sequential. Then, the outputs are fed to a Fusion neural network to enhance the estimation. The experiments conducted show good results in training both sources separately, with superior performance when the fusion approach is considered.es_ES
dc.description.sponsorshipThis research was funded by the University of Málaga, the Ministerio de Ciencia, Innovación y Universidades, Gobierno de España, grant number RTI2018-093421-B-I00 and the European Commission, grant number BES-2016-078237. Partial funding for open access charge: Universidad de Málagaes_ES
dc.identifier.citationPastor, F.; Lin-Yang, D.-h.; Gómez-de-Gabriel, J.M.; García-Cerezo, A.J. Dataset with Tactile and Kinesthetic Information from a Human Forearm and Its Application to Deep Learning. Sensors 2022, 22, 8752. https://doi.org/10.3390/s22228752es_ES
dc.identifier.doi10.3390/s22228752
dc.identifier.urihttps://hdl.handle.net/10630/25906
dc.language.isoenges_ES
dc.publisherMDPIes_ES
dc.relation.referenceshttps://hdl.handle.net/10630/39824
dc.rightsAtribución 4.0 Internacional*
dc.rights.accessRightsopen accesses_ES
dc.rights.urihttp://creativecommons.org/licenses/by/4.0/*
dc.subjectAutómatases_ES
dc.subject.otherPhysical human–robot interactiones_ES
dc.subject.otherGrippers for physical human-robot interactiones_ES
dc.subject.otherConvLSTMes_ES
dc.subject.otherHaptic perceptiones_ES
dc.titleDataset with Tactile and Kinesthetic Information from a Human Forearm and Its Application to Deep Learninges_ES
dc.typejournal articlees_ES
dc.type.hasVersionVoRes_ES
dspace.entity.typePublication
relation.isAuthorOfPublicatione12aaab5-66be-4d72-bd9c-36dc69c1f4cf
relation.isAuthorOfPublication111d26c1-efd3-4b8a-a05b-420a796580e0
relation.isAuthorOfPublication.latestForDiscoverye12aaab5-66be-4d72-bd9c-36dc69c1f4cf

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
sensors-22-08752-v2 (2).pdf
Size:
1.55 MB
Format:
Adobe Portable Document Format
Description:

Collections