Instrument Detection and Descriptive Gesture Segmentation on a Robotic Surgical Maneuvers Dataset

dc.centroEscuela de Ingenierías Industrialeses_ES
dc.contributor.authorRivas-Blanco, Irene
dc.contributor.authorLópez-Casado, Carmen
dc.contributor.authorHerrera-López, Juan María
dc.contributor.authorCabrera-Villa, José
dc.contributor.authorPérez del Pulgar, Carlos J.
dc.date.accessioned2024-09-20T12:19:38Z
dc.date.available2024-09-20T12:19:38Z
dc.date.created2024
dc.date.issued2024
dc.departamentoIngeniería de Sistemas y Automática
dc.description.abstractLarge datasets play a crucial role in the progression of surgical robotics, facilitating advancements in the fields of surgical task recognition and automation. Moreover, public datasets enable the comparative analysis of various algorithms and methodologies, thereby assessing their effectiveness and performance. The ROSMA (Robotics Surgical Maneuvers) dataset provides 206 trials of common surgical training tasks performed with the da Vinci Research Kit (dVRK). In this work, we extend the ROSMA dataset with two annotated subsets: ROSMAT24, which contains bounding box annotations for instrument detection, and ROSMAG40, which contains high and low-level gesture annotations. We propose an annotation method that provides independent labels for the right-handed tools and the left-handed tools. For instrument identification, we validate our proposal with a YOLOv4 model in two experimental scenarios. We demonstrate the generalization capabilities of the network to detect instruments in unseen scenarios. On the other hand, for gesture segmentation, we propose two label categories: high-level annotations that describe gestures at a maneuvers level, and low-level annotations that describe gestures at a fine-grain level. To validate this proposal, we have designed a recurrent neural network based on a bidirectional long-short term memory layer. We present results for four cross-validation experimental setups, reaching up to a 77.35% mAP.es_ES
dc.description.sponsorshipPID2021-125050OA-I00es_ES
dc.identifier.citationRivas-Blanco, I.; López-Casado, C.; Herrera-López, J.M.; Cabrera-Villa, J.; Pérez-delPulgar, C.J. Instrument Detection and Descriptive Gesture Segmentation on a Robotic Surgical Maneuvers Dataset. Appl. Sci. 2024, 14, 3701. https:// doi.org/10.3390/app14093701es_ES
dc.identifier.doihttps://doi.org/10.3390/app14093701
dc.identifier.urihttps://hdl.handle.net/10630/32775
dc.language.isoenges_ES
dc.publisherMDPIes_ES
dc.rights.accessRightsopen accesses_ES
dc.subjectCirugía - Aparatos e instrumentoses_ES
dc.subjectRobóticaes_ES
dc.subject.otherRobotic datasetes_ES
dc.subject.otherInstrument detectiones_ES
dc.subject.otherGesture segmentationes_ES
dc.subject.otherSurgical roboticses_ES
dc.titleInstrument Detection and Descriptive Gesture Segmentation on a Robotic Surgical Maneuvers Datasetes_ES
dc.typejournal articlees_ES
dc.type.hasVersionVoRes_ES
dspace.entity.typePublication
relation.isAuthorOfPublication02814d70-2bb0-4b1f-956c-3c05c00dcd8d
relation.isAuthorOfPublication.latestForDiscovery02814d70-2bb0-4b1f-956c-3c05c00dcd8d

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
applsci-14-03701-v2.pdf
Size:
4.56 MB
Format:
Adobe Portable Document Format
Description:

Collections