Automatic detection of surgical instruments' state in laparoscopic video images using neural networks

Martín Vicario, Celia and Oropesa García, Ignacio and Sánchez Margallo, Juan Antonio and Sánchez Margallo, Francisco Miguel and Gómez Aguilera, Enrique J. and Sánchez González, Patricia (2017). Automatic detection of surgical instruments' state in laparoscopic video images using neural networks. In: "XXXV Congreso de la Sociedad Española de Ingeniería Biomédica (CASEIB 2017)", 29/11/2017 - 01/12/2017, Bilbao, España. ISBN 978-84-9082-797-0. pp. 293-296.

Description

Title: Automatic detection of surgical instruments' state in laparoscopic video images using neural networks
Author/s:
  • Martín Vicario, Celia
  • Oropesa García, Ignacio
  • Sánchez Margallo, Juan Antonio
  • Sánchez Margallo, Francisco Miguel
  • Gómez Aguilera, Enrique J.
  • Sánchez González, Patricia
Item Type: Presentation at Congress or Conference (Article)
Event Title: XXXV Congreso de la Sociedad Española de Ingeniería Biomédica (CASEIB 2017)
Event Dates: 29/11/2017 - 01/12/2017
Event Location: Bilbao, España
Title of Book: Libro de Actas del XXXV Congreso de la Sociedad Española de Ingeniería Biomédica (CASEIB 2017)
Date: December 2017
ISBN: 978-84-9082-797-0
Subjects:
Faculty: E.T.S.I. Telecomunicación (UPM)
Department: Tecnología Fotónica y Bioingeniería
Creative Commons Licenses: Recognition - No derivative works - Non commercial

Full text

[img]
Preview
PDF - Requires a PDF viewer, such as GSview, Xpdf or Adobe Acrobat Reader
Download (291kB) | Preview

Abstract

Software-based solutions such as virtual reality simulators and serious games can be useful assets for training minimally invasive surgery technical skills. However, their high cost and lack of realism/fidelity can sometimes be a drawback for their incorporation in training facilities. In this sense, the hardware interface plays an important role as the physical connection between the learner and the virtual world. The EVA Tracking System, provides computer vision-based information about the position and the orientation of the instruments in an expensive and unobtrusive manner, but lacks information about the aperture state of the clamps, which limits the system¿s functionalities. This article presents a new solution for instrument¿s aperture state detection using artificial vision and machine learning techniques. To achieve this goal, videos in a laparoscopic training box are recorded to obtain a data set. In each frame, the instrument clamp is segmented in a region of interest by means of color markers. The classifier is modeled using an Artificial Neural Network. The trained prediction model obtains accuracy results of 94% in the validation dataset and an error of 6% in independent evaluation video sequences. Results show that the model provides a competent solution to clamp¿s aperture state detection. Future works will address the integration of the model into the EVA and a virtual environment, the KTS serious game.

More information

Item ID: 49858
DC Identifier: http://oa.upm.es/49858/
OAI Identifier: oai:oa.upm.es:49858
Deposited by: Memoria Investigacion
Deposited on: 24 Mar 2018 10:07
Last Modified: 24 Mar 2018 10:07
  • Logo InvestigaM (UPM)
  • Logo GEOUP4
  • Logo Open Access
  • Open Access
  • Logo Sherpa/Romeo
    Check whether the anglo-saxon journal in which you have published an article allows you to also publish it under open access.
  • Logo Dulcinea
    Check whether the spanish journal in which you have published an article allows you to also publish it under open access.
  • Logo de Recolecta
  • Logo del Observatorio I+D+i UPM
  • Logo de OpenCourseWare UPM