Automatic detection of surgical instruments' state in laparoscopic video images using neural networks

Martín Vicario, Celia; Oropesa García, Ignacio; Sánchez Margallo, Juan Antonio; Sánchez Margallo, Francisco Miguel; Gómez Aguilera, Enrique J. y Sánchez González, Patricia (2017). Automatic detection of surgical instruments' state in laparoscopic video images using neural networks. En: "XXXV Congreso de la Sociedad Española de Ingeniería Biomédica (CASEIB 2017)", 29/11/2017 - 01/12/2017, Bilbao, España. ISBN 978-84-9082-797-0. pp. 293-296.

Descripción

Título: Automatic detection of surgical instruments' state in laparoscopic video images using neural networks
Autor/es:
  • Martín Vicario, Celia
  • Oropesa García, Ignacio
  • Sánchez Margallo, Juan Antonio
  • Sánchez Margallo, Francisco Miguel
  • Gómez Aguilera, Enrique J.
  • Sánchez González, Patricia
Tipo de Documento: Ponencia en Congreso o Jornada (Artículo)
Título del Evento: XXXV Congreso de la Sociedad Española de Ingeniería Biomédica (CASEIB 2017)
Fechas del Evento: 29/11/2017 - 01/12/2017
Lugar del Evento: Bilbao, España
Título del Libro: Libro de Actas del XXXV Congreso de la Sociedad Española de Ingeniería Biomédica (CASEIB 2017)
Fecha: Diciembre 2017
ISBN: 978-84-9082-797-0
Materias:
Escuela: E.T.S.I. Telecomunicación (UPM)
Departamento: Tecnología Fotónica y Bioingeniería
Licencias Creative Commons: Reconocimiento - Sin obra derivada - No comercial

Texto completo

[img]
Vista Previa
PDF (Document Portable Format) - Se necesita un visor de ficheros PDF, como GSview, Xpdf o Adobe Acrobat Reader
Descargar (291kB) | Vista Previa

Resumen

Software-based solutions such as virtual reality simulators and serious games can be useful assets for training minimally invasive surgery technical skills. However, their high cost and lack of realism/fidelity can sometimes be a drawback for their incorporation in training facilities. In this sense, the hardware interface plays an important role as the physical connection between the learner and the virtual world. The EVA Tracking System, provides computer vision-based information about the position and the orientation of the instruments in an expensive and unobtrusive manner, but lacks information about the aperture state of the clamps, which limits the system¿s functionalities. This article presents a new solution for instrument¿s aperture state detection using artificial vision and machine learning techniques. To achieve this goal, videos in a laparoscopic training box are recorded to obtain a data set. In each frame, the instrument clamp is segmented in a region of interest by means of color markers. The classifier is modeled using an Artificial Neural Network. The trained prediction model obtains accuracy results of 94% in the validation dataset and an error of 6% in independent evaluation video sequences. Results show that the model provides a competent solution to clamp¿s aperture state detection. Future works will address the integration of the model into the EVA and a virtual environment, the KTS serious game.

Más información

ID de Registro: 49858
Identificador DC: http://oa.upm.es/49858/
Identificador OAI: oai:oa.upm.es:49858
Depositado por: Memoria Investigacion
Depositado el: 24 Mar 2018 10:07
Ultima Modificación: 24 Mar 2018 10:07
  • GEO_UP4
  • Open Access
  • Open Access
  • Sherpa-Romeo
    Compruebe si la revista anglosajona en la que ha publicado un artículo permite también su publicación en abierto.
  • Dulcinea
    Compruebe si la revista española en la que ha publicado un artículo permite también su publicación en abierto.
  • Recolecta
  • InvestigaM
  • Observatorio I+D+i UPM
  • OpenCourseWare UPM