eprintid: 49858 rev_number: 19 eprint_status: archive userid: 1903 dir: disk0/00/04/98/58 datestamp: 2018-03-24 10:07:28 lastmod: 2018-03-24 10:07:28 status_changed: 2018-03-24 10:07:28 type: conference_item metadata_visibility: show creators_name: Martín Vicario, Celia creators_name: Oropesa García, Ignacio creators_name: Sánchez Margallo, Juan Antonio creators_name: Sánchez Margallo, Francisco Miguel creators_name: Gómez Aguilera, Enrique J. creators_name: Sánchez González, Patricia creators_id: ioropesa@gbt.tfo.upm.es creators_id: egomez@gbt.tfo.upm.es creators_id: psanchez@gbt.tfo.upm.es title: Automatic detection of surgical instruments' state in laparoscopic video images using neural networks publisher: Universidad del País Vasco rights: by-nc-nd ispublished: pub subjects: electronica subjects: medicina subjects: telecomunicaciones full_text_status: public pres_type: paper abstract: Software-based solutions such as virtual reality simulators and serious games can be useful assets for training minimally invasive surgery technical skills. However, their high cost and lack of realism/fidelity can sometimes be a drawback for their incorporation in training facilities. In this sense, the hardware interface plays an important role as the physical connection between the learner and the virtual world. The EVA Tracking System, provides computer vision-based information about the position and the orientation of the instruments in an expensive and unobtrusive manner, but lacks information about the aperture state of the clamps, which limits the system¿s functionalities. This article presents a new solution for instrument¿s aperture state detection using artificial vision and machine learning techniques. To achieve this goal, videos in a laparoscopic training box are recorded to obtain a data set. In each frame, the instrument clamp is segmented in a region of interest by means of color markers. The classifier is modeled using an Artificial Neural Network. The trained prediction model obtains accuracy results of 94% in the validation dataset and an error of 6% in independent evaluation video sequences. Results show that the model provides a competent solution to clamp¿s aperture state detection. Future works will address the integration of the model into the EVA and a virtual environment, the KTS serious game. date_type: published date: 2017-12 place_of_pub: Bilbao pagerange: 293-296 event_title: XXXV Congreso de la Sociedad Española de Ingeniería Biomédica (CASEIB 2017) event_location: Bilbao, España event_dates: 29/11/2017 - 01/12/2017 event_type: conference institution: Telecomunicacion department: Tecnologia_fotonica_2014 refereed: TRUE isbn: 978-84-9082-797-0 book_title: Libro de Actas del XXXV Congreso de la Sociedad Española de Ingeniería Biomédica (CASEIB 2017) citation: Martín Vicario, Celia and Oropesa García, Ignacio and Sánchez Margallo, Juan Antonio and Sánchez Margallo, Francisco Miguel and Gómez Aguilera, Enrique J. and Sánchez González, Patricia (2017). Automatic detection of surgical instruments' state in laparoscopic video images using neural networks. In: "XXXV Congreso de la Sociedad Española de Ingeniería Biomédica (CASEIB 2017)", 29/11/2017 - 01/12/2017, Bilbao, España. ISBN 978-84-9082-797-0. pp. 293-296. document_url: https://oa.upm.es/49858/1/INVE_MEM_2017_270648.pdf