Mixed-reality for quadruped-robotic guidance in SAR tasks

Cruz Ulloa, Christyan ORCID: https://orcid.org/0000-0003-2824-6611, Cerro Giner, Jaime del ORCID: https://orcid.org/0000-0003-4893-2571 and Barrientos Cruz, Antonio ORCID: https://orcid.org/0000-0003-1691-3907 (2023). Mixed-reality for quadruped-robotic guidance in SAR tasks. "Journal of Computational Design and Engineering", v. 10 (n. 4); pp. 1479-1489. ISSN 2288-5048. https://doi.org/10.1093/jcde/qwad061.

Descripción

Título: Mixed-reality for quadruped-robotic guidance in SAR tasks
Autor/es:
Tipo de Documento: Artículo
Título de Revista/Publicación: Journal of Computational Design and Engineering
Fecha: 26 Junio 2023
ISSN: 2288-5048
Volumen: 10
Número: 4
Materias:
ODS:
Palabras Clave Informales: robotics vision, quadruped robot, mixed reality, hololens, search and rescue
Escuela: E.T.S.I. Industriales (UPM)
Departamento: Automática, Ingeniería Eléctrica y Electrónica e Informática Industrial
Grupo Investigación UPM: Robótica y Cibernética RobCib
Licencias Creative Commons: Reconocimiento - Sin obra derivada - No comercial

Texto completo

[thumbnail of 92639.pdf] PDF (Portable Document Format) - Se necesita un visor de ficheros PDF, como GSview, Xpdf o Adobe Acrobat Reader
Descargar (2MB)

Resumen

"In recent years, exploration tasks in disaster environments, victim localization and primary assistance have been the main focuses of Search and Rescue (SAR) Robotics. Developing new technologies in Mixed Reality (M-R) and legged robotics has taken a big step in developing robust field applications in the Robotics field. This article presents MR-RAS (Mixed-Reality for Robotic Assistance), which aims to assist rescuers and protect their integrity when exploring post-disaster areas (against collapse, electrical, and toxic risks) by facilitating the robot’s gesture guidance and allowing them to manage interest visual information of the environment. Thus, ARTU-R (A1 Rescue Tasks UPM Robot) quadruped robot has been equipped with a sensory system (lidar, thermal, and RGB-D cameras) to validate this proof of concept. On the other hand, Human-Robot interaction is executed by using the Hololens glasses. This work’s main contribution is the implementation and evaluation of a Mixed-Reality system based on a ROS-Unity solution, capable of managing at a high level the guidance of a complex legged robot through different interest zones (defined by a Neural Network and a vision system) of a post-disaster environment (PDE). The robot’s main tasks at each point visited involve detecting victims through thermal, RGB imaging, and neural networks and assisting victims with medical equipment. Tests have been carried out in scenarios that recreate the conditions of PDE (debris, simulation of victims, etc.). An average efficiency improvement of 48% has been obtained when using the immersive interface and a time optimization of 21.4% compared to conventional interfaces. The proposed method has proven to improve rescuers’ immersive experience of controlling a complex robotic system."

Proyectos asociados

Tipo
Código
Acrónimo
Responsable
Título
Gobierno de España
PID2019-105808RB-I00
TASAR
Antonio Barrientos
TASAR: Equipo de Robots para Misiones para Búsqueda y Rescate

Más información

ID de Registro: 92639
Identificador DC: https://oa.upm.es/92639/
Identificador OAI: oai:oa.upm.es:92639
URL Portal Científico: https://portalcientifico.upm.es/es/ipublic/item/10091631
Identificador DOI: 10.1093/jcde/qwad061
URL Oficial: https://academic.oup.com/jcde/article/10/4/1479/72...
Depositado por: Antonio Barrientos
Depositado el: 07 Ene 2026 19:34
Ultima Modificación: 07 Ene 2026 19:34