Natural User Interfaces for Human-Drone Multi-Modal Interaction

Suarez Fernandez, Ramon; Sanchez Lopez, Jose Luis; Sampedro, Carlos; Bavle, Hriday; Molina, Martin y Campoy Cervera, Pascual (2016). Natural User Interfaces for Human-Drone Multi-Modal Interaction. En: "2016 International Conference on Unmanned Aircraft Systems (ICUAS)", 7 Junio 2016, Miami Marriott Biscayne Bay, Miami, FL, USA.

Descripción

Título: Natural User Interfaces for Human-Drone Multi-Modal Interaction
Autor/es:
  • Suarez Fernandez, Ramon
  • Sanchez Lopez, Jose Luis
  • Sampedro, Carlos
  • Bavle, Hriday
  • Molina, Martin
  • Campoy Cervera, Pascual
Tipo de Documento: Ponencia en Congreso o Jornada (Artículo)
Título del Evento: 2016 International Conference on Unmanned Aircraft Systems (ICUAS)
Fechas del Evento: 7 Junio 2016
Lugar del Evento: Miami Marriott Biscayne Bay, Miami, FL, USA
Título del Libro: Proceedings of 2016 International Conference on Unmanned Aircraft Systems (ICUAS)
Título de Revista/Publicación: Proceedings of 2016 International Conference on Unmanned Aircraft Systems (ICUAS)
Fecha: 7 Junio 2016
Materias:
Escuela: E.T.S. de Ingenieros Informáticos (UPM)
Departamento: Inteligencia Artificial
Licencias Creative Commons: Ninguna

Texto completo

[img]
Vista Previa
PDF (Document Portable Format) - Se necesita un visor de ficheros PDF, como GSview, Xpdf o Adobe Acrobat Reader
Descargar (965kB) | Vista Previa

Resumen

Personal drones are becoming part of every day life. To fully integrate them into society, it is crucial to design safe and intuitive ways to interact with these aerial systems. The recent advances on User-Centered Design (UCD) applied to Natural User Interfaces (NUIs) intend to make use of human innate features, such as speech, gestures and vision to interact with technology in the way humans would with one another. In this paper, a Graphical User Interface (GUI) and several NUI methods are studied and implemented, along with computer vision techniques, in a single software framework for aerial robotics called Aerostack which allows for intuitive and natural human-quadrotor interaction in indoor GPS-denied environments. These strategies include speech, body position, hand gesture and visual marker interactions used to directly command tasks to the drone. The NUIs presented are based on devices like the Leap Motion Controller, microphones and small size monocular on-board cameras which are unnoticeable to the user. Thanks to this UCD perspective, the users can choose the most intuitive and effective type of interaction for their application. Additionally, the strategies proposed allow for multi-modal interaction between multiple users and the drone by being able to integrate several of these interfaces in one single application as is shown in various real flight experiments performed with non-expert users.

Proyectos asociados

TipoCódigoAcrónimoResponsableTítulo
Gobierno de EspañaDPI2014-60139-RVA4UAVPascual Campoy CerveraVisual autonomy for UAV in Dynamic Environments

Más información

ID de Registro: 43928
Identificador DC: http://oa.upm.es/43928/
Identificador OAI: oai:oa.upm.es:43928
Depositado por: Martin Molina
Depositado el: 21 Nov 2016 07:02
Ultima Modificación: 21 Nov 2016 07:02
  • Open Access
  • Open Access
  • Sherpa-Romeo
    Compruebe si la revista anglosajona en la que ha publicado un artículo permite también su publicación en abierto.
  • Dulcinea
    Compruebe si la revista española en la que ha publicado un artículo permite también su publicación en abierto.
  • Recolecta
  • e-ciencia
  • Observatorio I+D+i UPM
  • OpenCourseWare UPM