Texto completo
Vista Previa |
PDF (Portable Document Format)
- Se necesita un visor de ficheros PDF, como GSview, Xpdf o Adobe Acrobat Reader
Descargar (965kB) | Vista Previa |
ORCID: https://orcid.org/0000-0003-4102-5899, Sánchez López, José Luis
ORCID: https://orcid.org/0000-0001-5018-0925, Sampedro Pérez, Carlos
ORCID: https://orcid.org/0000-0003-2414-2284, Bavle, Hriday, Molina González, Martín
ORCID: https://orcid.org/0000-0001-7145-1974 and Campoy Cervera, Pascual
ORCID: https://orcid.org/0000-0002-9894-2009
(2016).
Natural User Interfaces for Human-Drone Multi-Modal Interaction.
En: "2016 International Conference on Unmanned Aircraft Systems (ICUAS)", 7 Junio 2016, Miami Marriott Biscayne Bay, Miami, FL, USA.
| Título: | Natural User Interfaces for Human-Drone Multi-Modal Interaction |
|---|---|
| Autor/es: |
|
| Tipo de Documento: | Ponencia en Congreso o Jornada (Artículo) |
| Título del Evento: | 2016 International Conference on Unmanned Aircraft Systems (ICUAS) |
| Fechas del Evento: | 7 Junio 2016 |
| Lugar del Evento: | Miami Marriott Biscayne Bay, Miami, FL, USA |
| Título del Libro: | Proceedings of 2016 International Conference on Unmanned Aircraft Systems (ICUAS) |
| Título de Revista/Publicación: | Proceedings of 2016 International Conference on Unmanned Aircraft Systems (ICUAS) |
| Fecha: | 7 Junio 2016 |
| Materias: | |
| ODS: | |
| Escuela: | E.T.S. de Ingenieros Informáticos (UPM) |
| Departamento: | Inteligencia Artificial |
| Licencias Creative Commons: | Ninguna |
Vista Previa |
PDF (Portable Document Format)
- Se necesita un visor de ficheros PDF, como GSview, Xpdf o Adobe Acrobat Reader
Descargar (965kB) | Vista Previa |
Personal drones are becoming part of every day life. To fully integrate them into society, it is crucial to design safe and intuitive ways to interact with these aerial systems. The recent advances on User-Centered Design (UCD) applied to Natural User Interfaces (NUIs) intend to make use of human innate features, such as speech, gestures and vision to interact with technology in the way humans would with one another. In this paper, a Graphical User Interface (GUI) and several NUI methods are studied and implemented, along with computer vision techniques, in a single software framework for aerial robotics called Aerostack which allows for intuitive and natural human-quadrotor interaction in indoor GPS-denied environments. These strategies include speech, body position, hand gesture and visual marker interactions used to directly command tasks to the drone. The NUIs presented are based on devices like the Leap Motion Controller, microphones and small size monocular on-board cameras which are unnoticeable to the user. Thanks to this UCD perspective, the users can choose the most intuitive and effective type of interaction for their application. Additionally, the strategies proposed allow for multi-modal interaction between multiple users and the drone by being able to integrate several of these interfaces in one single application as is shown in various real flight experiments performed with non-expert users.
| ID de Registro: | 43928 |
|---|---|
| Identificador DC: | https://oa.upm.es/43928/ |
| Identificador OAI: | oai:oa.upm.es:43928 |
| Depositado por: | Martin Molina |
| Depositado el: | 21 Nov 2016 07:02 |
| Ultima Modificación: | 08 Jul 2025 08:53 |
Publicar en el Archivo Digital desde el Portal Científico