Full text
Preview |
PDF
- Requires a PDF viewer, such as GSview, Xpdf or Adobe Acrobat Reader
Download (2MB) | Preview |
Alvarado Vasquez, Biel-Piero, González Martín, Rubén, Matía Espada, Fernando and Puente Yusty, Paloma de la (2018). Sensor Fusion for Tour-Guide Robot Localization. "Ieee Access", v. 6 ; pp. 78947-78964. ISSN 2169-3536. https://doi.org/10.1109/ACCESS.2018.2885648.
Title: | Sensor Fusion for Tour-Guide Robot Localization |
---|---|
Author/s: |
|
Item Type: | Article |
Título de Revista/Publicación: | Ieee Access |
Date: | 2018 |
ISSN: | 2169-3536 |
Volume: | 6 |
Subjects: | |
Freetext Keywords: | State estimation; extended Kalman filter; indoor localization; sensor fusion; laser localization; visual localization; RFID localization; social robotics |
Faculty: | E.T.S.I. Industriales (UPM) |
Department: | Automática, Ingeniería Eléctrica y Electrónica e Informática Industrial |
Creative Commons Licenses: | Recognition - No derivative works - Non commercial |
Preview |
PDF
- Requires a PDF viewer, such as GSview, Xpdf or Adobe Acrobat Reader
Download (2MB) | Preview |
Doris, the social robot girl, is under development to be employed in museums and trade fairs as a tour guide. External sensorial information must be inputted so that Doris moves around each new location by using landmark identification points that can improve the real localization of the robot in combination with an extended Kalman filter. Doris is equipped with a semantic map that contains several information points such as the building structure, sites that the robot must pass, features (obstacles) of the built environment, and landmark locations. Three additional sensors were installed on Doris: a laser range finder LMS-200, an omnidirectional Mobotix C25 camera, and an RFID system Speedway Revolution 220 by Impinj. The use of these sensors implies the use of different types of landmarks: 35-cm-high circular landmarks, placed on the ground and covered with a reflective laser-detectable material; markers similar to QR codes placed at 250 cm above the ground level that the omnidirectional camera can identify; and RFID detectable dogbone antennas. One contribution is to prove a simple methodology of localization by using sensor fusion with a semantic map, without mapping the whole environment by creating a point cloud map and without using the SLAM technique. Additionally, another contribution for the research is to define a good methodology for a precise sensors calibration. The initial results showed that each sensor functions efficiently, when using only the laser and the camera, due to the low accuracy of the RFID system alone. The final results show the behavior of the robot localization in the presence of people and different objects when both sensors are working at the same time. Occlusions may affect the reflective landmarks or visual markers. Therefore, the sensor fusion is implemented to achieve better robustness in the location estimation.
Item ID: | 54515 |
---|---|
DC Identifier: | https://oa.upm.es/54515/ |
OAI Identifier: | oai:oa.upm.es:54515 |
DOI: | 10.1109/ACCESS.2018.2885648 |
Official URL: | https://ieeexplore.ieee.org/document/8573764 |
Deposited by: | Memoria Investigacion |
Deposited on: | 03 Apr 2019 14:21 |
Last Modified: | 30 Nov 2022 09:00 |