Enhanced nighttime vehicle detection for on-board processing

Encío, Leyre ORCID: https://orcid.org/0000-0002-5534-3487, Fuertes Coiras, Daniel ORCID: https://orcid.org/0000-0002-5746-2199, Blanco Adán, Carlos Roberto del ORCID: https://orcid.org/0000-0003-0618-3488, Aguilar, Iu, Pérez Benito, Cristina ORCID: https://orcid.org/0000-0002-8470-7972, Jevtić, Aleksandar ORCID: https://orcid.org/0000-0003-4229-2606, Jaureguizar Núñez, Fernando ORCID: https://orcid.org/0000-0001-6449-5151 and García Santos, Narciso ORCID: https://orcid.org/0000-0002-0397-894X (2025). Enhanced nighttime vehicle detection for on-board processing. "IEEE Access", v. 13 ; pp. 44817-44835. ISSN 2169-3536. https://doi.org/10.1109/access.2025.3548837.

Descripción

Título: Enhanced nighttime vehicle detection for on-board processing
Autor/es:
Tipo de Documento: Artículo
Título de Revista/Publicación: IEEE Access
Fecha: 1 Enero 2025
ISSN: 2169-3536
Volumen: 13
Materias:
ODS:
Palabras Clave Informales: Accuracy; cameras; computer vision; convolutional neural networks; deep learning; detectors; feature extraction; lighting; network; nighttime detection; perception; radar; shape; systems; training; urban areas; vehicle detection; yolo
Escuela: E.T.S.I. Telecomunicación (UPM)
Departamento: Otro
Licencias Creative Commons: Reconocimiento

Texto completo

[thumbnail of 10344230.pdf] PDF (Portable Document Format) - Se necesita un visor de ficheros PDF, como GSview, Xpdf o Adobe Acrobat Reader
Descargar (6MB)

Resumen

Nighttime vehicle detection poses significant challenges, particularly in scenarios with limited lighting, where visibility is often compromised. To address this problem, this paper proposes a novel nighttime vehicle detection system that dynamically adapts to extreme lighting conditions, ranging from bright daytime scenarios to challenging nighttime conditions where the vehicle's appearance may be entirely lost. For this purpose, a multi-granularity detection approach is adopted, automatically combining bounding-box and point-based representations depending on the vehicle's visibility. Bounding-box detections, reporting location and size information, are selected when the vehicle appearance is mostly visible, such as in daytime or urban nighttime scenarios with sufficient artificial street illumination. Point-based detections, indicating only location information, are used when the vehicle's appearance is not discernible, such as in rural nighttime scenarios with little or no street illumination. The system is designed as a multi-head neural network built on a shared Hourglass backbone that accepts bounding-box and point-based annotations for training and can automatically predict, depending on the scenario, vehicle bounding boxes or point-based predictions. Extensive evaluations on a combined dataset of BDD100K and PVDN demonstrate that the proposed system achieves higher detection accuracy and robustness compared to existing methods, with mean Average Precision (mAP) scores of 0.7134 on BDD100K, 0.6621 on PVDN, and 0.6814 on the combined dataset. Additionally, a self-acquired dataset, FNTVD, further enhances the evaluation by providing real-world driving conditions. The system also achieves real-time performance at 45.45 FPS, making it suitable for practical applications.

Proyectos asociados

Tipo
Código
Acrónimo
Responsable
Título
Gobierno de España
PTAS-20211011
Sin especificar
Sin especificar
Percepción Inteligente para los Vehículos Autónomos y Conectados
Gobierno de España
PID2020-115132RB
SARAOS
Sin especificar
ShAred Reality for Advanced sOcial communicationS
Gobierno de España
PID2023-148922OA-I00
Sin especificar
Sin especificar
Enriched and Ecologically sustainable VOlumetric communiCATIONS

Más información

ID de Registro: 88901
Identificador DC: https://oa.upm.es/88901/
Identificador OAI: oai:oa.upm.es:88901
URL Portal Científico: https://portalcientifico.upm.es/es/ipublic/item/10344230
Identificador DOI: 10.1109/access.2025.3548837
URL Oficial: https://ieeexplore.ieee.org/document/10915593
Depositado por: iMarina Portal Científico
Depositado el: 30 Abr 2025 13:23
Ultima Modificación: 30 Abr 2025 13:23