Search for items in this repository.
Homography-based ground plane detection using a single on-board camera
Arróspide Laborda, Jon and Salgado Álvarez de Sotomayor, Luis and Nieto Doncel, Marcos and Mohedano del Pozo, Raúl
Homography-based ground plane detection using a single on-board camera.
"IET Intelligent Transport Systems", v. 4
This study presents a robust method for ground plane detection in vision-based systems with a non-stationary camera. The proposed method is based on the reliable estimation of the homography between ground planes in successive images. This homography is computed using a feature matching approach, which in contrast to classical approaches to on-board motion estimation does not require explicit ego-motion calculation. As opposed to it, a novel homography calculation method based on a linear estimation framework is presented. This framework provides predictions of the ground plane transformation matrix that are dynamically updated with new measurements. The method is specially suited for challenging environments, in particular traffic scenarios, in which the information is scarce and the homography computed from the images is usually inaccurate or erroneous. The proposed estimation framework is able to remove erroneous measurements and to correct those that are inaccurate, hence producing a reliable homography estimate at each instant. It is based on the evaluation of the difference between the predicted and the observed transformations, measured according to the spectral norm of the associated matrix of differences. Moreover, an example is provided on how to use the information extracted from ground plane estimation to achieve object detection and tracking. The method has been successfully demonstrated for the detection of moving vehicles in traffic environments.
Check whether the anglo-saxon journal in which you have published an article allows you to also publish it under open access.
Check whether the spanish journal in which you have published an article allows you to also publish it under open access.