Stereo-Camera–LiDAR Calibration for Autonomous Driving

Loading...
Thumbnail Image
Journal Title
Journal ISSN
Volume Title
Sähkötekniikan korkeakoulu | Master's thesis
Date
2021-08-23
Department
Major/Subject
ICT Innovation - EIT Digital Master School, Autonomous Systems (AUS)
Mcode
ELEC3055
Degree programme
Master's Programme in ICT Innovation
Language
en
Pages
62+7
Series
Abstract
Perception is one of the key factors to successful self-driving. According to recent studies in developing perception 3D range scanners combined with stereo camera vision are the most utilized sensors in autonomous vehicle perception systems. To enable accurate perception, the sensors must be calibrated before the sensor data can be fused. Calibration minimizes measurement errors caused by the nonidealities of individual sensors and errors caused by the transformation between different sensor frames. This thesis presents camera-LiDAR calibration, synchronisation, and data fusion techniques. It can be argued that the quality of data is more important to the calibration than the actual optimization algorithms, therefore, one challenge addressed in this thesis is accurate data collection with different calibration targets and result validation with different optimization algorithms. We estimated the vehicle windshield effect on camera calibration and show that the error caused by the windshield can be decreased by using more complex distortion models than the standard model. Synchronisation is required to ensure that sensors provide measurements at the same time. The sensor data used in this thesis was synchronized by using an external trigger signal from a GNSS receiver. The camera-LiDAR extrinsic calibration was performed using synchronised 3D-2D (LiDAR points and camera pixels) and 3D-3D (LiDAR points and stereo camera) point correspondences. This comparison demonstrates that the best method to estimate camera-LiDAR extrinsic parameters is to use 3D-2D point correspondences. Moreover, a comparison between camera-based and LiDAR 3D reconstruction is presented. Due to different sensors viewpoint, some data points are occluded, therefore, we propose a camera-LiDAR occlusion handling algorithm to remove occluded points. The quality of the calibration is demonstrated visually, by fusing and aligning the LiDAR point cloud and the image.
Description
Supervisor
Särkkä, Simo
Thesis advisor
Manninen, Petri
Hyyti, Heikki
Keywords
camera, LiDAR, calibration, synchronisation, stereo vision, 3D reconstruction
Other note
Citation