LiDAR-camera calibration plays a crucial role in autonomous driving. However, operation-induced factors such as physical vibrations and temperature variations degrade the pre-deployment calibration accuracy, leading to the environmental perception performance deterioration. Recent recalibration methods have achieved online calibration without a target board by leveraging the relative attributes of LiDAR and camera. Nevertheless, we proposes a novel framework for LiDAR-camera online calibration which employs a Transformer network to learn crucial interactions between cameras and LiDAR sensors. Additionally, our novel framework design enables the effective calibration by utilizing correspondence point information between the two sensors. This allows the utilization of global spatial context and achieves high performance by integrating information across modalities. Experimental results indicate that our method demonstrates superior performance compared to state-of-the-art benchmarks.
LiDAR-camera Online Calibration by Representing Local Feature and Global Spatial Context
IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2024