Most of the existing LiDAR-inertial navigation systems are based on frame-to-map registrations, leading to inconsistency in state estimation. The newest solid-state LiDAR with a non-repetitive scanning pattern makes it possible to achieve a consistent LiDAR-inertial estimator by employing a frame-to-frame data association. Hence, we propose a Consistent frame-to-frame LiDAR-inertial navigation system (FF-LINS) for solid-state LiDARs. With the INS-centric LiDAR frame processing, the keyframe point-cloud map is built using the accumulated point clouds to construct the frame-to-frame data association. The LiDAR frame-to-frame and the inertial measurement unit (IMU) preintegration measurements are tightly integrated using the factor graph optimization, with online calibration of the LiDAR-IMU extrinsic and time-delay parameters. The experiments on the public and private datasets demonstrate that the proposed FF-LINS achieves superior accuracy and robustness than the state-of-the-art systems. Besides, the LiDAR-IMU extrinsic and time-delay parameters are estimated effectively, and the online calibration notably improves the pose accuracy.
Authors: Hailiang Tang, Xiaoji Niu, and Tisheng Zhang from the Integrated and Intelligent Navigation (i2Nav) Group, Wuhan University.
Related Paper:
- Hailiang Tang, Tisheng Zhang, Xiaoji Niu, Liqiang Wang, Linfu Wei, and Jingnan Liu, “FF-LINS: A Consistent Frame-to-Frame Solid-State-LiDAR-Inertial State Estimator,” IEEE Robotics and Automation Letters, 2023.
- Hailiang Tang, Tisheng Zhang, Xiaoji Niu, Liqiang Wang, and Jingnan Liu, "LE-VINS: A Robust Solid-State-LiDAR-Enhanced Visual-Inertial Navigation System for Low-Speed Robots," IEEE Transactions on Instrumentation and Measurement, 2023.
- Xiaoji Niu, Hailiang Tang, Tisheng Zhang, Jing Fan, and Jingnan Liu, “IC-GVINS: A Robust, Real-time, INS-Centric GNSS-Visual-Inertial Navigation System,” IEEE Robotics and Automation Letters, 2023.
- Hailiang Tang, Tisheng Zhang, Xiaoji Niu, Jing Fan, and Jingnan Liu, “Impact of the Earth Rotation Compensation on MEMS-IMU Preintegration of Factor Graph Optimization,” IEEE Sensors Journal, 2022.
Contacts:
- For any technique problem, you can send an email to Dr. Hailiang Tang ([email protected]).
- For Chinese users, we also provide a QQ group (481173293) for discussion. You are required to provide your organization and name.
We recommend you use Ubuntu 18.04 or Ubuntu 20.04 with the newest compiler (gcc>=8.0 or clang>=6.0).
# gcc-8
sudo apt install gcc-8 g++-8
# Clang
# sudo apt install clang
Follow ROS Melodic installation instructions for Ubuntu 18.04 and ROS Noetic installation instructions for Ubuntu 20.04.
Threading Building Blocks (TBB) are used for parallel point clouds processing. We recommend you use oneTBB, and install the latest released version. You should install oneTBB before Ceres Solver.
We use Ceres Solver (>=2.1.0) to solve the non-linear least squares problem in FF-LINS. Please follow Ceres installation instructions.
The dependencies Eigen (>=3.3.7), TBB, glog (>=0.4.0) are also used in FF-LINS. You can install them as follows:
sudo apt install libeigen3-dev libgoogle-glog-dev libtbb-dev
If the version cannot be satisfied in your system repository, you should build them from the source code.
The yaml-cpp is employed for reading configurations. It can be installed as:
sudo apt install libyaml-cpp-dev
# Make workspace directory
mkdir ~/lins_ws && cd ~/lins_ws
mkdir src && cd src
# Clone the repository into src directory
git clone https://github.com/i2Nav-WHU/FF-LINS.git
# To lins_ws directory
cd ..
# Build the source code using catkin_make
catkin_make -j8 -DCMAKE_BUILD_TYPE=Release -DCMAKE_C_COMPILER=gcc-8 -DCMAKE_CXX_COMPILER=g++-8
If you have already downloaded the open-sourced dataset, run the following commands.
# Open a terminal and source the workspace environments
# For bash
source ~/lins_ws/devel/setup.bash
# For zsh
# source ~/lins_ws/devel/setup.zsh
# Run FF-LINS node
# 1. Download the dataset.
# 2. Change the outputpath in ff_lins_robot.yaml.
# 3. Change the path in the follwoing command.
# 4. Run the follwoing command.
roslaunch ff_lins ff_lins_read.launch configfile:=path/ff_lins_robot.yaml bagfile:=path/park/park.bag
The employed messages in FF-LINS are as follows:
Sensor | Message | Default Topic |
---|---|---|
Solid-State LiDAR | livox_ros_driver/CustomMsg | /livox/lidar |
IMU | sensor_msgs/Imu | /livox/imu |
The IMU should be in the front-right-down format in FF-LINS.
Sequence | Time length (seconds) | Trajectory Length (km) | Baidu Cloud Link |
---|---|---|---|
Schloss-1 | 634 | 0.67 | Schloss-1.bag |
Schloss-2 | 736 | 1.11 | Schloss-2.bag |
East | 1251 | 3.64 | East.bag |
Sequence | Time length (seconds) | Trajectory Length (km) | Baidu Cloud Link |
---|---|---|---|
hku_main_building | 1160 | 0.97 | hku_main_building.bag |
hkust_campus_00 | 1060 | 1.33 | hkust_campus_00.bag |
hkust_campus_01 | 1149 | 1.46 | hkust_campus_01.bag |
We also open source our self-collected robot dataset.
Sequence | Time length (seconds) | Trajectory Length (km) | Baidu Cloud Link |
---|---|---|---|
park | 1326 | 1.46 | park.bag |
You can run FF-LINS with your self-collected dataset. Keep in mind the following notes:
- You should prepare the Solid-State LiDAR and the IMU data in a ROS bag;
- The IMU data should be in the front-right-down format;
- Modify the topic names in the ff_lins_read.launch or the ff_lins_play.launch file;
- Modify the parameters in the configuration file.
We use evo to evaluate the TUM trajectory files. We also provide some useful scripts (evaluate_odometry) for evaluation.
We thanks the following projects for the helps in developing and evaluating the FF-LINS:
- IC-GVINS: A Robust, Real-time, INS-Centric GNSS-Visual-Inertial Navigation System
- OB_GINS: An Optimization-Based GNSS/INS Integrated Navigation System
- evo: Python package for the evaluation of odometry and SLAM
The source code is released under GPLv3 license.
We are still working on improving the codes. For any technical issues, please contact Dr. Hailiang Tang ([email protected]) or open an issue at this repository.
For commercial usage, please contact Prof. Xiaoji Niu ([email protected]).