Skip to content
/ MINS Public

An efficient and robust multisensor-aided inertial navigation system with online calibration that is capable of fusing IMU, camera, LiDAR, GPS/GNSS, and wheel sensors. Use cases: VINS/VIO, GPS-INS, LINS/LIO, multi-sensor fusion for localization and mapping (SLAM). This repository also provides multi-sensor simulation and data.

License

Notifications You must be signed in to change notification settings

rpng/MINS

Repository files navigation

MINS

Docker Image CI

An efficient, robust, and tightly-coupled Multisensor-aided Inertial Navigation System (MINS) which is capable of flexibly fusing all five sensing modalities (IMU, wheel encoders, camera, GNSS, and LiDAR) in a filtering fashion by overcoming the hurdles of computational complexity, sensor asynchronicity, and intra-sensor calibration.

Exemplary use case of MINS:

  • VINS (mono, stereo, multi-cam)
  • GPS-IMU (single, multiple)
  • LiDAR-IMU (single, multiple)
  • wheel-IMU
  • Camera-GPS-LiDAR-wheel-IMU or more combinations.

alt text alt text

Key Features

  • Inertial(IMU)-based multi-sensor fusion including wheel odometry and arbitrary numbers of cameras, LiDARs, and GNSSs (+ VICON or loop-closure) for localization.
  • Online calibration of all onboard sensors (check exemplary results).
  • Consistent high-order state on manifold interpolation improved from our prior work (MIMC-VINS) and dynamic cloning strategy for light-weight estimation performance.
  • Multi-sensor simulation toolbox for IMU, camera, LiDAR, GNSS, and wheel enhanced from our prior work (OpenVINS)
  • Evaluation toolbox for consistency, accuracy, and timing analysis.
  • Very detailed options for each sensor enabling general multi-sensor application.

Dependency

MINS is tested on Ubuntu 18 and 20 and only requires corresponding ROS (Melodic and Noetic).

  • Default Eigen version will be 3.3.7 (Noetic) or lower, but if one has a higher version the compilation can be failed due to thirdparty library (libpointmatcher) for LiDAR.

Build and Source

mkdir -p $MINS_WORKSPACE/catkin_ws/src/ && cd $MINS_WORKSPACE/catkin_ws/src/
git clone https://github.com/rpng/MINS
cd .. && catkin build
source devel/setup.bash

Run Examples

Simulation

roslaunch mins simulation.launch cam_enabled:=true lidar_enabled:=true

alt text

Real-World Dataset

Directly reading the ros bag file

roslaunch mins rosbag.launch config:=kaist/kaist_LC path_gt:=urban30.txt path_bag:=urban30.bag

alt text

Here are the rosbag files and ground truths we used in the evaluation. To be specific, we used kaist2bag to convert all sensor readings to rosbag files. All rights reserved to KAIST urban dataset.

Rosbag GT (csv) GT (txt) Rosbag GT (csv) GT (txt)
urban18.bag urban18.csv urban18.txt urban19.bag urban19.csv urban19.txt
urban20.bag urban20.csv urban20.txt urban21.bag urban21.csv urban21.txt
urban22.bag urban22.csv urban22.txt urban23.bag urban23.csv urban23.txt
urban24.bag urban24.csv urban24.txt urban25.bag urban25.csv urban25.txt
urban26.bag urban26.csv urban26.txt urban27.bag urban27.csv urban27.txt
urban28.bag urban28.csv urban28.txt urban29.bag urban29.csv urban29.txt
urban30.bag urban30.csv urban30.txt urban31.bag urban31.csv urban31.txt
urban32.bag urban32.csv urban32.txt urban33.bag urban33.csv urban33.txt
urban34.bag urban34.csv urban34.txt urban35.bag urban35.csv urban35.txt
urban36.bag urban36.csv urban36.txt urban37.bag urban37.csv urban37.txt
urban38.bag urban38.csv urban38.txt urban39.bag urban39.csv urban39.txt

Subscribing to the ros messages

roslaunch mins subscribe.launch config:=euroc_mav rosbag:=V1_03_difficult.bag bag_start_time:=0

alt text

RViz

rviz -d mins/launch/display.rviz

Acknowledgements

This project was built on top of the following libraries which are in the thirdparty folder.

Credit / Licensing

This code was written by the Robot Perception and Navigation Group (RPNG) at the University of Delaware. If you have any issues with the code please open an issue on our GitHub page with relevant implementation details and references. For researchers that have leveraged or compared to this work, please cite the following:

The publication reference will be updated soon.

@article{Lee2023arxiv,
    title        = {MINS: Efficient and Robust Multisensor-aided Inertial Navigation System},
    author       = {Woosik Lee and Patrick Geneva and Chuchu Chen and Guoquan Huang},
    year         = 2023,
    journal      = {arXiv preprint arXiv:2309.15390},
    url          = {https://github.com/rpng/MINS},
}

The codebase and documentation is licensed under the GNU General Public License v3 (GPL-3). You must preserve the copyright and license notices in your derivative work and make available the complete source code with modifications under the same license (see this; this is not legal advice).

About

An efficient and robust multisensor-aided inertial navigation system with online calibration that is capable of fusing IMU, camera, LiDAR, GPS/GNSS, and wheel sensors. Use cases: VINS/VIO, GPS-INS, LINS/LIO, multi-sensor fusion for localization and mapping (SLAM). This repository also provides multi-sensor simulation and data.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published