修改自 ORB-SLAM2
系统 Ubuntu 12.04, 14.04 and 16.04 等
We use the new thread and chrono functionalities of C++11.
We use Pangolin for visualization and user interface. Dowload and install instructions can be found at: https://github.com/stevenlovegrove/Pangolin.
We use OpenCV to manipulate images and features. Dowload and install instructions can be found at: https://opencv.org. Required at leat 2.4.3. Tested with OpenCV 2.4.11 and OpenCV 3.2.
Required by g2o (see below). Download and install instructions can be found at: https://eigen.tuxfamily.org. Required at least 3.1.0.
We use modified versions of the DBoW2 library to perform place recognition and g2o library to perform non-linear optimizations. Both modified libraries (which are BSD) are included in the Thirdparty folder.
We provide some examples to process the live input of a monocular, stereo or RGB-D camera using ROS. Building these examples is optional. In case you want to use ROS, a version Hydro or newer is needed.
Clone the repository:
git clone https://github.com/xxx/Multi_ORB_SLAM.git Multi_ORB_SLAM
We provide a script build.sh
to build the Thirdparty libraries and ORB-SLAM2. Please make sure you have installed all required dependencies (see section 2). Execute:
cd Multi_ORB_SLAM
chmod +x build.sh
./build.sh
This will create libORB_SLAM2.so at lib folder and the executables rgbd_tum in Examples folder.
- 需要自己建立数据集.
对两个相机的数据集分别进行associate:
python associate.py PATH_TO_SEQUENCE/rgb.txt PATH_TO_SEQUENCE/depth.txt > associations.txt
python associate.py PATH_TO_SEQUENCE/rgb2.txt PATH_TO_SEQUENCE/depth2.txt > associations2.txt
- Execute the following command.
执行命令 例如:
./Examples/RGB-D/rgbd_tum xxx/Multi_ORB_SLAM/Vocabulary/ORBvoc.txt xxx/Multi_ORB_SLAM/OtherFiles/multi.yaml <数据集目录路径> <数据集目录路径>/associations.txt <数据集目录路径>/associations2.txt xxx/Multi_ORB_SLAM/OtherFiles/calibration.txt
- Add the path including Examples/ROS/ORB_SLAM2 to the ROS_PACKAGE_PATH environment variable. Open .bashrc file and add at the end the following line. Replace PATH by the folder where you cloned ORB_SLAM2: 添加路径到环境变量
export ROS_PACKAGE_PATH=${ROS_PACKAGE_PATH}:PATH/MUlti_ORB_SLAM/Examples/ROS
- Execute
build_ros.sh
script:
chmod +x build_ros.sh
./build_ros.sh
安装奥比中光astra的ROS驱动,编写一个多相机launch文件,放在安装目录下,例如:
/opt/ros/kinetic/share/astra_launch/launch/下.
可以使用我编写的two_astra.launch,在/Multi_ORB_SLAM/OtherFiles/里.
使用launch文件运行多相机.
For an RGB-D input from topics /camera/rgb/image_raw
and /camera/depth_registered/image_raw
, run node ORB_SLAM2/RGBD. You will need to provide the vocabulary file and a settings file. See the RGB-D example above.
rosrun ORB_SLAM2 RGBD PATH_TO_VOCABULARY PATH_TO_SETTINGS_FILE PATH_TO_calibration_FILE
也可以直接执行launch文件:
roslaunch orbslam2.launch
You will need to create a settings file with the calibration of your camera. See the settings file provided for the TUM and KITTI datasets for monocular, stereo and RGB-D cameras. We use the calibration model of OpenCV. See the examples to learn how to create a program that makes use of the ORB-SLAM2 library and how to pass images to the SLAM system. Stereo input must be synchronized and rectified. RGB-D input must be synchronized and depth registered. 建立自己的多相机数据集
You can change between the SLAM and Localization mode using the GUI of the map viewer.
This is the default mode. The system runs in parallal three threads: Tracking, Local Mapping and Loop Closing. The system localizes the camera, builds new map and tries to close loops.
This mode can be used when you have a good map of your working area. In this mode the Local Mapping and Loop Closing are deactivated. The system localizes the camera in the map (which is no longer updated), using relocalization if needed.
ORB-SLAM2的原文档,see README_original.md