This repository contains the codes for the IROS 2023 paper "Panoptic Mapping with Fruit Completion and Pose Estimation for Horticultural Robots".
Panoptic Mapping | Fruit Completion |
---|---|
horti_map_part1.mp4 |
horti_map_part2.mp4 |
Monitoring plants and fruits at high resolution play a key role in the future of agriculture. Accurate 3D information can pave the way to a diverse number of robotic applications in agriculture ranging from autonomous harvesting to precise yield estimation. Obtaining such 3D information is non-trivial as agricultural environments are often repetitive and cluttered, and one has to account for the partial observability of fruit and plants. In this paper, we address the problem of jointly estimating complete 3D shapes of fruit and their pose in a 3D multi-resolution map built by a mobile robot. To this end, we propose an online multi-resolution panoptic mapping system where regions of interest are represented with a higher resolution. We exploit data to learn a general fruit shape representation that we use at inference time together with an occlusion-aware differentiable rendering pipeline to complete partial fruit observations and estimate the 7 DoF pose of each fruit in the map. The experiments presented in this paper, evaluated both in the controlled environment and in a commercial greenhouse, show that our novel algorithm yields higher completion and pose estimation accuracy than existing methods, with an improvement of 41% in completion accuracy and 52% in pose estimation accuracy while keeping a low inference time of 0.6s in average.
conda create --name homa python=3.8
conda activate homa
conda install pytorch==2.0.0 torchvision==0.15.0 torchaudio==2.0.0 pytorch-cuda=11.7 -c pytorch -c nvidia
The commands depend on your CUDA version. You may check the instructions here.
pip3 install open3d==0.17 opencv-python scikit-image wandb tqdm plyfile
git clone [email protected]:PRBonn/HortiMapping.git
cd HortiMapping
For the multi-resolution panoptic mapping part, we use our previous work Voxfield Panmap.
We provide an example data sequence generated from the public BUP20 sweet pepper dataset using multi-resolution panoptic mapping.
You can download this example data by:
sh scripts/download_bup_example.sh
You can then test the shape completion and pose estimation using the example data sequence after setting the path by:
python test_wild_completion.py -c ./configs/wild_pepper.yaml
You will see a visualizer showing the optimization process. You can then check the submaps_complete
and submaps_pose
folders in the example data folder for the completed mesh and pose for each fruit.
To run the code on the ECCV fruit shape completion benchmark, you can first download the data by:
sh scripts/download_fruit_shape_completion_dataset.sh
Then run:
python run_shape_completion_challenge.py
You can tune the parameters and switch the data split (train
, val
or test
) in the config file ./configs/shape_completion_challenge_pepper.yaml
.
For faster operation, you can turn off the visualization by setting vis_on
to false
.
For more details of the dataset and challenge, please refer to here.
This script can be applied to other datasets, where the input is the masked RGBD images.
If you use the repository for any academic work, please cite our paper.
@inproceedings{pan2023iros,
author = {Y. Pan and F. Magistri and T. L\"abe and E. Marks and C. Smitt and C.S. McCool and J. Behley and C. Stachniss},
title = {Panoptic Mapping with Fruit Completion and Pose Estimation for Horticultural Robots},
booktitle={Proceedings of the IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (IROS)},
year={2023}
}