Skip to content
/ mars Public
forked from OPEN-AIR-SUN/mars

MARS: An Instance-aware, Modular and Realistic Simulator for Autonomous Driving

License

Notifications You must be signed in to change notification settings

louhz/mars

 
 

Repository files navigation

MARS: An Instance-aware, Modular and Realistic Simulator for Autonomous Driving

CICAI 2023 Best Paper Runner-up Award

For business inquiries, please contact us at [email protected].

We have just finished a refactorization of our codebase. Now you can use pip install to start using mars instantly! Please contact us without hesitation if you encounter any issues using the latest version. Thanks!

1. Installation: Setup the environment

Prerequisites

You must have an NVIDIA video card with CUDA installed on the system. This library has been tested with version 11.7 of CUDA. You can find more information about installing CUDA here.

Create environment

Nerfstudio requires python >= 3.7. We recommend using conda to manage dependencies. Make sure to install Conda before proceeding.

conda create --name mars -y python=3.9
conda activate mars

Installation

This section will walk you through the installation process. Our system is dependent on the tiny-cuda-nn project.

pip install mars-nerfstudio
cd /path/to/tiny-cuda-nn/bindings/torch
python setup.py install

2. Training from Scratch

The following will train a MARS model.

Our repository provides dataparser for KITTI and vKITTI2 datasets, for your own data, you can write your own dataparser or convert your own dataset to the format of the provided datasets.

From Datasets

Data Preparation

The data used in our experiments should contain both the pose parameters of cameras and object tracklets. The camera parameters include the intrinsics and the extrinsics. The object tracklets include the bounding box poses, types, ids, etc. For more information, you can refer to KITTI-MOT or vKITTI2 datasets below.

KITTI