Skip to content

OpenGVLab/UniHCP

Repository files navigation

UniHCP: A Unified Model for Human-Centric Perceptions

Usage

Preparation

  1. Install all required dependencies in requirements.txt.
  2. Replace all path...to... in the .yaml configuration files to the absolute path to corresponding dataset locations.
  3. Place MAE pretrained weight mae_pretrain_vit_base.pth under core\models\backbones\pretrain_weights folder.

*Only slurm-based distributed training & single-gpu testing is implemented in this repo.

Experiments

All experiment configurations files and launch scripts are located in experiments/unihcp/release folder.

To perform full multi-task training for UniHCP, replace <your partition> in train.sh launch script and run:

sh train.sh 88 coslr1e3_104k_b4324g88_h256_I2k_1_10_001_2I_fairscale_m256

To perform evaluations, keep the test_info_list assignments corresponding to the tests you want to perform , replace <your partition>, then run :

sh batch_test.sh  1 coslr1e3_104k_b4324g88_h256_I2k_1_10_001_2I_fairscale_m256

Note that in this case, the program would look for checkpoints located at experiments/unihcp/release/checkpoints/coslr1e3_104k_b4324g88_h256_I2k_1_10_001_2I_fairscale_m256

Pretrained Models

Please send the signed agreement to [email protected] to get the download link.

About

Official PyTorch implementation of UniHCP

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published