Skip to content

TrackGait is a sub project of OpenGait. Implemented a gait recognition system.

Notifications You must be signed in to change notification settings

jdyjjj/All-in-One-Gait

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

logo

tracksegsil

🎉🎉🎉 OpenGait has been accpected by CVPR2023 as a highlight paper! 🎉🎉🎉

All-in-One-Gait is a sub-project of OpenGait provided by Shiqi Yu Group that develops a gait recognition system.

The workflow of All-in-One-Gait primarily involves the processes of pedestrian tracking, segmentation and recognition.

Users are encougraed to update the gait recognition models with watching the lastest SOTA methods in OpenGait.

Demo Results

gallery probe1-After probe2-After
The participants shown in the left video are gallery subjects, and that of other two videos are probe subjects.

The recognition results are represented by the color of the bounding boxes.

How to use

A. Quick Start in Colab (Recommended)

Open In Colab

B. Run on the host machine

Step1. Installation

git clone https://github.com/jdyjjj/All-in-One-Gait.git
cd All-in-One-Gait
pip install -r requirements.txt
pip install yolox

Step2. Get checkpoints

demo
   |——————checkpoints
   |        └——————bytetrack_model
   |        └——————gait_model
   |        └——————seg_model
   └——————libs
   └——————output


checkpoints
   |——————bytetrack_model
   |        └——————bytetrack_x_mot17.pth.tar
   |        └——————yolox_x_mix_det.py
   |
   └——————gait_model
   |        └——————xxxx.pt
   └——————seg_model
            └——————human_pp_humansegv2_mobile_192x192_inference_model_with_softmax
Get the checkpoint of gait model
cd All-in-One-Gait/OpenGait/demo/checkpoints
mkdir gait_model
cd gait_model
wget https://github.com/ShiqiYu/OpenGait/releases/download/v2.0/pretrained_grew_gaitbase.zip
unzip -j pretrained_grew_gaitbase.zip

Get the checkpoint of tracking model
cd All-in-One-Gait/OpenGait/demo/checkpoints/bytetrack_model
pip install --upgrade --no-cache-dir gdown
gdown https://drive.google.com/uc?id=1P4mY0Yyd3PPTybgZkjMYhFri88nTmJX5

Alternatively, you can manually download the checkpoint file and put it into the folder of bytetrack_model.

Get the checkpoint of segment model
cd All-in-One-Gait/OpenGait/demo/checkpoints
mkdir seg_model
cd seg_model
wget https://paddleseg.bj.bcebos.com/dygraph/pp_humanseg_v2/human_pp_humansegv2_mobile_192x192_inference_model_with_softmax.zip
unzip human_pp_humansegv2_mobile_192x192_inference_model_with_softmax.zip

Step3. Run demo

cd All-in-One-Gait/OpenGait
python demo/libs/main.py

All-in-One-Gait mainly consists of three processes, i.e., pedestrian tracking, segmentation, and recognition. In the main.py, you need to give two video as inputs and specify one as the gallery and other one as the probe to obtain the recognized results.

The return results will be written into the path of All-in-One-Gait/OpenGait/demo/output/Outputvideos/track_vis/{timestamp} in default.

Step4. See the result

cd All-in-One-Gait/OpenGait/demo/output

output
   └——————GaitFeatures: This stores the corresponding gait features
   └——————GaitSilhouette: This stores the corresponding gait silhouette images
   └——————InputVideos: This is the folder where the input videos are put
   |       └——————gallery.mp4
   |       └——————probe1.mp4
   |       └——————probe2.mp4
   |       └——————probe3.mp4
   |       └——————probe4.mp4
   └——————OutputVideos
           └——————{timestamp}
                   └——————gallery.mp4
                   └——————G-gallery_P-probe1.mp4
                   └——————G-gallery_P-probe2.mp4
                   └——————G-gallery_P-probe3.mp4
                   └——————G-gallery_P-probe4.mp4

{timestamp}: Store the result video of tracking here, naming it consistent with the input video. In addition, videos named like G-{gallery_video_name}_P-{probe_video_name}.mp4 are obtained after gait recognition.

Authors

OpenGait Team (OGT)

Acknowledgement

Citation

@InProceedings{Fan_2023_CVPR,
    author    = {Fan, Chao and Liang, Junhao and Shen, Chuanfu and Hou, Saihui and Huang, Yongzhen and Yu, Shiqi},
    title     = {OpenGait: Revisiting Gait Recognition Towards Better Practicality},
    booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
    month     = {June},
    year      = {2023},
    pages     = {9707-9716}
}

Note: This code is only used for academic purposes, people cannot use this code for anything that might be considered commercial use.

About

TrackGait is a sub project of OpenGait. Implemented a gait recognition system.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published