Skip to content

A flexible and extensible framework for gait recognition. You can focus on designing your own models and comparing with state-of-the-arts easily with the help of OpenGait.

Notifications You must be signed in to change notification settings

ShiqiYu/OpenGait

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

logo

nmbgcl

OpenGait is a flexible and extensible gait analysis project provided by the Shiqi Yu Group and supported in part by WATRIX.AI. The corresponding paper has been accepted by CVPR2023 as a highlight paper.

What's New

  • [Jun 2024] The first large-scale gait-based scoliosis screening benchmark ScoNet is accepted to MICCAI2024🎉 Congratulations to Zirui! This is his FIRST paper! The code is released here, and you can refer to project homepage for details.
  • [May 2024] The code of Large Vision Model based method BigGait is available at here. CCPG's checkpoints.
  • [Apr 2024] Our team's latest checkpoints for projects such as DeepGaitv2, SkeletonGait, SkeletonGait++, and SwinGait will be released on Hugging Face. Additionally, previously released checkpoints will also be gradually made available on it.
  • [Mar 2024] Chao gives a talk about 'Progress in Gait Recognition'. The video and slides are both available😊
  • [Mar 2024] The code of SkeletonGait++ is released here, and you can refer to readme for details.
  • [Mar 2024] BigGait has been accepted to CVPR2024🎉 Congratulations to Dingqiang! This is his FIRST paper!
  • [Jan 2024] The code of transfomer-based SwinGait is available at here.
  • [Dec 2023] A new state-of-the-art baseline, i.e., DeepGaitV2, is available at here!

Our Publications

  • [MICCAI'24] Gait Patterns as Biomarkers: A Video-Based Approach for Classifying Scoliosis, Paper, Dataset, and Code.
  • [CVPR'24] BigGait: Learning Gait Representation You Want by Large Vision Models. Paper, and Code.
  • [AAAI'24] SkeletonGait++: Gait Recognition Using Skeleton Maps. Paper, and Code.
  • [AAAI'24] Cross-Covariate Gait Recognition: A Benchmark. Paper, Dataset, and Code.
  • [Arxiv'23] Exploring Deep Models for Practical Gait Recognition. Paper, DeepGaitV2, and SwinGait.
  • [PAMI'23] Learning Gait Representation from Massive Unlabelled Walking Videos: A Benchmark, Paper, Dataset, and Code.
  • [CVPR'23] LidarGait: Benchmarking 3D Gait Recognition with Point Clouds, Paper, Dataset and Code.
  • [CVPR'23] OpenGait: Revisiting Gait Recognition Toward Better Practicality, Highlight Paper, and Code.
  • [ECCV'22] GaitEdge: Beyond Plain End-to-end Gait Recognition for Better Practicality, Paper, and Code.

A Real Gait Recognition System: All-in-One-Gait

probe1-After

The workflow of All-in-One-Gait involves the processes of pedestrian tracking, segmentation and recognition. See here for details.

Highlighted features

Getting Started

Please see 0.get_started.md. We also provide the following tutorials for your reference:

Model Zoo

✨✨✨You can find all the checkpoint files at Hugging Face Models✨✨✨!

The result list of appearance-based gait recognition is available here.

The result list of pose-based gait recognition is available here.

Authors:

Now OpenGait is mainly maintained by Dongyang Jin (金冬阳), [email protected]

Acknowledgement

Citation

@InProceedings{Fan_2023_CVPR,
    author    = {Fan, Chao and Liang, Junhao and Shen, Chuanfu and Hou, Saihui and Huang, Yongzhen and Yu, Shiqi},
    title     = {OpenGait: Revisiting Gait Recognition Towards Better Practicality},
    booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
    month     = {June},
    year      = {2023},
    pages     = {9707-9716}
}

Note: This code is only used for academic purposes, people cannot use this code for anything that might be considered commercial use.

About

A flexible and extensible framework for gait recognition. You can focus on designing your own models and comparing with state-of-the-arts easily with the help of OpenGait.

Resources

Stars

Watchers

Forks

Packages

No packages published