University of Leicester
Browse

Associated Spatio-Temporal Capsule Network for Gait Recognition

Download (2.15 MB)
journal contribution
posted on 2021-05-24, 10:41 authored by A Zhao, J Dong, J Li, L Qi, H Zhou
It is a challenging task to identify a person based on her/his gait patterns. State-of-the-art approaches rely on the analysis of temporal or spatial characteristics of gait, and gait recognition is usually performed on single modality data (such as images, skeleton joint coordinates, or force signals). Evidence has shown that using multi-modality data is more conducive to gait research. Therefore, we here establish an automated learning system, with an associated spatio-temporal capsule network (ASTCapsNet) trained on multi-sensor datasets, to analyze multimodal information for gait recognition. Specifically, we first design a low-level feature extractor and a high-level feature extractor for spatio-temporal feature extraction of gait with a novel recurrent memory unit and a relationship layer. Subsequently, a Bayesian model is employed for the decision-making of class labels. Extensive experiments on several public datasets (normal and abnormal gait) validate the effectiveness of the proposed ASTCapsNet, compared against several state-of-the-art methods.

History

Author affiliation

School of Informatics

Version

  • AM (Accepted Manuscript)

Published in

IEEE Transactions on Multimedia

Publisher

IEEE

issn

1520-9210

eissn

1941-0077

Copyright date

2021

Available date

2021-05-24

Language

eng

Usage metrics

    University of Leicester Publications

    Categories

    No categories selected

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC