header advert
Orthopaedic Proceedings Logo

Receive monthly Table of Contents alerts from Orthopaedic Proceedings

Comprehensive article alerts can be set up and managed through your account settings

View my account settings

Visit Orthopaedic Proceedings at:

Loading...

Loading...

Full Access

Research

DEEP LEARNING FOR ENLARGING HUMAN MOTION CAPTURE (MOCAP) DATASETS

The British Orthopaedic Research Society (BORS) 2023 Meeting, Cambridge, England, 25–26 September 2023.



Abstract

Abstract

OBJECTIVES

Application of deep learning approaches to marker trajectories and ground reaction forces (mocap data), is often hampered by small datasets. Enlarging dataset size is possible using some simple numerical approaches, although these may not be suited to preserving the physiological relevance of mocap data. We propose augmenting mocap data using a deep learning architecture called “generative adversarial networks” (GANs). We demonstrate appropriate use of GANs can capture variations of walking patterns due to subject- and task-specific conditions (mass, leg length, age, gender and walking speed), which significantly affect walking kinematics and kinetics, resulting in augmented datasets amenable to deep learning analysis approaches.

METHODS

A publicly available (https://www.nature.com/articles/s41597-019-0124-4) gait dataset (733 trials, 21 women and 25 men, 37.2 ± 13.0 years, 1.74 ± 0.09 m, 72.0 ± 11.4 kg, walking speeds ranging from 0.18 m/s to 2.04 m/s) was used as the experimental dataset. The GAN comprised three neural networks: an encoder, a decoder, and a discriminator. The encoder compressed experimental data into a fixed-length vector, while the decoder transformed the encoder's output vector and a condition vector (containing information about the subject and trial) into mocap data. The discriminator distinguished between the encoded experimental data from randomly sampled vectors of the same size. By training these networks jointly using the experimental dataset, the generator (decoder) could generate synthetic data respecting specified conditions from randomly sampled vectors. Synthetic mocap data and lower limb joint angles were generated and compared to the experimental data, by identifying the statistically significant differences across the gait cycle for a randomly selected subset of the experimental data from 5 female subjects (73 trials, aged 26–40, weighing 57–74 kg, with leg lengths between 868–931 mm, and walking speeds ranging from 0.81–1.68 m/s). By conducting these comparisons for this subset, we aimed to assess the synthetic data generated using multiple conditions.

RESULTS

We visually inspected the synthetic trials to ensure that they appeared realistic. The statistical comparison revealed that, on average, only 2.5% of the gait cycle showed significantly differences in the joint angles of the two data groups. Additionally, the synthetic ground reaction forces deviated from the experimental data distribution for an average of 2.9% of the gait cycle.

CONCLUSIONS

We introduced a novel approach for generating synthetic mocap data of human walking based on the conditions that influence walking patterns. The synthetic data closely followed the trends observed in the experimental data, also in the literature, suggesting that our approach can augment mocap datasets considering multiple conditions, an approach unfeasible in previous work. Creation of large, augmented datasets allows the application of other deep learning approaches, with the potential to generate realistic mocap data from limited and non-lab-based data. Our method could also enhance data sharing since synthetic data does not raise ethical concerns. You can generate and download virtual gait data using our GAN approach from https://thisgaitdoesnotexist.streamlit.app/.

Declaration of Interest

(b) declare that there is no conflict of interest that could be perceived as prejudicing the impartiality of the research reported:I declare that there is no conflict of interest that could be perceived as prejudicing the impartiality of the research project.