Affiliations 

  • 1 Department of Electrical, Electronic and Systems Engineering, Faculty of Engineering and Built Environment, Universiti Kebangsaan Malaysia, Bangi 43600, Malaysia
  • 2 Neurology Unit, Department of Medicine, Faculty of Medicine, Universiti Kebangsaan Malaysia Medical Centre, Kuala Lumpur 56000, Malaysia
  • 3 Department of Neurology, Radboud University Medical Center, P.O. Box 9101, 6500 HB Nijmegen, The Netherlands
  • 4 Pusat GENIUS@Pintar Negara, Universiti Kebangsaan Malaysia, Bangi 43600, Malaysia
Sensors (Basel), 2023 Jul 18;23(14).
PMID: 37514783 DOI: 10.3390/s23146489

Abstract

Gait analysis is an essential tool for detecting biomechanical irregularities, designing personalized rehabilitation plans, and enhancing athletic performance. Currently, gait assessment depends on either visual observation, which lacks consistency between raters and requires clinical expertise, or instrumented evaluation, which is costly, invasive, time-consuming, and requires specialized equipment and trained personnel. Markerless gait analysis using 2D pose estimation techniques has emerged as a potential solution, but it still requires significant computational resources and human involvement, making it challenging to use. This research proposes an automated method for temporal gait analysis that employs the MediaPipe Pose, a low-computational-resource pose estimation model. The study validated this approach against the Vicon motion capture system to evaluate its reliability. The findings reveal that this approach demonstrates good (ICC(2,1) > 0.75) to excellent (ICC(2,1) > 0.90) agreement in all temporal gait parameters except for double support time (right leg switched to left leg) and swing time (right), which only exhibit a moderate (ICC(2,1) > 0.50) agreement. Additionally, this approach produces temporal gait parameters with low mean absolute error. It will be useful in monitoring changes in gait and evaluating the effectiveness of interventions such as rehabilitation or training programs in the community.

* Title and MeSH Headings from MEDLINE®/PubMed®, a database of the U.S. National Library of Medicine.