A depth camera motion analysis framework for tele-rehabilitation : motion capture and person-centric kinematics analysis

Ye, Minxiang and Yang, Cheng and Stankovic, Vladimir and Stankovic, Lina and Kerr, Andrew (2016) A depth camera motion analysis framework for tele-rehabilitation : motion capture and person-centric kinematics analysis. IEEE Journal on Selected Topics in Signal Processing, 10 (5). pp. 877-887. ISSN 1932-4553 (https://doi.org/10.1109/JSTSP.2016.2559446)

[thumbnail of Ye-etal-IEEEJSTSP-2016-A-depth-camera-motion-analysis-framework-for-tele-rehabilitation]
Preview
Text. Filename: Ye_etal_IEEEJSTSP_2016_A_depth_camera_motion_analysis_framework_for_tele_rehabilitation.pdf
Accepted Author Manuscript

Download (2MB)| Preview

Abstract

With increasing importance given to telerehabilitation, there is a growing need for accurate, low-cost, and portable motion capture systems that do not require specialist assessment venues. This paper proposes a novel framework for motion capture using only a single depth camera, which is portable and cost effective compared to most industry-standard optical systems, without compromising on accuracy. Novel signal processing and computer vision algorithms are proposed to determine motion patterns of interest from infrared and depth data. In order to demonstrate the proposed framework’s suitability for rehabilitation, we developed a gait analysis application that depends on the underlying motion capture sub-system. Each subject’s individual kinematics parameters, which are unique to that subject, are calculated and these are stored for monitoring individual progress of the clinical therapy. Experiments were conducted on 14 different subjects, 5 healthy and 9 stroke survivors. The results show very close agreement of the resulting relevant joint angles with a 12-camera based VICON system, a mean error of at most 1.75% in detecting gait events w.r.t the manually generated ground-truth, and significant performance improvements in terms of accuracy and execution time compared to a previous Kinect-based system.

ORCID iDs

Ye, Minxiang ORCID logoORCID: https://orcid.org/0000-0003-0083-7145, Yang, Cheng ORCID logoORCID: https://orcid.org/0000-0002-3540-1598, Stankovic, Vladimir ORCID logoORCID: https://orcid.org/0000-0002-1075-2420, Stankovic, Lina ORCID logoORCID: https://orcid.org/0000-0002-8112-1976 and Kerr, Andrew ORCID logoORCID: https://orcid.org/0000-0002-7666-9283;