Picture of two heads

Open Access research that challenges the mind...

The Strathprints institutional repository is a digital archive of University of Strathclyde research outputs. Strathprints provides access to thousands of Open Access research papers by University of Strathclyde researchers, including those from the School of Psychological Sciences & Health - but also papers by researchers based within the Faculties of Science, Engineering, Humanities & Social Sciences, and from the Strathclyde Business School.

Discover more...

Quantitative analysis of facial paralysis using local binary patterns in biomedical videos

He, Shu and Soraghan, J.J. and O'Reilly, Brian and Xing, D. (2009) Quantitative analysis of facial paralysis using local binary patterns in biomedical videos. IEEE Transactions on Biomedical Engineering, 56 (7). pp. 1864-1870. ISSN 0018-9294

Full text not available in this repository. (Request a copy from the Strathclyde author)

Abstract

Facial paralysis is the loss of voluntary muscle movement of one side of the face. A quantitative, objective, and reliable assessment system would be an invaluable tool for clinicians treating patients with this condition. This paper presents a novel framework for objective measurement of facial paralysis. The motion information in the horizontal and vertical directions and the appearance features on the apex frames are extracted based on the local binary patterns (LBPs) on the temporal-spatial domain in each facial region. These features are temporally and spatially enhanced by the application of novel block processing schemes. A multiresolution extension of uniform LBP is proposed to efficiently combine the micropatterns and large-scale patterns into a feature vector. The symmetry of facial movements is measured by the resistor-average distance (RAD) between LBP features extracted from the two sides of the face. Support vector machine is applied to provide quantitative evaluation of facial paralysis based on the House-Brackmann (H-B) scale. The proposed method is validated by experiments with 197 subject videos, which demonstrates its accuracy and efficiency.