Picture of virus under microscope

Research under the microscope...

The Strathprints institutional repository is a digital archive of University of Strathclyde research outputs.

Strathprints serves world leading Open Access research by the University of Strathclyde, including research by the Strathclyde Institute of Pharmacy and Biomedical Sciences (SIPBS), where research centres such as the Industrial Biotechnology Innovation Centre (IBioIC), the Cancer Research UK Formulation Unit, SeaBioTech and the Centre for Biophotonics are based.

Explore SIPBS research

Quantitative analysis of facial paralysis using local binary patterns in biomedical videos

He, Shu and Soraghan, J.J. and O'Reilly, Brian and Xing, D. (2009) Quantitative analysis of facial paralysis using local binary patterns in biomedical videos. IEEE Transactions on Biomedical Engineering, 56 (7). pp. 1864-1870. ISSN 0018-9294

Full text not available in this repository. (Request a copy from the Strathclyde author)

Abstract

Facial paralysis is the loss of voluntary muscle movement of one side of the face. A quantitative, objective, and reliable assessment system would be an invaluable tool for clinicians treating patients with this condition. This paper presents a novel framework for objective measurement of facial paralysis. The motion information in the horizontal and vertical directions and the appearance features on the apex frames are extracted based on the local binary patterns (LBPs) on the temporal-spatial domain in each facial region. These features are temporally and spatially enhanced by the application of novel block processing schemes. A multiresolution extension of uniform LBP is proposed to efficiently combine the micropatterns and large-scale patterns into a feature vector. The symmetry of facial movements is measured by the resistor-average distance (RAD) between LBP features extracted from the two sides of the face. Support vector machine is applied to provide quantitative evaluation of facial paralysis based on the House-Brackmann (H-B) scale. The proposed method is validated by experiments with 197 subject videos, which demonstrates its accuracy and efficiency.