Automatic detection of speech disorder in dysarthria using extended speech feature extraction and neural networks classification

Ijitona, T B and Soraghan, J J and Lowit, A and Di-Caterina, G and Yue, H (2017) Automatic detection of speech disorder in dysarthria using extended speech feature extraction and neural networks classification. In: The 3rd International Conference on Intelligent Signal Processing, 2017-12-04 - 2017-12-05, Savoy Place. (https://doi.org/10.1049/cp.2017.0360)

[thumbnail of Ijitona-etal-ICISP-2017-Automatic-detection-of-speech-disorder-in-dysarthria]
Preview
Text. Filename: Ijitona_etal_ICISP_2017_Automatic_detection_of_speech_disorder_in_dysarthria.pdf
Accepted Author Manuscript
License: Strathprints license 1.0

Download (2MB)| Preview

Abstract

This paper presents an automatic detection of Dysarthria, a motor speech disorder, using extended speech features called Centroid Formants. Centroid Formants are the weighted averages of the formants extracted from a speech signal. This involves extraction of the first four formants of a speech signal and averaging their weighted values. The weights are determined by the peak energies of the bands of frequency resonance, formants. The resulting weighted averages are called the Centroid Formants. In our proposed methodology, these centroid formants are used to automatically detect Dysarthric speech using neural network classification technique. The experimental results recorded after testing this algorithm are presented. The experimental data consists of 200 speech samples from 10 Dysarthric Speakers and 200 speech samples from 10 age-matched healthy speakers. The experimental results show a high performance using neural networks classification. A possible future research related to this work is the use of these extended features in speaker identification and recognition of disordered speech.

ORCID iDs

Ijitona, T B ORCID logoORCID: https://orcid.org/0000-0003-1801-8201, Soraghan, J J ORCID logoORCID: https://orcid.org/0000-0003-4418-7391, Lowit, A ORCID logoORCID: https://orcid.org/0000-0003-0842-584X, Di-Caterina, G ORCID logoORCID: https://orcid.org/0000-0002-7256-0897 and Yue, H ORCID logoORCID: https://orcid.org/0000-0003-2072-6223;