Deep convolution network based emotion analysis towards mental health care

Fei, Zixiang and Yang, Erfu and Li, David Day Uei and Butler, Stephen and Ijomah, Winifred and Li, Xia and Zhou, Huiyu (2020) Deep convolution network based emotion analysis towards mental health care. Neurocomputing, 388. pp. 212-227. ISSN 0925-2312 (https://doi.org/10.1016/j.neucom.2020.01.034)

[thumbnail of Fei-etal-Neurocomputing-2020-Deep-convolution-network-based-emotion-analysis-towards-mental]
Preview
Text. Filename: Fei_etal_Neurocomputing_2020_Deep_convolution_network_based_emotion_analysis_towards_mental.pdf
Accepted Author Manuscript
License: Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 logo

Download (1MB)| Preview

Abstract

Facial expressions play an important role during communications, allowing information regarding the emotional state of an individual to be conveyed and inferred. Research suggests that automatic facial expression recognition is a promising avenue of enquiry in mental healthcare, as facial expressions can also reflect an individual's mental state. In order to develop user-friendly, low-cost and effective facial expression analysis systems for mental health care, this paper presents a novel deep convolution network based emotion analysis framework to support mental state detection and diagnosis. The proposed system is able to process facial images and interpret the temporal evolution of emotions through a new solution in which deep features are extracted from the Fully Connected Layer 6 of the AlexNet, with a standard Linear Discriminant Analysis Classifier exploited to obtain the final classification outcome. It is tested against 5 benchmarking databases, including JAFFE, KDEF,CK+, and databases with the images obtained ‘in the wild’ such as FER2013 and AffectNet. Compared with the other state-of-the-art methods, we observe that our method has overall higher accuracy of facial expression recognition. Additionally, when compared to the state-of-the-art deep learning algorithms such as Vgg16, GoogleNet, ResNet and AlexNet, the proposed method demonstrated better efficiency and has less device requirements. The experiments presented in this paper demonstrate that the proposed method outperforms the other methods in terms of accuracy and efficiency which suggests it could act as a smart, low-cost, user-friendly cognitive aid to detect, monitor, and diagnose the mental health of a patient through automatic facial expression analysis.

ORCID iDs

Fei, Zixiang, Yang, Erfu ORCID logoORCID: https://orcid.org/0000-0003-1813-5950, Li, David Day Uei ORCID logoORCID: https://orcid.org/0000-0002-6401-4263, Butler, Stephen ORCID logoORCID: https://orcid.org/0000-0002-2103-0773, Ijomah, Winifred, Li, Xia and Zhou, Huiyu;