Abstract
A facial expression is a combination of an expressive component and a neutral component of a person. In this paper, we propose to recognize facial expressions by extracting information of the expressive component through a de-expression learning procedure, called De-expression Residue Learning (DeRL). First, a generative model is trained by cGAN. This model generates the corresponding neutral face image for any input face image. We call this procedure de-expression because the expressive information is filtered out by the generative model; however, the expressive information is still recorded in the intermediate layers. Given the neutral face image, unlike previous works using pixel-level or feature-level difference for facial expression classification, our new method learns the deposition (or residue) that remains in the intermediate layers of the generative model. Such a residue is essential as it contains the expressive component deposited in the generative model from any input facial expression images. Seven public facial expression databases are employed in our experiments. With two databases (BU-4DFE and BP4D-spontaneous) for pre-training, the DeRL method has been evaluated on five databases, CK+, Oulu-CASIA, MMI, BU-3DFE, and BP4D+. The experimental results demonstrate the superior performance of the proposed method.
Recommended Citation
H. Yang et al., "Facial Expression Recognition By De-expression Residue Learning," Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, pp. 2168 - 2177, article no. 8578329, Institute of Electrical and Electronics Engineers, Dec 2018.
The definitive version is available at https://doi.org/10.1109/CVPR.2018.00231
Department(s)
Computer Science
International Standard Book Number (ISBN)
978-153866420-9
International Standard Serial Number (ISSN)
1063-6919
Document Type
Article - Conference proceedings
Document Version
Citation
File Type
text
Language(s)
English
Rights
© 2023 Institute of Electrical and Electronics Engineers, All rights reserved.
Publication Date
14 Dec 2018