Volume 20, Issue 4 (Winter 2019)                   Advances in Cognitive Sciences 2019, 20(4): 46-61 | Back to browse issues page

XML Persian Abstract Print


1- M.Sc student, Software Computer, Fouman&Shaft Islamic Azad University
2- Department of Computer Faculty member, Fouman&Shaft Islamic Azad University
Abstract:   (3972 Views)

Introduction: Facial expressions are one of the most important ways of communication and human response to the surrounding environment. The purpose of this study is to use the brain's emotional learning model (BEL) to face emotion recognition. The brain's emotional learning model is inspired by the human brain's limbic system, which is responsible for motivating human emotions. This model has been used to improve the recognition rate of emotional expression of the human face. Method: The input of the proposed model is JAFFE standard dataset which includes six emotional expressions of Happiness, Sadness, Anger, Surprise, Fear and Disgust. After reading images using the MATLAB software commands, all read images will be entered into the extracting step. Also, The PCA method was used to extract the small image components. Finally, to calculate the recognition rate of facial expressions, all extracted features from the previous step are entered into the classification stage of the BEL model. In the application of the BEL method, the communication matrix with the components of the eyebrows, eyes and mouth is created and their dependence is determined in each emotion. This way you can recognize facial expressions. Also, because of the demonstration of the efficiency of the proposed model, the BEL model is compared with the SVM rival model. Results: Dataset analysis results show the recognition rate of facial expressions of 93.8%. Conclusion: According to the results of this research, the BEL model shows the rate of recognition of emotional expressions with higher accuracy than the SVM model.

Full-Text [PDF 995 kb]   (1767 Downloads)    
Type of Study: Research | Subject: Special
Received: 2017/11/13 | Accepted: 2018/02/27 | Published: 2019/01/28

Rights and permissions
Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.