Facial Emotional Perception Training for Medical Students

HTML  XML Download Download as PDF (Size: 264KB)  PP. 1792-1801  
DOI: 10.4236/ce.2019.108128    538 Downloads   1,513 Views  Citations

ABSTRACT

Introduction: The aim of this study was to determine the applicability and efficacy of a tool for improving medical students’ ability to recognize the facial expressions of emotions. Material and methods: We conducted a controlled intervention study with 98 medical students. The control group underwent a 45-minute class whose content was not targeted at the rehabilitation objective, while the intervention group was trained to recognize emotions through tutorials and games on eye, face and microexpression recognition usinghttps://www.e-motionaltraining.com/ program. All participants were assessed before and after the intervention with Ekman Faces Test. Results: When comparing the scores between the groups, we observed significant preintervention postintervention improvements in the intervention group in the emotions of anger (p < 0.001), sadness (p = 0.031) and fear (p = 0.007). Moreover, when comparing the students who performed pretest at Q1 vs. Q4 there were significant differences in change scores in happiness and anger (p = 0.005 and p = 0.003, respectively), with the students who achieved lower results in the initial assessment (Q1), thereby showing greater improvement. Conclusions: We can conclude that training medical students in recognizing facial emotions as part of their routine classroom training is feasible. The participants who most benefited were those who initially had lower scores in emotion recognition.

Share and Cite:

Vázquez-Campo, M. , Vidal, L. , Torres, A. , Mateos, R. , Olivares, J. , García-Lado, I. and García-Caballero, A. (2019) Facial Emotional Perception Training for Medical Students. Creative Education, 10, 1792-1801. doi: 10.4236/ce.2019.108128.

Copyright © 2024 by authors and Scientific Research Publishing Inc.

Creative Commons License

This work and the related PDF file are licensed under a Creative Commons Attribution 4.0 International License.