Exploring Student Emotion via Facial Expressions Using Transfer Learning
DOI:
https://doi.org/10.58459/icce.2024.5030Abstract
Recognizing student emotions can significantly enhance the learning process. This study investigates the effectiveness of transfer learning with VGG-16 and ResNet-18 models for classifying student emotions based on facial expressions. Leveraging pre-trained models and employing cross-validation, we achieved a robust valence classification accuracy of 92% on FER2013 dataset. However, when applied to MAHNOB-HCI and ACADEMO datasets characterized by limited and subtle emotional cues, performance declined to approximately 82% with overfitting. To enhance model generalization and mitigate overfitting, strategies such as data augmentation, regularization techniques, and hyperparameter optimization are proposed. Our findings demonstrate the effectiveness of transfer learning in recognizing student emotions, which may significantly impact education through personalized learning, improved student engagement, and early intervention.