Real-time Adaptive Learning Environments Using Gaze and Emotion Recognition Engagement and Learning Outcomes

Authors

  • Aboul Hassane CISSE Graduate School of Informatics, Osaka Metropolitan University Author

DOI:

https://doi.org/10.58459/icce.2024.5044

Abstract

This study explores the integration of real-time adaptive learning environments with gaze tracking and emotion recognition technologies to enhance student engagement and learning outcomes. By leveraging artificial intelligence (Al) and machine learning, the research aims to develop a framework that dynamically adjusts instructional strategies based on students' cognitive and emotional states. The focus is on three objectives: integrating these technologies, evaluating their impact, and assessing the effectiveness of automated communication strategies—Affective Backchannels (AB), Conversational Strategies (CS), and their combination (AB+CS). Using a mixed-methods approach, data from approximately 30 university students in digital learning environments will be analyzed. The findings are expected to provide valuable insights into the application of these technologies in education, potentially informing future educational policies and practices.

Downloads

Download data is not yet available.

Downloads

Published

2024-11-25

How to Cite

Real-time Adaptive Learning Environments Using Gaze and Emotion Recognition Engagement and Learning Outcomes. (2024). International Conference on Computers in Education. https://doi.org/10.58459/icce.2024.5044