Real-time Adaptive Learning Environments Using Gaze and Emotion Recognition Engagement and Learning Outcomes
DOI:
https://doi.org/10.58459/icce.2024.5044Abstract
This study explores the integration of real-time adaptive learning environments with gaze tracking and emotion recognition technologies to enhance student engagement and learning outcomes. By leveraging artificial intelligence (Al) and machine learning, the research aims to develop a framework that dynamically adjusts instructional strategies based on students' cognitive and emotional states. The focus is on three objectives: integrating these technologies, evaluating their impact, and assessing the effectiveness of automated communication strategies—Affective Backchannels (AB), Conversational Strategies (CS), and their combination (AB+CS). Using a mixed-methods approach, data from approximately 30 university students in digital learning environments will be analyzed. The findings are expected to provide valuable insights into the application of these technologies in education, potentially informing future educational policies and practices.