Understanding Learner Interaction Analytics through Data Visualization in Metaverse-Based Learning Environments
Abstract
Metaverse-Based Learning Environments (MBLEs) generate rich, high-volume multimodal interaction data (e.g., gaze, gesture, speech, movement) that can reveal learner engagement patterns. Yet, instructors lack tools that can cluster and visualize these data in an actionable, real-time manner. This doctoral research aims to develop a data visualization framework that integrates unsupervised clustering of multimodal student interactions with adaptive, immersive dashboards to support pedagogical decision-making. The study proceeds in three phases: (1) establishing a validated operational framework for MBLEs, (2) designing and evaluating a clustering pipeline for multimodal learner data, and (3) prototyping immersive dashboards that reduce cognitive load and enhance instructional responsiveness. Preliminary outcomes include a Unity-based data capture prototype. This work advances immersive analytics, multimodal learning analytics, and Human–Computer Interaction by linking embodied learner behaviors to adaptive visual feedback.Downloads
Download data is not yet available.
Downloads
Published
2025-12-01
Conference Proceedings Volume
Section
Articles
How to Cite
Understanding Learner Interaction Analytics through Data Visualization in Metaverse-Based Learning Environments. (2025). International Conference on Computers in Education. https://library.apsce.net/index.php/ICCE/article/view/5694