A holistic visualisation solution to understanding multimodal data in an educational metaverse platform – Learningverse

Authors

  • Yanjie SONG Author
  • Jiaxin CAO Author
  • Lei TAO Author
  • Dragan Gašević Author

DOI:

https://doi.org/10.58459/icce.2023.1501

Abstract

Traditional digital learning environments faced challenges in obtaining comprehensive user interaction data, often yielding fragmented insights without a cohesive visual representation. The emergence of metaverse platforms has enriched this landscape, enabling detailed user activity representation with multimodal data through avatars. However, how to understand the multimodal data related to teaching, social and cognitive presences underpinned by the 'Community of Inquiry' theoretical framework in the metaverse is a big challenge for educators. This study introduces a holistic visualisation solution to bridge this gap, ensuring a better understanding of avatars’ behaviours in an educational metaverse platform – Learningverse developed by our research team. The solution captures a range of multimodal data in Learningverse, such as avatar location, behaviours, emotions, and conversation. Key visualisation elements include heatmaps, points, and arrows, each with distinct informational value. In the future, integrating the solution with multimodal learning analytics is our next step work to understand teaching, social and cognitive presences.

Downloads

Download data is not yet available.

Downloads

Published

2023-12-04

How to Cite

A holistic visualisation solution to understanding multimodal data in an educational metaverse platform – Learningverse. (2023). International Conference on Computers in Education. https://doi.org/10.58459/icce.2023.1501