Measuring Semantic Fidelity in Student Discussion Posts
Abstract
Asynchronous discussion boards are central to many online and blended learning environments, yet assessing the quality of student contributions remains a challenge—particularly in capturing both literal adherence to prompts and deeper conceptual engagement. This paper introduces a layered framework for measuring semantic fidelity in student discussion posts, defined as the extent to which a post aligns with both the instructional prompt (literal fidelity) and the broader course themes (inferential fidelity). Using natural language processing (NLP), we operationalize this framework through two computational lenses: semantic similarity based on sentence- level embeddings to measure prompt adherence, and zero-shot topic modeling using RoBERTa-large-MNLI model to assess alignment with course themes. A two- dimensional semantic fidelity quadrant is used to visualize and classify student posts based on their scores across these dimensions. We applied this approach to discussion data from 88 students (44 undergraduate and 44 graduate) across a data-focused undergraduate course and a development-focused graduate cohort. The results revealed clear differences in engagement profiles. Undergraduate posts tended to score higher on literal prompt similarity but lower on inferential alignment, often restating questions without integrating broader concepts. In contrast, graduate student responses were more evenly distributed, with many demonstrating conceptual alignment despite diverging from prompt wording—highlighting a more reflective and applied discourse style. This analytic framework offers educators a scalable and interpretable tool to better understand student thinking, surface off-prompt yet pedagogically rich contributions, and inform more equitable feedback practices. It also advances the field of trustworthy learning analytics by balancing automated insight with human pedagogical goals. The study concludes with implications for instructional design, recommendations for adaptive scoring thresholds, and directions for extending the framework to multi-turn discussions and real-time learning dashboards.Downloads
Download data is not yet available.
Downloads
Published
2025-09-05
Conference Proceedings Volume
Section
Conference Proceedings Submissions