A Federated Learning Neural Network For Student Dropout Prediction
Abstract
Federated learning provides a promising approach for training neural network models across distributed datasets while mitigating the risk of data leakage. This is particularly important in educational data mining, where student records often contain highly sensitive and confidential information. To address this challenge, we propose FedAvg-NNFL, a privacy-preserving neural network model for predicting student dropout using a federated learning framework. In this decentralized setup, each participating institution trains a local model on its private data, and a global model is constructed by aggregating the locally trained weights. This approach maintains data privacy and complies with institutional data-sharing restrictions while enabling collaborative model development. We evaluated the performance of the proposed FedAvg-NNFL model using a benchmark dataset from the Polytechnic Institute of Portalegre, which includes demographic, socioeconomic, academic, and macroeconomic features. Our model was compared against both a centralized neural network and a locally trained model. FedAvg-NNFL demonstrated strong, well-balanced performance across all metrics, achieving an accuracy of 0.9280, precision of 0.9320, recall of 0.8972, F1-score of 0.9142, and AUC of 0.92. These results highlight the effectiveness of federated learning in building accurate and privacy-aware predictive models for educational applications.Downloads
Download data is not yet available.
Downloads
Published
2025-12-01
Conference Proceedings Volume
Section
Articles
How to Cite
A Federated Learning Neural Network For Student Dropout
Prediction. (2025). International Conference on Computers in Education. https://library.apsce.net/index.php/ICCE/article/view/5560