Privacy in the Age of Robotics: Protecting Personal Data in Classrooms
Abstract
This article explores the privacy challenges posed by artificial intelligence in embodied robotic systems and proposes technical, design, and governance responses. Robots generate raw sensor data and derived inferences, creating distinctive risks in human-robot interaction such as incidental capture, inferential leakage, algorithmic bias, and third-party exposure. Existing consent models and legal frameworks (GDPR, CCPA, EU AI Act) provide only partial protection, especially in contexts where robots operate persistently and without meaningful choice for users or bystanders. A classroom case study illustrates these concerns, showing how educational robots can expose children and teachers to privacy harms while also pointing to mitigation strategies, including privacy-by-design, transparency indicators, configurable controls, and privacy-enhancing technologies like edge AI and federated learning. The discussion emphasizes interdisciplinary collaboration, participatory deployment, and privacy impact assessment. The article concludes that embedding dignity and digital self-determination into system design and governance is essential for aligning innovation with accountability and trust.Downloads
Download data is not yet available.
Downloads
Published
2025-12-01
Conference Proceedings Volume
Section
Articles
How to Cite
Privacy in the Age of Robotics: Protecting Personal Data in Classrooms. (2025). International Conference on Computers in Education. https://library.apsce.net/index.php/ICCE/article/view/5674