Machine translations by Deepl

Privacy in education

Nowadays, almost everything in education happens digitally. Teaching materials are offered digitally, digital testing programmes are used, pupils' developments are recorded in a pupil monitoring system and communication is digital. As digitisation in education increases, the traditional analogue space for children to develop freely is getting smaller and smaller. Excessive data collection and data misuse lurk. To provide a safe and reliable learning environment, educational institutions have a responsibility to handle the data they process carefully and securely. This is particularly important because education also processes data of minors who are especially vulnerable. Therefore, below in brief are some of Privacy First's current concerns about privacy in education.

Large-scale data collection and retention

Educational institutions collect, store and share large amounts of data from (former) students. This includes school-related information, such as test results, a student's attendance and behaviour during lessons. Health data, private circumstances or special needs of a student are also processed by educational institutions. These data contain a lot of information that creates risks that could harm the child's development, such as misinterpretation or misuse of these data. These are sensitive or even special personal data of vulnerable pupils, and protecting them is especially important. Attention to the protection of this data and a legal framework that takes into account the rights of the child can contribute to this, such as the Children's DPIA of Privacy at School. They won the Incentive Award at the Dutch Privacy Awards last year. Privacy First therefore invites organisations within education that have a privacy-friendly initiative or solution to apply sign up for the Privacy Awards 2024!

AI targeting children

Artificial-intelligence (AI) systems are increasing at a breakneck pace. AI is also playing an increasing role in education. Students generate essays and presentations, among other things, through generative AI. In its newsletter of 14 September 2023 announced concerns about the handling of personal data in organisations using AI, especially when AI targets children. As these AI systems continue to develop, it is vital to better understand their impact and risks in the education sector. The European Commission has therefore ethical guidelines prepared for teachers on AI and data use in teaching and learning.

Big Tech in education

Educational institutions often engage external service providers for various digital education tools and platforms. The use of these platforms by educational institutions raises concerns about how students' personal data is handled. Large vendors such as Google and Microsoft often have a dominant position and a lack of transparency. This makes it difficult for educational institutions to enforce proper safeguards and maintain control over data. This can result in students ceding data to these large vendors. Privacy First therefore supports the petition of the Coalition Fair Digital Education (CEDO), in which they request an alternative design of digital learning environments that safeguard public values and autonomy for the individual.

Privacy in education is an important area of focus for Privacy First in the coming years. If you would like to become active for us in this area as a volunteer or advisor, please take contact with us!