Too young for TikTok?
There are growing concerns in society about the impact of social media on young people, including exposure to harmful content and the influence of addictive algorithms. Privacy First advocates an approach that respects young people's privacy and autonomy, while taking measures to protect them from the risks of social media. This includes parental monitoring and education as well as strict regulation and accountability from the platforms themselves.

In March this year, the House of Representatives passed a motion calling for the establishment of a minimum age of 15 for using social media platforms.[1] The motion focuses mainly on platforms that use addictive algorithms, such as TikTok, Instagram and Snapchat. This motion does not come out of the blue: many people are concerned. A panel survey of parents by EenVandaag showed that 71% of those surveyed favour a minimum age of 15 for social media. It also recently published a fire letter from doctors, scientists and practitioners to limit screen time and social media use.[2]
The secretary of state responsible has announced that he will issue advice on a minimum age for social media and recommended screen times for children before summer 2025. He will also consult with other European countries to develop joint guidelines. Several European countries have expressed support for an age limit for social media. French President Macron has signalled his intention to regulate a social media ban at European level within a few months. Should the EU not act quickly enough, the French president says he will introduce the ban in France.[3]
The discussion is not just in the Netherlands and Europe. The use of social media by young people raises questions about age limits around the world. In November 2024, Australia introduced a law banning social media for children under 16 unless parents give explicit permission.[4] Platforms that break these rules risk hefty fines of up to €31 million. The ban will not take effect immediately. Implementation will take another year.
Why an age limit?
Social media can expose young people to inappropriate or harmful content. An age limit can help protect children from harmful content. Consider videos or photos that romanticise eating disorders, normalise self-harm, or contain extreme violence or hateful messages. Confrontation with such content can have a major impact on young people's mental health, especially if they feel vulnerable or insecure. Fake news and conspiracy theories can also be harmful and give a distorted view of the world. What makes it extra complicated is that algorithms often actually show more of this kind of content once you click or linger on it a few times. This creates the danger of getting caught in a negative spiral.
Social media algorithms are designed to keep users on the platform for as long as possible. They do this by analysing behaviour and offering increasingly sophisticated content that capitalises on emotions, curiosity and vulnerabilities. For young people, whose brains are still developing and extra sensitive to rewards, social affirmation and peer pressure, this is a risky combination. The algorithm learns at lightning speed which videos, images or messages catch their attention, and then feeds them more of the same or more extreme.
This mechanism leads to so-called "rabbit holes": young people end up in a bubble of one-sided or harmful content, for example around appearance, eating behaviour, violence or fake news, without always being aware of it. They have fewer skills to critically filter that flow or slow themselves down.[5]
The popular series Adolescence reveals the profound influence of social media on young people's lives. Social media plays a major role in identity formation, self-image and relationships. Intensive use can have a major impact on young people's well-being, including risks such as bullying and low self-esteem.
Advantages
The use of social media does not have only negative sides. Social media, if used consciously, also offer clear benefits for young people. They strengthen social connectedness, help identity development and provide a platform for creative expression. Young people can go there for information and education, often in a way that better suits their perceptions than traditional media. In addition, social media offer scope for involvement in social issues and are a valuable source of support and recognition, especially for young people who feel less seen offline. If used 'properly', social media can contribute to personal growth, self-confidence and global citizenship. It offers young people opportunities for social interaction and participation in online communities. A ban or an age limit limits the opportunity to participate digitally for a large group of young people and find like-minded people online, for example.[6]
Age verification, how do you do it?
Verifying age is complex. Young people can easily circumvent age restrictions by providing incorrect dates of birth. Age verification should be implemented without sharing (sensitive) personal data (privacy by design). Techniques such as facial recognition or fingerprinting raise (too) many privacy questions.
The European Commission has decided to build an age verification app to address the concerns. The idea is that the app will allow young people to prove they are old enough without compromising privacy. Social media providers should be required to modify their access to use this app.
AVG, DSA and DFA
The AVG (or GDPR in Europe) explicitly states that children deserve extra protection when processing personal data. In the Netherlands, young people under 16 are not allowed to independently consent to the use of their data on social media. Platforms must therefore check whether parental consent has been obtained. Moreover, they are not allowed to collect sensitive data or show personalised ads to this age group. In addition, the AVG describes that platforms must make active efforts to protect young people from data abuse.[7]
Since 2024, the Digital Service Act (DSA) has been in force in the EU. Major platforms such as TikTok, YouTube and Instagram are now required to identify and mitigate the risks their systems pose to young people. Think algorithms that recommend harmful content, or mechanisms that encourage addiction. Platforms must take measures to protect minors. Transparency on how the recommendation algorithm works is mandatory here.
Because of the DSA, changes are visible. Meta provides more information on options for parents to regulate screen time and adjust privacy settings. YouTube also announced new default settings. It is certainly not enough yet and is not going fast enough. For that, the guidelines from the DSA should be tightened and more needs to be done on enforcement.
The Digital Fairness Act (DFA) offers opportunities. The DFA is an important next step in the European Union to protect users from unethical digital practices. The proposal is expected in 2026.
And the parents?
Parental supervision is often cited as part of the solution. However, there are often privacy concerns attached to such monitoring. The Jeugdjournaal recently devoted an item to young people being followed by their parents via an app. It rightly pointed out the importance of a free space for young people to experiment. The right to privacy, and thus the right to decide for themselves what information they share with their parents, also applies to young people.
The Volkskrant spoke to 18 different children about using their phones.[8] Many children report being confronted online with images they would rather not see. The threshold for involving parents appears to be high. Other research also shows that children are reluctant to go to their parents when unpleasant things happen online.
Rights and interests of the child paramount
Simone van der Hof (Professor of Law and Digital Technology, Leiden University) examines the impact of technology on children's rights. She criticises the proposal to ban social media for children under 16.[9] She argues that such a ban excludes children from an important part of their social lives, including the positive aspects of social media.
It is the opposite world to deny children access instead of forcing social media platforms to provide a safe environment for young people. The UN Children's Rights Committee has made it explicit that children's rights also apply in the digital world. Young people are entitled to protection, privacy, information and participation, also online. Platforms need to adapt their design and policies accordingly.[10]
Instead of banning social media for young people, Privacy First advocates respecting children's privacy and autonomy in the first place. In addition, we need to take measures that protect young people. Parents can help their children do this, but attention to media literacy in education can also contribute.
But let's primarily turn to the platforms themselves. For instance, by strengthening, tightening and enforcing regulations such as the DSA, the AVG and, in the future, the DFA, and enforcing changes in social media design. Think rules on addictive design, manipulative interfaces such as endless scrolls and better reporting options.
It is time for platforms to mature enough to provide a safe environment for their youngest users as well.
[1] Motion by member van der Werf c.s. on differentiated age limits for social media platforms (20 February 2025). See also NRC, Lower house wants minimum age of 15 for social media, https://www.nrc.nl/nieuws/2025/03/04/tweede-kamer-wil-minimumleeftijd-15-jaar-voor-sociale-media-a4885178.
[2] See https://smartphonevrijopgroeien.nl/brandbrief/. See also NRC, Experts and parents want social media ban, https://www.nrc.nl/nieuws/2025/05/26/experts-en-ouders-willen-een-socialemediaverbod-komt-dat-er-ook-a4894706.
[3] https://www.security.nl/posting/891751/Franse+president+wil+Europees+verbod+op+social+media+voor+onder+de+15+jaar
[4] NOS, Australia wants social media ban for under-16s, https://nos.nl/l/2546272
[5] Amnesty International (2023), TikTok's 'For You' feed risks pushing children and young people towards harmful content, https://www.amnesty.org/en/latest/news/2023/11/tiktok-risks-pushing-children-towards-harmful-content/.
[6] https://netwerkmediawijsheid.nl/een-minimumleeftijd-voor-sociale-media-een-verbod-kan-zelfs-een-tegengesteld-effect-hebben/
[7] See Article 8 AVG.
[8] How do 10-year-olds handle their phones? This is what they tell themselves for once. 'It's very dirty, I can't believe I saw it', https://www.volkskrant.nl/wetenschap/hoe-gaan-kinderen-van-10-om-met-hun-telefoon-dat-vertellen-ze-nu-eens-zelf-het-is-heel-erg-vies-ik-kan-niet-geloven-dat-ik-het-heb-gezien~b1032b77/.
[9] NRC interview with Simone van der Hof: Banning social media solves nothing, says professor, https://www.nrc.nl/nieuws/2024/12/13/sociale-media-verbieden-lost-niets-op-zegt-hoogleraar-dwing-platformen-veilig-te-zijn-voor-kinderen-a4876442.
[10] For a further explanation of children's rights in relation to the digital world, see General comment no 25 of the UN Committee on the Rights of the Child (2021): https://open.overheid.nl/documenten/ronl-ad4f21ab-3f0d-4dfd-82a4-e6822c23c203/pdf.