Machine translations by Deepl

Snapchat under fire

The European Commission has launched an official investigation into Snapchat. And this is not a symbolic step, but a serious test of how strictly Europe will regulate social media. The wind seems to be turning in this regard. In the US, a jury recently ruled that Meta and Google are liable for social media addiction, because of the mental problems in users caused by the addictive design. In the Netherlands, the court recently ruled that Meta must allow its users at Instagram and Facebook to choose a timeline that is not based on profiling. In another recent Dutch case, the court banned AI chatbot Grok (X) from generating deep-fake nude photos.

Now Snapchat is under scrutiny from the EU and assessed against the obligations imposed on major platforms by the European Digital Services Act (DSA). A key premise of the DSA is that platforms must ensure a high level of protection for young people.

Snapchat

Snapchat has been one of the most popular apps among young people for years. Quickly sending a picture, a message that disappears on its own; the platform feels approachable and fast. Signing up is easy. That makes Snapchat the favourite platform, even for the very young, primary school-aged children. In the Netherlands, we are talking about millions of active users who open the app several times a day. However, the easy use of Snapchat is attractive not only to well-meaning users but also, on the contrary, to others. That is where the concerns come from.

From complaint to European dossier

The starting point of the investigation is an individual complaint. A Dutch lung doctor raised the alarm about vape dealers on Snapchat. The complaint was initially taken up by the Authority Consumer & Market (ACM) and grew into a wider European case, focusing on whether Snapchat complies with its obligations under the DSA, especially when it comes to the protection of minors.

Age verification

One of the main concerns in the study[1] is age verification. The Snapchat terms and conditions state that users must be at least 13 years old before they can use the app. Snapchat works on the basis of what the user self-declares: users enter their date of birth and then access the platform. According to the European Commission, Snapchat's measures are insufficient. In the DSA guidelines[2] for the protection of minors explicitly stated that self-declaration of age is not a reliable method.

By the way, age verification is not only about actively banning children under the minimum age, but also about adapting content when minors are involved.

Platform providers have been challenged many times on how age verification is carried out in practice. So far, they hide behind technical obstacles and point at each other. To overcome the technical objections, an app has now been developed on the Commission's initiative that allows users to prove that they are old enough to access certain sites, without compromising their privacy (according to the Commission).[3]

The study on Snapchat and the resulting conclusions can make an important contribution to sharpening providers' commitment to effective age verification. At the same time, age verification also introduces new risks, which have been warned about for years by civil society organisations (including Privacy First) and academics.[4]

Design choices and dark patterns

Another concern regarding Snapchat is about the way the platform is set up. Under both the DSA and AVG, systems must be designed to be secure and privacy-friendly by default (privacy by design). With Snapchat, this is not obvious. Users are actively encouraged to add new contacts and can connect with strangers relatively easily.

The European Commission is also taking a critical look at so-called dark patterns. The DSA prohibits interfaces that mislead or direct users in a way that undermines their freedom of choice.

Illegal content and abuse

A major concern with Snapchat is the presence of illegal and harmful activities on the platform. Signs point to drug trafficking and the sale of vapes and alcohol to minors. There is also the risk of grooming, in which adults impersonate minors in order to contact them.

This does not only apply to Snapchat. Social media in general is a pond for young people who want to earn a penny illegally. In the Netherlands, some 6 per cent of young people aged 16 to 27 say they have been approached for ‘odd jobs’ via social media. 1 in 5 young people in the same age group say they have seen calls for drug or violent crimes on occasion, according to a recently published survey.[5]

The obligation to tackle illegal practices and content follows from the DSA. This is not just about responding to reports, but structural measures, such as moderation systems, detection mechanisms and risk assessments. On that very point, the European Commission seems to doubt whether Snapchat is doing so sufficiently.

Investigation and enforcement

With the opening of the investigation, the European Commission has far-reaching powers under the DSA. The Commission can request information, conduct inspections and eventually impose sanctions. In addition, Snapchat may be required to modify its systems or make binding commitments. This makes the investigation an important enforcement moment within European platform regulation.

Beyond Snapchat

This case is broader than a single platform. It is a concrete test of how the DSA is applied in practice. If Snapchat actually violated the DSA, the fine could be up to 6 per cent of its annual global turnover.

The road ahead is clear. The protection of minors is central and platforms must actively demonstrate that they take that responsibility. The combination of DSA, AVG and additional guidelines rightly raises the bar for online safety of young people. The Commission's investigation is a great start to spring.

 

[1] https://ec.europa.eu/commission/presscorner/detail/en/ip_26_723

[2] Commission publishes guidelines on the protection of minors | Shaping Europe's digital future

[3] European age verification app to keep children safe online - European Commission

[4] See for example: Does Europe really need a new payments system? https://edri.org/our-work/open-letter-the-dangers-of-age-verification-proposals-to-fundamental-rights-online/ and https://blog.xot.nl/2026/04/09/online-age-assurance-raises-thorny-questions/index.html.

[5] This is a study by the Free University Amsterdam (VU), Erasmus University Rotterdam and the Netherlands Study Centre for Crime and Law Enforcement, https://www.politieenwetenschap.nl/publicatie/de-rol-van-sociale-media-bij-de-betrokkenheid-van-jongeren-bij-zware-drugs–en-geweldscriminaliteit