The Digital Services Act imposes strict duties on online platforms accessible to minors, and the European Commission is already enforcing them. From age assurance to dark patterns, recommender systems and AI, this blog explains what you need to change in your product and governance now.
Article 28(1) of the Digital Services Act (DSA) requires providers of online platforms accessible to minors to put in place appropriate and proportionate measures to ensure a high level of privacy, safety and security of minors on their service. In practice, the scope of this obligation is broader than many platforms assume, and the European Commission has set the bar high. The Commission is already taking action, having preliminarily found several major adult content platforms in breach of the DSA for failing to protect minors, and having opened formal proceedings against Snapchat. Platforms that have not yet assessed their compliance position should do so now.
What Is an online platform under the DSA?
An online platform under the DSA is a hosting service that stores and makes user-generated content publicly available. It covers social media platforms, online marketplaces, gaming platforms and community forums, but not services where content sharing is merely a minor and ancillary feature, such as comments under a news article.
Accessible to minors?
A provider cannot rely on a clause in its terms and conditions prohibiting minor access to argue that Article 28(1) DSA does not apply. Platforms restricting access to users aged 18 or over will still be considered accessible to minors where no effective measures are in place to actually prevent it. Beyond that, a provider can reasonably be expected to know that minors are among its users where the platform is known for its appeal to minors, offers comparable services to those used by minors, is promoted towards minors, or where research – whether commissioned or not – identifies minors as users. Do not assume these obligations do not apply without first conducting a thorough scope analysis.
What the Guidelines Require
On 7 October 2025, the European Commission published its Guidelines as a significant and meaningful benchmark for compliance with Article 28(1) DSA. In April 2026, it followed up with a family-friendly booklet on protecting minors. We will take you through 5 key elements:
1. Age assurance: Self-declaration – whether a date-of-birth field or a click-through checkbox – does not meet the Commission’s requirements of robustness and accuracy. High-risk platforms should adopt double-blind age verification methods, such as the EU validation app or, in due course, the digital wallet. Medium-risk platforms may adopt age estimation methods, such as facial estimation (selfie).
2. Dark patterns & interface design: Product decisions are now legal decisions. Minors’ accounts should be set to private by default, so that
- their personal information and content remain hidden from unauthorised users;
- features contributing to excessive use – such as streaks, autoplay, read receipts and push notifications – are disabled by default;
- third-party accounts are not permitted to download or screenshot minors’ content, in order to prevent the unwanted dissemination of intimate content and sexual extortion;
- and persuasive design features aimed at engagement are removed. AI chatbots are subject to adequate safeguards, including a clear notification that the user is interacting with an AI and the option to disable this.
3. Content moderation: Platforms must establish and enforce clear policies on harmful content, ensure moderation is active 24/7 with at least one staff member on call, complement automated tools with human review, and provide child-friendly reporting mechanisms with prompt feedback and minimum parental control requirements.
4. Recommender systems: Recommender systems must not rely on behavioural data capturing all or most of a minor’s activities. Platforms must prioritise explicit user signals over implicit engagement signals such as clicks and scrolls, and empower minors to control their own feeds. This strikes at the heart of many business models built on behavioural data.
5. AI features: AI features such as chatbots must not be automatically activated on platforms accessible to minors, and minors must not be encouraged to use them. They must be aligned with minors’ evolving capacities and may only be made available following a risk assessment. AI features must not be used to steer minors towards commercial content or purchases.
What this requires from you
Compliance is not a purely legal exercise. It requires close collaboration across legal, product, engineering, and compliance teams, with governance structures that ensure legal and compliance teams are involved at the design stage, not after launch. Product decisions are now legal decisions. Compliance by design is no longer aspirational; under the DSA, it is a legal requirement.