The European Commission, in its preliminary investigation, has found that Meta may be in breach of the Digital Services Act (DSA), citing failures to prevent children under 13 from accessing Instagram and Facebook. The Commission said the company’s existing safeguards appear insufficient to identify, remove and mitigate risks linked to underage users, despite platform rules setting 13 as the minimum age.
What the EU found
According to the Commission’s preliminary findings, Meta’s enforcement systems do not adequately prevent minors under 13 from creating or maintaining accounts on its platforms. Children can bypass age restrictions by entering false dates of birth, with no effective verification mechanism to confirm accuracy.
The Commission also flagged shortcomings in reporting tools. Reporting an underage user requires multiple steps, and forms are not pre-filled with relevant user details. Even when reports are submitted, there is often little follow-up, allowing flagged accounts to remain active.
Beyond enforcement gaps, the Commission said Meta’s risk assessment framework is incomplete and inconsistent. It does not fully account for the scale of underage usage, despite evidence suggesting that around 10–12 per cent of children under 13 in the EU use these platforms.
The findings also highlight a lack of consideration for research showing that younger users are more vulnerable to online harms.
What’s next
The European Commission has asked Meta to revise its risk assessment methods and strengthen systems to prevent, detect and remove underage users.
It also emphasised the need to ensure a high level of privacy, safety and security for minors. The Commission clarified that the findings are preliminary and do not prejudge the final outcome. However, it has called on Instagram and Facebook to improve both detection tools and their overall approach to risk assessment.
Meta can now respond to the preliminary findings as part of its right of defence. The company may review investigation documents, submit a written reply and propose corrective measures in line with DSA guidelines.
The European Board for Digital Services will be consulted during this process. If the Commission confirms its findings, it may issue a non-compliance decision. This could lead to fines of up to 6 per cent of Meta’s global annual turnover, along with periodic penalty payments to ensure compliance.
Where the case stems from
The case dates back to May 16, 2024, when the European Commission initiated formal proceedings against Meta under the DSA.
The probe examines whether Instagram and Facebook adequately protect minors, particularly in preventing underage access and limiting exposure to harmful content.
It also focuses on how Meta assesses and mitigates risks for younger users, including whether platform design contributes to addictive behaviour and “rabbit hole” effects.
The Commission said its findings are based on an in-depth review of Meta’s internal documents, risk assessments and responses to formal information requests.
As part of the evaluation, it used its 2025 DSA Guidelines on the protection of minors as a benchmark. These guidelines highlight the importance of age verification tools that are accurate, reliable, non-intrusive and non-discriminatory.
The Commission has also developed a blueprint for an EU-wide age verification system as a reference framework.