The European Commission has issued a preliminary finding against Meta, alleging that the US technology company has failed to adequately protect children under 13 from using Instagram and Facebook. The Commission states that Meta's systems violate the Digital Services Act (DSA), which imposes strict obligations on large online platforms to mitigate systemic risks, including those affecting minors.
Meta's own terms of service set 13 as the minimum age for both platforms. However, the Commission's investigation found that the company's age enforcement measures are largely ineffective. Children can simply enter a false date of birth when signing up, with no robust mechanism to verify the accuracy of the information. According to the Commission, approximately 10 to 12 percent of children under 13 are active on Instagram and Facebook, a figure that contradicts Meta's internal assessments. The Commission also noted that Meta disregarded readily available scientific evidence indicating that younger children are particularly vulnerable to harms from social media services.
Meta's Response and the Industry-Wide Challenge
Meta has responded by stating that it disagrees with the preliminary findings. A company spokesperson told Euronews: "We're clear that Instagram and Facebook are intended for people aged 13 and older and we have measures in place to detect and remove accounts from anyone under that age." The spokesperson added that Meta continues to invest in technologies to find and remove underage users and will share more details about additional measures rolling out soon. "Understanding age is an industry-wide challenge, which requires an industry-wide solution, and we will continue to engage constructively with the European Commission on this important issue," they said.
The age verification challenge is not unique to Meta. Several EU member states are discussing plans to implement blanket social media bans for children under 15, but effective age verification methods remain a sticking point. European Commission President Ursula von der Leyen stated in April that a new age-verification app is technically ready and will be available for use soon, though no specific date was given. She told social media platforms there were "no more excuses" for not protecting children online.
Regulators are now demanding that Meta overhaul its risk assessment methodology and significantly strengthen its measures to prevent, detect, and remove underage users from both platforms. The company has the right to examine the Commission's investigation files and respond in writing to the findings before any final decision is made.
This case highlights broader European efforts to hold tech giants accountable for user safety, particularly for minors. As the EU pushes for tougher rules in other areas, such as shipping emissions, the DSA remains a cornerstone of digital regulation. The outcome of this investigation could set a precedent for how platforms handle age verification across the continent.


