European institutions are intensifying efforts to create a cohesive framework for protecting minors in digital spaces, as new data reveals the pervasive nature of young people's online engagement. According to a 2025 European Parliament report, 97% of young people in the EU are online daily, with 65% relying on social media as their primary news source. The report highlights that 78% of 13 to 17-year-olds check their devices hourly, while a quarter of 9 to 15-year-olds admit to smartphone addiction, spending up to three hours daily on social platforms.
The Push for a Pan-European Standard
Last week, Commission President Ursula von der Leyen announced the development of a new age-verification application, designed to enforce minimum age requirements for accessing social media while prioritising user privacy. This initiative responds to overwhelming public support, with 90% of EU citizens backing increased action on online child safety. The announcement follows the Parliament's 2025 push for an EU-wide age limit on social media and restrictions on addictive design features like infinite scrolling and engagement-driven algorithms.
Currently, an expert panel is advising the Commission on formulating a comprehensive EU strategy for child safety online. Their mandate is to avoid a confusing patchwork of national rules that could undermine the digital single market. The panel's final recommendations are scheduled for release by summer 2026.
Existing EU measures, including the Digital Services Act (DSA), the Digital Markets Act (DMA), the Strategy for a Better Internet for Kids, and the Action Plan Against Cyberbullying, already provide specific guidelines for protecting children online. However, as noted in the Parliament's report, none of these solutions currently establishes a minimum age for accessing social media, online platforms, or AI tools.
Member States Move Faster Than Brussels
While Brussels works on a unified approach, several national governments are not waiting. France has already approved legislation banning social media access for children under 15. Spain, Austria, Greece, Ireland, Denmark, and the Netherlands are all preparing urgent political action to implement similar restrictions, reflecting growing political and parental concern across the continent.
This disparity between national and EU-level action creates a complex regulatory environment for global tech platforms operating in Europe. The Commission's proposed app and strategy aim to harmonise these efforts, ensuring consistent protection for minors regardless of their member state. The challenge lies in balancing effective age verification with fundamental rights to privacy and data protection—a tension at the heart of the digital sovereignty debate.
The issue of online safety intersects with broader European policy concerns, from digital regulation to mental health. As the EU Energy Chief Warns of Prolonged Price Hikes from Middle East Conflict, the bloc is simultaneously grappling with internal social challenges shaped by technology. Furthermore, the integrity of European institutions remains paramount, as seen in cases like the Greek Parliament stripping immunity from 13 MPs in a fraud probe, underscoring the need for robust governance in all policy areas.
The development of the age-verification tool will be closely watched by privacy advocates and technology companies alike. Its success or failure could set a precedent for how democracies regulate digital spaces without resorting to outright censorship or surveillance. The final strategy, due in 2026, will need to accommodate the diverse legal and cultural approaches across the EU's twenty-seven member states and the wider European continent.
As Europe navigates this complex terrain, the outcome will influence global standards for child protection online. The bloc's attempt to craft a privacy-preserving yet effective technical solution represents a significant test of its ability to translate regulatory ambition into practical, rights-respecting tools.


