Europe’s TikTok Crackdown

Carol Khorramchahi

As European leaders push age limits and tougher platform rules, the debate is no longer whether social media affects teens but rather what lawmakers should do about it.

Students using smartphones in classroom. RDNE Stock project. Pexels.

In Europe, the debate over teen social media use is moving fast. What used to sound like a parenting argument about how much screen time is too much is increasingly becoming a policy fight over age limits, platform design and whether companies should be legally forced to protect minors. Recent proposals in countries including Spain, Greece, France, Britain and Germany show how quickly governments are hardening their approaches to apps like TikTok and Instagram.

At the European Union level, lawmakers are pushing for a broader shift. In a November 2025 resolution, the European Parliament called for a harmonized digital minimum age of 16 for using social media, AI companions and video-sharing platforms while still allowing access for ages 13 to 16 with parental consent. The resolution is not legally binding, but it signals where the political momentum is heading: less focus on individual parental controls and more focus on rules that platforms must follow.

Germany is one of the clearest examples of that momentum. Reuters reported on Feb. 21, 2026, that Germany’s ruling conservatives backed a motion to ban social media use for children under 14, push stricter digital age verification for teenagers and support fines for platforms that fail to enforce limits. That does not mean a nationwide ban will happen immediately, as Germany’s federal system makes media regulation more complicated, but it shows how mainstream these proposals have become.

This is also why Australia keeps coming up in the European debate. Julie Inman Grant, Australia’s eSafety Commissioner, says age-restricted platforms must now take “reasonable steps” to prevent under-16s from creating or keeping accounts, and platforms can face major penalties if they fail to comply. The model matters because it shifts the burden from parents and kids to tech companies, which is exactly the direction many European policymakers now favor.

Still, the move is not without criticism. Professor Sonia Livingstone at the London School of Economics and Political Science argues that governments should be cautious and build better evidence before rushing into broad bans. That tension is at the center of the story; many officials believe action is overdue, while researchers and rights advocates warn that blunt bans may create new problems, including privacy concerns around age verification and weaker oversight if teens move to less-regulated spaces.

For parents, the practical takeaway is simple. The conversation is no longer just about family rules at home. Across Europe, governments are now asking whether social media platforms should be treated more like products with age restrictions and if companies, not families, should be held responsible when those safeguards fail.

GET INVOLVED:

Learn more about youth online safety policy through the European Parliament’s digital policy coverage, follow implementation updates through Australia’s eSafety Commissioner social media age restrictions page and read research-based perspectives on children’s digital rights from the London School of Economics’ Media@LSE.

Carol Khorramchahi

Carol Khorramchahi is a student at Boston University, where she studies English and Psychology and minors in Journalism. She enjoys writing and reporting on stories that bring together culture, identity, and community, and has experience in both newsroom reporting and digital media. She is especially interested in thoughtful storytelling with a global lens.