Denmark plans to become the first European nation to completely ban social media access for children under 15, marking a significant escalation in the global debate over online safety and child protection. The ambitious legislation aims to shield young users from harmful content and exploitation by requiring tech platforms to verify age and enforce restrictions. While the policy won’t take effect immediately, it reflects growing international concern about the impact of social media on children’s well-being.
Caroline Stage, Denmark’s Minister for Digital Affairs, stressed the urgency of the situation, citing alarming statistics: 94% of Danish children under 13 and over half of those aged under 10 already use social media platforms. “The amount of violence, self-harm that they are exposed to online… is simply too great a risk for our children,” Stage declared. She also criticized tech companies for prioritizing profits over child safety despite their vast resources.
This move comes after Australia introduced the world’s first ban on social media for under-16s in December 2022, imposing hefty fines on platforms that fail to enforce age restrictions. The Australian legislation set a precedent for stricter regulation of online spaces but faced similar implementation challenges.
Enforcement: A Complex Challenge
Denmark acknowledges the practical difficulties involved in enforcing such a ban in an increasingly interconnected digital world. Stage emphasized that while the government can’t force tech giants to use their proposed age-verification app, they will ensure platforms implement robust verification methods or face significant fines under the EU commission’s Digital Services Act – up to 6% of their global income.
The Danish approach relies on two key elements: a national electronic ID system (used by nearly all citizens over 13) and the development of a dedicated age-verification app. This strategy reflects broader European efforts to establish effective age verification tools, though successful implementation remains to be seen.
A Broader Global Trend
This Danish initiative is part of a growing global trend towards stricter regulation of online platforms’ impact on children. China, for example, has imposed limits on gaming and smartphone usage for minors, while French prosecutors are currently investigating TikTok over allegations that its algorithms contribute to suicidal ideation among vulnerable young users. These cases highlight the complex ethical challenges posed by social media’s influence, prompting governments worldwide to seek solutions to protect children online.
Denmark’s decision underscores a crucial shift in the conversation surrounding digital safety: from relying on self-regulation by tech companies to implementing legally enforceable measures to safeguard children’s well-being in the digital sphere. Whether this bold step sets a global precedent remains to be seen, but it undoubtedly raises the stakes for the tech industry and ignites further debate about the balance between online freedom and child protection.

































































