Turkey’s planned new restrictions on accessing social media platforms without identity verification will erode anonymity and further tighten censorship.
Starting with various adult content sites in a few countries, these initiatives are now focusing on restricting social media access without identity verification, under the broader pretext of safeguarding young users. Australia’s ban on social media for under-16s – which appears to have been ineffective so far– has sparked debates about emulating it elsewhere.
While such regulations cite child protection, they erode anonymity, restrict free expression and build digital surveillance infrastructure on a massive scale. What I believe is that instead of bans that treat children as passive victims, we need transparent, multi-stakeholder digital rights frameworks that empower them as rights-holding digital citizens.
In Turkey, the issue recently hit the headlines when Justice Minister Akın Gürlek announced an agreement with platforms such as X, Instagram and TikTok, under which unverified accounts will be shut down after a three-month transition period.
The system mandates e-State tokens for verification, restricting access for under-15s, despite a 16-year-old minimum to use the government database. The government frames this as a tool against cybercrime, disinformation, and child protection. As with past laws concerning the digital sphere, I suspect the real focus will again not be children but digital surveillance infrastructure.
Let’s look at Australia’s ban for under-16, and how it risks patronising youth, treating them as passive victims. Medyascope’s Açık Oturum (Open Forum) session in December 2025, held to discuss the ban in Australia, moderated by Göksel Göksu, brought together experts from different disciplines including myself to talk about digital rights and liberties: Şevket Uyanık (communications specialist at the Turkish Human Rights Foundation), clinical psychologist Deniz Bozunoğulları, and technologist Ahmet Alphan Sabancı. During this session, we reached the same conclusion: this should not be about adding restrictive layers, but empowering children as rights-holding citizens.
While this article focuses on government-driven identity checks, the social media platforms and the tech companies are far from innocent. These platforms have spent years building systems that maximise attention, push addictive content, and surface harmful material to children while hiding behind vague moderation promises and self-regulation.
If governments are serious about child protection, they cannot simply outsource the problem to platforms that have repeatedly shown they will not restrain themselves without hard rules, transparency and real enforcement.
Privacy and media freedom at risk



