
Why Discord Decided to Consider Everybody a 13-Year-Old
In the ten years that I have used the communication software Discord, my personal data has been exposed to at least two major security breaches. The first was when hackers accessed the personal data of 760,000 users in 2024, and the second was when a different group of hackers accessed the personal data of 5.5 million unique users in October 2025.
The second hack was especially notable because an alleged 70,000 users had their government IDs accessed.
The reason Discord had these IDs in the first place was that they sometimes required users to submit an ID and a face scan for age verification. However, there’s something that doesn’t completely line up here. Before the hack, Discord claimed that face scans and IDs would be deleted from their servers once age verification was confirmed. This, it turns out, was a lie.
As far as I know, my data was not compromised in either of the major Discord hacks. If my data had been compromised, it would amount to my location, my passwords, my age, my communication habits — nothing that Apple doesn’t already have. I never sent Discord my ID because I had never needed to verify my age. That could change as of March 2026.
On Feb. 9, Discord announced it would roll out a new “teen-by-default” setting for all new and existing users. In short, this means that users will not be able to access content on Discord that has been deemed “adult” — certain settings, servers, videos, direct messages — without confirming their age via face scan and government-issued ID.
They have reiterated that the face scans and ID will only be used to verify age and will be deleted immediately after. Why they should be believed this time around is anybody’s guess.
Discord has unveiled its new age verification policy as pressure mounts for the company to do something about the rampant child abuse that has been enabled by the platform. In recent years, a spotlight has been turned on ideologically motivated violent extremism (IMVE). These networks associated with IMVE use Discord as a platform to coordinate the grooming, extortion, abuse, assault and murder of children.
Granted, these IMVE groups do not use Discord in isolation. They operate using a combination of encrypted messaging platforms like Discord and Telegram, as well as video games like Roblox and Minecraft, to commit acts of child abuse. These groups have no physical gathering places and are made up of members from all over the world, which has made them difficult to pin down or even quantify. That said, Discord has been aware of the major IMVE groups as far back as 2021, maybe even earlier, and has been, at best, slow to act.
Discord has passed the buck of dealing with this problem to its userbase. I don’t plan on accessing any “adult” content on Discord, but normalizing face scans and government-issued ID upload for all will not stop at simply regulating who can share or watch pornographic material on the platform.
Here’s a logical jump. Discord already has an internal structure for universities to facilitate class-specific servers. USaskhas several Discord servers for various colleges, clubs and classes. I am a part of a few myself. It would not surprise me if, going forward, Discord would start to require scans and ID to access university-specific content, on the basis of wanting to verify that every user is genuinely a university student. After all, not everybody in university is an adult.
Since Discord doesn’t specify how they will determine if something (or if someone’s entire account) qualifies as adult content, they get to set the parameters of when and why they’ll require users to upload their private data. If you don’t engage with adult content on Discord but somebody in one of your servers does, who’s to say Discord won’t require every server member to verify their age? The alternative goes against what Discord says they are trying to curtail with this new policy: teenage users interacting with adult content.
So, what then is Discord to do? If you weigh the implementation of wholesale age verification against the prospective abuse of children, isn’t it a no-brainer? Maybe, but there was an alternative option. If you ask me, if a company is having a hard time getting rid of a rampant child abuse situation on their digital platform, they are entitled to nuclear options in rectifying the problem. However,if the software has been historically vulnerable to data breaches affecting millions of users, the nuclear option doesn’t lie in requiring more people to upload sensitive documents and biometrics. It lies in banning adult content entirely.
This will never happen, and the precedent for why it will never happen was set in 2017. When blogging platform Tumblr’s parent company, Yahoo, was bought out by wireless giant Verizon in 2017, Verizon banned all adult content on the platform. This came after Apple forced their hand by pulling the Tumblr app from the App Store, citing that the website hosted child pornography.
Tumblr had already been experiencing a drop in users for the four years since it was acquired by Yahoo in 2013 for US$1.1 billion. By 2015, its value had been marked down to US$230 million.
With the change in restrictions after the Verizon buyout in 2017, it experienced a mass exodus of users, many of whom cited the adult content ban as the reason. Verizon sold Tumblr two years later for US$3 million, and the blogging platform never again reached the traffic-highs of the pre-NSFW-ban era. Sex still sells.
There were two clear options for Discord. One would potentially risk their users’ data, the other the company’s bottom line. Since Discord is a for-profit organization, the choice was clear. It’s the 560 million users that will be shouldering the responsibility of making Discord a “safer and more inclusive experience for users over the age of 13”, not the company itself.
It’s not even worth considering the possibility that Discord isn’t as forthcoming with what they do with their users’ personal data as they seem. The going rate for black market personal data is about US$4000 on the high side. With a user base of 560 million (and counting), all being forced to upload their data, I’ll let you do the math on what kind of payday that looks like. However, that’s silly, because big tech doesn’t lie about how our data is managed. Right?
As tech giants become more brazen in the amount of data they ask of their users, it is our responsibility as users to call a spade a spade and identify when alternative measures can be taken, and ask why they weren’t. In this case, I believe Discord prioritized profit over taking real action to solve a horrifying and rapidly proliferating problem. Let’s just hope that some of those dollars they’re holding on to are dedicated to better cybersecurity going forward.
Leave a Reply