How Snapchat Is Failing to Protect Its Users
What Happens When Platforms Prioritize Growth Over Safety?
Last week The Times based out of London released the piece “Snapchat harming children on industrial scale, says author” discussing how Snapchat has a rampant sextortion problem. But this is not anything new, apparently these reports have been coming in since 2022—3 years. How have they continued to brush this under the rug for 3 years? This lead me down a rabbit hole to see how much (or little) Snapchat is actually doing to protect it’s users.
Snapchat, like most social media platforms, uses AI to recommend and moderate content. Its algorithms categorize users and then suggest content that others in those categories have engaged with. But this can lead to trouble if that content is harmful. That’s where content moderation is supposed to step in. AI systems can be trained to identify and flag inappropriate or dangerous material for removal. Yet in Snapchat’s case, those systems—and the teams behind them—don’t appear to be effective enough.
In September 2024, New Mexico’s attorney general filed a lawsuit against Snapchat claiming their “recommendation algorithm foster the sharing of child sexual abuse material (CSAM) and facilitate child sexual exploitation.” Their investigation revealed that Snapchat’s Trust & Safety Teams were receiving around 10,000 sextortion reports each month. In response, Snapchat argued that stronger safeguards would violate user privacy and lead to “disproportionate admin costs.”
The argument about the tradeoff between privacy and safety is nothing new. One could argue that social media platforms have a duty to keep their users safe, but others argue that respecting their users’ privacy is more important. Today, I want to look at the argument for safety.
Snapchat has reported over 850 million monthly active users, so the 10,000 reports is a fraction of a fraction of their monthly users. Yet, how many are not actively reporting? In Snapchat’s 2024 Transparency report, they stated that their automated moderation system found 1,291,158 violation of sexual content and 664,819 violation for Child Sexual Exploitation. While users also submitted 5,383,707 reports of sexual content and another 1,388,230 reports of Child Sexual Exploitation to their Trust & Safety Team. That’s a total of 8,727,914 incidents that were caught or reported, still a fraction of a fraction.
In all of 2024, their automated moderation system only caught a total of 3.4 million violations— this is mind boggling. That’s an astoundingly low detection rate and this doesn’t even begin to account for what’s slipping through the cracks.
The most concerning number for me was the turnaround time from detection to action. For general sexual content, the average response time was 31 minutes. For child sexual exploitation, it was 9,505 minutes— that’s six days and 14 hours. Almost a week. Snapchat acknowledged this negligence and stated, “Moving forward, we have increased the size of our global vendor teams significantly to reduce turnaround times and accurately enforce on reports of potential CSEA.”
But we have heard that line before. In 2021, they reported they were taking action to address the increase in drug activity occurring on the platform by improving the automated moderation systems and “hiring more staff to respond to law enforcement for criminal investigations”. However, a 2024 article from NBC looked at Snapchat’s employment data, finding that “in 2019, the company had 763 workers, including full-time employees and contract workers, doing safety and moderation tasks. By 2021, that number had increased to 3,051. But in 2022 it shrunk to 2,592, and in 2023 shrunk again to 2,226.” A direct contradiction to their claim that they were going to grow their Safety teams.
Possibly, they decided to utilize AI instead? In their 2024 Transparency Report, Snapchat states that they “started including drug violation” in their 2022 transparency report, but I could not find the 2022 report online. So instead, we can look at the 2023 report, which did not break down drug violations by whether they were caught by automated systems or reported by users. In 2023, they reported a total of 642,421 drug violations. In 2024, that number jumped to 1,563,386, with 1,110,704 caught by automation and another 452,682 reported by users. That’s a 143% increase in total reports between 2023 and 2024.
We can’t say for sure if this means the rate of drug activity on the platform rose or if they got better at catching the activity. Average monthly users did grow by 50 million between 2023 and 2024. However, one concerning trend stands out: the turnaround time for drug-related violations increased from 28 minutes in 2023 to 94 minutes in 2024.
When looking at the numbers, it’s hard to believe Snapchat when they claim to care about the safety of their users. Snapchat noted in their 2024 report that they have provide resources to users in need, like “Our Here For You search tool shows resources from expert local partners when users search for certain topics related to mental health, anxiety, depression, stress, suicidal thoughts, grief and bullying.” or “a page dedicated to financial sextortion and other sexual risks and harms”. These efforts come across as performative at best, especially when the scale of harm and the slow pace of response are taken into account.
By choosing not to invest in better content moderation algorithms or growing their Trust & Safety Teams, one can conclude that Snapchat is not taking its responsibility seriously. As a society, we need to demand better of our social media platforms. The risk of harm is too great to allow them to be this negligent.