News
Meta allegedly let sex trafficking accounts get 16 strikes before a ban
The lawsuit paints a picture of Big Tech companies prioritizing engagement metrics like it’s the only vital sign that matters.
Just a heads up, if you buy something through our links, we may get a small share of the sale. It’s one of the ways we keep the lights on here. Click here for more.
According to a newly unredacted court filing, Meta apparently believed in second chances. And third chances.
And fourth. And, well, up to sixteen chances, even for accounts allegedly involved in human sex trafficking.
The revelation comes from testimony by Vaishnavi Jayakumar, Meta’s former head of safety and well-being, in a massive child safety lawsuit brought by US school districts.
In her deposition, Jayakumar said that accounts could rack up 16 violations for prostitution and sexual solicitation before finally getting suspended on number 17.
You know, just a casual warning system for crimes, like a punch card at a frozen yogurt shop.
The court filing claims internal documentation backed up this policy, which Jayakumar described as having a “very high strike threshold” compared to industry standards.
Most companies don’t give you 16 tries to stop doing something extreme and illegal.
And that’s just the appetizer.
The unredacted documents also allege that Instagram, for an alarmingly long time, didn’t even have a specific way for users to report child sexual abuse material (CSAM).
When Jayakumar raised this issue internally, she claims she was told it would be too much work to build the tools needed to handle all those reports.
Apparently, child safety lost to “engineering bandwidth.”
The lawsuit, which also targets
If a safety change threatened to tank user activity, the filing suggests, it had a tough time making it out of the meeting room alive.
In 2019, Meta reportedly discussed making teen accounts private by default to reduce harassment.
But according to the lawsuit, the idea was scrapped after being flagged as something that would “likely smash engagement.”
Another internal study allegedly found that hiding like counts made users feel significantly better about themselves.
Sounds great, right? Well, the idea was reportedly pulled back because it was “pretty negative to FB metrics,” which is possibly the most Meta sentence ever written.
It gets darker: Meta is also accused of bringing back beauty filters after briefly considering removing them, even though researchers found they worsened body image issues in young girls, because getting rid of them might hurt growth.
Meta, unsurprisingly, says the claims are misleading.
A company spokesperson told The Verge that these allegations rely on “cherry-picked quotes” and ignore years of work on teen safety features and parental controls.
Still, the unredacted filing has added fuel to the ongoing legal and political pressure on Meta over its role in teen mental health and online safety.
The lawsuit claims Meta kept asking not “Is this safe for kids?” but “Yeah, but… will people log off?”
And apparently, that question has a minimum of 16 wrong answers before the platform takes it seriously.
