News
TikTok is under investigation over child sexual abuse material
The DOJ and DHS are investigating the app.

Just a heads up, if you buy something through our links, we may get a small share of the sale. It’s one of the ways we keep the lights on here. Click here for more.
TikTok has found itself at the center of an investigation from U.S. Government agencies over how the platform handles child sexual abuse material (CSAM).
Both the Department of Homeland Security (DHS) and the Department of Justice (DOJ) are looking into the app.
These TikTok investigations surfaced thanks to a report from Financial Times. Sources told the publication that the DHS is investigating the app over its handling of CSAM.
Another source says that the DOJ is looking into a specific privacy feature that predators are exploiting.
TikTok’s skyrocketing popularity has led to moderation struggles

READ MORE: TikTok is now testing in-app video games
The platform only reported 155,000 videos to the National Center for Missing and Exploited Children (NCMEC) last year.
Comparing that to Instagram, which has a similar amount of users but has more than 3 million reports in the same time frame, it’s obvious to see that TikTok is lagging behind.
TikTok has responded, noting:
“TikTok has zero-tolerance for child sexual abuse material. When we find any attempt to post, obtain or distribute [child sexual abuse material], we remove content, ban accounts and devices, immediately report to NCMEC, and engage with law enforcement as necessary. We are deeply committed to the safety and wellbeing of minors, which is why we build youth safety into our policies, enable privacy and safety settings by default on teen accounts, and limit features by age.“
READ MORE: Leaked TikTok meetings show China accesses US user data
Erin Burke, head of the child exploitation investigations unit at DHS called TikTok the “platform of choice” for predators. “It is a perfect place for predators to meet, groom, and engage children,” she said to Financial Times.
Burke continued, adding that international companies are more reluctant to help law enforcement in the U.S. “We want [social media companies] to proactively make sure children are not being exploited and abused on your sites — and I can’t say that they are doing that, and I can say that a lot of US companies are,” she said.
Hopefully, these investigations are productive and result in a change in TikTok’s moderation. Because it’s obvious that the platform isn’t doing enough to deter this kind of content and there’s absolutely no excuse why it’s not.
READ MORE: Google, Meta supplied fake cops with data used to exploit minors
Have any thoughts on this? Let us know down below in the comments or carry the discussion over to our Twitter or Facebook.
Editors’ Recommendations:
- TikTok is finally testing a dislike button for toxic comments
- TikTok now lets anyone make their own AR filters, but don’t get your hopes up
- A bug in Facebook’s system led to a surge of misinformation
- Elon Musk just said he wants to buy 100% of Twitter
Follow us on Flipboard, Google News, or Apple News
