Donate
Text Audio
00:00 00:00
Font Size

The Department of Homeland Security reportedly investigated how TikTok handled material that detailed the sexual abuse of children. The Financial Times reported that the Department of Justice also reviewed how a privacy feature on the platform was exploited by predators. 

Content that contained the sexual abuse of children was allegedly traded among private accounts and used the platform’s “Only Me” feature where videos were only visible to those logged into the specific profile.

The Times added that Seara Adair, a child safety campaigner, reported the conduct to law enforcement.

TikTok was accused of neglecting its duties to detect and prevent grooming attempts compared to other platforms.

“It is a perfect place for predators to meet, groom and engage children,” Erin Burke, the unit chief of the Child Exploitation Investigations Unit at the Department of Homeland Security’s cybercrime division, said and called it the “platform of choice” for that type of behavior.

Burke alleged that international companies like TikTok, which is owned by ByteDance, its Chinese parent company, often failed to work with law enforcement. “We want [social media companies] to proactively make sure children are not being exploited and abused on your sites — and I can’t say that they are doing that, and I can say that a lot of US companies are,” she stated.

The platform is designed to be popular among teenagers.

For its part, TikTok said that it cooperated with law enforcement “as necessary.”

“TikTok has zero-tolerance for child sexual abuse material,” the company stated. “When we find any attempt to post, obtain or distribute [child sexual abuse material], we remove content, ban accounts and devices, immediately report to NCMEC, and engage with law enforcement as necessary.”

CNET reported that TikTok told the publication over email that it was not aware of an investigation from the U.S. government.

"We're not aware of any of the government investigations as alleged by the Financial Times, but TikTok has a zero tolerance policy on CSAM. Upon reading this story, we reached out to HSI to begin a dialogue and discuss opportunities to work together on our shared mission of ending child sexual exploitation online -- just as we regularly engage with law enforcement agencies across the country on this crucial topic."