ISD reported the account, along with 49 others, in June for violating TikTok’s policies on hate speech, encouraging violence against protected groups, promoting hateful ideologies, glorifying violent extremism and Holocaust denial. In each case, TikTok found no violations and initially allowed all the accounts to remain active.
A month later, TikTok banned 23 accounts, indicating that the platform is removing at least some violating content and channels over time. The 23 banned accounts had garnered at least 2 million views before being removed.
The researchers also created new TikTok accounts to understand how TikTok’s powerful algorithms promote Nazi content to new users.
Using accounts created in late May, the researchers watched 10 videos from a network of pro-Nazi users, clicking through the comments section but not engaging with it in any real way, such as liking, commenting or bookmarking. They also viewed 10 pro-Nazi accounts. The researchers then switched to the app’s “recommended” feed, and after just three videos, the algorithm suggested a video featuring a World War II-era Nazi soldier overlaid with a graph of the U.S. murder rate and a racial breakdown of perpetrators. They then saw a video of an AI-translated speech from Hitler overlaid with a recruitment poster from a white supremacist group.
Another account created by ISD researchers promoted even more extreme content in its main feed, with 70% of its videos either coming from self-described Nazis or featuring Nazi propaganda. After the account followed a number of pro-Nazi accounts to access content on channels set to private, TikTok’s algorithm recommended that it follow other Nazi accounts as well. All of the 10 accounts initially recommended to this account by TikTok either used Nazi symbols or keywords in their usernames or profile pictures, or featured Nazi propaganda in their videos.
“This isn’t particularly surprising,” said Abby Richards, a disinformation researcher who specializes in TikTok. “These are things we’ve found time and time again. I’ve certainly found them in my research.”
Richards wrote about white supremacist and radical accelerationist content on the platform in 2022, including a TikTok video featuring Paul Miller, a neo-Nazi serving a 41-month sentence on firearms charges, that garnered more than 5 million views and 700,000 likes in the three months it was on the platform before being removed.
Markus Bosch, a researcher at the University of Hamburg who monitors TikTok, told WIRED the report’s findings “aren’t all that surprising” and that he doesn’t expect TikTok to do anything to fix the problem.
“I don’t know exactly where the problem lies,” Bosch said. “TikTok says it has about 40,000 content moderators, so a clear policy violation like this should be easy to spot. But the sheer volume of content and the ability of bad actors to quickly adapt leave me convinced that neither AI nor adding more moderators will ultimately solve the entire problem of disinformation.”
TikTok announced that it has completed a mentorship program with Tech Against Terrorism, an organization that works to disrupt terrorist activity online and help TikTok identify online threats.
“Despite aggressive measures, TikTok’s growing popularity continues to make it a target for exploitation by extremist groups,” Adam Hadley, executive director of Tech Against Terrorism, told WIRED. “ISD’s research shows that adversarial asymmetries can allow a small number of violent extremists to wreak havoc on large platforms. This report therefore highlights the need for cross-platform threat intelligence supported by improved AI-powered content moderation. The report also reminds us that Telegram must also be held accountable for its role in the online extremism ecosystem.”
As Hadley noted, the report’s findings point to significant loopholes in the company’s current policies.
“The far-right’s use of TikTok, I’ve always described it as a messaging platform,” Richards said, “and more than anything, it’s about repetition — being exposed to the same hateful narratives over and over again, because at a certain point, if you see enough, you start to believe things and it really starts to affect your worldview.”