TikTok removed 2.4 million videos from Nigerian users in the fourth quarter of 2024 for violating its content policies, the company revealed in its latest Community Guidelines Enforcement Report.
Nigeria was among the top 50 countries where policy-violating content originated during the period. However, the highest number of violations was recorded in the United States, with 8.5 million videos removed. Globally, TikTok took down a total of 153 million videos in Q4 2024.
According to the report, the top 50 markets accounted for approximately 90 percent of all content removals. The affected videos were found to have breached TikTok’s policies on Integrity and Authenticity, Privacy and Security, Mental and Behavioural Health, Safety, and Civility, among others.
Read also: Data protection commission probes TikTok, Truecaller over alleged data breaches
In addition to content takedowns, TikTok said it removed 211.5 million accounts suspected to be fake or operated by users under the age of 13 during the quarter.
Fake accounts made up the largest share, with 185.3 million removed. Additionally, 20.5 million accounts suspected to belong to underage users were taken down, while 5.6 million accounts were removed for other unspecified reasons.
“We remain vigilant in our efforts to detect external threats and safeguard the platform from fake accounts and engagement. These threats persistently probe and attack our systems, leading to occasional fluctuations in the reported metrics within these areas.
“In Q4 2024, we also updated the way we classify fake likes and followers removed over time, which contributed to some of the overall increases observed in these categories. This update better reflects the scale of our existing work to promptly identify and remove any accounts, content, or activities that seek to artificially boost popularity on our platform,” TikTok stated in the report.
Read also: NDPC probes TikTok, Truecaller over data breaches
Despite its enforcement measures, TikTok continues to face scrutiny from regulators worldwide over content moderation and user safety.
In October 2024, 13 U.S. states and the District of Columbia filed lawsuits against the company, accusing it of failing to protect young users from harm. The lawsuits, filed in New York, California, Washington D.C., and 11 other states, allege that TikTok’s platform is designed to be addictive, exploiting children’s vulnerabilities for profit.