TLDR:
- TikTok was aware of risks to young users but downplayed them publicly
- Internal documents show TikTok tracked extensive usage metrics for kids/teens
- Time management tools were largely ineffective at reducing screen time
- TikTok allegedly prioritized “beautiful people” in its algorithm
- Content moderation metrics were described internally as “largely misleading”
TikTok, the popular short-form video app, has found itself at the center of controversy following the revelation of internal documents and communications that suggest the company was well aware of the potential risks its platform posed to young users.
These details emerged from a lawsuit filed by the state of Kentucky, with inadvertently unredacted portions shedding light on TikTok’s internal practices and knowledge.
The documents allege that TikTok had quantified how quickly young users became hooked on the platform, referring to a “habit moment” that occurs when users have watched 260 videos or more during their first week on the app.
This milestone could be reached in as little as 35 minutes, given that some TikTok videos are as short as 8 seconds.
Internal presentations from spring 2020 indicated that TikTok had already “hit a ceiling” among young users, with estimates showing at least 95% of smartphone users under 17 using TikTok at least monthly. The company closely tracked metrics for young users, including time spent watching videos and daily active users, using this information to refine its algorithm and drive user engagement.
TikTok’s internal studies, conducted by a group called “TikTank,” noted that compulsive usage was “rampant” on the platform. An unnamed executive was quoted as acknowledging the algorithm’s effectiveness while expressing concern about its impact on other activities such as sleep, eating, and social interactions.
The lawsuit also calls into question the effectiveness of TikTok’s publicized safety measures. In March 2023, TikTok introduced a 60-minute daily screen time limit for minors, presenting it as a tool to help teens manage their time on the platform.
However, the complaint argues that this feature was more of a public relations tool than an effective measure to reduce screen time.
Internal metrics for measuring the success of the time limit feature did not include reducing teen screen time as a primary goal. Instead, the first metric was “improving public trust in the TikTok platform via media coverage.” An experiment conducted by TikTok found that the time-limit prompts only reduced average teen usage from 108.5 to 107 minutes per day, a mere 1.5-minute decrease.
The complaint alleges that TikTok executive Zhu Wenjia approved the feature only on the condition that its impact on TikTok’s “core metrics” would be minimal. This suggests that the company prioritized engagement over meaningful reductions in screen time for young users.
Beyond screen time concerns, the lawsuit touches on other controversial aspects of TikTok’s operations. It alleges that the platform’s algorithm “prioritized beautiful people” despite internal acknowledgment that this could “perpetuate a narrow beauty norm.” The complaint states that TikTok modified its algorithm after an internal report noted a high “volume of … not attractive subjects” in the app’s main “For You” feed.
The lawsuit also raises concerns about TikTok’s content moderation practices. Internal communications allegedly describe the company’s moderation metrics as “largely misleading” because they don’t account for content that evades detection.
The complaint cites significant “leakage” rates for harmful content, including approximately 36% of content normalizing pedophilia and 50% of content glorifying minor sexual assault that reportedly goes unmoderated.
TikTok has responded to these allegations, with spokesperson Alex Haurek stating that the complaint “cherry-picks misleading quotes and takes outdated documents out of context to misrepresent our commitment to community safety.”
The company maintains that it has implemented robust safeguards, including proactive removal of suspected underage users and voluntary safety features such as default screen time limits and family pairing options.
The revelations come at a challenging time for TikTok, as it faces multiple legal battles and potential bans in various countries. The platform is currently fighting a U.S. federal law that could force its sale or ban, citing concerns over its China-based parent company, ByteDance. TikTok argues that such a ban would infringe on the free speech rights of its millions of U.S. users.