TikTok designed to be an addiction machine, internal documents reveal
TikTok reportedly ignored the adverse mental health impact its features could have on teen users. Executives and employees at TikTok knew that the app's features promoted compulsive use, documents reviewed by NPR from a lawsuit filed by Kentucky Attorney General Russell Coleman revealed. The lawsuit alleges TikTok "falsely claiming [that it's] safe for young people," with Coleman saying the app was "specifically designed to be an addiction machine, targeting children who are still in the process of developing appropriate self-control."
Internal research reveals negative mental health effects
TikTok's own research reportedly found that compulsive usage of the app correlates with several negative mental health effects. These include loss of analytical skills, memory formation, contextual thinking, empathy, conversational depth, and increased anxiety. The documents also revealed that TikTok executives were aware that compulsive use can interfere with sleep, school, and work responsibilities, and even "connecting with loved ones."
Time-management tool ineffective
Internal documents have revealed that TikTok's time-management tool, which limits app usage to 60 minutes a day, has been largely ineffective in curbing screen time among teens. Despite the tool's implementation, teens were still spending an average of 107 minutes on the app daily. The company reportedly based the success of the tool on how it "improved public trust in the TikTok platform via media coverage," acknowledging privately that "minors do not have executive function to control their screen time."
Existence of dangerous 'filter bubbles' acknowledged
Reportedly, TikTok is aware of the existence and potential dangers of "filter bubbles" on its platform. According to internal studies, users can be drawn into negative filter bubbles, such as those focusing on painful ("painhub") and sad ("sadnotes") content, within 30 minutes of use in one sitting. The company's researchers also noted the promotion of "thinspiration," content associated with disordered eating, due to the way TikTok's algorithm operates.
Struggles with content moderation
TikTok is reportedly grappling with content moderation issues, according to the lawsuit documents. An internal investigation revealed that underage girls were receiving "gifts" and "coins" in exchange for live stripping. The company's higher-ups allegedly instructed the moderators not to remove users reported to be under 13 unless their accounts explicitly stated their age. NPR reports that TikTok acknowledged a significant amount of content violating its rules slips through its moderation techniques, including videos normalizing pedophilia and glorifying minor sexual assault.
Lawsuit alleges TikTok prioritizes 'beautiful people'
The lawsuit also claims that TikTok has favored "beautiful people" on its platform. The complaint states that the company altered its algorithm after an internal report observed a high "volume of ... not attractive subjects" in the app's main "For You" feed. It also accuses TikTok of publicizing content-moderation metrics that are "largely misleading," indicating a gap between the company's public image and internal practices.
TikTok defends practices amid lawsuit
In response to the allegations, TikTok spokesperson Alex Haurek defended the company's commitment to community safety. He argued that the Kentucky AG's complaint "cherry-picks misleading quotes and takes outdated documents out of context." Haurek also highlighted TikTok's "robust safeguards," including proactive removal of suspected underage users. He pointed out that the firm voluntarily launched safety features like default screen time limits, family pairing, and privacy by default for minors under 16.