TIKTOK’S DEEPENING SCRUTINY: SYSTEMIC RISKS AND THE ALGORITHMIC ECONOMY OF HARM

Amnesty International’s October 2025 report adds new weight to growing evidence that TikTok’s recommendation systems endanger minors through what may be described as algorithmic grooming — the rapid funneling of young users toward emotionally charged and self-destructive content.

Using test accounts registered as 13-year-olds in France, researchers observed that within 15–20 minutes the “For You” feed became saturated with videos about anxiety, depression, and self-harm. After three to four hours, the same accounts were exposed to material that explicitly romanticized or normalized suicide. Amnesty argues that such exposure patterns are not isolated moderation errors but systemic symptoms of the platform’s engagement-driven design. The study suggests that the algorithm itself functions as an emotional amplifier — rewarding vulnerability, sensationalism, and distress because these states maximize attention and time-on-screen.

From the perspective of E.U.LABORATORY’s critical framework, this case exemplifies the economic unconscious embedded in digital architectures: platforms transform human fragility into a profitable behavioral signal. What is sold to advertisers is not merely visibility, but psychic activation — states of anxiety and excitation that sustain interaction loops. The European Union’s Digital Services Act (DSA) explicitly requires large online platforms to mitigate “systemic risks” to mental health, especially for minors, and to conduct annual independent audits of algorithmic design.

The Amnesty findings thus not only expose ethical lapses but potentially reveal non-compliance with European regulation. The key question becomes whether the algorithmic infrastructure of attention can ever be aligned with psychological safety when its core business logic depends on stimulation, comparison, and emotional volatility.

In this sense, TikTok’s case should be seen less as an isolated failure than as an indicator of a structural disorder within cybercapitalism: the monetization of affect and the industrialization of psychic distress. Regulating this economy of harm requires not only oversight of moderation policies but a deeper transformation of recommendation logic — where well-being replaces engagement as the metric of value.

REFERENCES

In 2023, Amnesty International published two complementary reports Driven into the Darkness: How TikTok Encourages Self‑harm and Suicidal Ideation and “I feel exposed”: Caught in TikTok’s surveillance web, highlighting abuses suffered by children and young people using TikTok.

Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market For Digital Services and amending Directive 2000/31/EC (Digital Services Act).

FIND OUT MORE ON E.U.LABORATORY

Leave a Reply