When Trust Isn’t Spoken: Algorithmic Bias in TikTok Searches

The Report Of The Institute For Strategic Dialogue Says TikTok Recommends Hate

Photo by BoliviaInteligente on Unsplash

I want to thank

for who shared this report link in response to the article I published.

It’s worth pausing here. Many of us think of TikTok as short dances, quick laughs or bite-sized advice. But under the hood, the same mechanics that decide which clip you see next. They are quietly shaping collective ideas of truth, danger, morality and identity.

The report shows how search results on TikTok don’t simply reflect what people are looking for. They amplify certain views, suppress others and nudge users into frames of thinking. In other words, the algorithm is not neutral. It behaves like a cultural editor, invisible yet deeply persuasive.

This report (ISD, Feb 2025), explores biases in TikTok’s search behavior across English, French, German and Hungarian. It systematically tracks how derogatory prompts surface biased, demeaning content about marginalized…

Leave a Reply