Why Is My TikTok So Different from Yours? How Algorithms Are Reshaping Public Debate

Recommendation algorithms are fragmenting our shared public sphere while simultaneously creating new forms of publicness.

Open TikTok and you’ll find that the content everyone sees is quite different. You might be immersed in tips for cooking, K-pop dance, or highlights of gaming moments, while your friend’s interface might feature political protests, environmental actions, or life advice. These seemingly random recommendations are actually controlled by the recommendation algorithm, which determines what we see, hear, and ignore.

The Internet was originally regarded as a global public square where people from different backgrounds could share information and exchange views. Many people hope that it can break away from traditional constraints and lead to a freer and inclusive form of communication. However, this vision no longer aligns with the current reality. Digital media has reshaped the way humans connect with reality. We are no longer exposed to the same news or culture, but rather a stream of personalized content tailored by algorithms. When people have different understandings of basic events and information channels, what is left of the so-called public space?

Image by u_icjer0igil from Pixabay

Although some movements have gained widespread attention due to algorithmic dissemination, such as “#MeToo” and “#BlackLivesMatter”, this is only a small portion of the cases. Algorithms often merely aim to prolong users’ usage time and push more content. They will cause people to get trapped in a repetitive and monotonous information loop, making it difficult for them to come into contact with new perspectives.

The recommendation algorithm has had a serious negative impact on public conversations: it exacerbates social divisions, solidifies existing biases, and masks the underlying political and commercial motives. Although the algorithm seems to connect people, it actually weakens democratic dialogue. Protecting the public discussion space is a structural challenge that requires political intervention.

Fragmenting the ‘Common World’

During the early stages of the development of the Internet, the information available to everyone was quite similar. Websites and blogs were updated chronologically, and people could come across various viewpoints simply by browsing. It was this open format that allowed differences and conversations to coexist, thus ensuring a shared information world. Even if opinions were different, people still felt that they and others were communicating in the same space.

The current online environment has completely changed. Software like Facebook and Instagram do not focus on diversity but rely on personalized algorithms to predict users’ preferences and continuously push similar content based on past clicks and stays. Soon, people will rarely come into contact with opposing viewpoints and get trapped in “filter bubbles”. The “square” that everyone used to share has now been divided into isolated realities.

Eli Pariser TED Talk on the concept of filter bubbles

This change is profoundly influencing democratic life. When people do not see different voices, they will become more stubborn in adhering to their original positions and are less willing to think from a different perspective. Algorithms make people stay within the content they are familiar with and push users into “echo chambers”, where the same viewpoints are repeatedly magnified. As the “circles” become narrower and narrower, differences are no longer opportunities for communication but rather a threat. The public online space has gradually lost the meaning of dialogue and compromise, leaving only parallel worldviews: people believe in different “facts” and have completely opposite interpretations of the same world.

Photo by SCARECROW artworks on Unsplash

What lies behind these platforms actually follows a logic of “surveillance capitalism”: users’ activities are comprehensively collected and used to determine the content to be pushed next. Content that reflects our emotions always spreads faster than posts that require deeper thinking, which leads to the voices that are truly helpful in understanding being ignored. As this model continues to operate, complex discussion spaces are constantly compressed, and public dialogue becomes a reflection of immediate emotions.

Ultimately, the algorithm not only affects people’s satisfaction with online activities but also reshapes the way they interpret the world. When the shared information environment is fragmented into individualized and fragmented spaces, the possibility of establishing a common understanding also disappears. If platform design continues to centre on traffic and emotional intensity, the Internet will only deepen social division and mistrust. Rebuilding meaningful dialogue will require rethinking the systems that control access and attention, so they serve public life rather than private profit.

Amplifying Bias, Weakening Democracy

Algorithms not only record reality but also reshape it. They convert social patterns into digital rules and embed the platform’s commercial goals within these systems. Most algorithms are trained on historical data, and when that data reflects inequality, the algorithm repeats it. Amazon’s recruitment tool is a typical example. Because the training materials mainly come from male-dominated industries, it gradually equated “men” with “career success”. As a result, resumes with feminine characteristics would be given lower scores by the system, and this tool was eventually abandoned. The problem is not that the computer maliciously discriminates against women, but that the system takes past data biases as facts, allowing the old injustice to continue in a new form.

The danger lies in the opacity of these systems. The algorithms operate silently, without clear rules or signals. Users see the results, not the underlying logic. For instance, when searching for “CEO” or “scientist” on Google Images, the results are often of male figures; while searching for “nurse” or “receptionist”, the results are mostly of female figures. At first glance, these search results seem neutral and harmless, but over time, they will subtly influence people’s perception of who holds power and who is subordinate, thereby deepening the unequal social system.

Video on gender stereotypes in image search

This influence is not limited to positions; it also affects people’s perception of social media credibility and understanding of political discourse power. Research shows that when people focus on right-wing topics, YouTube’s recommendation system often leads users to encounter more extreme,sensational content, even if they initially watched neutral videos. This pattern makes it difficult to distinguish between facts and opinions, leading to a decrease in people’s trust in the information environment.

This is not a minor design flaw; rather, it is a political issue regarding power distribution in public discourse. The platform is no longer merely a channel for neutral information transmission; it has taken on the role of “gatekeeper”, where algorithms now redefine public knowledge by determining which information is considered credible, which narratives dominate, and which viewpoints are excluded.

Photo about algorithms as gatekeepers of information by Gaspar Uhas on Unsplash

All of this occurs in an environment lacking transparency and democratic oversight, platforms will prioritize engagement while disregarding fairness and responsibility, users lose their right to speak, and content seen in public spaces is dominated by commercial interests rather than civic values. Therefore, we must hold these platforms accountable for the hidden choices behind content visibility and reach.

Case Study: TikTok and the Performance of Political Participation

On TikTok, political information has not vanished but has been adjusted according to the platform’s visual and aesthetic style. It is no longer dull policy speeches or debates, but appears in the form of audio clips, lip-sync performances, or satirical short plays. During the 2022 Australian federal election, young users had vastly different exposure to political information due to their varying browsing habits. Some were bombarded with star emojis, while others saw funny videos about rent or slogans supporting indigenous rights. The public discussion of this election was not driven by shared citizen dialogue, but was shaped by the recommendation system selecting the content that could most attract attention. TikTok does not prioritise political significance. It optimises for interaction, and in doing so, fragments public understanding.

User-created humorous TikTok video

Alongside this shift, youth political enthusiasm reached record highs. Data from the Australian Electoral Commission shows that enrolment among 18 to 24-year-olds increased to 85.4%. But higher participation did not always translate into informed choices. Many relied on quick, fragmented videos to understand complex issues, often mistaking popularity for trustworthiness. Nearly half of users under 30 now turn to TikTok for political updates, but much of what they see lacks depth. The platform makes politics feel immediate and emotionally charged, but rarely explains how policies work or what parties represent. Instead of gaining real understanding, users often leave with fragments of feeling rather than clarity.

TikTok use for news and politics by age group

Political actors quickly adapted. Instead of promoting detailed platforms, candidates now post short clips with filters, music, and humour to stay visible. The aim is no longer to persuade, but simply to be noticed. A viral dance might include a campaign slogan, but it offers little context or analysis. Style outweighs substance, and politicians and voters play by the same algorithmic rules.

This does not mean young people are disengaged. On the contrary, it reflects a different mode of participation. Expression replaces deliberation, and emotional reaction stands in for careful evaluation. Users engage in ways that match the platform’s fast-moving pace. On the surface, it looks like a political awakening. But in reality, it may be more performance than conversation. In a system where attention is the metric of success, space for informed citizenship is quietly being dismantled.

Counterpoint: Platformed Activism and Algorithmic Visibility

Algorithms are often criticized for fueling division, but they have also driven some of the most influential social movements in recent years. The rise of #MeToo, #BlackLivesMatter, and global climate strikes mainly relied on algorithmic dissemination on TikTok, Instagram, and Twitter, rather than traditional newspapers or television. In Australia, content related to Indigenous land rights has been able to go beyond the original community and be seen by a wider public because the recommendation system pushed activists’ posts to users who would not have actively sought out relevant information. Testimonies, slogans, and images with emotional resonance were rapidly magnified and reached people who had previously distanced themselves from politics.

Photo of BlackLivesMatter by Teemu Paananen on Unsplash

During the wildfires from 2019 to 2020, the hashtag #AustraliaOnFire was widely retweeted on Twitter, revealing both environmental and cultural losses and providing real-time updates often faster than mainstream media. In times of crisis, social platforms demonstrate centralized information advantages, enabling people to spontaneously organize public discussions, coordinate actions, and increase attention. However, although these examples are powerful, they do not prove that algorithms were designed to support democratic discussions. Platforms made this content popular not out of considerations of civic values, but because they possessed qualities of touching emotions and attracting attention.

Screenshot of Twitter feed during #AustraliaOnFire, taken by the author

In the vast flow of information, complex issues are often transformed into forms that are easier to attract attention: shocking visuals or personal stories that can quickly evoke empathy. Although such content is more eye-catching, it often obscures necessary background and in-depth explanations. Over time, public issues are simplified into a single emotion, while key details and discussions gradually fade from view. The use of algorithms may facilitate the spread of bullying, conspiracy theories, and misinformation, even under the guise of “fairness”.

This reveals a deeper issue: Algorithms may indeed spread rapidly through digital activism, but their strength is often weak and depends on the operational logic of the platform rather than whether it aligns with democratic ideals. In the absence of institutional accountability, technology that makes protests visible today may also become a tool to suppress them tomorrow.

Conclusion: Reclaiming Democratic Control

To address the issue of polarization caused by algorithms in democratic countries, reforms need to be initiated simultaneously from policy, platform design, and public education. Firstly, in addition to emphasizing “transparency”, platforms must also be accountable for how their algorithms affect public discourse. The EU’s “Digital Services Act” has taken a step forward by requiring platforms to conduct impact assessments and allowing researchers to access data. The Australian “Online Safety Act” has also proposed similar directions, but for it to be truly implemented, stricter enforcement and a clear commitment to public interests are needed, rather than just superficial risk mitigation. Government departments must exercise more supervision.

Secondly, public institutions should also play a greater role in the construction of alternative platforms. Public broadcasting organizations such as ABC or SBS, if they can collaborate with universities and libraries, have the opportunity to build non-commercial digital spaces, placing citizen education and user experience at the core. The significance of such platforms does not lie in the number of user clicks, but in better listening to the views of users, which has made a huge contribution to public interests and democratic life.

Thirdly, it is necessary to enhance citizens’ digital literacy to ensure that they understand how algorithmic biases affect their worldview, and to disseminate information sources, as well as to be vigilant about those voices that are excluded. Digital literacy is not merely a technical ability, but a public practice. It means that citizens should be able to understand the media, actively participate in communication, and jointly shape their own and social roles in public life.

Photo of students learning digital literacy by Ron Lach

These plans cannot be accomplished merely through technology; political determination is also needed to challenge the profit-driven incentive mechanisms of the platforms. Algorithms have deeply integrated into public life and have become a new social model. The real key lies in whether we should manage these systems through democratic means, so that they can serve the public interest instead of being controlled by them and being driven along unconsciously, being dominated by commercial logic and technical decisions.

Referencs List:

@channyvee89. (2024, September 15). [TikTok video]. TikTok. https://www.tiktok.com/@channyvee89/video/7500194738176314631

Association for Computing Machinery. (2015, May 18). Unequal representation and gender stereotypes in image search results for occupations [Video]. YouTube. https://youtube.com/watch?v=xeZherjrfSE&t=6s

Australia. (2021). Online Safety Act 2021 (Cth). Federal Register of Legislation.

Chowdhury, I. (2022, April 19). What will young Australians do with their vote? Australian National University. https://www.anu.edu.au/news/all-news/what-will-young-australians-do-with-their-vote

Dastin, J. (2018, October 10). Amazon scrapped ‘sexist AI’ recruiting tool. Reuters. https://www.reuters.com/article/us-amazon-com-jobs-automation-insight-idUSKCN1MK08G

European Commission. (2022). The Digital Services Act explained [Video]. European Union Audiovisual Service. https://audiovisual.ec.europa.eu/corporateplayer/index.html?video=I-200453&language=EN

European Commission. (2022). The Digital Services Act Package. European Union.

Flew, T., Bruns, A., Burgess, J., Crawford, K., & Shaw, F. (2014). Social media and its impact on crisis communication: Case studies of Twitter use in emergency management in Australia and New Zealand. In 2013 ICA Shanghai Regional Conference: Communication and Social Transformation (pp. 1–1).

Gao, Y., Liu, F., & Gao, L. (2023). Echo chamber effects on short video platforms. Scientific Reports, 13(1), 6282.

Gillespie, T. (2018). All platforms moderate. In Custodians of the internet: Platforms, content moderation, and the hidden decisions that shape social media (pp. 1–23). Yale University Press. https://doi.org/10.12987/9780300235029

Gorwa, R. (2019). The platform governance triangle: Conceptualising the informal regulation of online content. Internet Policy Review, 8(2). https://doi.org/10.14763/2019.2.1407

Haroon, M., Chhabra, A., Liu, X., Mohapatra, P., Shafiq, Z., & Wojcieszak, M. (2022). YouTube, the great radicalizer? Auditing and mitigating ideological biases in YouTube recommendations. arXiv. https://arxiv.org/abs/2203.10666

Kay, M., Matuszek, C., & Munson, S. A. (2015). Unequal representation and gender stereotypes in image search results for occupations. Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (CHI ‘15), 3819–3828. https://doi.org/10.1145/2702123.2702520

Lach, R. (n.d.). Students learning digital literacy [Photograph]. Pexels. https://www.pexels.com/photo/students-learning-digital-literacy-10638075/

Leimbach, T., & Palmer, J. (2022). #AustraliaOnFire: Hashtag Activism and Collective Affect in the Black Summer Fires. Journal of Australian Studies, 46(4), 496-511. https://doi.org/10.1080/14443058.2022.2121744

Li, M., Suk, J., Zhang, Y., Pevehouse, J. C., Sun, Y., Kwon, H., Lian, R., Wang, R., Dong, X., & Shah, D. V. (2024). Platform affordances, discursive opportunities, and social media activism: A cross-platform analysis of #MeToo on Twitter, Facebook, and Reddit, 2017–2020. New Media & Society, 0(0). https://doi.org/10.1177/14614448241285562

Literat, I., & Kligler-Vilenchik, N. (2023). TikTok as a key platform for youth political expression: Reflecting on the opportunities and stakes involved. Social Media + Society, 9(1). https://doi.org/10.1177/20563051231157595

Mihailidis, P., & Thevenin, B. (2013). Media literacy as a core competency for engaged citizenship in participatory democracy. American Behavioral Scientist, 57(11), 1611–1622. https://doi.org/10.1177/0002764213489015

Mundt, M., Ross, K., & Burnett, C. M. (2018). Scaling social movements through social media: The case of Black Lives Matter. Social Media + Society, 4(4). https://doi.org/10.1177/2056305118807911

Noble, S. U. (2018). A society, searching. In Algorithms of Oppression: How Search Engines Reinforce Racism (pp. 15–63). NYU Press.

Oden, A., & Porter, L. (2023). The kids are online: Teen social media use, civic engagement, and affective polarization. Social Media + Society, 9(2). https://doi.org/10.1177/20563051231186364

Paananen, T. (n.d.). BlackLivesMatter protest with people holding signs [Photograph]. Unsplash. https://unsplash.com/photos/grayscale-photo-of-city-buildings-rd5uNIUJCF0

Pariser, E. (2011, March). Beware online “filter bubbles” [Video]. TED Conferences. https://www.ted.com/talks/eli_pariser_beware_online_filter_bubbles

Pew Research Center. (2023, June 29). Social media, online activism and 10 years of #BlackLivesMatter. https://www.pewresearch.org/internet/2023/06/29/blacklivesmatter-turns-10/

Pew Research Center. (2024, August 20). About half of TikTok users under 30 say they use it to keep up with politics/news. https://www.pewresearch.org/short-reads/2024/08/20/about-half-of-tiktok-users-under-30-say-they-use-it-to-keep-up-with-politics-news/?utm_source=chatgpt.com

Poell, T., & Waal, M. de (Eds.). (2018). The Platform Society as a Contested Concept. In J. van Dijck, The platform society (pp. 5–32). Oxford University Press.

SCARECROW artworks. (n.d.). Multiple screens with spiral patterns [Photograph]. Unsplash. https://unsplash.com/photos/woman-in-white-shirt-sitting-on-chair-eJ93vVbyVUo

School Strike 4 Climate. (2023). About Our Movement. Global Climate Strike Network.

TikTok. (n.d.). For You. TikTok Help Center. Retrieved September 20, 2025, from https://support.tiktok.com/en/getting-started/for-you

u_icjer0igil. (n.d.). Image symbolizing algorithmic control over information visibility [Photograph]. Pixabay. https://pixabay.com/users/u_icjer0igil-44310198/?utm_source=link-attribution&utm_medium=referral&utm_campaign=image&utm_content=8820274

Uhas, G. (n.d.). Algorithms as gatekeepers of information [Photograph]. Unsplash. https://unsplash.com/@gasparuhas

Wikipedia contributors. (2025, January 17). MeToo movement. In Wikipedia. https://en.wikipedia.org/wiki/MeToo_movement

Zuboff, S. (2023). The age of surveillance capitalism. In Social theory re-wired (pp. 203–213). Routledge.

Leave a Reply