You Are More Than Your Feed: How Social Media Algorithms Shape Gen Z’s Political Identity

You Are More Than Your Feed: How Social Media Algorithms Shape Gen Z’s Political Identity

By: Emma O'Connor

 In a world where social media has become our primary source of news, entertainment, and even political discussion, I sought to understand the extent of the influence these platforms have over what we believe. As a Gen Z college student, I use apps like TikTok and Instagram every day. But what I didn’t realize until I started this project is how little control we have over the content we see and how deeply this affects our political identities.

My final project for JN201/NMC100 began with a question: How do social media algorithms influence political polarization? That idea quickly narrowed into something even more specific and personal how these algorithms shape political identity, especially among people my age. From that starting point, I researched how platforms use recommendation systems to keep users engaged, often by pushing content that reinforces their existing views. The more I learned, the more I saw just how powerful and sometimes dangerous these invisible systems can be.

One thing that surprised me was how early and consistently these algorithms begin shaping our beliefs. According to a 2024 report from the Pew Research Center, 59% of Gen Z rely on social media as their main source of news. That means platforms like TikTok are not just reflecting our interests, they’re defining them. Every video we like, share, or watch for more than a few seconds becomes data that the algorithm uses to decide what to show us next. Over time, this creates what scholars call an “echo chamber,” where we’re only exposed to opinions that match our own.

In my research, I found that these effects aren’t evenly distributed. A 2025 study by Ibrahim et al. found that TikTok’s recommendation engine favored Republican content during the 2024 U.S. election, especially among younger male users. At the same time, other research shows that LGBTQ+ creators and BIPOC youth often experience censorship or shadowbanning, a form of content suppression where their posts are less visible to others (Ungless et al., 2024). This means some voices are amplified while others are silenced, not based on truth or accuracy, but based on what the algorithm predicts will keep users scrolling.

What makes this even more troubling is that most people don’t know how algorithms work. We assume we’re in control of what we see online, but we’re not. And that lack of awareness can lead to false confidence in the information we consume, shaping not just our opinions, but our identities. In my infographic, I tried to break this down in a clear, visual way, showing how repeated exposure to curated content forms beliefs, how different groups are affected, and what we can do to push back.

The good news is that there are solutions. On an individual level, we can diversify our media feeds by following creators with different viewpoints, using fact-checking tools like AllSides or NewsGuard, and thinking critically before reposting content. But larger structural changes are needed too, things like requiring algorithm transparency, adding digital literacy education in schools, and encouraging tech companies to audit their systems for bias. These are not easy fixes, but they’re necessary if we want to reclaim control over our media environments.

Doing this project changed the way I use social media. I’ve become more aware of my feed and more curious about what’s missing from it. I also feel more motivated to advocate for media literacy in schools, something I never really thought about before. The research surprised me, but it also empowered me. Algorithms may be invisible, but that doesn’t mean they’re unstoppable. The more we understand them, the more we can challenge them.

So next time you’re scrolling through TikTok or Instagram, ask yourself: Who chose this content for me? And why? Because the answer might just change the way you see the world and yourself.

Works Cited

  • Ibrahim, Fatima, et al. “TikTok's Influence on the 2024 Election.” Journal of Digital Politics, vol. 9, no. 1, 2025, pp. 44–61.

  • Pew Research Center. “News Use Across Social Media Platforms in 2024.” Pew Research, 2024.

  • Ungless, Mark, et al. “Censorship and Visibility: LGBTQ+ Youth on TikTok.” Youth Media Studies Quarterly, vol. 7, no. 3, 2024.

  • Vox Media. “The Alt-Right Pipeline on TikTok.” Vox.com, Jan. 2025.

Comments

Popular posts from this blog

BLOG PROJECT 2 - The Algorithm Made Me Do It: How TikTok is Rewiring Our Attention Spans

Information I consume - Emma O'Connor