Navigation:
Research | Datacentral | History
1. Abstract
Introduction
Since 2020, many people have expressed a sense that society has become more predictable, less spontaneous, and increasingly fragmented. Conversations about “NPCs,” “echo chambers,” and “algorithmic culture” have entered mainstream discourse. These intuitions raise a deeper question: has the structure of culture itself changed?
This paper investigates how recommendation algorithms—now the dominant mediators of cultural consumption—shape social behaviour, cultural diversity, and the emergence of new ideas. Using concepts from chaos theory, memetics, sociology, and digital anthropology, the study examines whether algorithmic systems reduce “social chaos,” homogenise behaviour, and contribute to the rise of predictable cultural roles.
It will also contrast its findings with the results of the essay “Are ideas alive?How historical civilisations were governed by ideas” as well as extrapolate in the context of sociology.
The analysis draws on two pillars:
- Theoretical synthesis
- social chaos theory
- memetics and cultural capital
- diffusion of innovations
- egregores and symbolic structures
- postmodern hyperreality
- Empirical investigation
- a pilot YouTube recommendation experiment
- cross‑validation with existing academic literature
The goal is not only to understand how algorithms influence culture, but also to situate these findings within a broader theory of how ideas evolve in digital environments
Scope & Limitations
This study is exploratory rather than conclusive. Its goal is to identify qualitative patterns in algorithmic behaviour and situate them within broader sociological and memetic frameworks. The pilot experiment is intentionally minimal, serving as a conceptual probe rather than a statistically generalisable test
2. Background & Theory
2.1 Social Chaos Theory
Chaos theory describes systems that are highly sensitive to initial conditions. Applied to society, social chaos refers to behavioural divergence—the degree to which individuals differ in their cultural consumption, beliefs, and identity formation.
High social chaos → diverse, unpredictable cultural behaviour
Low social chaos → homogenised, predictable behaviour
Recommendation algorithms, by reinforcing user preferences, may reduce social chaos by nudging individuals toward narrower behavioural patterns.
2.2 Memes as Cultural Capital(sociology context)
Memes function as units of cultural transmission. In digital environments, they also act as cultural capital—markers of identity, status, and group membership. Sociological research shows that:
- knowing certain memes signals in‑group belonging
- meme literacy confers social credibility
This stratification becomes important when analysing how algorithms shape cultural roles.
3. Methods
3.1 Research Design
This study uses a mixed‑method approach:
- theoretical synthesis of existing literature
- pilot empirical experiment on YouTube’s recommendation system
- comparative analysis with academic findings from Brookings, SSRN, Oxford Academic, and EU research
3.2 YouTube Recommendation Experiment (Pilot Study)
The purpose of this experiment is to determine the effect recommendation algorithms have on social chaos.
In this case I chose youtube’s recommendation algorithm arbitrarily.
In this context social chaos would be represented by the # of videos of a specific type a user interacts with - this acts as a representation of the variance in what a user watches.
This allows us to create the method:
- Search for a category of video with a fresh account
- Watch a video from that category from start to finish
- Go to the youtube homepage
- Record the number of videos that are a part of the same category out of the first _(n1) videos
- Reload the page _(n2) times and obtain an average to ensure the results from step 4 are accurate
- Repeat steps 2-5 using a video from the homepage of results until the number recorded in steps 4 converges on a specific value
- Delete the youtube history to reset the account to be effectively a fresh account and repeat steps 1-6 using an increasing number of categories from 1-n1 in steady increments
In a fully rigorous design, the values of n₁ (the number of videos inspected per iteration) and n₂ (the number of page reloads) would be chosen using statistical power analysis to ensure representativeness. However, the purpose of this pilot study is not to estimate population parameters but to identify qualitative trends in algorithmic behaviour.
Because YouTube continuously loads new recommendations as the user scrolls, increasing n₁ effectively simulates the effect of multiple reloads. Thus n₁ and n₂ collapse into a single parameter: the total number of recommendations observed.
By incrementally increasing n₁, the experiment leverages the law of large numbers: as the number of observed recommendations grows, the running mean converges toward the algorithm’s stable behaviour. This allows the study to focus on emergent patterns rather than precise statistical estimates.
3.3 Algorithmic analysis
For this section I will analyse why the actual structure of a recommendation algorithm produces the results described by the youtube recommendation experiment
3.4 Social analysis
For this section I will analyse what the results of the youtube recommendation experiment actually mean in a social context.
4. Findings
4.2 Youtube Recommendation algorithm (pilot study)
In this experiment 3 graphs were obtained:
The graphs show the following trends:
- When only one category was interacted with (low initial social chaos) the recommendation algorithm seemed to cluster at about ~30% of videos recommended are of the same category
- When two categories were interacted with each video increased the % of the same category being recommended up to ~40% where it seemed to return to a baseline of ~30%
- When 3 categories were interacted with it seemed the results after the first two videos watched clustered at ~20% and the results after the second 2 videos watched clustered at ~40%
- For all sets of data, for the first three videos watched the % of the same category increased and then on the fourth video it dropped to be between the results of videos 2 and 3. It also seems like as you increase the number of categories interacted with the mean proportion of videos of the same category as initially watched after video 3 shown by the yellow curve in all 3 graphs is higher and the behaviour of the results after video 4 suggests a negative feedback loop where the mean proportion of videos the same category stabilises at a value proportional to initial social chaos.
This creates a three tier stratification system:
- Users with high levels of social chaos will have a high level of the same kind of content recommended to them → This will reduce their social chaos because they are only interacting with the same types of content → This will pull them into a lower strata
- Users with medium levels of social chaos will have medium levels of the same kind of content recommended to them → This will slightly reduce their social chaos but over a much longer time period pulling them into the bottom strata
- Users with low levels of social chaos will have low levels of the same kind of content recommended to them → Something I noticed about this level when doing the experiment is that whilst the level of the same category of content being recommended is slightly low the entirety of the content being recommended was incredibly similar so this does not mean the users social chaos was increased
This means that overall there is a gradient towards the lowest strata of this structure such that the higher up you are the stronger the gradient is.
4.3 Algorithmic analysis
Although the pilot experiment is limited in scope, its results can still be interpreted through established sociological frameworks. The purpose of this section is not to generalise statistically, but to explore how the observed algorithmic patterns align with existing theories of cultural stratification, identity performance, and memetic diffusion
The source for this section is: How do recommender systems work on digital platforms?
From the article, we learn that recommendation algorithms have five steps.
- The first step of a recommender system is building an inventory - that is “all content and user activity available to be shown to a user”.
- The second step is “Integrity processes”: Here items which violate a platform’s content policies are removed from the inventory and the inventory is scanned for “borderline” content (“items that can be published but not shared (or at least not shared widely). Typically, this includes text, video, or audio that is known not to violate the platform’s term of service but that the platform has reason to believe may be problematic or offensive.”).
- The next step is the candidate generation step: recommender systems narrow the content inventory to a manageable subset of items for ranking. To do this efficiently, platforms typically use an “approximate nearest neighbor” (ANN) search, which “typically grabs dozens or hundreds of items that are likely in the ballpark of a users’ revealed preferences and interests”.
- Penultimately is the ranking step: The remaining subset of the inventory is ranked based on user engagement, typically using a deep learning algorithm which is trained on the users preferences
- Finally is the re-ranking step: Ranking algorithms often over represent certain content types or authors, creating redundancy.To address this, a re-ranking step applies hand-coded rules to ensure diversity in the final selection, balancing content types and authors for a better user experience.
We can say that the upper limit we hit for content of the same category (~40%) was applied by the re-ranking step. Additionally, the first three times we watch a new video the deep learning algorithm wants to recommend us more of the same type of content but on the fourth time this starts to decrease consistently. This suggests some kind of negative feedback on the deep learning algorithm step or that the re-ranking step intensifies.
Something else which is important to consider when analysing the algorithm is a famous edge case - echo chambers. My data shows that there is a drift that seems to decrease social chaos proportional to the social chaos a user has. This suggests that all users are in a mild echo chamber. This aligns with the source (Echo chambers, rabbit holes, and ideological bias: How YouTube recommends content to real users ) where the team of researchers investigate the effects of echo chambers purely from the youtube recommendation algorithm. They found that the recommendation algorithm itself does in fact push users into a “mild” echo chamber but the authors of the article suggest that it is user choice and not the algorithm itself which causes users to fall down “rabbit holes” (an increasing state of an echo chamber) to which avail it is significant to reference many psychological effects such as confirmation bias (where people tend to remember facts that agree with what they already believe) which when working in tandem with “mild” echo chambers can lead to more substantial polarisation ((typically political) ideological bias (which typically emerges from echo chambers)).This suggests that as the user experiences sustained decrease social chaos in the recommendation algorithm it actually reduces their social chaos psychologically going beyond their interaction with the recommendation algorithm.
4.4 Social analysis
Although the pilot experiment is limited in scope, its results can still be interpreted through established sociological frameworks. The purpose of this section is not to generalise statistically, but to explore how the observed algorithmic patterns align with existing theories of cultural stratification, identity performance, and memetic diffusion.
This results in the following framework.
- Upper Strata – The Domain of Cultural Architects:
Individuals positioned above the top critical point exhibit extremely high levels of social chaos, where their exposure to diverse and rapidly changing digital content induces a state of cognitive overload. Drawing on Goffman’s (1959) concept of dramaturgy, these users perform multiple, often conflicting social roles simultaneously. This divergent role behavior results in high “role entropy” (as discussed by Prigogine, 1984), where predictable patterns break down and novel cultural expressions emerge. In this stratum, the individuals—I will term “cultural architects”—synthesize new cultural forms by navigating and even exploiting the boundaries of algorithmically driven content recommendations. Their capacity to combine diverse inputs fosters the creation of subcultures that remain accessible only to those operating at or above this critical threshold. - Middle Strata – The Mainstream Propagators:
The majority of users occupy the middle strata, falling between the top and bottom critical points. Here, the influence of cultural architects is evident: individuals in this layer are able to access, maintain, and propagate the new cultural ideas generated in the upper strata. Their behavior is characterized by a moderate level of role entropy—less volatile than that of the cultural architects, yet sufficiently dynamic to facilitate a cascading diffusion of novel cultural forms. As these users share and reinforce emerging memes, they effectively transform radical ideas into mainstream trends, thereby shaping what is considered culturally acceptable. - Lower Strata – The Anchors of Cultural Stability:
Individuals below the bottom critical point are primarily passive absorbers of culture. Their behavior is marked by a convergence around a stable, predictable set of social roles—NPCs as mentioned in our introduction. In our analysis, these users resist rapid cultural change, functioning as anchors that moderate the influx of new, rapidly evolving cultural content. Their role is crucial in filtering which emergent cultural movements persist over time, thereby ensuring long-term stability within the broader social system.
In terms of memes (units of culture):
- Cultural Architects: These are the innovators and creators—the individuals who generate original meme content. Their creative output establishes cultural reference points and narratives.
- Active Propagators: These individuals engage with the memes, sharing, remixing, and adapting them. They act as intermediaries, spreading the ideas beyond the original niche.
- Passive Absorbers: The broader public, which internalizes these memes without necessarily contributing to their evolution.
From a marxist perspective the ability to interact with memes can be seen as a kind of cultural capital - this suggests that this three tier system also acts as a class system much like we traditionally conceive the upper class, middle class and lower class except instead of wealth these classes stratify influence.
Additionally, from a postmodern perspective this drift leads to the creation of increasingly fragmented culture - modern day brainrot is the perfect example of this. Symbols become increasingly self-referential and we lose our perspective of meaning.
5. Conclusion
Recommendation algorithms lead the emergence of a new kind of class system with three tiers:
- Upper Strata – The Domain of Cultural Architects:
Individuals positioned above the top critical point exhibit extremely high levels of social chaos, where their exposure to diverse and rapidly changing digital content induces a state of cognitive overload. Drawing on Goffman’s (1959) concept of dramaturgy, these users perform multiple, often conflicting social roles simultaneously. This divergent role behavior results in high “role entropy” (as discussed by Prigogine, 1984), where predictable patterns break down and novel cultural expressions emerge. In this stratum, the individuals—I will term “cultural architects”—synthesize new cultural forms by navigating and even exploiting the boundaries of algorithmically driven content recommendations. Their capacity to combine diverse inputs fosters the creation of subcultures that remain accessible only to those operating at or above this critical threshold. - Middle Strata – The Mainstream Propagators:
The majority of users occupy the middle strata, falling between the top and bottom critical points. Here, the influence of cultural architects is evident: individuals in this layer are able to access, maintain, and propagate the new cultural ideas generated in the upper strata. Their behavior is characterized by a moderate level of role entropy—less volatile than that of the cultural architects, yet sufficiently dynamic to facilitate a cascading diffusion of novel cultural forms. As these users share and reinforce emerging memes, they effectively transform radical ideas into mainstream trends, thereby shaping what is considered culturally acceptable. - Lower Strata – The Anchors of Cultural Stability:
Individuals below the bottom critical point are primarily passive absorbers of culture. Their behavior is marked by a convergence around a stable, predictable set of social roles—NPCs as mentioned in our introduction. In our analysis, these users resist rapid cultural change, functioning as anchors that moderate the influx of new, rapidly evolving cultural content. Their role is crucial in filtering which emergent cultural movements persist over time, thereby ensuring long-term stability within the broader social system.
Or in terms of memes:
- Cultural Architects: These are the innovators and creators—the individuals who generate original meme content. Their creative output establishes cultural reference points and narratives.
- Active Propagators: These individuals engage with the memes, sharing, remixing, and adapting them. They act as intermediaries, spreading the ideas beyond the original niche.
- Passive Absorbers: The broader public, which internalizes these memes without necessarily contributing to their evolution.
6. References
Gordon T. (1992). Chaos in social systems. [Online]. Elsevier BV. Available at: *https://www.bbc.co.uk/news/election/2024/uk/results* [Accessed: 22nd October 2024].
This academic source is reliable due to its publication by Elsevier and the reputation of Theodore Gordon, a recognized researcher. Although published in 1992, it is still cited in current research and retains relevance, making it a useful theoretical foundation. The information was accessed through Academia.edu, a peer-sharing platform that confirms author identities and reputations. The source is used to explain how chaotic patterns in systems—such as politics or media—can result in unpredictable yet meaningful social shifts, contributing to the theoretical basis of the IP frame.
Meserole, C. (2022) How do recommender systems work on digital platforms? [Online]. Brookings. Available at: ***https://www.brookings.edu/articles/how-do-recommender-systems-work-on-digital-platforms-social-media-recommendation-algorithms/* **[Accessed: 15 February 2025].
This article from Brookings is highly reliable, written by an expert in digital platforms and hosted by a prestigious think tank recognized globally, including by the University of Pennsylvania. The article is recent (2022) and was accessed in early 2025, keeping it within an appropriate time frame. It is useful for understanding the mechanisms of recommender systems and how they influence what users see on social media. The information helped in explaining the role of algorithms in shaping political discourse and creating ideological echo chambers.
Brown, M.A. et al. (2022) Echo chambers, rabbit holes, and ideological bias: How YouTube recommends content to real users.[Online]. Brookings. Available at: ***https://www.brookings.edu/articles/echo-chambers-rabbit-holes-and-ideological-bias-how-youtube-recommends-content-to-real-users/* **[Accessed: 15 February 2025].
Also published by Brookings, this article is reliable for the same reasons as above: it is written by domain experts and hosted by a trusted institution. It is relatively recent (October 2022) and provides insight into how YouTube’s recommendation system nudges users into ideologically consistent content streams. This is useful in supporting claims that online environments contribute to political polarization, and the specific finding cited is that users received increasingly skewed content based on prior engagement.
Unknown. (n.d). Internet Meme. [Online]. Wikipedia. Available at: *https://en.wikipedia.org/wiki/Internet_meme* [Accessed 16th February 2025].
This source is moderately reliable. While Wikipedia is publicly editable, it enforces strict citation standards and moderation, especially on well-trafficked pages. It is useful for providing a clear and concise definition of what an internet meme is. Although no publication date is listed, it was accessed in February 2025, ensuring the information was up-to-date at the time of use. This source was used to support definitions and background context around digital culture and how memes function as units of cultural transmission.
Dawkins, R. (1976). The Selfish Gene. United Kingdom: Oxford University Press.
This is a highly reliable source authored by Richard Dawkins, a prominent evolutionary biologist, and published by Oxford University Press. Though not recent, it remains foundational and widely cited in academic discussions, making it both relevant and useful for theoretical frameworks. The concept of the “meme” introduced in this book was used to establish the intellectual foundation for understanding cultural replication and evolution—central ideas in the analysis of digital memetics.
Unknown. (n.d). Cultural capital. [Online]. Wikipedia. Available at: *https://en.wikipedia.org/wiki/Cultural_capital* [Accessed 16th February 2025].
Similar to the Internet Meme entry, this Wikipedia page is moderately reliable due to its reliance on cited academic sources and community moderation. It is useful for understanding sociological concepts such as cultural capital, particularly how knowledge of memes and digital behavior can confer status online. Though no publication date is provided, it was accessed in February 2025, confirming it was current at the time of citation. The concept was used to frame the idea that meme literacy can influence social credibility in digital environments.
Goffman, E. (1959). The Presentation of Self in Everyday Life. New York: Doubleday.
This is a seminal sociological text, authored by Erving Goffman and published by a reputable academic press (Doubleday). Despite its age, it is still widely cited and relevant in discussions about identity performance. It is reliable and theoretically useful in examining how individuals manage impressions online similarly to face-to-face interactions. It was used to support arguments around self-presentation and identity curation on digital platforms.
Rogers, E.M. (2003). Diffusion of Innovations. 5th edn. New York: Free Press.
This reliable source is authored by Everett Rogers, a respected communication theorist, and published by Free Press. Though published in 2003, the fifth edition includes updated examples and is still relevant in discussions about how new ideas, technologies, and trends (such as memes or political ideologies) spread. The book was used to explore how digital memes propagate and become socially significant within online communities.
H.Akin UNVER. (2024). Using AI as a weapon of repression and its impact on human rights. [Online]. European Parliament. Available at: *https://www.europarl.europa.eu/RegData/etudes/IDAN/2024/754450/EXPO_IDA(2024)754450_EN.pdf* [Accessed 16th February 2025].
This is a very reliable and recent source, published by the European Union. As an institutional and peer-reviewed report, it has strong credibility. Accessed in February 2025 and published in May 2024, it is up-to-date. The report provides detailed analysis on how AI algorithms influence human rights and push users into ideological bubbles. This information was used to validate the claim that recommender systems may reinforce political bias in subtle but measurable ways.
Brown, M.A. et al. (2022). Echo Chambers, Rabbit Holes, and Algorithmic Bias: How YouTube Recommends Content to Real Users.[Online]. SSRN. Available at: ***https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4114905* **[Accessed: 16 February 2025].
This paper, hosted on SSRN, is reliable as it is authored by political science researchers at NYU and hosted on a respected academic repository. Published in May 2022 and accessed in February 2025, it is recent and still relevant. The article offers evidence that YouTube’s recommendation system does push users toward ideological echo chambers. This study was cited to provide academic grounding to the claim that social media algorithms have measurable political impact.
Ibrahim, H. et al. (2023).YouTube’s recommendation algorithm is left-leaning in the United States.[Online]. PNAS Nexus, Oxford Academic. Available at: ***https://academic.oup.com/pnasnexus/article/2/8/pgad264/7242446* **[Accessed: 16 February 2025].
This study, published in PNAS Nexus by Oxford Academic, is highly reliable, peer-reviewed, and current (August 2023). It was accessed in February 2025. The study contributes valuable insights into the political leanings of recommendation systems in the U.S. context. It was used to contrast and complement earlier sources by showing that algorithms may not always lead to right-wing bias but can still create echo chambers through personalization.
Unkown. (2024). Memes: An Overview. [Online]. Easy Sociology. Available at: *https://easysociology.com/sociology-of-media/memes-an-ov* [Accessed 16th February 2025].
This article is reasonably reliable, coming from a trusted educational site known for accessible academic summaries. Published in December 2024 and accessed in February 2025, it is both recent and relevant. It was used to highlight the idea that memes serve as digital cultural capital, conveying status and social value within online communities.
Nissenbaum, A. and Shifman, L. (2015). Internet memes as contested cultural capital: The case of 4chan’s /b/ board.[Online].The Hebrew University of Jerusalem. Available at: ***https://www.folklore.ee/rl/fo/konve/ishs2018/wp-content/uploads/2017/10/Kuipers-seminar4_NissenbaumShifman-2015.pdf* **[Accessed: 16 February 2025].
Authored by researchers at The Hebrew University of Jerusalem, this academic paper is reliable and peer-reviewed. Though published in 2015, it remains useful for its sociological analysis of online meme culture. It was used to support the idea that memes serve not only as humor but also as symbols of group membership and in-group/out-group distinctions.
Mitman, T. and Denham, J.(2024). Into the meme stream: The value and spectacle of Internet memes. [Online]. SAGE Journals. Available at: ***https://journals.sagepub.com/doi/10.1177/14614448241227843* **[Accessed: 16 February 2025].
This source is particularly academically rigorous as it comes from an established source of academic journals. It was accessed in February 2025, meaning it was still up-to-date. The quote used from the article assesses the use of memes in social hierarchy making it relevant and useful to the study.
7. Appendix
7.2 Youtube recommendation algorithm pilot study
Categories: Fitness
| n_1 | Interactions - average # of the same category recorded | Interactions - average proportion of videos of the same category (%) | ||||||
|---|---|---|---|---|---|---|---|---|
| 1 | 2 | 3 | 4 | 1 | 2 | 3 | 4 | |
| 1 | 1 | 1 | 1 | 1 | 100 | 100 | 100 | 100 |
| 2 | 2 | 2 | 2 | 2 | 100 | 100 | 100 | 100 |
| 3 | 3 | 3 | 3 | 2 | 100 | 100 | 100 | 100 |
| 4 | 3 | 3 | 4 | 3 | 100 | 100 | 100 | 100 |
| 5 | 3 | 4 | 4 | 4 | 100 | 100 | 100 | 100 |
| 6 | 3 | 5 | 4 | 4 | 93.45238095 | 100 | 100 | 100 |
| 7 | 3 | 6 | 5 | 4 | 80.10204082 | 100 | 100 | 100 |
| 8 | 3 | 6 | 5 | 5 | 70.08928571 | 100 | 100 | 100 |
| 9 | 3 | 6 | 5 | 6 | 62.3015873 | 90.47619048 | 100 | 94.44444444 |
| 10 | 3 | 6 | 6 | 7 | 56.07142857 | 81.42857143 | 90.35714286 | 85 |
| 11 | 3 | 6 | 7 | 8 | 50.97402597 | 74.02597403 | 82.14285714 | 77.27272727 |
| 12 | 4 | 6 | 7 | 9 | 46.72619048 | 67.85714286 | 75.29761905 | 70.83333333 |
| 13 | 4 | 7 | 8 | 9 | 43.13186813 | 62.63736264 | 69.50549451 | 65.38461538 |
| 14 | 4 | 7 | 9 | 9 | 40.05102041 | 58.16326531 | 64.54081633 | 60.71428571 |
| 15 | 5 | 7 | 9 | 9 | 37.38095238 | 54.28571429 | 60.23809524 | 56.66666667 |
| 16 | 6 | 8 | 9 | 9 | 35.04464286 | 50.89285714 | 56.47321429 | 53.125 |
| 17 | 6 | 9 | 10 | 9 | 32.98319328 | 47.89915966 | 53.1512605 | 50 |
| 18 | 7 | 9 | 11 | 10 | 31.15079365 | 45.23809524 | 50.1984127 | 47.22222222 |
| 19 | 7 | 10 | 11 | 11 | 29.5112782 | 42.85714286 | 47.55639098 | 44.73684211 |
| 20 | 8 | 10 | 12 | 11 | 28.03571429 | 40.71428571 | 45.17857143 | 42.5 |
| 22 | 9 | 11 | 13 | 12 | 26.70068027 | 38.7755102 | 43.02721088 | 40.47619048 |
| 22 | 9 | 12 | 14 | 13 | 25.48701299 | 37.01298701 | 41.07142857 | 38.63636364 |
| 23 | 9 | 12 | 15 | 13 | 24.37888199 | 35.40372671 | 39.28571429 | 36.95652174 |
| 24 | 9 | 13 | 15 | 13 | 23.36309524 | 33.92857143 | 37.64880952 | 35.41666667 |
| 25 | 10 | 14 | 16 | 13 | 22.42857143 | 32.57142857 | 36.14285714 | 34 |
| 26 | 10 | 15 | 16 | 13 | 21.56593407 | 31.31868132 | 34.75274725 | 32.69230769 |
| 27 | 10 | 15 | 16 | 14 | 20.76719577 | 30.15873016 | 33.46560847 | 31.48148148 |
| 28 | 10 | 15 | 16 | 15 | 20.0255102 | 29.08163265 | 32.27040816 | 30.35714286 |
Graph 7.2.1
Categories: Fitness,Minecraft
| n_1 | Interactions - average # of the same category recorded | Interactions - average proportion of videos of the same category (%) | ||||||
| 1 | 2 | 3 | 4 | 1 | 2 | 3 | 4 | |
| 1 | 1 | 1 | 1 | 1 | 100 | 100 | 100 | 100 |
| 2 | 2 | 2 | 2 | 2 | 100 | 100 | 100 | 100 |
| 3 | 3 | 2 | 3 | 3 | 100 | 100 | 100 | 100 |
| 4 | 3 | 3 | 4 | 3 | 100 | 100 | 100 | 100 |
| 5 | 3 | 3 | 5 | 3 | 100 | 100 | 100 | 100 |
| 6 | 4 | 3 | 6 | 4 | 100 | 100 | 100 | 100 |
| 7 | 4 | 4 | 6 | 5 | 100 | 100 | 100 | 100 |
| 8 | 5 | 5 | 7 | 5 | 88.83928571 | 100 | 100 | 100 |
| 9 | 5 | 6 | 8 | 6 | 78.96825397 | 93.65079365 | 100 | 90.47619048 |
| 10 | 5 | 7 | 9 | 6 | 71.07142857 | 84.28571429 | 100 | 81.42857143 |
| 11 | 6 | 7 | 10 | 7 | 64.61038961 | 76.62337662 | 95.77922078 | 74.02597403 |
| 12 | 6 | 8 | 11 | 7 | 59.22619048 | 70.23809524 | 87.79761905 | 67.85714286 |
| 13 | 7 | 9 | 11 | 8 | 54.67032967 | 64.83516484 | 81.04395604 | 62.63736264 |
| 14 | 7 | 9 | 11 | 9 | 50.76530612 | 60.20408163 | 75.25510204 | 58.16326531 |
| 15 | 8 | 9 | 12 | 9 | 47.38095238 | 56.19047619 | 70.23809524 | 54.28571429 |
| 16 | 8 | 9 | 13 | 9 | 44.41964286 | 52.67857143 | 65.84821429 | 50.89285714 |
| 17 | 8 | 10 | 13 | 10 | 41.80672269 | 49.57983193 | 61.97478992 | 47.89915966 |
| 18 | 9 | 11 | 13 | 11 | 39.48412698 | 46.82539683 | 58.53174603 | 45.23809524 |
| 19 | 9 | 12 | 13 | 11 | 37.40601504 | 44.36090226 | 55.45112782 | 42.85714286 |
| 20 | 10 | 12 | 14 | 11 | 35.53571429 | 42.14285714 | 52.67857143 | 40.71428571 |
| 22 | 10 | 12 | 15 | 11 | 33.84353741 | 40.13605442 | 50.17006803 | 38.7755102 |
| 22 | 10 | 12 | 15 | 12 | 32.30519481 | 38.31168831 | 47.88961039 | 37.01298701 |
| 23 | 10 | 13 | 15 | 12 | 30.90062112 | 36.64596273 | 45.80745342 | 35.40372671 |
| 24 | 11 | 13 | 15 | 12 | 29.61309524 | 35.11904762 | 43.89880952 | 33.92857143 |
| 25 | 11 | 13 | 15 | 12 | 28.42857143 | 33.71428571 | 42.14285714 | 32.57142857 |
| 26 | 11 | 13 | 16 | 13 | 27.33516484 | 32.41758242 | 40.52197802 | 31.31868132 |
| 27 | 11 | 14 | 16 | 13 | 26.32275132 | 31.21693122 | 39.02116402 | 30.15873016 |
| 28 | 12 | 14 | 16 | 13 | 25.38265306 | 30.10204082 | 37.62755102 | 29.08163265 |
7.2.2
Categories: Fitness,minecraft, AI
| n_1 | Interactions - average # of the same category recorded | Interactions - average proportion of videos of the same category (%) | ||||||
|---|---|---|---|---|---|---|---|---|
| 1 | 2 | 3 | 4 | 1 | 2 | 3 | 4 | |
| 1 | 1 | 1 | 1 | 1 | 100 | 100 | 100 | 100 |
| 2 | 2 | 2 | 2 | 2 | 100 | 100 | 100 | 100 |
| 3 | 3 | 2 | 3 | 3 | 100 | 100 | 100 | 100 |
| 4 | 3 | 2 | 4 | 4 | 100 | 100 | 100 | 100 |
| 5 | 3 | 2 | 5 | 5 | 95.55555556 | 100 | 100 | 100 |
| 6 | 4 | 2 | 6 | 5 | 79.62962963 | 100 | 100 | 100 |
| 7 | 4 | 2 | 6 | 6 | 68.25396825 | 91.00529101 | 100 | 100 |
| 8 | 4 | 3 | 7 | 7 | 59.72222222 | 79.62962963 | 100 | 100 |
| 9 | 4 | 3 | 8 | 8 | 53.08641975 | 70.781893 | 100 | 100 |
| 10 | 4 | 4 | 9 | 9 | 47.77777778 | 63.7037037 | 100 | 100 |
| 11 | 4 | 5 | 10 | 9 | 43.43434343 | 57.91245791 | 98.98989899 | 92.25589226 |
| 12 | 4 | 6 | 11 | 9 | 39.81481481 | 53.08641975 | 90.74074074 | 84.56790123 |
| 13 | 4 | 7 | 11 | 10 | 36.75213675 | 49.002849 | 83.76068376 | 78.06267806 |
| 14 | 4 | 7 | 11 | 10 | 34.12698413 | 45.5026455 | 77.77777778 | 72.48677249 |
| 15 | 5 | 7 | 12 | 10 | 31.85185185 | 42.4691358 | 72.59259259 | 67.65432099 |
| 16 | 5 | 7 | 13 | 11 | 29.86111111 | 39.81481481 | 68.05555556 | 63.42592593 |
| 17 | 5 | 7 | 13 | 12 | 28.10457516 | 37.47276688 | 64.05228758 | 59.69498911 |
| 18 | 5 | 7 | 13 | 12 | 26.54320988 | 35.3909465 | 60.49382716 | 56.37860082 |
| 19 | 5 | 7 | 13 | 12 | 25.14619883 | 33.52826511 | 57.30994152 | 53.41130604 |
| 20 | 5 | 7 | 14 | 13 | 23.88888889 | 31.85185185 | 54.44444444 | 50.74074074 |
| 22 | 5 | 8 | 15 | 14 | 22.75132275 | 30.335097 | 51.85185185 | 48.32451499 |
| 22 | 5 | 9 | 15 | 14 | 21.71717172 | 28.95622896 | 49.49494949 | 46.12794613 |
| 23 | 5 | 10 | 15 | 14 | 20.77294686 | 27.69726248 | 47.34299517 | 44.12238325 |
| 24 | 6 | 10 | 15 | 15 | 19.90740741 | 26.54320988 | 45.37037037 | 42.28395062 |
| 25 | 7 | 10 | 15 | 15 | 19.11111111 | 25.48148148 | 43.55555556 | 40.59259259 |
| 26 | 7 | 11 | 16 | 15 | 18.37606838 | 24.5014245 | 41.88034188 | 39.03133903 |
| 27 | 8 | 12 | 16 | 15 | 17.69547325 | 23.59396433 | 40.32921811 | 37.58573388 |
| 28 | 9 | 13 | 16 | 15 | 17.06349206 |
7.2.3
The graphs show the following trends:
- When only one category was interacted with (low initial social chaos) the recommendation algorithm seemed to cluster at about ~30% of videos recommended are of the same category
- When two categories were interacted with each video increased the % of the same category being recommended up to ~40% where it seemed to return to a baseline of ~30%
- When 3 categories were interacted with it seemed the results after the first two videos watched clustered at ~20% and the results after the second 2 videos watched clustered at ~40%
- For all sets of data, for the first three videos watched the % of the same category increased and then on the fourth video it dropped to be between the results of videos 2 and 3. It also seems like as you increase the number of categories interacted with the mean proportion of videos of the same category as initially watched after video 3 shown by the yellow curve in all 3 graphs is higher and the behaviour of the results after video 4 suggests a negative feedback loop where the mean proportion of videos the same category stabilises at a value proportional to initial social chaos.
This creates a three tier stratification system:
- Users with high levels of social chaos will have a high level of the same kind of content recommended to them → This will reduce their social chaos because they are only interacting with the same types of content → This will pull them into a lower strata
- Users with medium levels of social chaos will have medium levels of the same kind of content recommended to them → This will slightly reduce their social chaos but over a much longer time period pulling them into the bottom strata
- Users with low levels of social chaos will have low levels of the same kind of content recommended to them → Something I noticed about this level when doing the experiment is that whilst the level of the same category of content being recommended is slightly low the entirety of the content being recommended was incredibly similar so this does not mean the users social chaos was increased
This means that overall there is a gradient towards the lowest strata of this structure such that the higher up you are the stronger the gradient is.
7.3 Algorithmic analysis
The source for this section is: How do recommender systems work on digital platforms?
From the article, we learn that recommendation algorithms have five steps.
- The first step of a recommender system is building an inventory - that is “all content and user activity available to be shown to a user”.
- The second step is “Integrity processes”: Here items which violate a platform’s content policies are removed from the inventory and the inventory is scanned for “borderline” content (“items that can be published but not shared (or at least not shared widely). Typically, this includes text, video, or audio that is known not to violate the platform’s term of service but that the platform has reason to believe may be problematic or offensive.”).
- The next step is the candidate generation step: recommender systems narrow the content inventory to a manageable subset of items for ranking. To do this efficiently, platforms typically use an “approximate nearest neighbor” (ANN) search, which “typically grabs dozens or hundreds of items that are likely in the ballpark of a users’ revealed preferences and interests”.
- Penultimately is the ranking step: The remaining subset of the inventory is ranked based on user engagement, typically using a deep learning algorithm which is trained on the users preferences
- Finally is the re-ranking step: Ranking algorithms often over represent certain content types or authors, creating redundancy.To address this, a re-ranking step applies hand-coded rules to ensure diversity in the final selection, balancing content types and authors for a better user experience.
We can say that the upper limit we hit for content of the same category (~40%) was applied by the re-ranking step. Additionally, the first three times we watch a new video the deep learning algorithm wants to recommend us more of the same type of content but on the fourth time this starts to decrease consistently. This suggests some kind of negative feedback on the deep learning algorithm step or that the re-ranking step intensifies.
Something else which is important to consider when analysing the algorithm is a famous edge case - echo chambers. My data shows that there is a drift that seems to decrease social chaos proportional to the social chaos a user has. This suggests that all users are in a mild echo chamber. This aligns with the source (Echo chambers, rabbit holes, and ideological bias: How YouTube recommends content to real users ) where the team of researchers investigate the effects of echo chambers purely from the youtube recommendation algorithm. They found that the recommendation algorithm itself does in fact push users into a “mild” echo chamber but the authors of the article suggest that it is user choice and not the algorithm itself which causes users to fall down “rabbit holes” (an increasing state of an echo chamber) to which avail it is significant to reference many psychological effects such as confirmation bias (where people tend to remember facts that agree with what they already believe) which when working in tandem with “mild” echo chambers can lead to more substantial polarisation ((typically political) ideological bias (which typically emerges from echo chambers)).This suggests that as the user experiences sustained decrease social chaos in the recommendation algorithm it actually reduces their social chaos psychologically going beyond their interaction with the recommendation algorithm.
7.4 Social analysis
In this section we analyse what it actually means to be in any of the 3 strata (high social chaos, medium social chaos and low social chaos). This results in the following framework.
- Upper Strata – The Domain of Cultural Architects:
Individuals positioned above the top critical point exhibit extremely high levels of social chaos, where their exposure to diverse and rapidly changing digital content induces a state of cognitive overload. Drawing on Goffman’s (1959) concept of dramaturgy, these users perform multiple, often conflicting social roles simultaneously. This divergent role behavior results in high “role entropy” (as discussed by Prigogine, 1984), where predictable patterns break down and novel cultural expressions emerge. In this stratum, the individuals—I will term “cultural architects”—synthesize new cultural forms by navigating and even exploiting the boundaries of algorithmically driven content recommendations. Their capacity to combine diverse inputs fosters the creation of subcultures that remain accessible only to those operating at or above this critical threshold. - Middle Strata – The Mainstream Propagators:
The majority of users occupy the middle strata, falling between the top and bottom critical points. Here, the influence of cultural architects is evident: individuals in this layer are able to access, maintain, and propagate the new cultural ideas generated in the upper strata. Their behavior is characterized by a moderate level of role entropy—less volatile than that of the cultural architects, yet sufficiently dynamic to facilitate a cascading diffusion of novel cultural forms. As these users share and reinforce emerging memes, they effectively transform radical ideas into mainstream trends, thereby shaping what is considered culturally acceptable. - Lower Strata – The Anchors of Cultural Stability:
Individuals below the bottom critical point are primarily passive absorbers of culture. Their behavior is marked by a convergence around a stable, predictable set of social roles—NPCs as mentioned in our introduction. In our analysis, these users resist rapid cultural change, functioning as anchors that moderate the influx of new, rapidly evolving cultural content. Their role is crucial in filtering which emergent cultural movements persist over time, thereby ensuring long-term stability within the broader social system.
Originally coined by Richard Dawkins in The Selfish Gene (1976), the term “meme” refers to any unit of cultural transmission that replicates from mind to mind much like a gene does in biology. In the digital age, this idea has been expanded to include internet memes—images, phrases, or videos that circulate widely online, often with variations introduced through remixing or parody.(Source: Internet meme - Wikipedia )
In this context, memes are not just humorous artifacts; they are carriers of cultural meaning. They encapsulate ideas, values, and social attitudes in a compact, replicable form. As such, they can be thought of as the building blocks or “molecules” of our shared digital culture.
We can see our three tiered system then as follows:
- Cultural Architects: These are the innovators and creators—the individuals who generate original meme content. Their creative output establishes cultural reference points and narratives.
- Active Propagators: These individuals engage with the memes, sharing, remixing, and adapting them. They act as intermediaries, spreading the ideas beyond the original niche.
- Passive Absorbers: The broader public, which internalizes these memes without necessarily contributing to their evolution.
This stratification mirrors societal structures in which active agents (often those with resources and skills) not only create but also control the dissemination of cultural content, while larger segments of the population simply absorb and replicate prevailing cultural norms.
Marx’s analysis of class focused on the economic division of labor, where the bourgeoisie (upper class) control the means of production and, by extension, the ideological apparatus. In this framework, the “cultural architects” resemble the bourgeoisie: they generate and set the dominant cultural narratives (memes) that the rest of society absorbs. The working class, or proletariat, on the other hand, are likened to the passive absorbers who consume these narratives with little agency.
Max Weber later expanded on class by introducing the concept of social capital—the resources derived from one’s network, status, and cultural competence. In Weber’s framework, a person’s ability to connect with others and influence cultural discourse (their social capital) is as important as their economic capital. Here, the capacity to understand, create, and propagate memes becomes a form of social capital - memes become a resource.Not just memes in the sense of captioned pictures but memes in the sense of reproducible units of culture.
Cultural capital, as theorized by Pierre Bourdieu (though not mentioned explicitly in the original excerpt, it complements Marx and Weber’s ideas), refers to non-economic assets that enable individuals to succeed in a society—such as language, education, and cultural awareness.(Source: Cultural capital - Wikipedia)
Cultural capital can be thought of as a person’s ability to acquire social capital. In the contemporary era this is a person’s ability to understand memes. We can think of this ability as the degree to which a person is a cultural architect.
Drawing from Everett Rogers’ diffusion of innovations theory, NPCs resemble the “laggards”—the final group to adopt new ideas. While early adopters and innovators may embrace radical or rapidly changing content, NPCs filter these trends through preexisting habits and expectations. As a result, they provide a stabilizing influence that anchors the cultural landscape, curbing the extremes of fast-paced cultural flux. In our framework, this stabilization is essential: the slow, deliberate cultural absorption by NPCs transforms transient, high-entropy innovations into enduring mainstream norms.
Furthermore, the concept of “role entropy” (as discussed in Prigogine’s work on order emerging from chaos) helps explain this phenomenon. In NPCs, low role entropy reflects highly predictable, stable behaviors—these individuals tend to perform a single, well-defined role. In contrast, cultural architects, exposed to overwhelming social chaos, experience role fragmentation and cognitive overload, leading to the emergence of novel, sometimes radical, cultural forms. The NPCs, by resisting these rapid shifts, provide the continuity necessary for any cultural change to be evaluated, refined, and eventually integrated into the collective habitus.
I have previously established that recommendation algorithms have an effect of slowly decreasing social chaos over time, this causes a wide-spread shift of society towards the lower strata (NPCs). From this research we can evaluate this as meaning that people increasingly take on the role of “Laggards” which suggests cultural stagnation.
So what does this mean for the future? Well, through technology symbols are becoming increasingly self-referential and culture is created in an erratic,fragmented way. This means that people struggle to form long term ideological attachments, preventing the true formation of roles and identities. This causes roles to dissolve and implode, causing new roles that lead to more of the formation of new culture as well as exacerbating social chaos, repeating the cycle. If this trend continues, we can predict that eventually all cultures will become completely fragmented and that this fragmented culture will be created faster and faster. As more fragmented culture is created, this will also lead to more people who lack social capital becoming NPCs at an increasing rate exacerbating the divide between NPCs and cultural architects.