If you’re someone who creates content or just loves scrolling through social media platforms like YouTube or Instagram, then you’re probably familiar with algorithmic curation. It’s basically the process by which these platforms use algorithms to select and prioritize content for users to see based on their interests and past behavior.
So, for example, if you’re into cooking videos and you’ve been watching a lot of recipe tutorials lately, the algorithm will likely show you more food-related content. Similarly, if you’re someone who loves watching travel vlogs, the algorithm will show you more content from travel creators.
It’s become a hot topic because these algorithms are getting more sophisticated. They are shaping what we see and don’t see on social media platforms. In his 2018 book “Ten Arguments for Deleting Your Social Media Accounts Right Now“, Jaron Lanier raised ample concerns about how the big tech deploys algorithms to modify user behavior.
On the one hand, personalized content recommendations can provide a convenient and efficient way for individuals to access information. However, on the other hand, they can have a negative impact on the diversity of perspectives and information consumed.
What are the major concerns?
One concern is that algorithms can reinforce pre-existing biases and create echo chambers.
By only presenting information that aligns with an individual’s past preferences and behaviors, algorithms may limit exposure to alternative viewpoints. This can result in a narrow and polarized understanding of the world, as well as the reinforcement of incorrect or misleading information.
Overall, the impact of algorithmic curation is an important issue that deserves ongoing attention and analysis. It highlights the need for media literacy education and the development of algorithms that prioritize public interest over commercial interests.
Examples of how algorithmic curation impact you
Personalized content recommendations can create filter bubbles. The filter bubble means that you end up in a sort of “echo chamber” where you only find content and opinions similar to your own. This has many dangerous offshoots, a few of which I would discuss below:
1. Reinforcing political biases
If you frequently engage with content that aligns with your political views, algorithms may prioritize similar content in your recommendations.
While on the surface, this may appear harmless, you may end up imposing a self-reinforcing cycle on yourself – one that limits exposure to alternative viewpoints. This can contribute to political polarization and the reinforcement of false or misleading information.
2. Narrowing interests
Let’s say, you’re a huge fan of motivational videos, and you watch them every chance you get. Whether you’re commuting or just relaxing at home, you know. Before you know it, the YouTube algorithm will prioritize showing you only motivational content.
Now, don’t get me wrong – there’s nothing inherently wrong with loving motivational videos. But it’s important to remember that the content we consume shapes our worldview and influences our thinking.
If you want to broaden your horizons and learn about different topics, you might have to be a bit more proactive in seeking out that content yourself. Don’t let the algorithm dictate what you see and don’t see!
3. Commercial interests over public interest
Have you ever wondered why some posts or articles seem to get a lot of attention and visibility on social media, even if they’re not high quality or factually accurate? Well, it’s all thanks to the algorithm!
You see, algorithms are designed to prioritize content that generates engagement and clicks. This means that sensational or clickbait headlines can end up receiving more visibility in recommendations.
Like other issues we discussed, this might also seem harmless at first, but it can have serious consequences. It can perpetuate false information. So next time, when you feel tempted to like or share something, remember this: if it seems too good (or too outrageous) to be true, it probably is.
4. Lack of media diversity
If algorithms are trained on data that reflects the biases and preferences of certain groups, they may end up prioritizing content that reflects those biases. This can have a real impact on the representation of diverse voices.
Let’s say, an algorithm is trained on data that shows that people of a certain gender or race tend to engage with certain types of content. It may, then, start to prioritize that content over other, more diverse perspectives. This can be harmful because it reinforces harmful stereotypes.
The only remedy to this problem is to intentionally seek out content from a variety of sources and creators, and not just rely on the recommendations that the algorithm throws up.
5. Amplifying conspiracy theories
Have you ever fallen down a rabbit hole of conspiracy theories and false information online? Well, it turns out that algorithmic curation can actually amplify these kinds of ideas and make it harder to discern fact from fiction.
A case in point is the spread of the “QAnon” conspiracy theory which gained popularity on social media platforms in recent years. So this theory promoted the wild idea that a secret cabal of powerful people was running a global child sex trafficking ring, and former President Donald Trump was secretly fighting to expose and dismantle the ring.
No evidence was found to support these claims, but the theory gained a large following online, with QAnon believers often sharing and promoting false information and baseless claims.
The problem is that this kind of algorithmic curation can contribute to huge mistrust, as people become more entrenched in particular beliefs and less willing to consider other perspectives.
6. Underrepresentation of important news
One major downside of algorithmic curation is that it may lead to the underrepresentation of important news stories that aren’t as popular or attention-grabbing.
In other words, it may prioritize entertaining and clickbait content over less flashy but still important news. This could result in a limited and skewed understanding of current events.
Which companies practice algorithmic curation?
Many technology companies, including social media platforms, search engines, and e-commerce sites exploit algorithmic curation. These include:
1. Facebook – Facebook uses algorithms to curate and personalize users’ news feeds, based on their engagement history, interests, and demographic data. This has been criticized for amplifying fake news, hate speech, and political propaganda.
2. Google – Google’s search algorithms curate and rank search results based on relevance and authority, but they have also been accused of perpetuating bias and misinformation. For example, Google’s autocomplete feature can perpetuate stereotypes and false information.
3. Amazon – Amazon’s recommendations algorithms curate and personalize product recommendations for users based on their purchase history and preferences. This has been criticized for perpetuating consumerism and promoting products that are not environmentally sustainable.
4. YouTube – YouTube uses algorithms to recommend videos to users based on their viewing history, likes, and comments. This has been criticized for promoting extremist content and conspiracy theories.
It’s important to remember that the content we see on social media is not just a reflection of what’s out there in the world. It’s shaped by the algorithms that curate it. There is no doubt that there are many benefits to algorithmic curation such as improved user experiences, but algorithms have a dark side, too, as we saw in this post.
They can perpetuate biases and amplify false information and conspiracy theories. They can lead to the creation of filter bubbles and a narrow understanding of the world, which can have serious consequences for individuals and society as a whole.
As users, it’s important to be aware of the potential negative impacts of algorithmic curation and actively seek out reliable sources of information. Only then can we make informed decisions and have a well-rounded understanding of the world around us.
©BookJelly. All rights reserved