How algorithm is making us a “closed-loop” (or even towards extremism)

This is something that I have been advocating for years – We need to expose ourselves to differing views in order not to be a “closed-loop”.

Birds of the same feather flock together. It is a known fact that humans tend to assimilate with individuals who are similar to them. To prevent social disintegration, Singapore famously took approaches that are unheard of in the western world, such as dictating that only a certain proportion of a particular race can stay in a particular residential block to allow the intermingling of different races.

I am often accused of interfering in the private lives of citizens. Yes, if I did not, had I not done that, we wouldn’t be here today. And I say without the slightest remorse, that we wouldn’t be here, we would not have made economic progress, if we had not intervened on very personal matters – who your neighbour is, how you live, the noise you make, how you spit, or what language you use. We decide what is right.”

– Lee Kuan Yew (Singapore 1st Prime Minister)

It seems to be adequate until today’s problem is exacerbated by technology. How so?

It is a known fact that social media algorithm promotes whatever that they think we like to us. When we disagree with a friend, we can simply “unfriend” them on social media. Considering the amount of time we spend with screens instead of in person, especially with home-based work gaining traction due to the pandemic, it is easier than ever to block out differing views.

The current solution we adopt is to steer individuals away from such unhealthy materials. Several social media have taken approaches to prevent individuals from accessing materials associated with hate, discrimination and violence. However, these are clear cut topics that are against good values. Certain inclinations can be more subtle, such as one’s political beliefs. When we only subject one towards what they like and what they tend to like, what we create is an echo chamber. Differences between individuals are going to magnify over time and result in paradigm being reinforced, instead of challenged. And paradigms are inherently difficult to overcome until presented with overwhelming evidence that challenges them. Furthermore, the algorithm needs to be aware of new, trending topics before it can do something about them. This means that the person coding the algorithm must be aware. With so many new trends, topics and subcultures coming up every day, whether we can adequately address the issue in a timely manner remains questionable.

While it may not cause us to all be extremists, at least, it keeps our eyes on the screen.





Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.