Support

Explore

HomeNo Image is Available
About UsNo Image is Available
AuthorsNo Image is Available
TeamNo Image is Available
CareersNo Image is Available
InternshipNo Image is Available
Contact UsNo Image is Available
MethodologyNo Image is Available
Correction PolicyNo Image is Available
Non-Partnership PolicyNo Image is Available
Cookie PolicyNo Image is Available
Grievance RedressalNo Image is Available
Republishing GuidelinesNo Image is Available

Languages & Countries :






More about them

Fact CheckNo Image is Available
LawNo Image is Available
ExplainersNo Image is Available
NewsNo Image is Available
DecodeNo Image is Available
BOOM ReportsNo Image is Available
Media BuddhiNo Image is Available
Web StoriesNo Image is Available
BOOM ResearchNo Image is Available
WorkshopsNo Image is Available
VideosNo Image is Available

Support

Explore

HomeNo Image is Available
About UsNo Image is Available
AuthorsNo Image is Available
TeamNo Image is Available
CareersNo Image is Available
InternshipNo Image is Available
Contact UsNo Image is Available
MethodologyNo Image is Available
Correction PolicyNo Image is Available
Non-Partnership PolicyNo Image is Available
Cookie PolicyNo Image is Available
Grievance RedressalNo Image is Available
Republishing GuidelinesNo Image is Available

Languages & Countries :






More about them

Fact CheckNo Image is Available
LawNo Image is Available
ExplainersNo Image is Available
NewsNo Image is Available
DecodeNo Image is Available
BOOM ReportsNo Image is Available
Media BuddhiNo Image is Available
Web StoriesNo Image is Available
BOOM ResearchNo Image is Available
WorkshopsNo Image is Available
VideosNo Image is Available
Media Buddhi

Republic TV, FB, WhatsApp, YouTube: What Do They Have in Common?

Echo Chambers and Filter Bubbles, that's what.

By - H R Venkatesh | 25 Nov 2020 4:20 PM GMT

It shouldn't matter whom we support or what we believe, so long as we are exposed to a multitude of perspectives. But this condition is getting increasingly difficult to achieve today, because of filter bubbles and echo chambers.

Here's a quick experiment or game that can help us understand these concepts. H R Venkatesh, Media Buddhi, Media Literacy, Media Literacy Project, Media Literacy India,
 

We'll need: Access to a digital device (a laptop is best but your tablet or mobile phone will do too) and a YouTube account.

  1. Go to YouTube as you do normally and glance at all the recommended videos. (If on the phone, go to the YouTube app.)
  2. Next, open YouTube again on a different browser. Look at all the video choices before you. You will see that the recommended videos on your browser are different from the ones on your regular browser or app.
  3. You can go a step further. On your non-regular browser, click on videos you wouldn't normally watch. Once you do that, YouTube will give you more recommendations based on what you watched. Click on a few more videos that you would not ordinarily view. If you have the time, do this off and on for a couple of days.
  4. Now compare the results. Your two YouTube pages (on your regular and non-regular browsers)
    will look entirely different.

This experiment works better if you open YouTube on Incognito mode (Chrome) or Private mode (Safari, Firefox).

You can do the same thing with Facebook, except that you will need use two accounts (if you have them) or be willing to alter your existing one. Go to Facebook, and start 'liking' posts and pages that you would normally never 'like' and before too long, the feed will look entirely different.

Hacking the algorithm

Masato Kajimoto, a news literacy professor based at the University of Hong Kong, conducts this exercise with his students. He calls it 'hack the algorithm'. If anything, he says, Facebook is easier to manipulate than YouTube because the feed changes or adapts more quickly to your inputs.

The game is called 'hack the algorithm', because both YouTube and Facebook are driven by algorithms, which are computer programs that automatically decide what you would like to see based on your previous behaviour.

This can be dangerous. If a man, say X, likes to watch conspiracy theory videos on YouTube, then the algorithm will start recommending more and more extreme types of videos to him. (We'll get into why algorithms do this a little later in the post.)

Filter Bubbles

On its own, a habit of watching conspiracy theory videos on YouTube is relatively harmless, but personalized recommendations that are driven by algorithms are everywhere, not just on the video streaming site and Facebook. Twitter and Instagram give us a feed tailored to our preferences and biases too.

Scarily enough, even a Google search is dictated by algorithms. Duck Duck Go, a search engine that focuses on users' privacy said in a study that your personal data informs search results. It stated that some search hits are "moved up or down or added to your Google search results, necessitating the filtering of other search results altogether". This, Duck Duck Go claimed, was especially true of political topics:

"…undecided and inquisitive voters turn to search engines to conduct basic research on candidates and issues in the critical time when they are forming their opinions on them. If they're getting information that is swayed to one side because of their personal filter bubbles, then this can have a significant effect on political outcomes in aggregate."

So when we spend hours everyday hopping from YouTube to Facebook to Twitter to Instagram and checking stuff on Google search, we're exposed relentlessly to a certain worldview driven by algorithms. This worldview can shape our attitudes and then our behaviour.

It's like each of us is living in a bubble of our own.

This is called a 'Filter Bubble'— which is a bubble created by filtering out different perspectives.

Echo Chambers

But it's not just algorithm-driven content that places us in bubbles. Even non-algorithmic content does this.

For example, WhatsApp groups function without algorithms. Yet, they too create or reinforce a particular worldview. Being in a WhatsApp group is like being in a chamber where you hear voices like your own being echoed back to you. Hence the term echo chamber.

Echo chambers are found offline too. If you watch a particular TV news channel regularly for example, like Republic TV or Fox News, you are exposed to a very particular set of attitudes. (The same is true of TV news networks with a different slant as well.)

We are exposed to echo chambers (not driven by algorithms) and filter bubbles (driven by algorithms) every day, every hour and every minute. Is it any wonder then that we are unable to agree on anything, even on basic facts that ought to be indisputable?

Why do we stay in echo chambers and filter bubbles?

Life is tough, even for the relatively privileged. And we humans are wired towards finding other people who think like us.

People who prefer echo chambers and filter bubbles that reflect back extremist perspectives are often people who are isolated, angry and alienated. Even those who don't feel isolated have grievances—like all humans—that can be easily weaponized in this new world of hyper connectivity.

The philosophy professor C. Thi Nguyen at Utah Valley University writes that echo chambers are "the real problem". He says that filter bubbles are easier to pop and we can do that by exposing those inside these bubbles to "arguments they've missed".

He takes climate change deniers as an example:

"They are fully aware of all the arguments on the other side. Often, they rattle off all the standard arguments for climate change, before dismissing them. Many of the standard climate change denial arguments involve claims that scientific institutions and mainstream media have been corrupted by malicious forces.
What's going on, in my view, isn't just a bubble. It's not that people's social media feeds are arranged so they don't run across any scientific arguments; it's that they've come to systematically distrust the institutions of science."

His piece is worth reading, because he adds a layer of nuance to our understanding of filter bubbles and echo chambers.

Why don't social networks change their algorithms?

One might reasonably ask, why don't social networks alter their algorithms in order to serve us multiple viewpoints?

The quick answer to that is: money. These networks make more money when we stay on their platforms for longer periods of time. They ensure that by serving us content that appeals to our sense of community, or by appealing to our fear and anger. Here, extremist viewpoints work better than moderate ones.

The media does this too on a daily basis, but Silicon Valley has raised this to an art (having spent millions of dollars or hundreds of millions on the science of attention hacking and addiction).

This morning, I received a message from an Indian reader of Media Buddhi who tracks US politics closely. He had just read last week's issue on political gaslighting and felt strongly that I had presented only one side of the issue. With his permission, I'm including the message here with some edits:

"Gaslighting - while it was very well written, I thought it was too one-sided. The media's concerted effort to cover up Joe Biden's obvious cognitive decline (he forgot what office he is running for, for crying out loud) or for that matter the 1983 Nellie massacre come to mind...I may be completely wrong here, but to express only one side's point of view and covering up the crimes/ineptitude of the other by journalists could also be clubbed under gas-lighting, no?! In fact, when journalists/media companies decide NOT to present all sides of the story (given that tribalism, echo chambers, news bubbles etc., exists - which you have yourself written about not too long ago) shouldn't it be clubbed under gas-lighting as well? We in the South, after all, like to give Ravana and Duryodhana the same benefit of doubt (or at least some) unlike those who believe everything is black and white!"

What do you all think?