Bias: Four-Letter Word That Explains The World Of Misinformation

The idea that biases affect our judgment for what is trustworthy and share-able is fairly recent. The more aware we are, the better.

You know that giddy feeling when you've learnt something new and your entire worldview has been reoriented because of it?

That's how I felt when I learned about the concept of biases.

However, this is not necessarily the word as we use it in normal conversation, such as, "he's so biased against me" or "I'm biased towards her", but the word that social scientists use to describe a characteristic human trait. (Of course, there are considerable overlaps in the two usages.)

So what are biases?

There are several definitions out there, but I'll go with this: biases are hidden prejudices that shape what we see, think, and do.

It is both a profoundly simple and comprehensive definition, and I literally took it from the title of Jennifer L. Eberhardt's brilliant book, Biased: Uncovering the Hidden Prejudice That Shapes What We See, Think, and Do. (It is also a very timely book, because she uses the lens of race through which to examine the issue.)

Eberhardt is a professor of psychology at Stanford University and a grantee of the MacArthur 'genius' grant. She is also African-American—a point which she uses to drive home the idea of bias.

She writes that the first 12 years of her life were spent in an "all-black world." What she means is that until then, every single, meaningful interaction she had was with other black people.

However, when her parents moved to a more affluent neighbourhood, she came into contact with white people. She also had to switch schools, and at her new school, she had an unexpected crisis. The then 12-year-old was expecting to be treated badly on account of her race, but her new friends were friendly, and indeed, went out of their way to make her feel welcome.

It was she who felt was treating them badly. Not because she wanted to, but because she had this inexplicable problem: of forgetting her new friends' names. Indeed she not only forgot their names, but also regularly confused one person for the other.

As she writes, this was because she was "confronted with a mass of white faces that I could not distinguish from one another" every single day.

She continues:

"I'd had no practice recognizing white faces. They all looked alike to me. I could describe in detail the face of the black woman I happened to pass in the shopping mall. But I could not pick out from a crowd the white girl who sat next to me in English class every day."

Eberhardt's point is that this is a universal thing. People are better at recognizing people from their own race.

Her story rang a bell for me because I remembered that as a young child, I too had trouble distinguishing white and black people's faces. Ditto with fellow Indians from the north east—people who are usually described to have 'East Asian' features. Once I started watching sports regularly, I was exposed to a wide variety of faces, but I can still remember the early incomprehension.

How does this square with the idea of bias?

Eberhardt says that this inability to see unfamiliar groups as comprising of unique individuals happens due to both biological reasons and social conditioning. But they lead us to create categorizations and associations that quickly turn into stereotypes. And when we create stereotypes, we attach certain feelings and judgements about them. Sometimes, unpleasant and traumatic experiences can calcify into stereotypes that become difficult to dislodge. Such stereotypes are then transmitted to other people who imbibe them, and in turn, they influence others as well, creating a chain that can stretch across generations, cultures and regions.

As Eberhardt writes, "simply seeing a black person can automatically bring to mind a host of associations that we have picked up from our society: this person is a good athlete, this person doesn't do well in school, this person is poor…this person should be feared. The process of making these connections is called bias. It can happen unintentionally. It can happen unconsciously. It can happen effortlessly. And it can happen in a matter of milliseconds."

These types of bias are not restricted to race alone. They come into play whenever you are confronted with a group that you are not familiar with. In May 2019, when I was one of 17 JSK Fellows at Stanford, I posed the following question to Eberhardt during her visit to our cohort:

"How would you approach understanding bias when it comes to caste?"

And she replied that race can be a proxy for many other situations, "Certain elements of bias operate in the same way no matter whether it is race, caste or gender. Our brains are categorization machines…that process of categorizations and acts of stereotyping are universal."

So it's not just race, caste and gender. Biases can unconsciously affect us in many other situations. I've spent 15 years in New Delhi, but I still view people I meet for the first time through my Bangalore eyes. I think at least two biases are at play here: language bias and region bias, and I think I'm mostly aware of this bias, but perhaps I'm fooling myself.

How do biases affect the world of misinformation and polarization?

To be clear, Eberhadt is writing about a particular set of biases, called implicit biases in her book. There are many other types, and many of them grouped under the term 'cognitive bias'. We will get into those biases—especially 'confirmation bias'—in another issue of this newsletter. But for now consider this:

If I have a negative stereotype regarding people of a certain gender, caste, race, religion, class, language, sexuality, region, able-ness, etc., I am primed to believe bad stuff about them—especially if the person who spreads the bad stuff is from my group. If someone confronts my negative stereotypes, I might actually think of them as biased.

Here's an example.

X is a man who has been brought up in a household that is anti-feminist. X would have imbibed all kinds of unconscious biases about women, about how they ought to behave, what they should wear, whom they should talk to, what professions they should enter, and so on. X also has a difficult life, partly because he doesn't have the habit of taking responsibility for his reaction to things that happen to him. X would then be highly susceptible to any messaging that criticizes the feminist movement. He might dismiss those who attempt to counter his narrative by casting doubt on their intentions.

In many ways, we are all X. We all have tough lives, and not many of us consistently takes responsibility for our reactions to difficulties. If it's not gender, then it's race, caste, religion or any other social construct.

So what ultimately happened to Jennifer L. Eberhardt in school?

Here's where we last left her story: she was having trouble recognizing and remembering her friends. Eberhardt writes that as a result, she began to be afraid of hurting the feelings of her new friends. During one particular period, she would see her friends "whispering among themselves", and when she would join them, "they'd fall silent".

However, this is a story with a happy ending. One day, she is invited to a restaurant lunch by a popular girl in school, and when she goes there, they yell out happy birthday:

"I scanned their faces and realized that these were the classmates I'd seen whispering in the hall, planning a surprise party for the new girl who still hadn't managed to get their names right."

Further reading

For those who want to explore the fascinating ways in which we are hardwired for biases, read Daniel Kahneman's 2011 book Thinking, Fast and Slow. I first came across it in 2013 and my worldview was reoriented (as I said at the beginning of this piece). Kahneman, who won the Nobel Prize in Economics for his work with the late Amos Tversky, writes of 'System 1' or 'fast thinking' (the kind that leads us to fall back on our biases) and its antidote, 'System 2' or 'slow thinking' (the kind that helps us examine our biases).

For those who prefer their non-fiction with a narrative (like me), an accounting of Kahneman's work and his friendship with Tversky is detailed in the wonderful book, The Undoing Project: A Friendship That Changed our Minds by Michael Lewis.

In Nudge: Improving Decisions About health, wealth and happiness, the economists Richard Thaler and Cass Sunstein write about the same two systems. Only they refer to System 1 (or fast thinking) as the Automatic System, and System 2 or (slow thinking) as the Reflective System.

Note: This piece was originally published on our Substack website.

If you value our work, we have an ask:

Our journalists work with TruthSeekers like you to publish fact-checks, explainers, ground reports and media literacy content. Much of this work involves using investigative methods and forensic tools. Our work is resource-intensive, and we rely on our readers to fund our work. Support us so we can continue our work of decluttering the information landscape.

📧 Subscribe to our newsletter here.

📣You can also follow us on Twitter, Facebook, Instagram, Youtube, Linkedin and Google News
Show Full Article
Next Story
Our website is made possible by displaying online advertisements to our visitors.
Please consider supporting us by disabling your ad blocker. Please reload after ad blocker is disabled.