A new study has contradicted the long-held belief that the ongoing fake news epidemic in India is directly tied to low digital literacy and ignorance. The study revealed that prejudice and ideology along with religion, caste and gender played a much bigger role in the mass forwarding of mis/disinformation.
The study was conducted by media and communication researchers at the London School of Economics (LSE), and was one of the 20 research projects to have received a grant of $50,000 by WhatsApp.
The aim of this grant was to independently study the use of the instant messaging app in the spread of mis/disinformation leading to violent consequences around the world.
Mob Violence And Technology
The past decade has seen a steady rise of mob violence in India, with rumours of cattle theft, cow slaughter and child kidnapping playing a big part in the gathering and mobilization of mobs.
While mob violence in itself is not unheard of in the past, the use of technology - especially of instant messengers like WhatsApp - to rapidly spread such rumours has become a major debate in India while discussing the causes of such mob activities.
LSE researchers Shakuntala Banaji and Ram Bhat teamed up with Anushi Agrawal, Nihal Passanha and Mukti Sadhana Pravi to carry out a research project that would look into the utilisation of WhatsApp in triggering or organising mob violence in India.
Titled "WhatsApp Vigilantes: An exploration of citizen reception and circulation of WhatsApp misinformation linked to mob violence in India", the study notes that sociopolitical context and xenophobia plays a major role in fueling such incidents.
The project was conducted through in-depth qualitative interviews with expert stake holders such as people in law enforcement, legal services, government and journalists, along with focus groups with various sets of users across Karnataka, Maharashtra, Madhya Pradesh and Uttar Pradesh. A total of 275 people were interviewed between November 2018 and August 2019.
Tech Literacy vs Prejudice
"People forgave themselves for forwarding quite dangerous and violence-inducing misinformation around the time of Pulwama (terror attack), because they felt that they were doing their national duty at that time."- Dr. Shakuntala Banaji, Researcher, London School of Economics and Political Science
One of the most crucial and contentious revelations of the study was that the influence of low digital literacy in the proliferation of fake news in India is largely exaggerated.
Instead, the study found that pre-existing beliefs and prejudices against discriminated groups like Muslims, Christians, Dalits, Adivasis etc played a much bigger role in the sharing of misinformation and hate-speech by users from rural and urban, upper and middle caste Hindu men and in some cases, women.
"A key finding is that in the case of violence against a specific group (Muslims, Christians, Dalits, Adivasis, etc.) there exists widespread, simmering distrust, hatred, contempt and suspicion towards Pakistanis, Muslims, Dalits and critical or dissenting citizens amongst a section of rural and urban upper and middle caste Hindu men and women. WhatsApp users in these demographics are predisposed both to believe disinformation and to share misinformation about discriminated groups in face-to-face and WhatsApp networks. Regardless of the inaccuracy of sources or of the WhatsApp posts, this type of user appears to derive confidence in (mis)information and/or hate-speech from the correspondence of message content with their own set of prejudiced ideological positions and discriminatory beliefs."Excerpt from the study.
The lack of digital literacy has been conventionally considered as a major influential factor in the spread of misinformation. WhatsApp has undertaken large-scale initiatives in India to impart digital literacy as a measure to curb fake news.
The LSE study, however, goes against this convention to state that technologically-literate men from upper and middle class background in both urban and rural areas were found to be more likely to share certain types of ideologically charged mis/disinformation and hate-speech.
"Digital skills and functional media literacy allied to strong ideological prejudices and hatred against the other can assist the spread of disinformation, and sometimes be the trigger for misinformation."Excerpt from the study.
In contrast, it suggested that WhatsApp users from lower castes, Dalits, Muslims and/or women from rural areas with lower levels of technological literacy are "less likely to create and curate and unlikely even to forward ideologically-charged misinformation and disinformation."
Xenophobia, Trauma and Emotional Exhaustion
However, when it came to rumour-based violence about child kidnappers and organ harvesting, motivation to share them stems from xenophobia.
"In the arena of malicious rumour-based violence about child snatchers and organ harvesting, user motivations for forwarding misinformation and fake news stories are not based on ideology but rather on a generalised mistrust of strangers, and on affinity to the message sender and extreme trust in the group through which the user received the message. Based on fieldwork, participants in the spread of such rumours seem overwhelmingly to be male; albeit a minority, women, too, participate in online, face-to-face and telephonic sharing of concerns around such issues, and are also sometimes culpable in shielding the perpetrators of violence during the planning and execution of lynchings."Excerpt from the study.
The study also found that emotional disturbance and exhaustion of viewing violent or overwhelming content also played a role in spreading misinformation, as users would forward such messages either to discuss the trauma felt from viewing such content or forward them without checking it entirely.
"In both cases of violence motivated by prejudice (against a particular group or community) and cases motivated by rumour, many WhatsApp users we spoke to also acknowledged the effective and temporal labour required to contribute to, circulate and consume misinformation. In some cases, the emotional disturbance felt by users on viewing a clip of spectacular violence or overwhelming content (train or road accidents, harm caused by natural disasters) impelled the recipients of these WhatsApp messages to share them with others and/or discuss them within their networks. In other cases, this kind of content contributed to a sense of emotional fatigue and exhaustion whereby WhatsApp users would forward disinformation without checking the message fully or would bulk delete messages from their most prolific groups."
Related: In July 2019, BOOM did a podcast episode on the vicarious trauma experienced from watching violent content on WhatsApp and social media.
The Case Of End-To-End Encryption
WhatsApp's end-to-end encryption - which eliminates any possibility of traceability - has been long considered the big bad roadblock to efforts by authorities in curbing fake news.
The Indian government has made several requests in the past to WhatsApp for local data storage and access to user messages in India.
In contrast, the researches of the LSE study found that end-to-end encryption had little to do with the forwarding of misinformation.
Dr. Shankuntala Banaji, one of the authors of the report, told BOOM that the view that end-to-end encryption was the real problem was "found to be a false premise", during the course of the study.
"Many stuff that went viral on WhatsApp were also being spread on social media. In cases where people had actually reported the originators of something, nothing was being done about it," she said.
"We feel it's a red herring, to go after encryption. It's about the ideological motivations of those who are enforcing the law. There are laws about hate speech but there's selectively enforced and selectively ignored. Deencrypting messages simply plays into the hands of an already authoritarian set of state practices," she added.
What Could Be Done?
The violent consequences of misinformation spread through instant messengers had led to an increasing pressure on tech companies to accept accountability for providing platform for such misinformation to spread.
However, the LSE study suggested that tech companies alone are not to blame, and that national and local governments along with civil society organisations and local WhatsApp users "share greater responsibility and are held accountable in a systematic way until justice has been served and the misuse is demonstrably curbed."
The study also made certain recommendations to WhatsApp, that could help in curbing the spread of mis/disinformation on the app.
One of the recommendations stated that WhatsApp introduce a "beacon" feature, "where a warning/advisory can be broadcast from WhatsApp to users in specific locations about specific issues".
"Finally, we also recommend that WhatsApp introduce a mechanism whereby users, especially women and sexual minorities are able to report hate speech, misogyny, sexual violence etc. on a separate fast tracked route in partnership with local or state-level law enforcement."Excerpt from the article
Need For Further Research
While the study does lay the groundwork for further research on human-tech interaction in the Indian sociopolitical context and, a lot is yet to be done to fully understand the situation unraveling in countries like India and Brazil with respect to ustilisation of smarphones and internet.
With a user base of 400 million, India is currently the biggest market for WhatsApp.
Banaji told BOOM that future researchers should look into expanding the scale of research by carrying out similar studies across different states in India. She also stated that better accessibility to resources and more cooperation from government officials will also be useful for research.
"It important to get access to and get comment from government officials who participate in such groups; would be interesting to get more access – keeping within ethical boundaries – to examples of bullying against those who stand up to hate speech in groups if any."
Do you always want to share the authentic news with your friends?