Social media platforms, and the tech companies that own them, were the major topics of discussion for the second week of BOOM's #TruthSeekers Fest.
This week, we explored the role of Big Tech in fighting misinformation/disinformation and polarisation, with four esteemed panelists - Tarunima Prabhakar, research lead at Tattle, Lucina Di Meco, gender equality expert and co-founder of #ShePersisted, Berges Y. Malu, director of public policy and communications at ShareChat, and Apar Gupta, executive director at the Internet Freedom Foundation (IFF).
The panelists presented us with a diverse set of opinions on how tech companies could improve their contribution to the fight against fake news, and on whether the law could be improved upon to be more effective in taking action against those spread disinformation, and engage in trolling.
Tech Platforms And Information Overload
The talk started off with Prabhakar weighing in how an information overload through social media platforms has taken a toll on our critical thinking, which eventually pushes us to share unverified information.
"There has been a spike in the amount of information that we are hit with. As a result of this information overload, there is a cognitive coping mechanism, that you become less critical. In such a situation we end up falling for something that is inaccurate," Prabhakar said.
While adding that a little nudge to people to verify information at the time of sending it to others might help slow down its spread by a little, motivated reason can sometimes push people to believe and share information perceived by themselves as inaccurate.
"Social media content often align with our historical, cultural and political beliefs. People maybe be aligned to believe such information, even if they are inaccurate, that is motivated reasoning. There is also lazy reason, where people just share something because they are too lazy to verify," she added.
Accountability: Tech Companies vs Law Enforcement
Berges Y. Malu, who represented the tech industry among the panelists, stated that social media platforms do have the responsibility of stopping inaccurate information, however, he added that the work of correcting such information would lie with fact-checkers, rather than the platforms.
"And this is why we need fact-checkers, to stop such information. If I do it, I will be accused of being right-wing, left-wing or centrist," Malu said. "Platforms should remain independent and not lose ability to have safe harbour privileges."
However, Apar Gupta, a lawyer and digital rights activist, disagreed, and argued that the most effective solutions to the fake news problem are at technical and platform level, rather than legal.
"Such platforms do provide social value through the facilities to exchange information. We saw that during the second wave (of the pandemic) on how social media was used to arrange medical facilities. However, certain choices made on the algorithms, such as to not make it open to public review, or review by independent experts create an issue," Gupta said.
Quoting American law professor Frank Pasquale, Gupta said that such decisions create a black box society, where secret algorithms determine our choices.
Enforcing Stereotypes Through Tech
Lucina Di Meco, who had previously conducted a research in over 30 countries, including India, on the experience of women on social media, highlighted how such tech platforms were used to reinforce stereotypes about women to keep them away from the political space.
"We found that women were being targeted with overwhelming amount of abuse and with disinformation narratives, particularly women in politics," Di Meco stated.
"Using gender stereotypes, women are often portrayed as untrustworthy, stupid, oversexual, transgender, asexual. It passes along the message that women are unfit for office," she added.
Echoing with Di Meco, Gupta provided the example of the 'Sulli Deals' app, a misogynistic app on which was put on GitHub for a limited period of time, which auctioned Twitter profiles of politically active Muslim women on the platform.
He also used this example to further explain how the law is insufficient with dealing with such issues, by highlighting the inaction of the police in taking action against the makers of the app.
"When it got played up in the press, the police was pressured to file a First Information Report against the makers of the app. The police says, GitHub is not cooperating. However, GitHub has well laid-out policies, which includes request for information," he said.
"I saw this as a lawyer, as a person who believes in the constitution and the law, that the legal response to this is not the most efficient one," Gupta added.
Do you always want to share the authentic news with your friends?