Support

Explore

HomeNo Image is Available
About UsNo Image is Available
AuthorsNo Image is Available
TeamNo Image is Available
CareersNo Image is Available
InternshipNo Image is Available
Contact UsNo Image is Available
MethodologyNo Image is Available
Correction PolicyNo Image is Available
Non-Partnership PolicyNo Image is Available
Cookie PolicyNo Image is Available
Grievance RedressalNo Image is Available
Republishing GuidelinesNo Image is Available

Languages & Countries :






More about them

Fact CheckNo Image is Available
LawNo Image is Available
ExplainersNo Image is Available
NewsNo Image is Available
DecodeNo Image is Available
BOOM ReportsNo Image is Available
Media BuddhiNo Image is Available
Web StoriesNo Image is Available
BOOM ResearchNo Image is Available
WorkshopsNo Image is Available
VideosNo Image is Available

Support

Explore

HomeNo Image is Available
About UsNo Image is Available
AuthorsNo Image is Available
TeamNo Image is Available
CareersNo Image is Available
InternshipNo Image is Available
Contact UsNo Image is Available
MethodologyNo Image is Available
Correction PolicyNo Image is Available
Non-Partnership PolicyNo Image is Available
Cookie PolicyNo Image is Available
Grievance RedressalNo Image is Available
Republishing GuidelinesNo Image is Available

Languages & Countries :






More about them

Fact CheckNo Image is Available
LawNo Image is Available
ExplainersNo Image is Available
NewsNo Image is Available
DecodeNo Image is Available
BOOM ReportsNo Image is Available
Media BuddhiNo Image is Available
Web StoriesNo Image is Available
BOOM ResearchNo Image is Available
WorkshopsNo Image is Available
VideosNo Image is Available
Explainers

Facebook Aware Of Anti-Muslim Content In India: Whistleblower Frances Haugen

The hearing marked one of the most impactful one yet on the topic of regulating big tech companies, and marks a historic crisis for the aggressively growing Facebook.

By - Archis Chowdhury | 7 Oct 2021 2:54 AM GMT

Facebook whistleblower Frances Haugen appeared in front of the United States Senate on Tuesday for a testimony, where she alleged blatant disregard from top level Facebook executives towards the company's own internal research that highlighted the platform's role towards harmful effects on foreign democracies, and the mental health of children.

"There is a pattern of behaviour that I saw at Facebook - Facebook choosing to prioritize its profits over people," Haugen said during the testimony.

Haugen, who was a product manager working in the civic integrities department of the platform, had previously leaked documents to Wall Street Journal for an exposé on Facebook, and its decision to rake in profits at the cost of misleading its users, and causing mental health issues, especially among young users.

The hearing marked one of the most impactful one yet on the topic of putting a leash on big tech companies, and marks a historic crisis for the aggressively growing Facebook.

In a series of complaints filed by Haugen and her lawyers with the US Security Exchange Commission, an undated internal company document called 'Adversarial Harmful Networks – India Case Study' was brought up that highlighted the company's awareness of the prevalence of anti-Muslim content in India, along with "fear mongering" content promoted by RSS users groups and pages.

"Anti-Muslim narratives targeted pro-Hindu populations with [violent and incendiary] intent… There were a number of dehumanizing posts comparing Muslims to 'pigs' and 'dogs' and misinformation claiming the Quran calls for men to rape their female family members," the complaint says.

In response to Haugen's testimony and allegations, Lena Pietsch, director of policy communications at Facebook, released a statement aimed at discrediting the details she shared on the company, while calling for new internet regulations.

"It's time to begin to create standard rules for the internet," Pietsch said in a statement. "It's been 25 years since the rules for the internet have been updated, and instead of expecting the industry to make societal decisions that belong to legislators, it is time for Congress to act."

Here are the key takeaways from the hearing:

Also Read: WSJ Exposé On Facebook & BJP Triggers Political Row In India

Facebook's Big Tobacco Moment

A phrase that was repeated several times during the hearing, both by Haugen and the US lawmakers present before her, was that Facebook (and to an extent big tech) was facing its 'Big Tobacco moment'.

This was a reference to a 1994 hearing, when Big Tobacco companies appeared before the US Senate and falsely testified on tobacco not being addictive, despite having internal research showing it was.

Haugen drew a parallel on how US lawmakers took action against tobacco companies after being convinced of the harmful effects of tobacco, and urged the Congress to do the same to regulate big tech companies for the harm it does to democracy and to the mental health of children.

Also Read: Qatar Has Put India On 'Exceptional Red' List. How To Travel There?

The Adverse Impact On Children

One of the central topics of discussion during the hearing was the harmful effects social media, especially Instagram, had on children, including those who are underage.

Haugen brought up the topic of downstream MSI (meaningful social interactions) - an algorithm that predicts content that are likely to go viral, and them promotes them in the platform - and explained how that frequently pushed content that promoted eating disorders, misleading information, and hatred and divisiveness, right from an early age.

"Facebook knows its engagement ranking on Instagram can lead children from very innocuous topics like healthy recipes to anorexic content over a very short period of time," Haugen alleged. "Facebook knows they are leading young users to anorexia content."

She further alleged that children were a demographic of interest for Facebook, referencing to Facebook's children-focussed initiatives called "Instagram Kids", which has been paused since the company came under public scrutiny.

"I would be sincerely surprised if they do not continue working on Instagram kids," Haugen speculated, responding a senator's question on the topic. "Facebook intends to make sure that the next generation is just as engaged with Instagram as the current one, and the way they'll do that, making sure children establish habits before they have good self-regulation."

Also Read: Explained: India's Coal Supplies Are Running Dangerously Low

Disregarding Misuse Of Platform Abroad

Haugen also highlighted during the hearing that Facebook does not pay equal heed to hateful, divisive and misleading content in non-English languages, as it does with English content.

"Facebook invests more in users that make them more money, even though danger may not be evenly distributed based on profitability," she said.

She further added that 87 per cent of the spending made by Facebook to tackle misinformation is solely for English content, while only 9 per cent of its users are English speaking.

Such resource gap, she said, is fuelling violence in Ethiopia, as it had done in Myanmar.

Also Read: 2017 Video From France Of Muslims Praying On A Street Peddled As UK

"Buck Stops With Mark"

While taking on the topic of who makes the decision at Facebook to implement such measures that may be harmful, Haugen repeatedly highlighted the central role played by Facebook CEO and Founder Mark Zuckerberg, as compared to founders of other tech companies.

On the topic of who makes the big decisions at Facebook, current staff members have increasingly beaten around the bush to keep away any single name from appearing as a lead decision-maker, Haugen said, "The buck stops with Mark [Zuckerberg]."

Haugen highlighted that Zuckerberg owns more than 50 per cent of voting shares in the company, and that he alone is accountable for the decisions made by the company, and their consequences.

Regulating Big Tech

Haugen also mentioned several steps that could be taken to regulate the big tech platform.

The measures included greater transparency from Facebook on its news feed algorithm and downstream MSI. Haugen mentioned that an independent government body staffed with former tech employees with good understanding of the algorithm would be required to create policies regarding the regulations that should be implemented.

She also said that the news feed should be changed to chronological, rather than ranking content through a virality-predicting algorithm. She also urged Facebook to disclose its internal research, and declare moral bankruptcy on its past decisions to disregard the results of the internal research, especially on the harmful effects of the company on children.

You can watch the hearing here:

Full View