Support

Explore

HomeNo Image is Available
About UsNo Image is Available
AuthorsNo Image is Available
TeamNo Image is Available
CareersNo Image is Available
InternshipNo Image is Available
Contact UsNo Image is Available
MethodologyNo Image is Available
Correction PolicyNo Image is Available
Non-Partnership PolicyNo Image is Available
Cookie PolicyNo Image is Available
Grievance RedressalNo Image is Available
Republishing GuidelinesNo Image is Available

Languages & Countries :






More about them

Fact CheckNo Image is Available
LawNo Image is Available
ExplainersNo Image is Available
NewsNo Image is Available
DecodeNo Image is Available
Media BuddhiNo Image is Available
Web StoriesNo Image is Available
BOOM ResearchNo Image is Available
BOOM LabsNo Image is Available
Deepfake TrackerNo Image is Available
VideosNo Image is Available

Support

Explore

HomeNo Image is Available
About UsNo Image is Available
AuthorsNo Image is Available
TeamNo Image is Available
CareersNo Image is Available
InternshipNo Image is Available
Contact UsNo Image is Available
MethodologyNo Image is Available
Correction PolicyNo Image is Available
Non-Partnership PolicyNo Image is Available
Cookie PolicyNo Image is Available
Grievance RedressalNo Image is Available
Republishing GuidelinesNo Image is Available

Languages & Countries :






More about them

Fact CheckNo Image is Available
LawNo Image is Available
ExplainersNo Image is Available
NewsNo Image is Available
DecodeNo Image is Available
Media BuddhiNo Image is Available
Web StoriesNo Image is Available
BOOM ResearchNo Image is Available
BOOM LabsNo Image is Available
Deepfake TrackerNo Image is Available
VideosNo Image is Available
News

Meta, OpenAI Combat Manipulated Content Targeting Indian Elections, Sikh Groups

While Meta removed 'Chinese-origin' pages targeting Sikh community, OpenAI disrupted 'covert influence campaign' by Israeli firm using its AI tools for 'anti-BJP content'.

By - Hera Rizwan | 3 Jun 2024 4:01 PM IST

Escalating the issue of surge in manipulated content aimed at India, two major tech firms, Meta and OpenAI, each released reports this week detailing their actions against accounts posting content intended to sway public opinion, particularly regarding India's general elections.

In its report, OpenAI, the US-based artificial intelligence (AI) company behind ChatGPT, revealed its disruption of a covert influence campaign by an Israeli firm. The campaign used OpenAI's AI model to fabricate fake social media personas and generate content related to the Indian elections, including 'anti-Bharatiya Janata Party (BJP) material', which was disseminated across multiple social media platforms.

On the other hand, Meta, the parent company of Facebook, Instagram, and WhatsApp, announced the removal of numerous Facebook accounts, pages, and groups due to violations of its policy against "coordinated inauthentic behavior". Originating from China, these accounts, groups, and pages specifically targeted the Sikh community, extending beyond India to various other countries.

'Israeli firm STOIC tried to disrupt Lok Sabha polls 2024': OpenAI

According to its report, OpenAI stated that it intervened in certain activities related to the Indian elections within less than 24 hours of their commencement in May. This also marks the first release of a report by the company on ‘AI and Covert Influence Operations: Latest Trends’.

The company highlighted that STOIC, an Israeli political campaign management firm, was producing content concerning the Gaza conflict and, to a lesser degree, the Histadrut trade union organisation in Israel, alongside the Indian elections.

Histadrut, or the General Organisation of Workers in Israel, is Israel's national trade union centre and represents the majority of Israel's trade unionists.

Alluding to the network operated by STOIC, the firm said, “We banned a cluster of accounts operated from Israel that were being used to generate and edit content for an influence operation that spanned X, Facebook, Instagram and YouTube.”

The company added, “In some cases, we identified this operation’s fake accounts commenting on social-media posts made by the operation itself, likely in an attempt to create the impression of audience engagement.” Nonetheless, as per OpenAI, this campaign garnered minimal engagement.

Additionally, it stated that in May, the network shifted its focus to generating comments centered around India, critiquing the ruling BJP party, and lauding the opposition Congress party.

Responding to OpenAI's report, the incumbent BJP labeled this a “dangerous threat” to democracy, stating that OpenAI should have informed the public when the threat was first detected in May.

Minister of State for Electronics and IT Rajeev Chandrasekhar wrote on X, “It is absolutely clear and obvious that BJP was and is the target of influence operations, misinformation, and foreign interference, being done by and/or on behalf of some Indian political parties.”

Last month, Decode also identified at least 8 chatbots in OpenaAI's GPT Store that were created for Indian elections and clearly violated the company's poll policy regarding election campaigning. The chatbots were later taken down.

Anti-Sikh fake accounts backed by China: Meta

Meta, in its 'Adversarial Threat Report', stated that it had taken down 37 Facebook accounts, 13 Pages, five Groups, and nine Instagram accounts for breaching its policy on coordinated inauthentic behaviour. This network, originating in China, aimed its efforts at the global Sikh community, spanning countries such as Australia, Canada, India, New Zealand, Pakistan, the UK, and Nigeria.

The activity extended across multiple social media platforms and comprised several clusters of fabricated accounts, including one linked to an unattributed Coordinated Inauthentic Behavior (CIB) network. CIB involves orchestrated attempts to influence public discourse for strategic purposes, with fake accounts playing a central role. This particular network, also originating from China, targeted India and the Tibet region, which Meta had previously disrupted in early 2023.

The report read, "They appeared to have created a fictitious activist movement called Operation K which called for pro-Sikh protests, including in New Zealand and Australia. We found and removed this activity early, before it was able to build an audience among authentic communities.”

According to Meta, their posts mainly revolved around news and current affairs in both English and Hindi. These included images that appeared to be altered using photo editing software or created by artificial intelligence, “in addition to posts about floods in the Punjab region, the Sikh community worldwide, the Khalistan independence movement, the assassination of Hardeep Singh Nijjar, a pro-Khalistan independence activist in Canada, and criticism of the Indian government”, it added.

Additionally, Meta also confirmed that it too removed 510 Facebook accounts, 11 pages and one group, as well as 32 Instagram accounts, linked to the same operation associated with STOIC. The company announced that it has banned STOIC from its platforms and that it issued a cease-and-desist letter “demanding that they immediately stop activity that violates Meta’s policies.”

Tags: