Support

Explore

HomeNo Image is Available
About UsNo Image is Available
AuthorsNo Image is Available
TeamNo Image is Available
CareersNo Image is Available
InternshipNo Image is Available
Contact UsNo Image is Available
MethodologyNo Image is Available
Correction PolicyNo Image is Available
Non-Partnership PolicyNo Image is Available
Cookie PolicyNo Image is Available
Grievance RedressalNo Image is Available
Republishing GuidelinesNo Image is Available

Languages & Countries :






More about them

Fact CheckNo Image is Available
LawNo Image is Available
ExplainersNo Image is Available
NewsNo Image is Available
DecodeNo Image is Available
BOOM ReportsNo Image is Available
Media BuddhiNo Image is Available
Web StoriesNo Image is Available
BOOM ResearchNo Image is Available
WorkshopsNo Image is Available
VideosNo Image is Available

Support

Explore

HomeNo Image is Available
About UsNo Image is Available
AuthorsNo Image is Available
TeamNo Image is Available
CareersNo Image is Available
InternshipNo Image is Available
Contact UsNo Image is Available
MethodologyNo Image is Available
Correction PolicyNo Image is Available
Non-Partnership PolicyNo Image is Available
Cookie PolicyNo Image is Available
Grievance RedressalNo Image is Available
Republishing GuidelinesNo Image is Available

Languages & Countries :






More about them

Fact CheckNo Image is Available
LawNo Image is Available
ExplainersNo Image is Available
NewsNo Image is Available
DecodeNo Image is Available
BOOM ReportsNo Image is Available
Media BuddhiNo Image is Available
Web StoriesNo Image is Available
BOOM ResearchNo Image is Available
WorkshopsNo Image is Available
VideosNo Image is Available
Decode

It’s Terrifying To Be A Woman Content Creator In The Age Of Deepfakes

The internet is a scary place for women, but with deepfakes being easy to create it has become even more dangerous.

By - Adrija Bose | 27 Nov 2023 6:36 AM GMT

Susan* woke up one morning to a flurry of hate messages in her Instagram inbox. Her video had been morphed, her face was replaced with the actress Kajol’s face. The deepfake video had gone viral in India, while Susan was thousands of kilometres away in another country.

“It was terrifying to watch the video. It was my body, my video but not my face,” she told Decode.

There would be no way to identify her except that news organisations had carried stories naming her even though she wanted to remain anonymous.

“Not just as a creator but as a human being it’s terrifying to think we will begin to struggle to know what’s real and what’s fake. It will cause a lot of hurt as there are malicious people already creating footage to hurt others,” she said.

A few days before the Kajol deepfake went viral, another video of a content creator had been morphed to use actress Rashmika Mandanna's face. While the dangers are evident, what has been lost in the narrative is the rights of the content creators whose videos are being used to create such deepfakes.

Decode found a number of deepfake videos on X (formerly Twitter) that had superimposed faces of Indian actresses on bodies of other women.

The internet is a scary place for women, but with deepfakes being easy to create it has become even more dangerous.

“So many of my content creator friends don’t post their photos immediately, they wait a few days before posting so their location is not revealed,” said Sonia Thomas, a creator with nearly 18k followers on Instagram. “It could get much worse,” she said, talking about the dangers of deepfake for creators who have a public profile.

With deepfakes, Thomas believes it’s the smaller content creators who don’t have managers who are going to face the brunt even more. “It’s getting more convincing and there’s no way to trace it back to who made it,” she said.

“When I post a photo of myself I make sure my feet are not visible. Because you never know, it may reach pornographic sites,” she added. She said she almost always uses text to hide her feet in photos she posts.

Shreemi Verma, a creator who tweets on Bollywood, had faced the trolling last year. One of her tweets had gone viral and it made headlines as an ‘anti-Hindu’ post. What Shreemi had not expected was that the trolls would find her personal photos on Instagram and use them out of context. “I deactivated my Twitter account and made my Instagram private,” she said.

She is back on the Internet but the fears haven’t left her. “If someone bookmarks a photo I share on Twitter, I get scared. I don’t know what they are going to do with it,” she said.

Thomas says that if you are a marginalized creator, a Muslim or a Dalit woman, you are likely to be targetted more.

Lisanne Buik, an AI ethics consultant based out of Netherlands, has seen some of the deepfake videos that had gone viral in India. “It is scary,” she told Decode. However, Buik also believes that it is incidents like these that will force the government to come up with policies. “It’s unfortunate but only when something goes wrong then the government will act and bring in regulations,” she said.

She may be right. The Union Minister for Electronics and Information Technology and Communications, Ashwini Vaishnaw has just announced that the government is set to introduce draft regulations to tackle the growing concern of deepfakes.

“We are heading towards this synthetic media where we don't know what is true or not true and women are more vulnerable than any other groups,” she said. The only way to deal with this age of Artificial Intelligence, is to have your radar on and feel and sense something is wrong, Buik added.

So, are deepfakes always bad? Consent is the key, says Buik. In the case deepfakes of Indian actresses, neither the actress nor the content creators had consented to them.

"A 2019 study found that 96% of deepfakes are of non-consensual sexual nature, and of those, 99% are made of women. This is content aimed to silence, shame, and objectify women. And tech expects the victims to uncover and report the material. For example, it’s left to women to proactively request the removal of the harmful pages from Google Search,” pointed out Patricia Gestoso, an inclusion strategist and a technologist with over 20 years of experience in digital transformation with a focus on scientific modeling, artificial intelligence, and inclusive design of products.

“There’s a gap in the law,” pointed out Tania Duarte, Co-Founder and CEO of We and AI, a UK non-profit focusing on activating diverse communities to make AI work for everyone. “This is not just about deepfakes but using someone’s content in a harmful manner. It’s pitting women against women,” she said talking about the social media creators’ whose content was used to make the deepfakes.

Criminalising these offences is welcome but tracing who created it is difficult and women do not have the support in place to even talk about it, added Medina Bakayeva, who researches AI governance and literacy at We and AI and works as a Consultant in cybersecurity policy at the European Bank for Reconstruction and Development's Digital Hub.

“Women’s bodies are already sexualised. Even if they don't have sexual connotation they are penalisted on social media,” Gestoso said, adding that this is how women are dehumanised.

Mia Shah-Dand, Founder of Women in AI Ethics said that women are already at a higher risk from gender-based violence offline and new AI tools have made it even easier to harass and exploit women online.

The male-owned and male-led social media platforms are not taking the threats against women seriously, she said. “In many cases tech companies are benefiting from these harmful activities,” Dand pointed out that adding VC-funded Civitai has launched a new feature that allows users to post “bounties” to create AI models of specific real people. “No woman regardless of her celebrity status or privilege is safe,” she added.

Susan*, meanwhile, hasn’t stopped posting her photos but she fears her content may be used in a harmful way. “My content is created to help women feel more confident in their natural bodies, but I’ve experienced either Indian men sexualising me or using the content to hurt other women,” she said, adding that it is simply “disgraceful”.


*Name changed to protect identity.