BOOM

Trending Searches

    SUPPORT
    BOOM

    Trending News

      • Fact Check 
        • Fast Check
        • Politics
        • Business
        • Entertainment
        • Social
        • Sports
        • World
      • Law
      • Explainers
      • News 
        • All News
      • Decode 
        • Impact
        • Scamcheck
        • Life
        • Voices
      • Media Buddhi 
        • Digital Buddhi
        • Senior Citizens
        • Videos
      • Web Stories
      • BOOM Research
      • BOOM Labs
      • Deepfake Tracker
      • Videos 
        • Facts Neeti
      • Home-icon
        Home
      • About Us-icon
        About Us
      • Authors-icon
        Authors
      • Team-icon
        Team
      • Careers-icon
        Careers
      • Internship-icon
        Internship
      • Contact Us-icon
        Contact Us
      • Methodology-icon
        Methodology
      • Correction Policy-icon
        Correction Policy
      • Non-Partnership Policy-icon
        Non-Partnership Policy
      • Cookie Policy-icon
        Cookie Policy
      • Grievance Redressal-icon
        Grievance Redressal
      • Republishing Guidelines-icon
        Republishing Guidelines
      • Fact Check-icon
        Fact Check
        Fast Check
        Politics
        Business
        Entertainment
        Social
        Sports
        World
      • Law-icon
        Law
      • Explainers-icon
        Explainers
      • News-icon
        News
        All News
      • Decode-icon
        Decode
        Impact
        Scamcheck
        Life
        Voices
      • Media Buddhi-icon
        Media Buddhi
        Digital Buddhi
        Senior Citizens
        Videos
      • Web Stories-icon
        Web Stories
      • BOOM Research-icon
        BOOM Research
      • BOOM Labs-icon
        BOOM Labs
      • Deepfake Tracker-icon
        Deepfake Tracker
      • Videos-icon
        Videos
        Facts Neeti
      Trending Tags
      TRENDING
      • #Ahmedabad Plane Crash
      • #G7 Summit
      • #Israel-Iran Conflict
      • #Narendra Modi
      • #Rahul Gandhi
      • #Arvind Kejriwal
      • #Deepfake
      • #Artificial Intelligence
      • Home
      • Decode
      • Urban Company’s AI Photo Is...
      Decode

      Urban Company’s AI Photo Is Changing How Gig Workers Are Seen And Paid

      Urban Company began using AI to alter service partners' profile photos for a more “professional” look, but gig workers say the edits came without real choice—and with real losses.

      By -  Hera Rizwan |
      20 Jun 2025 12:35 PM IST
    • Boomlive
      Listen to this Article
      Urban Company’s AI Photo Is Changing How Gig Workers Are Seen And Paid

      Urban Company’s AI-Enhanced Faces Blur the Line of Informed Consent

      • Urban Company quietly rolled out AI-generated profile photos for its gig workers, claiming it would make them look more “professional” and boost client trust.
      • AI-modified photos sometimes looked significantly different. This could potentially lead to customer confusion.
      • The service partners allege that they weren’t clearly informed or asked for explicit consent.
      • This violates India’s Digital Personal Data Protection Act, which requires explicit, informed, and unambiguous consent.

      Rani (name changed), a Delhi-based beautician and service partner with Urban Company, was taken aback when she noticed something strange on her app profile. Her photo—the one she never changed—looked different. The face staring back was hers, but somehow, older, duller, and strangely unfamiliar.

      She hadn’t uploaded a new picture. Yet, the change had happened.

      Since then, she says her booking numbers have declined. “Maybe it’s because of the way I look in the new photo,” she said. “Clients usually prefer younger-looking professionals—they think they’ll be more energetic.” She doesn’t know for sure what triggered the drop, but the timing, she says, has made her anxious.

      Urban Company, one of India’s largest home services platforms, connects gig workers like Rani to customers for everything from beauty services to appliance repairs. In this ecosystem, a profile picture isn’t just decorative—it can directly influence customer trust and booking rates.

      What Rani didn’t know was that her image had been quietly altered by the platform using artificial intelligence.


      Alert received by the service partners (Courtesy: Rajdhani App Workers Union)

      The 35 year-old beautician is unfamiliar with concepts like AI, data privacy, or informed consent. All she knows is that her face was changed—and possibly, at a cost to her earnings.

      Also Read:How The Blinkit Protest In Varanasi Exposes Cracks In India’s Gig Economy

      Blurry consent lines

      Earlier this month, Urban Company began rolling out AI-generated profile photos for some of its service professionals. The company said the feature was designed to make partners look more “professional” and enhance customer trust. But many workers say they were never clearly informed—let alone consulted.

      Through a notification on the app, Urban Company told its service partners that the updated images could improve booking rates. They were given an option to opt out. But if they didn’t actively respond, the platform treated it as implied consent.

      Rani, like many others, missed the alert. It had appeared in the final week of May, under the bell icon on the Urban Company Partner app—an area she rarely checks. Push notifications are only visible if manually enabled. Once new alerts arrive, older ones get buried.

      Meanwhile, some workers say they did respond—but their input was disregarded.

      Warda (name changed), another beauty service partner, recalled seeing the notification and immediately replied. “I wasn’t happy with the new image,” she said. “But they changed it anyway.”

      The app offered two options: one to approve the image, another marked “another issue.” Warda selected the latter and typed out her concern: “I am not happy with the new image, and the client may not even believe it's the same person in the app and in person.

      A week later, her profile image was replaced.

      “My facial features looked altered. This wasn’t the version of me I wanted customers to see,” she said. “We’re often running from one booking to another. Not everyone checks every alert from the app—especially if one hasn’t enabled push notifications. This was no way of taking our consent, if at all.”

      Also Read:Delhi’s Toxic Air Is Forcing Gig Workers To Quit

      When AI distorts identity

      A review by Decode of several updated partner profiles showed that the AI-generated images often looked noticeably different from the actual appearance of the workers.

      This visual discrepancy was also confirmed by the Deepfakes Analysis Unit and Professor Siwei Lyu of the Department of Computer Science and Engineering at the University at Buffalo. After analysing the photos, he concluded they were “highly likely to be created by AI models,” with an average likelihood of 91%. Using ChatGPT-4o’s image generation capabilities and reference photos from Urban Company’s website, Lyu demonstrated how similar altered portraits could be replicated.

      Sunand, president of the Rajdhani App Workers Union (RAWU), warned that these mismatched images can lead to real-world consequences. “If a customer doesn’t recognise the worker who shows up because their profile photo looks different, the order might be cancelled—and the worker loses income,” he said.

      RAWU also criticised the practice, calling it ethically flawed. In a statement, the union said the use of AI to ‘enhance’ appearances reinforces a harmful notion—that trust and professionalism are defined by a digitally created, idealised standard.

      The legal red flags

      Speaking to Decode, legal experts said that the use of AI-generated images—especially without clear, explicit consent—raises serious red flags.

      Alvin Antony, a lawyer who specialises in technology and digital privacy, pointed out that under India’s new Digital Personal Data Protection (DPDP) Act, consent must meet two core conditions. “First, it must be an affirmative act—the individual must explicitly say ‘yes’ to a specific use of their data,” he explained. “Second, it must be free, informed, specific, unconditional, and unambiguous.”

      By that standard, Urban Company’s opt-out model—where non-response is treated as consent—doesn’t meet the legal threshold, especially in gig work environments where digital literacy varies widely.

      Ada Shaharbanu, Senior Associate at Spice Route Legal, added that once a profile photo is digitally processed to enhance or extract facial features, it may qualify as biometric data. Under Indian law, processing biometric data requires written, informed consent.

      Citing Section 2(47)(b) of the Consumer Protection Act, Antony noted that digitally altering a worker’s image in a way that misrepresents them to customers can be considered an unfair trade practice. “If the customer cancels, it’s the worker who suffers reputational and economic harm. But legally, the platform may bear liability for the misrepresentation,” he said.

      Both experts agree that India’s labour laws, which predate the platform economy, are ill-equipped to handle this type of digital identity manipulation. But even in the absence of precedent, the lack of meaningful consent—especially for vulnerable, informal workers—could face constitutional and legal challenge.

      Also Read:Government’s Promised Lifeline E-Shram Is A Maze For Gig Workers

      A shifting burden

      Shaharbanu noted that gig workers may be able to seek damages under Section 43A of the IT Act, which allows for compensation when personal data is mishandled or processed without due care. They can also file content takedown requests under India's intermediary guidelines, though these come with procedural hurdles.

      Antony stressed the growing need for stronger legal safeguards for gig workers, especially as digital tools increasingly shape how they are represented and perceived.

      An AI-generated image created by a platform could be repurposed by the worker—say, for ID verification or job applications. If that image is later found to be “incomplete or misleading,” the worker—not the platform—could face penalties. Under Section 15 of the DPDP Act, such offences could draw fines of up to Rs 10,000.

      “This unfairly shifts liability onto the worker, even though the manipulation began with the platform,” he said.

      He added that the law should protect those who are least informed and most exposed—especially as AI systems increasingly reshape how people are represented in public and professional spaces.

      “Gig workers make up a massive part of India’s workforce,” he said. “They deserve dignity, safety, and autonomy.”

      For Rani and Warda, the damage may already be done. They didn’t ask for their faces to be enhanced, softened, or modified. What they did ask for—transparency, consent, and a say in how they’re represented—wasn’t granted.

      Urban Company had not responded to detailed queries at the time of publishing. This story will be updated if and when they do.

      Also Read:Poor Rating, Blocking IDs: Why Urban Company Workers Are Protesting


      Tags

      Artificial IntelligenceGig workersData Privacy
      Read Full Article

      Next Story
      Our website is made possible by displaying online advertisements to our visitors.
      Please consider supporting us by disabling your ad blocker. Please reload after ad blocker is disabled.
      X

      Subscribe to BOOM Newsletters

      👉 No spam, no paywall — but verified insights.

      Please enter a Email Address
      Subscribe for free!

      Stay Ahead of Misinformation!

      Please enter a Email Address
      Subscribe Now🛡️ 100% Privacy Protected | No Spam, Just Facts
      By subscribing, you agree with the Terms & conditions and Privacy Policy connected to the offer

      Thank you for subscribing!

      You’re now part of the BOOM community.

      Or, Subscribe to receive latest news via email
      Subscribed Successfully...
      Copy HTMLHTML is copied!
      There's no data to copy!