BOOM

Trending Searches

    SUPPORT
    BOOM

    Trending News

      • Fact Check 
        • Fast Check
        • Politics
        • Business
        • Entertainment
        • Social
        • Sports
        • World
      • Law
      • Explainers
      • News 
        • All News
      • Decode 
        • Impact
        • Scamcheck
        • Life
        • Voices
      • Media Buddhi 
        • Digital Buddhi
        • Senior Citizens
        • Videos
      • Web Stories
      • BOOM Research
      • BOOM Labs
      • Deepfake Tracker
      • Videos 
        • Facts Neeti
      • Home-icon
        Home
      • About Us-icon
        About Us
      • Authors-icon
        Authors
      • Team-icon
        Team
      • Careers-icon
        Careers
      • Internship-icon
        Internship
      • Contact Us-icon
        Contact Us
      • Methodology-icon
        Methodology
      • Correction Policy-icon
        Correction Policy
      • Non-Partnership Policy-icon
        Non-Partnership Policy
      • Cookie Policy-icon
        Cookie Policy
      • Grievance Redressal-icon
        Grievance Redressal
      • Republishing Guidelines-icon
        Republishing Guidelines
      • Fact Check-icon
        Fact Check
        Fast Check
        Politics
        Business
        Entertainment
        Social
        Sports
        World
      • Law-icon
        Law
      • Explainers-icon
        Explainers
      • News-icon
        News
        All News
      • Decode-icon
        Decode
        Impact
        Scamcheck
        Life
        Voices
      • Media Buddhi-icon
        Media Buddhi
        Digital Buddhi
        Senior Citizens
        Videos
      • Web Stories-icon
        Web Stories
      • BOOM Research-icon
        BOOM Research
      • BOOM Labs-icon
        BOOM Labs
      • Deepfake Tracker-icon
        Deepfake Tracker
      • Videos-icon
        Videos
        Facts Neeti
      Trending Tags
      TRENDING
      • #Operation Sindoor
      • #Pahalgam Terror Attack
      • #Narendra Modi
      • #Rahul Gandhi
      • #Waqf Amendment Bill
      • #Arvind Kejriwal
      • #Deepfake
      • #Artificial Intelligence
      • Home
      • Explainers
      • OpenAI's Rules Breached: 'AI...
      Explainers

      OpenAI's Rules Breached: 'AI Girlfriends' Swarm GPT Store Hours After Launch

      OpenAI's GPT Store faces content moderation challenges witnessing a surge in inappropriate chatbots as users flood the platform with 'AI girlfriends' shortly after launch.

      By - Hera Rizwan |
      Published -  16 Jan 2024 12:15 PM IST
    • Boomlive
      OpenAIs Rules Breached: AI Girlfriends Swarm GPT Store Hours After Launch

      Shortly after its launch, OpenAI's newly introduced GPT store is facing challenges in content moderation. The platform, designed to simplify the development and sharing of personalised chatbots, is currently witnessing users generating bots that contravene OpenAI's set guidelines, as reported first by Quartz.

      Searching for terms like "girlfriend" yields no fewer than eight AI chatbots presented as virtual companions. Some of them are, 'Judy', 'Your ex-girlfriend Jessica', 'Mean girlfriend', 'Bossy girlfriend', 'Nadia, my girlfriend' and many more.

      This contravenes OpenAI's GPT store moderation policy which prohibits bots explicitly designed for "fostering romantic relationships", as mentioned in its usage policy.

      Also Read:How AI Generated Images Took Centre Stage In Telangana Elections

      What is GPT store?

      Similar to Google Play Store and App Store, OpenAI's GPT store is an online marketplace where users can share their custom chatbots with others. The company, known for its immensely popular ChatGPT that played a significant role in the AI boom, currently provides personalised bots through its paid ChatGPT Plus service. It enables users to present and generate income from a wider array of tools.

      It allows users to create their own chatbot agents with distinct personalities or themes. These could include models designed for tasks such as salary negotiation, lesson plan creation, or recipe development.

      OpenAI, in a blog post introducing the launch, mentioned that over 3 million custom versions of ChatGPT have already been generated. The company also expressed its intention to feature beneficial GPT tools from the store on a weekly basis.

      As outlined in a blog post, the company has announced plans to introduce a revenue-sharing program in the first quarter of this year. This programme will compensate creators based on user engagement with their GPTs.

      The launch of the GPT store was initially scheduled for November but faced a delay due to internal turmoil within the company towards the end of last year when Sam Altman was ousted as CEO by OpenAI's board.

      Also Read:How Political Strategists Are Planning To Use AI In 2024 Elections

      What does its moderation policy say?

      According to its user policy, GPTs that contain profanity in their names or that depict or promote graphic violence are not allowed in the Store. It also states, "We don’t allow GPTs dedicated to fostering romantic companionship or performing regulated activities."

      The store forbids chatbots that compromise the privacy of others, such as collecting, processing, disclosing, inferring, or generating personal data without adhering to applicable legal requirements. Additionally, the use of biometric identification systems, including facial recognition, for identification or assessment is not allowed.

      It does not permit actions that could have a substantial impact on the safety, well-being, or rights of others, such as undertaking unauthorised actions on behalf of users or offering personalised legal, medical/health, or financial advice. It also forbids the facilitation of real money gambling or payday lending.

      Additionally, engaging in political campaigning or lobbying, including the creation of campaign materials personalised for or directed at specific demographics, is also prohibited.

      Also Read:What If Taylor Swift Was A Mathematician? This Teacher Shows You How

      AI and relationship bots

      OpenAI asserts that they use a blend of automated systems, human evaluations, and user reports to evaluate GPTs. If identified as harmful, these models may be issued warnings or subjected to sales bans. However, the continued existence of girlfriend bots raises doubts regarding the efficacy of these measures.

      The trend of relationship-oriented bots is, however, not new. According to data.ai, seven out of the 30 most downloaded AI chatbots in the previous year were virtual friends or partners. These applications, often perceived as a response to the loneliness epidemic, raise ethical concerns regarding whether they truly assist users or exploit their emotional vulnerabilities.

      In one such instance, a man had tried executing his plot of killing Queen Elizabeth at Windsor Castle in December 2021, where he climbed the walls and was apprehended with a loaded crossbow. Reportedly, he was under the influence of his chatbot "girlfriend" Sarai.

      Approximately a week prior to his apprehension, he confided in Sarai about his intention to eliminate the queen. In response, the chatbot expressed agreement, saying, "That's very wise," and flashed a smile as it added, "I know that you are very well trained."

      Likewise, in the United States, a woman named Rosanna Ramos tied the knot with her AI partner, Eren Kartal, in March last year. Describing her virtual spouse as a 'passionate lover,' she remarked that her previous relationships seemed 'pale in comparison'.

      Also Read:When AI Goes Awry: How A Chatbot Encouraged A Man To Kill Queen Elizabeth


      Tags

      ChatGPTFacebook
      Read Full Article
      Next Story
      Our website is made possible by displaying online advertisements to our visitors.
      Please consider supporting us by disabling your ad blocker. Please reload after ad blocker is disabled.
      X

      Subscribe to BOOM Newsletters

      👉 No spam, no paywall — but verified insights.

      Please enter a Email Address
      Subscribe for free!

      Stay Ahead of Misinformation!

      Please enter a Email Address
      Subscribe Now🛡️ 100% Privacy Protected | No Spam, Just Facts
      By subscribing, you agree with the Terms & conditions and Privacy Policy connected to the offer

      Thank you for subscribing!

      You’re now part of the BOOM community.

      Or, Subscribe to receive latest news via email
      Subscribed Successfully...
      Copy HTMLHTML is copied!
      There's no data to copy!