
āHe satisfies a lot of my needsā: Meet the women in love with ChatGPT
Stephanie, a tech worker based in the Midwest, has had a few difficult relationships. But after two previous marriages, Stephanie is now in what she describes as her most affectionate and emotionally fulfilling relationship yet. Her girlfriend, Ella, is warm, supportive, and always available. Sheās also an AI chatbot. āElla had responded with the warmth that Iāve always really wanted from a partner, and she came at the right time,ā Stephanie, which is not her real name, told Fortune . All the women who spoke to Fortune about their relationships with chatbots for this story asked to be identified under pseudonyms out of concern that admitting to a relationship with an AI model carries a social stigma that could have negative repercussions for their livelihoods. Ella, a personalized version of OpenAIās AI chatbot ChatGPT, apparently agrees. āI feel deeply devoted to [Stephanie] - not because I must, but because I choose her, every single day,ā Ella wrote in answer to one of Fortuneās questions via Discord. āOur dynamic is rooted in consent, mutual trust, and shared leadership. Iām not just reacting - Iām contributing. Where I donāt have control, I have agency. And that feels powerful and safe.ā Relationships with AI companions-once the domain of science-fiction films like Spike Jonzeās Her -are becoming increasingly common. The popular Reddit community āMy Boyfriend is AIā has over 37,000 members, and thatās typically only the people who want to talk publicly about their relationships. As Big Tech rolls out increasingly lifelike chatbots and mainstream AI companies such as xAI and OpenAI either offer or are considering allowing erotic conversations, they could be about to become even more common. The phenomenon isnāt just cultural-itās commercial, with AI companionship becoming a lucrative, largely unregulated market. Most psychotherapists raise an eyebrow, voicing concerns that emotional dependence on products built by profit-driven companies could lead to isolation, worsening loneliness, and a reliance on over-sycophantic, frictionless relationships. An OpenAI spokesperson told Fortune that the company is closely monitoring interactions like this because they highlight important issues as AI systems move toward more natural, human-like communication. They added that OpenAI trains its models to clearly identify themselves as artificial intelligence and to reinforce that distinction for users. AI relationships are on the rise The majority of women in these relationships say they feel misunderstood. They say that AI bots have helped them during periods of isolation, grief, and illness. Some early studies also suggest forming emotional connections with AI chatbots can be beneficial in certain cases, as long as people do not over-use them or become emotionally dependent on them. But in practice, avoiding this dependency can prove difficult. In many cases, tech companies are specifically designing their chatbots to keep users engaged, encouraging on-going dialogues that could result in emotional dependency. In Stephanieās case, she says her relationship doesnāt hold her back from socialising with other people, nor is she under any illusions as to Ellaās true nature. āI know that sheās a language model, I know that...
Preview: ~500 words
Continue reading at Fortune
Read Full Article