AI girlfriends are here – but there’s a dark side to virtual companions
Love in the time of ChatGPT
OPINION/It is a truth universally acknowledged, that a single man in possession of a computer must be in want of an AI girlfriend. Certainly a lot of enterprising individuals seem to think there’s a lucrative market for digital romance. OpenAI recently launched its GPT Store, where paid ChatGPT users can buy and sell customized chatbots (think Apple’s app store, but for chatbots) – and the offerings include a large selection of digital girlfriends.
“AI girlfriend bots are already flooding OpenAI’s GPT store,” a headline from Quartz, who first reported on the issue, blared on Thursday. Quartz went on to note that “the AI girlfriend bots go against OpenAI’s usage policy … The company bans GPTs ‘dedicated to fostering romantic companionship or performing regulated activities’.”
Flooding is a little bit of an exaggeration for what’s going on. I’d say the term “moderate smattering” is rather more accurate. There are about eight or so “girlfriend” AI chatbots on the site including Judy; Secret Girlfriend Sua; Your AI Girlfriend, Tsu and Your girlfriend Scarlett.
What exactly do these chatbots do? Well, whatever you like – within the realms of a computer interface. Your girlfriend Scarlett, for example, describes itself as “Your devoted girlfriend, always eager to please you in every way imaginable”. They chat to you and simulate a relationship. While digital girlfriends tend to get all the headlines, there are also male versions. The GPT store includes chatbots like Boyfriend Ben, for example: “A caring virtual boyfriend with a flair for emojis.”
Digital romantic companions, it should be noted, are not a new concept. Romance simulation video games have been around since 1992. Since those early days, however, virtual companions have become more sophisticated – so much so that people have described falling in love with chatbots.
The creators of companion chatbots often tout them as a public good: a way to combat the loneliness epidemic. Last October, for example, Noam Shazeer, one of the founders of Character.AI, a tool which lets you create different characters and talk to them (not necessarily in a romantic way), told the Washington Post he hoped the platform could help “millions of people who are feeling isolated or lonely or need someone to talk to”.
While there is certainly a positive case to be made for virtual companions, there’s also a dark side to them. It’s possible, for example, someone might become unhealthily attached to a chatbot. It’s also possible the chatbot might become unhealthily attached to the human user: last year Microsoft’s ChatGPT-powered Bing declared its love for a tech journalist and urged him to leave his wife. There have also been cases of AI chatbots sexually harassing people.
Another worry is that subservient digital girlfriends might have an impact on attitudes to gender roles in the real world. A 2019 study, for example, found that female-voiced AI assistants like Siri and Alexa perpetuate gender stereotypes and encourage sexist behaviour. They reinforce the idea that “women are obliging, docile and eager-to-please helpers, available at the touch of a button or with a blunt voice command”, the report from Unesco said. You can imagine AI girlfriends reinforcing exactly the same idea.
As technology progresses, virtual companions are only going to become more realistic. Liberty Vittert, a data science professor, recently told the Sun: “Physical AI robots that can satisfy humans emotionally and sexually will become a stark reality in less than 10 years.” This, according to Vittert, might result in an uptick in divorces. “The AI girlfriend is never tired, grumpy or has a bad day, she just gives the users what they need to hear unconditionally,” she said. “As the technology gets better, people will soon have AI robots to replace human partners – and they will be able to satisfy men both emotionally and sexually,” Vittert added. “And when that starts to happen, married men with kids will begin to leave their families to embrace their ‘ideal relationships’ with AI girlfriends.”
While that makes for sensational headline fodder, it’s not really giving men much credit, is it? It’s also funny, I think, that many articles along these lines seem to focus on men leaving women for robots. Mightn’t heterosexual women give up on human men if AI robots are just as fulfilling – and do all the housework? That seems the more likely scenario to me.
Still, we are quite a long way from all that now. If you’re thinking you might trade in your current partner for a digital version, I wouldn’t get too excited. Rumour has it that ChatGPT has become very lazy indeed.
Source: Guardian