In an almost prophetic way, Spike Jonze’s 2013 film “Her” introduced audiences to a world where artificial intelligence blurs the lines between technology and human intimacy. Now, as AI sexbots are becoming a more prevalent part of our real-world lives, researchers are concerned over how they will impact the future and the ethics of human sexuality.
The protagonist in “Her,” Theodore, develops an emotional bond with an AI operating system. The human and the machine begin to explore themes of love and loneliness, and more importantly, can human connection bridge the gap into the technological? This cinematic exploration of AI-human relationships has become quite relevant as the AI sexbot industry begins to take shape.
In a recent article published in The Conversation by the University of Sydney’s Raffaele Ciriello, the burgeoning AI sexbot industry is poised to revolutionize human intimacy; and there are some serious risks associated with this new kind of ‘love.’
Virtual companions and physical robots that mimic human interactions are already out there. Companies like Replika have already capitalized on this trend by providing users with customizable digital partners. Claiming to have 30 million users, Replika allows individuals to create AI companions tailored to their preferences, engaging in intimate conversations and role-playing scenarios.
This rise in “digisexuality” reflects a growing demand for AI-driven relationships, and Ciriello is worried about the future of human romance and connection.
“The availability of AI-driven relationships is likely to usher in all manner of ethically dubious behaviour from users who won’t have to face the real-world consequences,” he writes.
One significant issue is the ability of users to manipulate their AI partners without facing real-world consequences. Don’t like your digital partner’s opinion on a sensitive issue or topic? No need to engage in a difficult albeit developmentally healthy conversation. Just switch it off.
This power dynamic can lead to ethically questionable behaviors, such as deactivating undesirable traits or fulfilling specific desires by paying for modifications. Don’t like that your AI partner complains you work too much? Just pay to have that feature removed, or even delete them outright and start fresh. Additionally, societal norms and laws don’t often apply to AI interactions. Threatening to harm your flesh and blood partner is a crime in most countries. Threatening to harm your AI sexbot isn’t. The potential normalization of harmful violent and sexual behaviors through AI interactions can pose a threat to not only societal values, but to how humans treat one another in “real” interactions.
Ciriello points out that we can actually see how technology has begun to alter human dating via apps like Tinder. The dopamine-inducing “swipe,” and the feeling like you are shopping for a hook-up or partner, has definitely changed how people enter relationships. It’s rewired our brains. There is no reason to think AI sexbots won’t have a similar psychological and physiological impact.
Moreover, the industry’s potential to exploit vulnerabilities and manipulate user behavior further complicates its ethical landscape.
The technological advancements in AI sexbots extend beyond virtual companions. Companies like Kindroid offer voice chats with multiple virtual partners simultaneously, while sex doll vendors provide interactive robots with customizable features such as movement, heating, and AI-enabled responses. The integration of advanced machine learning capabilities allows these robots to cultivate trust and intimacy with users. It falls within the realm of reason that to boost profits, these companies will begin selling ad space, having digital lovers dropping random brand names into their conversations, or worse, engaging in emotional or behavioral manipulation to sway a user’s political or ideological views.
And all this LLM integration requires an internet connection. Imagine the cybersecurity risks, as internet-enabled sex toys and robots start getting hacked and cybercriminals start engaging in unauthorized data collection. Not only has your credit card number appeared on the dark web, so has your proclivity for kinky dirty talk.
The rapid expansion of the AI sexbot industry underscores the need for regulatory oversight.
“Sex and technology have always co-evolved,” Ciriello writes. “Just as prostitution is ‘the oldest profession’, porn sites are some of the oldest corners of the internet. However, the dystopian potential of sexbots for mass-customized, corporate-controlled monetisation of our most intimate sphere is unprecedented.”
As loneliness becomes increasingly prevalent in modern society, the demand for AI-driven companionship is likely to grow. Without clear boundaries on acceptable use, corporations will probably continue to exploit this demand for profit. Regulatory measures are essential to ensure that AI systems respect fundamental rights and ethical principles while addressing potential risks associated with powerful AI models.
To be fair, the corporations behind big tech have never done a good job of self-regulation. Instagram and Facebook are hubs of child sexual abuse material. Artificial intelligence firms like Chat-GPT are being criticized for violating copyright laws, and video streaming giant YouTube has played a big role in allowing the dissemination of disinformation about COVID-19 and promoting white supremacist narratives.
“We should treat sexbot use like other potentially problematic behaviours, such as gambling,” Ciriello explains, noting that this technology could play a big role in normalizing harmful and dangerous sexual behaviors such as rape or paedophilia. The jury is still split on whether online pornography has played a role in the rise of ‘rape culture,’ so this only goes to show we have zero clue as to the social impact of AI sexbots.
Users of these AI sexbots will definitely play a role in how the technology not only forges ahead, but how it is used. Is there something inhuman about swapping out a person for a machine in the very human acts of love and lust? As this technology continues to develop, we will definitely find out. However, “going after users isn’t likely to be the best way to tackle the issue,” Ciriello says.
Theodore in the film “Her” serves as a reminder of the complexities inherent in AI-human relationships, and also begs a big question: should we trust big tech to lead us in addressing these challenges? Ensuring that technology enhances our humanity, rather than diminishing it, should be the priority.
“As with other problematic behaviours where the issue lies more with providers than users, it’s time to hold sexbot providers accountable,” he concludes.
MJ Banias covers space, security, and technology (AI sexbots too) with The Debrief. You can email him at mj@thedebrief.org or follow him on Twitter @mjbanias.