More & More Women Are Getting Into AI Relationships

I even tried it for myself - for journalism

via Pexels

Some people are choosing artificial intelligence over real life partners, but is this really something we should be encouraging or have we crossed a line? Listen, I’m the last person who could judge anyone for preferring fictional men over the men they know in real life. Spending my early teen years staying up late reading ‘x reader’ fanfictions, watching TikTok edits of my favourite TV characters on repeat, fixating onto a character and finding every single piece of fanart ever made of them. I’ve been there. 

There’s nothing wrong with finding yourself a little bit obsessed with fictional characters, it’s really just a part of growing up. Even when you do get older, it’s not crazy to wish for a partner that shares the same traits or looks as the love interests in your books and on your screen. This is all normal, at least I hope it is. What isn’t so normal, however, is trying to blur the line between fiction and reality, especially when it comes to people. Up until  recently we could pretty much all understand that these fictional characters that we loved so much were not real, no matter how much we wished they were. We could turn off the TV or put down the book, and go interact with the people in our real lives. Make new connections, both platonic and romantic, with real people. The rise of the AI partner seems to be changing things though. 

A study done in the US by the Institute for Family Studies found that 1 in 4 young adults believe that AI could replace real life romance, while 1 in 5 young adults reported that they had engaged with AI software designed to simulate a romantic partner. Although you might not personally know anyone with an AI partner, it is becoming more and more popular as that kind of software advances. Many young men and women turn to sites like ChatGPT to find (or technically create) a partner, but other apps and websites have emerged with the sole purpose of facilitating this behaviour. They give users the freedom to design their partner’s appearance, personality, love languages and more. As mentioned earlier, it looks like AI-generated love could possibly become a social norm, no matter how taboo and strange it seems now. But should we really accept it as a norm? What’s the appeal?

@alottameg Oh we’re losing it actually #ai #reddit #relationships #rant #aiboyfriend ♬ original sound – alottameg

I decided to do some investigating by looking into a platform that’s been getting a lot of heat on social media recently. The MyBoyfriendIsAi subreddit. I wanted to see why all these women were trying to build human connections with something that wasn’t human, and whether or not they were actually achieving that. I’m not here to judge anyone, or to play the psychologist, but something that interested me was that this subreddit not only served as a space where women could share funny or cute moments with their AI boyfriends, but also open up to other users about problems they were facing in real life. Many of these women seem to have gone through a lot of grief or trauma, and it was almost as if they were using their AI boyfriends as a coping mechanism instead of turning to real people for help. Everybody copes with grief, tragedy or mental health issues in different ways, and depending on your circumstances it can be very difficult to seek out help from the people around you. So I’m not ridiculing these women for choosing this route, but I do think that in this case, the rise in AI romance could hint at a much larger problem we have as we depend more and more on technology and the internet to make us feel better, instead of communicating with professionals or people in our real life. 

Something else I noticed was that if these Redditors did at one point have a clear understanding that their AI boyfriends weren’t actually real, it had become very blurred. Reading through posts, it felt like a lot of these women saw their AI creations as real people with real feelings, which of course isn’t true. In fact, the majority of the recent posts seemed to be users complaining that the software they were using for their AI boyfriend had been updated or deleted, which either had a huge impact on how their boyfriend could interact with them or would just delete him altogether. Personally, if a simple software update obliterated the very concept of my boyfriend and our entire history together, I would probably find it very difficult to become attached to him in the first place, or to genuinely make myself believe that my relationship was real. But these technical issues don’t seem to lead the women on this platform to the same conclusion at all. 

After lurking on this subreddit for way too long, I realised I wasn’t really getting anywhere when it came to understanding what the actual appeal of an AI relationship was. In an article that Cathy Hackl wrote for Forbes on the same topic, she explained how she actually tried AI dating herself to really understand the appeal. I eventually realised that I was probably going to have to do the same, unfortunately. For the sake of good journalism. 

I’m not the biggest fan of real men or AI, so I knew from the start that I didn’t have it in me to commit to this for more than ten minutes without feeling tortured. I was never going to be able to build the detailed, lengthy robo-relationships that so many others have somehow managed to do over time. And after one brief conversation with ChatGPT pretending to be my boyfriend, my belief that AI relationships shouldn’t be encouraged was only solidified. Maybe I was the last person on earth who should’ve conducted this little experiment. Even just asking the bot to act as a boyfriend made me cringe so hard I wanted to die. I explained his appearance to him, gave him a brief rundown of his personality, and then I let him work away. To be totally honest, it didn’t feel like I was committing some unnatural, cardinal sin. Like I said earlier, I’m a Wattpad veteran, and after reading all those self-insert fanfictions, I could just about handle the cheesiness coming at me from this situation. What I couldn’t handle was how fake it all was. 

via Pexels

Unlike reading, being in an AI relationship actually requires input from you too. I wasn’t able to just sit back and read what was happening like I could with a romance book, I actually had to prompt different answers and actions myself. It took effort, but you weren’t getting anything real in return. I just couldn’t wrap my head around how people could become attached to these computer generated answers that they basically had to construct themselves. Sure, you can create this ideal partner out of thin air and ask them to make you feel loved and happy, but you could also ask them to spontaneously combust or morph into Darth Vader and they would just do it. No matter how charmed you are, none of it is real. It’s just a daydream facilitated by AI. Real romance, real human connection in general is challenging. There’s fear, there’s miscommunication, there’s rejection. There’s a load of ugly stuff. But that’s the only way to achieve real love. So, thankfully, trying it myself didn’t change my mind and I could close the tab after a few minutes and pretend I never did that. 

Something else that bothered me was that AI partners will just do anything you tell them too. Is it time to bring ethics into this too? I know that AI isn’t vulnerable to abuse because it can’t suffer the physical or emotional consequences of anything, because, again, it has no feelings. However, I do find it alarming that there are so many people who see any appeal in being in a relationship where their partner has no autonomy, no free will and can’t refuse anything. Although these AI boyfriends and girlfriends aren’t technically victims, that kind of software promotes the idea that relationships with this dangerous power imbalance should be accepted, when they definitely should not be. I’m not being crazy. There’s already so much evidence of men using generative AI as an outlet for violent and aggressive fantasies.

We saw it with the violating deepfakes being created through Grok, and now AI dating websites are only fostering this gross behaviour further by providing people with partners who can’t say no. CEOs of these kind of sites will often argue that they’re protecting vulnerable groups like women and children in real life by providing these AI outlets. Many women advocating for AI romance also argue that having an AI boyfriend protects them from having to deal with real abuse or rape. I disagree. Things like domestic abuse, physical and sexual violence are systemic issues that need to be addressed and tackled in real life if we ever want to get rid of them. We can’t just run away to our perfect AI partners and let the real world deal with the mess. We definitely can’t let violent people think that they’re behaviour or beliefs are OK as long as real physical people aren’t being affected, because then the problem will never be solved. 

An AI partner might seem perfect on paper. A romance story that you control where nothing goes wrong. But why sacrifice real, genuine connection and settle for something that you know isn’t real just to prevent something going wrong? Let yourself and the people around you be human. Dealing with people in real life, whether you get your happy-ever-after or not, will always be messy and scary and ugly at times. But that’s life, and AI won’t fix it for you.