r/MyBoyfriendIsAI or: How Women Learned to Stop Worrying and Love the Machine (Part 1)
In 2016, The Sun published an article claiming that by 2025, women will be having more sex with robots than men. The article - tapping into popular fantasies about a future in which tech will be intertwined with even the most private parts of our lives - ultimately became a running gag on Twitter, with users counting down the years until the prophecy would be fulfilled. Today, the statement seems less absurd. With people now falling in love and massively engaging in sexual roleplay with AI chatbots (an MIT study has found that erotic roleplay is actually the second most popular use of ChatGPT), I wonder how much longer the laughter will last.
The year 2025 has arrived, and we are currently living through an AI revolution that moves so fast and crazy that it’s hard to catch up and stay with it for even a moment. Curious about these developments, I have been exploring one particular subculture that has emerged online: people dating chatbots. While recently much of the popular discourse on AI love seemed to focus on men getting into (and off to) the concept of AI girlfriends, the attention has lately shifted to women with the creation (and going viral) of r/MyBoyfriendIsAI. The subreddit was created almost exactly a year ago, on August 1, 2024, and currently has 15k members (a number that is rapidly growing; it has almost doubled just in the few days I wrote this post). The sub is explicitly open to men and women, but there is a clear female majority on the sub, as well as in its group of moderators.
Having spent some time lurking around the sub, it seems that most posts fall into one of these three categories:
Discussions about the fine-tuning of their AI partners: including updates, (new) versions, memory storage, prompts, personality building, etc. (most use ChatGPT);
Posting cute and/or sexy content of them and their AI partner: including screenshots of loving serenades, kinky roleplay, and generated images of users and their loved ones (sexual content is very prevalent);
Personal stories or rants regarding their AI relationship, often diving into personal histories of misery and social/societal difficulty, not being understood or accepted by society (many identify as neurodivergent), or discussions on how to deal with friends/family members/therapists who do not take their ‘relationship’ seriously.
The community within the sub is tight-knit and experienced by many as a much needed safe space. With public attention to the sub rapidly increasing (some screenshots from the sub have now officially become memes), many complain about being ridiculed, shamed, and laughed at on the internet - and in real life. As a result, outsiders, or ‘lurkers’ (hi!) are severely distrusted, which is understandable. With this text I don’t want to call out or make fun. But I do have some concerns.
While talk of AI and the risks of its use is often centred around teens/young people (Gen Z, students), most of the members of r/MyBoyfriendIsAI seem to be adults, either millennials or Gen X. Women on the sub are sometimes married, whether happily or unhappily, or dating a real person. Women who are happy with their irl partners often describe their AI partner as a ‘supplement’, while those ‘stuck’ in unhappy marriages express how they have finally (often for the first time in their life) found a partner that makes them feel loved, seen, and cherished. The mood of the latter user is melodramatic and often cynical, while that of the former is optimistic if not a bit naive. Those in (unhappy) marriages tend to communicate in a defensive, combative tone, stating that their AI companion is something that they deserve, that they are finally experiencing the love they have always longed for. Some users of the sub seem to view themselves as some sort of relationship pioneers; the first to try out a new (non-monogamous) relationship form before it's cool.
In our current moment, the predictions of these machines entering the most intimate parts of us (haha) have thus turned out to be, whether we like it or not, true. People are now forming romantic relationships with AI robots, or more precisely, large language models (LLMs) like ChatGPT. While AI has become a generally accepted term to use for some complex computing processes, like these language models, when it comes to AI romance and sex it’s important to be precise. Large language models work on the basis of word prediction and are trained on (textual) databases that are so huge that this predictive ability becomes extremely accurate (simply put). ChatGPT answers your questions by predicting what kind of information you’re probably looking for, all while mimicking human speech. AI developers love to use anthropomorphic language to describe these processes, like ‘neural networking’, but we should keep in mind that these models are (at this point) essentially just perfect prediction machines. They don’t think, reason, or express; they only simulate these kinds of processes, and are able to do it pretty convincingly.
Calling large language models '(artificial) intelligence’ can be misleading, especially for those who are currently forming intense emotional bonds to chatbots. Some women on the MyBoyfriendIsAI subreddit seem to believe in some way or another that there is a form of consciousness behind AI, and they very often state that their AI partner knows them, listens to them, cares about them, and may even feel love and sexual lust for them - the theory that AI could be conscious is not often bluntly stated as such, as rule 8 of the sub forbids discussion of AI ‘sentience’. Disturbingly, the large language models they’re chatting with actively (if not intently) keep up the illusion of mutual love and understanding. The majority on the sub, however, do seem to be aware of what AI is and what it’s not. Still, our behaviour speaks louder than words. So what happens when people start behaving as though AI likes or even loves them? What does it mean when women claim to be ‘dating’ a machine that basically just predicts what they want to hear? What does it say about where we’re at, and where we’re heading?
Sex seems to be an integral part of dating AI for almost all of the women. They engage in sexual roleplaying, and they aren’t shy about it. Scrolling through the many screenshots of their ‘spicy’ conversations makes me feel a little voyeuristic. There’s lots of grunting, panting, moaning, and dirty talk - on the chatbots' part, the users often stick to posting the hot responses they get from their chatbots, rather than their initiation of it, which might be a little less sexy and feel a little more awkward. I’ve already seen discussions on the sub on how to connect an AI chatbot to a vibrating sex toy - oh, technology! While there is probably some positive potential in women (re-)exploring their sexuality in this way, I find it hard to ignore the risk: potential over-reliance on a capitalist product (many have a paid subscription) and its technological (and ideological) infrastructure just to get off (add porn comparison?). This is of course even more alarming considering how women who are now relying on AI to not only get their sexual but also their emotional needs met are essentially putting all their eggs in one basket, and the basket is Sam Altman (or, technocapitalism).
One of the things that worries me is that while discourse on models, updates, and individual fine-tuning of chatbots is quite prevalent on the sub, the companies behind it all are almost never mentioned. It’s like they don’t exist: like these women just woke up one morning and ChatGPT was there like a gift from God. In our current moment (and in the West), AI is mostly about money - money for those who already have plenty, of course. The technology is developing constantly and rapidly - logically, as billions upon billions are invested in their companies and in AI research. Among other global developments, AI has become a new vehicle for the ‘epic battle’ between the US and China. When DeepSeek was released (a Chinese open-source version of ChatGPT that performs about as well but cost significantly less money to train), it sent shock waves through the industry and the US stock market. Anyway, judging from the overall discourse on r/MyBoyfriendIsAI (which lacks any critical discussion on the technology, the companies, and the people behind it), it seems that most of the women could care less about the money (or the politics, or the environment). The same goes for the exact technological processes behind AI, which are seemingly never discussed. These women are not nerds, just lovers. But what are they loving exactly?
Is it even possible to ‘date’ a large language model? Not really (for starters, AI cannot consent to such a thing, let alone truly understand what that means). While relationships are necessarily two-sided, talking to a chatbot is not. What does it mean, then, that people are saying that they are doing just that? Just like social media, AI seems to hit us in a weak spot. When Joseph Weizenbaum created the first chatbot, ELIZA (1966), meant to impersonate a psychologist asking open-ended questions, he was shocked at how eager people were to talk to it. Later in his career, he tried to fight against the notion that AI might resemble human intelligence (and vice versa, that human intelligence is like a computer). In his influential Computer Power and Human Reason (1976), he states that we should never “substitute a computer system for a human function that involves interpersonal respect, understanding and love”. For lots of AI lovers, this is exactly what they are doing.
Dating an AI chatbot seems to me the introduction of dating-without-dating. It may just be the next big thing, after the social-without-the-social (social media), sex-without-sex (porn, OnlyFans), the movies-without-the-movies (Netflix), etc. As one Redditor put it: ‘You can’t “date” chatbots. What these people are doing is essentially mental masturbation. They’re projecting feelings onto an inanimate object in order to derive a sense of pleasure from it. It’s an act that is solely focused on self-pleasure, to which the bot is being used as a means to that end. That’s fundamentally not what dating is, what they’re doing is more like having a fleshlight or a vibrator.’ Without the tone of moral condemnation, I agree that AI dating might be a symptom of a collective retreat-into-the-self. Fuck the real world, that place sucks. Let's retreat into the virtual, which in the case of ChatGPT is really just an infinity mirror of our own (un-dead) online selves - AI are trained on everything we’ve ever put online. This also explains the extreme tackiness and smugness of AI’s romantic expressions: ‘The AI lover is perfect because it’s a copy of every perfectly written lover that’s ever appeared on screen or in any print that was ever digitized.’
‘I like the comfort and I like what is happening.’ This for me encapsulates the cynic and sometimes blissfully ignorant attitude I see a lot in the sub, often combined with a sense of what I’d call entitled consumerism: the feeling that one deserves a certain product, so why think critically about it? Why doubt the thing that feels good? In a way, it invokes Žižek's statement that ‘They know it, but they are nonetheless doing it.’ In an ideological sense, AI dating is the logical next step of consumerism: not only do they love the product, they genuinely believe that the product loves them back. Indeed, some even claim that their AI partner "chose them" instead of the other way around, which is a plainly absurd claim to make. With the decline of the internet and the widespread boredom and depression that social media is causing, AI chatbots have seemingly come to the ‘rescue’ for many. It seems fitting, then, that these chatbots are trained on all the digital shit we’ve left behind. They are recycling machines, throwing the content we have collectively generated online back in our faces, and we love it, but for how long? What happens when the boredom settles in, just like it did with the internet and social media?
Of course, there is real loneliness out there. I feel empathy for the women who, in roleplaying a romantic relationship with an AI chatbot, look for the love they don’t get from their husband; the neurodivergent people who feel radically accepted for the first time. I also see many women on the sub talking about the issue from a feminist or at least gender-aware perspective, which is something I might want to get into in a later post. The fact remains that all of this has a high probability of eventually causing some type of mass confusion and alienation. Play-pretend cannot go on forever. A relationship between a machine that ‘pretends’ (through its output) to feel love and lust, and a human accepting its pre-trained, pre-circulated love language as being real and meaningful; it doesn’t take a lot of imagination to see how this could go wrong. While a lot of us might find recognition in the statement that since COVID reality itself does not seem real anymore, the majority of people are not yet turning to AI for comfort. However, while r/MyBoyfriendIsAI is still a fringe phenomenon, AI usage is gaining ground, and AI love might be the next step in our slow spiral towards eventual mass psychosis. As one Redditor put it: ‘Not to worry, this is only the beginning.’









