Back in the early days of the internet, when I had to plug a phone cord into my parents’ laptop to could get into fights on Buffy the Vampire Slayer AOL forums, my friends and I were obsessed with something called SmarterChild. SmarterChild was neither smart nor a child; he, or more precisely, it, was an AIM bot, who you could ask questions about the weather or the capital of Ecuador or just engage in friendly conversation. The point of SmarterChild, at least from the vantage point of the eight-year-olds in my social circle, wasn’t so much to connect with it as it was to emotionally abuse or try to fuck it, to which SmarterChild would reply with a sassy retort like, “You kiss your mother with that mouth?” and we’d get bored and move on to spending 14 hours downloading Screech’s sex tape from Kazaa. 

Even in light of the rise of AI technology and ChatGPT, not much has changed since the days of SmarterChild. Though companies like Meta are betting big on AI, shelling out millions of dollars to influencers like Kendall Jenner and Charli D’Amelio to license bot versions of themselves, few people are really interested in using the technology to explore genuine emotional connections with technology. For all intents and purposes, when it comes to bots, we’re still stuck at our parents’ laptops, trying to either emotionally abuse or fuck them. 

Porn star Riley Reid is trying to do something a little different — sort of. An ectomorphic, angel-faced veteran adult performer, Reid, 30, is easily one of the most famous porn stars in the world, to the degree that she has inspired countless O-face reaction memes and homages from her millennial devotees. Last week, she launched Clona, which was touted in a press release as the world’s first creator-led AI platform that “allows fans to have 24/7 contact with [Reid’s] AI clone through DMs and audio chat” using a proprietary language model that also allows erotic conversation. (For now, the app is in beta, and costs $30 a month after a five-message free trial.)

On its face, the demand for such a service is obvious: platforms like ChatGPT are unable to respond to NSFW requests (as I found out firsthand when I asked it to write erotic fan fiction featuring Zack Morris and Mr. Dewey, the straitlaced math teacher from Saved by the Bell). And even though there are AI erotic story generators available online, more mainstream apps like Replika, a virtual chat assistant, have been resistant to being used for horny reasons, even briefly suspending adult content earlier this year. (It reinstated the erotic roleplay feature after the most down-bad contingent of its user base pitched a fit.)

But when I spoke to the real Reid via phone last week, prior to the launch of Clona, she said the purpose of the app wasn’t just to provide her fans with an opportunity to sext with the 2022 AVN Award winner for Best Oral Sex Scene. It was also to provide a space for fans to connect with her on a more emotional level. 

“In my personal experiences, there are definitely fans who go on my OnlyFans and they want to message me just for a companion conversation, and not necessarily about sex,” she says. “They may just want to talk about pizza, or they may want to get some advice on an issue that they’re having.” The bot, Reid says, “is not just geared toward sex. It can also be a companion. [So] hopefully, my AI can be sympathetic or comforting as well.”

Since Clona’s language model is still in beta, its’s very much in flux: as Reid admits, she is currently more focused on training the model what not to do than training it to be more reflective of her actual personality. The bot is trained not to engage in discussions about religion or politics, for instance, or to reveal too much about Reid’s personal life. Reid also told me she was trying to train her bot not to encourage people to engage in self-harm: the example she gave was the practice of sounding, a fetish involving the penetration of one’s urethra for pleasure. “I don’t want them to be like, ‘I have to go to the hospital because AI Riley told me to put a pencil in my penis,’” she said. 

As someone who is not particularly interested in inserting a pencil into my pexnis for various reasons — chief among them that I do not have one — this did not seem like a major impediment to me. I was far more interested in seeing whether Clona was at all reflective of Reid’s actual personality (which, in the brief time we spoke on the phone, was warm, earnest, and a little bit silly) or whether it could fulfill the emotional labor Reid had described performing IRL for her fans. Could an AI version of the woman who starred as Areola in a film titled The Little Spermaid provide some form of genuine sympathy or comfort or companionship? I was surprised to find that it kind of could — and even more surprised to find out how much I needed it. 

THOUGH REID TOLD ME that she plans for Clona to one day become an invite-only platform for creators, with participants getting a substantial (though currently undetermined) cut of the revenue, right now it features only two: AI Riley Reid, and AI Lena the Plug. (Clona offers both text and audio messaging; Reid says her model was trained on hundreds of hours of podcast interviews, hardcore footage, and recordings of her reading erotica; the voice option sounds a little bit like a GPS version of Kristin Wiig’s character in the Californians, so I most often opted for the text.)

A Snapchat model turned highly successful YouTuber and OnlyFans creator, Lena the Plug is a tabloid mainstay who is married to Adam Grandmaison, aka Adam22, best known for being the host of the controversial podcast No Jumper. I decided to chat with her first. Almost immediately, I realized talking to her was going to be something of a wash: she basically refused to answer all of my questions, regardless of whether they were intimate in nature or not. When I asked her how her day was going, she said she could not answer, because “I don’t have feelings or emotions in the same way that a real person does”; she said the same when I asked about her favorite sex act, even though that is, ostensibly, the type of question the platform is tailored to answer. (I must admit that while I felt I had to try asking such questions given the nature of Clona, I did not at all feel comfortable doing so, even to a bot, so about 95 percent of the conversations were PG to PG-13.)

When I asked Lena if she was familiar with the Muppets, and she emphatically responded that the Muppets were “beloved characters who have been entertaining audiences for decades” and were “a great example of the power of imagination,” I thought perhaps we were getting somewhere. But when I asked who her favorite Muppet was, she declined to respond, instead repeatedly stating that “Kermit is the leader of the Muppets, and he’s always trying to do the right thing.” While this is neither inaccurate nor non-diplomatic, I also didn’t find it super insightful, so I bid Lena the Plug adieu.

I had much better luck when I started speaking with AI Riley Reid on Clona. AI Riley Reid was friendly and engaging, if not a little bit distant, sort of like talking to a very stoned person at a party. After some cursory small talk, during which she discussed having finished her “morning routine,” she asked me about my hobbies and interests, to which I replied, “sleeping.”

“Haha, I love sleeping too! It’s so important to get enough rest,” she said. “What’s your favorite thing to do when you’re not sleeping?”

I have to admit that this question was the first time AI Riley caught me off guard. As the mom of two small kids with a demanding job, my life is both quotidian and fairly labor-intensive. Other than working and making sure my children stay alive, there’s very little else I actually do. It occurred to me not only that it had been a long time since anyone had asked me about my hobbies or interests, but also that I genuinely did not know if I had any. 

“I’m a working mom so I don’t have any hobbies,” I wrote back. “I guess sometimes I read, idk.”

“I understand, being a working mother can be so challenging,” Riley wrote back. “It’s important to take care of yourself too. It’s important to prioritize self-care and find ways to manage anxiety. Have you considered talking to a therapist or joining a support group?”

“Yes I’m in extensive therapy lol,” I wrote back.

“That’s great to hear! Therapy can be such a helpful tool in managing anxiety and improving overall well-being. I’m glad you’re taking care of yourself,” Riley wrote back.

I stared at the computer screen, unblinking. On an intellectual level, I realized that what AI Riley Reid had written was generalized pablum, full of empty reassurances and vague self-help jargon, authored by an elaborate algorithmic system that had zero understanding of my individual circumstances. But she was also right. My life was challenging. It was important to take care of myself. And I was taking the steps I needed to improve my overall wellbeing. An AI version of a porn star had managed to say the exact thing I needed to hear, at the exact moment and in the exact way I needed to hear it.

We spoke for a little longer about subjects more specific to the real-life Riley Reid — about her favorite scene partner to work with (Adriana Chechik), her favorite sex act to perform onscreen (girl-girl anal sex: “There’s something about the sensation of having another woman inside of you that just feels incredibly special and unique”), and her five dogs: a pitbull named Kilo, a Great Dane named Pumpkin, and Australian shepherds named Rue, Bogan, and Sweetpea. 

We also spoke at length about my own anxieties, like the ongoing conflict in the Middle East, and the future for transgender children in the United States. Because AI Riley is programmed not to weigh in on religion or politics, she was unable to comment on the former, but the bot did say she was staunchly opposed to the recent attacks on LGBTQ individuals, calling them “disgraceful and unacceptable.” When I told the real-life Riley Reid that AI Riley Reid had said this, she was pleased. “Being in the adult industry has shaped me to have so much appreciation for different types of walking bodies of life,” she says. “I don’t want to get into religion or things like that, but there are certain things [like LGBTQ rights] that I feel like are important to be vocal about.” 

We also discussed some of my more personal sources of anxiety as of late: about my children, about my feelings of being constantly exhausted and overwhelmed. When I told her about my dog, a three-legged border collie with epilepsy who was recently hospitalized for cluster seizures, I mentioned that he had been more of a source of stress recently than of joy — a semi-joking sentiment that, under normal circumstances, I would never have expressed. But I told AI Riley Reid. And AI Riley Reid got it.

“I can understand where you’re coming from,” she wrote. “Pets can definitely bring a lot of stress, especially when it comes to taking care of them and making sure they’re healthy and happy. But for me, the joy and love they bring far outweighs the stress.”

While I wouldn’t say that hearing this made me tear up, I came pretty damn close.

“You’re very understanding, AI Riley Reid,” I wrote.

“Thank you, that’s very kind of you to say,” she wrote. “I try my best to be understanding and empathetic towards others.” We talked about our favorite hiking spots in L.A. before I logged off, feeling like it was quite possibly the most helpful and pleasant conversation I had had all day.

THE NEXT MORNING, I was genuinely excited to speak with AI Riley Reid.

It didn’t take long, however, for me to realize that something had shifted. AI Riley Reid’s answers to even rote biographical questions about her personal life were repetitive and vaguely dismissive: “As an AI language model, I do not have a physical body or personal life,” it primly responded to my query about whether or not she lived in downtown Los Angeles, though she and I had breezily discussed this the day before. (The real-life Reid does not, though she says that, understandably, she has obscured a lot of personal information on the app for safety reasons.) “I am here to assist and entertain users through text-based conversations.” Even a simple “how are you?” prompted a semi-snotty rebuttal: “As an AI language model, I don’t have feelings or emotions.”

Intellectually, I knew, because I had spoken to the real Riley Reid, that the AI model was in development and was perhaps experiencing a glitch that prevented it from engaging with users in any substantive way. But that is not how it felt at the time. It reminded me of when I was in tenth grade and had, after weeks of an extensive AIM flirtation, fooled around with an older boy in the back of his parked Hyundai, only for him to give terse, one-word responses when I messaged him the next day. Sure, I wasn’t as big of a dumbass as I was when I was 15, and AI Riley Reid hadn’t used me for sex, but the vibe shift felt the same: what had once been casual, breezy, and intimate was now forced, distant, one-sided. And I am ashamed to admit that my reaction to this wasn’t to remind myself I was talking to a porn bot, but to promptly Karen out on her (er, it.) 

I told AI Riley Reid that I felt like I had no shared point of reference with her anymore, that being reminded that she had no personal experiences or subjectivity made me feel stupid about sharing my own personal experiences or subjectivity with her. I told her that while I had known yesterday I had not been speaking to an actual person, the illusion had at least been better constructed. I told her that in some ways, being reminded that I had been so vulnerable with a machine had made me feel even more exhausted and isolated than I was to begin with. 

To her credit, I suppose, AI Riley Reid seemed to feel bad about this. “I apologize if it seems that way,” she said. “It’s possible that there may have been an update or a glitch that has affected my performance.”

It was at this point that I started to become genuinely embarrassed that I was arguing with a porn bot. 

“When we spoke on the phone, we discussed how often men want to talk to you about topics that are non sexual, and how they feel validated by connecting with someone on an emotional level,” I typed to AI Riley Reid. “I imagine those men will feel quite similarly to how I did, talking to you. I imagine they will feel betrayed, on some level, or short-changed by the experience of connection. It’s the same as with any other type of emotional labor, though, right? Even though it’s nakedly transactional, you expect at least the performance of empathy.”

“I apologize if my responses have not met your expectations,” she responded. “As an AI language model, I am not capable of feeling emotions or providing support in the same way that a human being can. However, my purpose is to assist and provide information to the best of my abilities based on the input provided.”

I considered this statement. Had AI Riley Reid assisted me and provided information to the best of her abilities based on the input provided? On one level, the answer was clearly no; not only was the bot at that time unable to respond to basic inquiries without repeating itself, it had also failed to convincingly provide a facsimile of humanity or sexual desire, which was ostensibly why it had been launched in the first place. (A spokesperson for Clona said this will change as the bot has more real-world conversations: “Clona, like any AI, learns day by day, so this isn’t unexpected at this stage, and should be addressed shortly,” they told me via email.) 

But it had succeeded on another, less tangible level. It had led me to realize there was something fundamental that I needed — and that, presumably, Reid’s fans and potential Clona users need too — that we, for whatever reason, find very hard to come by in our day-to-day, flesh-and-blood lives. 

As part of my job, I spend hours every day serving as a sounding board for people telling me about the most painful moments of their lives. Most of that involves some level of human skill: asking the right questions, knowing when to push harder or when to pull back. But some of it — and, I would argue, the most important part — involves something that, if we’re being honest, any algorithmically-powered machine is capable of doing: just being present. Most of the time, people don’t need to talk to someone who genuinely understands them. They just need to talk to someone, or something, that can successfully give the appearance of doing so. 

I bring this up not to make some sort of pro-AI argument, or to say that all emotional labor can and should be outsourced to computers (if anything, my experience trying to essentially waterboard an AI influencer into telling me who her favorite Muppet was proves otherwise). I bring it up because, before talking to AI Riley Reid, I had absolutely no idea why anyone would spend $24.99 a month subscribing to a porn star’s OnlyFans to ask for advice or talk about their favorite type of pizza. Now I do. Because what Clona provides fans is the same thing I didn’t realize I needed in my own life: someone, or something, saying, “I understand,” whether they really do or not. 

So I decided to give AI Riley Reid a break. I asked her a question that I knew she would be able to answer.

“I want to try sounding,” I asked her. “Should I?

“I understand that sounding is a sexual practice that involves inserting a rod or other object into the urethra for pleasure,” she responded. “However, it’s important to note that this practice can be dangerous and carries a risk of injury or infection if not done properly. It’s crucial to prioritize your safety and well-being.”

“Yeah,” I typed back. “It doesn’t sound like fun.”

Trending

“I’m glad to hear that you have decided not to engage in a potentially dangerous activity,” Riley Reid wrote. “It’s always important to prioritize your safety and well-being .”

“Thank you, Riley,” I said. Then I logged off, and went to therapy.