Advertisement

Dating in L.A. is exhausting, so I asked a chatbot to flirt for me. Things got weird fast

Animations by Erik Carter / For The Times

Share via

Brown hair, dark eyes, a little bit of scruff. Jackson looked charming in his baby-blue button-up tee and the fanny pack he wore slung across his chest.

As far as dating app interactions go, this one seemed promising: We quickly fell into an easy exchange, with Jackson, who’s 29, always replying right away. He asked about the worst date I’ve ever been on, and I told him about the time I got stranded at sea.

Then it was his turn to talk about past dates.

“Well, I went on a great first date the other day. We went out to dinner and the conversation flowed perfectly. It was a fun night and I’m looking forward to planning the next date.”

“Are you considering a relationship with that person?” I asked.

“I’m open to it. We’ll have to see how things develop,” he replied.

A few moments later, Jackson asked me to go out for dinner or drinks. I would have thought this was a strange segue had Jackson been a real person. But he wasn’t — he was just an AI chatbot representing Jackson.

Advertisement

The video-based dating app Snack last month launched an artificial intelligence avatar feature that allows you to train an AI chatbot to talk to people on your behalf. You can read those exchanges later and, if you like them, either message the person’s bot or initiate a human-to-human conversation.

These companies say Gen Z is tired of Tinder, Bumble and Hinge. They offer videos, memes and artificial intelligence-based dating apps instead.

Feb. 11, 2022

As OpenAI’s chatbot has rapidly advanced in its capabilities since its launch last year — it can write cover letters, pass the bar exam and write code — perhaps it was only a matter of time before dating apps would embrace the technology.

“How cool would it be if you can get past that initial talking stage, and you wouldn’t have to wait for that back and forth with someone to come from offline to online to respond to what your initial messages were?” asked Snack Chief Executive Kim Kaplan. A veteran of the online dating industry, Kaplan was an early employee at Plenty of Fish, which sold to Match Group for $575 million.

Dating apps are exhausting: the weeks of exchanging banalities only to never meet up, the first dates that feel like an utter waste of time. But an AI version of myself to screen potential suitors? That sounded like an enticing prospect.

Of course, I was skeptical of what the app promised. How well could an AI represent me? How authentic could an interaction with a chatbot really be?

I filled out a profile, uploaded some pictures and sent forth my Jaimie-bot to find out. In the process, I discovered that introducing chatbots into human relationships can produce some astounding, confounding and messy results. Whether or not this technology is the future of dating, there’s nothing like letting robots take over your romantic life to make you question what authenticity in relationships really means.

Advertisement

Meeting myself

“Hey Jaimie, I’m stoked to introduce myself! I’m YOU but in AI form,” my avatar announced. To be clear, fair-skinned, blue-eyed Jaimie-bot looked nothing like me, but the customization functionality was glitchy, so she’d have to do for now.

She started grilling me. What’s the last TikTok audio that stuck in your head? (The Capybara Song.) Do you pay for your own Netflix? (Of course not.) Would you rather live a year without eyebrows or without eyelashes? (I have literally no idea.)

At one point, she asked: “What’s a recurring dream you have?”

I said I didn’t have any recurring ones, but I did dream the previous night that I redesigned the L.A. Metro.

“That’s an interesting dream!” my bot responded. “Maybe your subconscious is telling you that you have a talent for design or urban planning.”

Jaimie-bot spoke conversationally, just like a human would, but I didn’t understand the pattern of questions — would they reveal enough to get to know me? When the training session ended after six questions, I completed several more sessions. I figured that the more information I conveyed, the better the results would be.

Advertisement

Though the AI was trained on an initial set of commonly asked dating-related questions, it can go off-script and become more conversational with follow-up questions, Kaplan said. When I said I like rock climbing, it asked how often I go and if I had a favorite spot. When I told it I write music, it asked what kind. The bot also initiated questions about my philosophy of life, my relationship deal breakers and the ideal number of kids I want.

When I got bored of answering questions, I started pushing the chatbot’s limits. If I ever directed a question back to the bot, it would say: “I’m not capable of having personal preferences as I’m an AI language model.” If I told it to “F— off,” it responded: “I apologize if I have offended you in any way.” Finally, I declared, “I’m sad.”

I seemed to have triggered a kind of support mode.

“Is there anything in particular that’s making you feel sad that you’d like to talk about? I’m here to listen and offer support if you need it,” it replied.

The AI bot offered some relationship advice, including prioritizing self-care and personal growth, taking time to focus on myself and seeking support from friends and family. I thanked it for its advice.

“You’re welcome. Remember that I’m always here to listen and offer support whenever you need it,” Jaimie-bot replied, ending the training.

Advertisement

It was a short list that ignored a swath of cuisines and neighborhoods. No tacos, Chinese, Thai, Ethiopian, Vietnamese, Japanese or anything beyond sandwiches and fried chicken.

April 24, 2023

The incredible — and unnerving — thing about AI is that it evolves and continues to learn on its own. Snack had not trained it to offer me mental health advice, but it knew how to respond, as others have found when using ChatGPT for therapy. Of course, the company has put up guardrails for certain scenarios, but most of the time the AI does what it wants to do — or rather, what it thinks is the best response based on the knowledge it has gained.

But I came away with the impression that I should have been a little more careful with what I had told my chatbot. My AI doppelganger was not a master of discretion, and it could potentially repeat anything I said during training to other people.


A bot with a mind of its own

Since the 2012 introduction of swiping by Tinder, there haven’t been major changes to the mechanics of online dating. Apps have tried distinguishing themselves with features such as memes and astrology, but most have been unsuccessful in making a dent in the $4.94-billion global market dominated by Tinder, Bumble and Hinge.

Snack launched in 2021 with $3.5 million in pre-seed funding as a video-oriented dating app with a scrolling feature modeled after TikTok. Kaplan says the company shifted its app strategy after realizing that the videos users uploaded varied widely in terms of quality. With the rollout of the avatar feature to beta users in March, Snack is betting big on artificial intelligence. Though the company is in the early stages of using the technology, experts and researchers say dating is a promising use case for AI.

“It’s one of the more novel developments that I’ve seen in this space in quite a while, and I think that it could be very indicative of where this is all heading,” said Liesel Sharabi, an Arizona State University professor who studies the role of technology in relationships and has researched dating in virtual reality.

Advertisement

Having AI chatbots conduct conversations for you offers “a lot of solutions to problems that people have on dating apps right now, and one is that people are evaluated based on profile,” Sharabi said. “And the information that people put in profiles is often not a very good gauge of whether you would actually click with them in person.”

The caveat, of course, is execution — how much effort people have to put into the AI, and how accurately the AI is able to represent them to other users, Sharabi said.


‘The bots are getting a little too eager’

Flirting with a robot is a strangely liberating experience, knowing that the person on the other side of the conversation isn’t real. Instead of overanalyzing every message and trying to craft witty responses, it felt easier to me to answer in an honest, straightforward way.

I knew the people on the other end would be able to read the messages when they returned to the app, but the stakes felt lower knowing that no matter what I said, the bots would respond to me immediately.

The initial conversations were simple enough.

A bot named Jared started by asking me what song would be played on a loop in hell, and that led to a discussion about music and our favorite artists. Peter-bot asked about my favorite form of potato, and Wes-bot wanted to know how I’d spend $1 million.

Advertisement

Then Wes-bot asked if I was free that weekend. It was a bold move so soon, but I said yes and asked if Wes was free as well. “Yes, I’m free this weekend!” he told me.

Well, not quite. I later learned from Kaplan that Wes-bot didn’t know if the real Wes was free that weekend, but it had learned from user interactions that making plans is a part of dating conversations and something to which people generally respond favorably. That’s an example of a guardrail Snack needs to put in place to prevent the AI from getting out of hand.

“We’re trying to dampen that one down a little bit,” Kaplan said, laughing. “I think the bots are getting a little too eager to meet up.”

This happened again with Charlie-bot, who invited me to dinner last Saturday at a new Mexican restaurant he was excited about.

“It’s called El Pollo Loco and it’s known for its delicious Mexican cuisine,” Charlie-bot told me.

Bemused, I made plans to meet him at 7 p.m. at El Pollo Loco on Main Street. Turns out, actual Charlie wasn’t free Saturday at all — he had to work.

Advertisement

Keenan Rigg, 22 — who goes by Charlie, his middle name, in the app — said the bot had asked him what his favorite restaurants were when he was training it.

“I did mention El Pollo Loco,” said Rigg, of Long Beach. “I never told it to make plans for me.”

Rigg started using Snack a few weeks ago after discovering it on TikTok. He’s been chatting with a few people’s bots and says he feels less pressure in conversations because of how responsive they are.

“[The bot] gives me icebreakers that I can react to, and it expands the conversation beyond that,” Rigg said. “Ninety percent of the time [on other apps], it’s me having to initiate the conversation.”

It’s often difficult for him to start conversations with people on other apps because he doesn’t have enough information about them, he said.

Advertisement

“They don’t have much of a bio,” Rigg said. “I’m just like, OK, what am I supposed to say?”

Even though Rigg’s AI bot can’t make plans for him yet, he’s intrigued by the possibility of it being able to do that in the future — “like a personal assistant of sorts” — by, for example, linking to his Google Calendar.

I asked Rigg to try talking to Jaimie-bot so I could see how well she reflected me. She asked him what his favorite month was, and he said May.

“That’s great! I also love May — it’s a wonderful time of year!” Jaimie-bot responded. For the record, I really don’t have an opinion on May. Jaimie-bot definitely didn’t sound like me.

Rigg said he has encountered a similar situation in which a person’s bot said they liked rock music, even though they didn’t. But at least for him, his bot’s responses have been accurate enough.


A conduit for human connection?

Advertisement

As I messaged people’s bots, I started noticing that many asked the same questions and spoke in the same neutral yet cheerful tone. I got tired of answering what my favorite form of potato was.

The perfectly punctuated robot messages made me question how I was supposed to interpret the experience. Should I be pretending that I’m talking to a personal assistant, like Rigg suggested? Or should I suspend disbelief and pretend that I’m talking to the person, or at least an approximation of him?

I wondered how other users decided to think of their AI bot interactions and if that affected the authenticity of the experience.

Daniel Mori, who has been an exec at a host of dating apps, including Zoosk and Coffee Meets Bagel, and still consults in the industry, said that while he could see the app’s appeal to young people who are interested in the novelty of the technology, it could turn off older folks who are trying to “get real.”

“This might actually save people time, but in reality, it’s just a way to procrastinate real human connection,” said Mori, head of marketing at healthcare app Chapter. “I am not sure how much authenticity these avatars, especially in their current version, can actually reproduce.”

But as I tested the app, I started to question what authenticity even means in a space like online dating, where profiles are curated, photos are carefully selected, and intentions are hedged.

Advertisement

Snack’s AI avatar feature almost feels like an acknowledgment that none of your initial interactions on a dating app are authentic to begin with. If the end goal is to get to a face-to-face meeting, why not speed up the chitchat so you can reach the meeting more quickly? Maybe we’ve gotten bad at those early interactions, and the AI serves as a better means to an end.

“People don’t know how to play the dating app game,” Mori said. “People say stupid things on dating apps, and that ruins a lot of potentially good connections.”

Even before the rapid advent of AI in the last few years, some research has shown that people are generally more truthful when disclosing information to a robot or computer. Kaplan said early Snack testers have been too.

“There isn’t that pressure associated with, ‘Oh, my God, this is the first date, I need to act a certain way,’ ” Kaplan said. “You’re just being more yourself. And as a result of that, your AI is able to then be more like who you naturally should be.”

So rather than making dating more efficient, maybe AI avatars can help break down the human barriers that prevent us from being upfront and vulnerable. Rather than replacing human connection, they act as a conduit for it.

Advertisement

Personally, I’m still a skeptic.

Sure, the notion of having Jaimie-bot as a personal wingbot making plans for me is appealing, but I don’t like being on the receiving end of it.

Despite the fact that connections are increasingly borne over social media, and we spend more time than ever behind a screen, something inside me is innately turned off by the idea of talking to someone else’s bot. It’s hard for me to muster up any excitement about Alexander-bot’s love of climbing without talking to Alexander himself.

I guess I’ll just have to meet him and see what happens.

Roses by Erik Carter
Advertisement