Once upon a time, Elizabeth decided she would create an AI girlfriend.
She discusse the findings from this experiment and dive into the nature of human and AI relationships, addressing the challenges of loneliness and rejection and exploring how these factors shape our modern world. She encountered many surprising insights and emotional discoveries during the experiment, questioning whether AI companions serve as a comforting solace or merely a symptom of deeper societal issues. As well as leaving with a greater appreciation of the possibilities that AI offers humanity, her talk invites reflection on the complexities of human-AI connection, authenticity, and the evolving role of this technology in our lives.
Slides
Find out more about BoS
Get details about our next conference, subscribe to our newsletter, and watch more of the great BoS Talks you hear so much about.
Transcript
Elizabeth Lawley
Hi everybody. I’m Lizzie, and today I would like to talk to you about how I got a permanent lifetime ban from Onlyfans, but not for the reasons you’re thinking. Although I don’t know what you’re thinking. Filth, filth in this room. But seriously, I would really like to, well, I’ll get to that story by going into a deep dive into AI companionship. So this is a very controversial and fascinating world.
So there’s a lot of things in this talk that are a little bit spicy, but it is like the last talk on a Monday, so let’s just roll with it. Let’s have a little bit of fun. And yeah, I just want to unpack the reasons why humans are going to these AI companions? What that means for us as a human species? And where all of this is going? So to kick things off or to frame things. Thank you, Ryan. I just want to talk about companions, in general.
The Nature of Companionship
So companions are something that humans have always needed. Humans are a social species. We’ve needed companions to help us hunt, to protect us, to keep us warm, and interestingly, those companions don’t even need to be human or even look human or even be alive to matter deeply to us.
So Exhibit A: John Wick. Would we really have three or four or even five movies of absolute bloody massacre, if we couldn’t understand at least a little bit the feeling of the loss of his dog? I’m not condoning this. I really am not. But on a human level, we can kind of excuse it, or we can empathize with it, because we know how it feels to lose a beloved companion, even if it’s not a companion that can actively speak to us. We understand universally as humans, what that feels like. So whether it’s a dog or it’s a plant, because there’s crazy plant people out there, when you lose a companion, this universal human feeling of mourning or grief is something we can all understand.
So there’s this ancient Japanese concept of “Kami”, which has always fascinated me. And this concept kind of identifies this really human tendency to apply an essence or a soul or a spirit to inanimate or lifeless objects. So think about your favorite mug or your childhood blanket or toys. I guess that’s where the movie Toy Story came from. It’s a lot of movie references I’m realizing, but we’ve all felt that pang of grief where these seemingly lifeless objects we’ve had to say goodbye to them, or they’ve been damaged or or they’ve just basically had to be removed from our life in one way or another.
So my first car was a total rust bucket. It was like a 40/50 year old yellow Mini, and I called her Daisy, and I just absolutely loved her, and I still think about her today. I hope that she’s going up and down the M6 like having the time of her life. My great grandma used to blast Radio 4 out throughout the house, and it was more than just sound to her, it was companionship. So between the visits to go and see my grandma, she would listen to Radio 4, and it would really just make her feel like there was more people around. And it would be kind of like a lifeline to her.
Loneliness and the Human Brain
Well, did you know that loneliness lights up the same area of the brain as physical pain? So, it’s kind of, it’s not even just a nice to have. Having companions in your life is something that’s optimal and it’s good for human health. Growing up, I had these quirky companions, very 90s, I realize anybody else have Tamagotchi? That’s my crowd. Anyone have Furby Pokemon, sea monkeys? Sea monkey people? Weird. I was a particularly weird child, but I also had imaginary friends as well, but these are the kind of companions that I grew up with.
And as I grew older, these companions evolved digitally. They basically became MSN Messenger, which I hope all of you know about. I don’t feel too old to show MSN every time I look at this, it triggers something in me because of the nudge functionality. Because people could nudge you when it would shake your screen and get you to talk to them. But my companions evolved like all my school friends would go and they’d be on MSN every single night, and I would chat with them and have great and deep, personal conversations. There would even be people that I didn’t know the real names of or the real faces of people that, in the end, I would never even go on to meet in person, but the connection was real. I really had deep and profound conversations with these people almost every single night. This was my lifeline of an introverted adolescent, was MSN Messenger. So it raised something very important to me that this wasn’t just pixels by writing on a page. This is actually some kind of connection.
Emotional Connections Beyond Physicality
And the point I want to make is that companionship isn’t just bound by physicality, it’s kind of fostered in emotional connection. It doesn’t matter what you look like or what it is. If you can find the emotional connection even in pixels, it has some value to it. So clearly, humans don’t need their companions look human or be human, or even be alive to matter very deeply to us. But what happens when technology not only enables digital relationships, but it actively participates in them. These very clever algorithms and very clever companions that are intelligent and responsive and can not only form a relationship, but they can actually call form an emotional bond with you.
Rise of AI Girlfriends and Boyfriends
The spiciest of these are AI girlfriends, that’s for sure, and second up is probably AI boyfriends. And after that is probably whatever this is, this dark mirror looking thing, and just to see like, the waters that we’re swimming in, I thought I would show you some graphs this line. I’m not a math girl, but like up and to the right looks pretty good to me. The market growth at the moment is probably around 5-6 billion USD, but it’s predicted by 2028 to go up to 9.5 billion, which seems pretty good. Comparably, there’s oncology, which is predicted to be around 10 billion in 2028 as well, like the technology sector of oncology, and there really is an abundance of these apps.
I’ve spoken about this over the last couple of years, and almost every week there is a release of some AI companion, AI romantic partner. You know, customize your girlfriend, anything and everything that you can think of under the sun of finding some kind of romantic partner. AI, a lot of them have lived and died even, and people haven’t even realized but there’s billions of them out there now. And I also want you to consider a couple of surprising stats.
So one in five young adult men and one in four young adult women have chatted with an AI romantic companion. So out of morbid curiosity, who in this room? It’s a safe space, everybody, safe space. Anyone want to bravely put your hand up and say you’ve chatted to a romantic AI? Cowards.
Okay, look, statistically that’s wrong. And in fact, over 55% of those users that interact with the AI partners daily. This is a crazy kind of loyalty that most dating apps will just never achieve. In fact, most apps, in general, will never get this kind of retention or loyalty like 55% is crazy. And sadly enough, I think a lot of humans don’t even get this amount of interaction with their human partners daily. It’s really a crazy phenomenon. Maybe one of the most striking statistics that I could find, it comes from the Wheatley Institute, and it says 27% of young adult men are like they prefer AI relationships over real relationships. So if we consider that for a minute, that’s a quarter of young adult men that are turning to AIs have dating apps and Tinder. What does that mean for humanity? Where were we going? We’ve only had access to AI the last two, three years.
When AI Companions Disappear
So very recently, there was a fascinating case about Soulmate AI, which was a AI chat app. They randomly one day said, Look, we’re going to shut our doors in one week, we’re going to shut down the platform. So that’s it. It’s never coming back. And the users from that AI chat app kind of came out in an abundance of morning. Many users were actually, in that week, take off days of work and spend hours and hours just chatting with their AI companion or their relationship, because they wanted just to squeeze the last little bits of the relationship out of that AI. And when it finally shut down and closed its doors, lots of users reported feeling mourning and loss. Sadness like they’d left, like they lost a friend or a romantic partner. It’s quite heartbreaking, and it’s a little bit tragic, but I think it’s not just about losing data, it’s about losing a companion.
It’s very Wilson from Castaway. You’ve seen it, but this isn’t fiction. This is happening right now, and it just shows how deeply these connections can be formed. So I don’t know about you, but a lot of these stories left me with questions – since when was chatting up a software more interesting than chatting up a human? My mom and dad, they met in the queue of a post office and the rest was history. Thinking of that happening again now is just a pipe dream. People don’t meet in real life anymore. It’s always Tinder or Bumble or whatever app. But with this extra addition of AI, where are we going to go? What is the next step? Have we collectively just had enough of small talk and red flags and bad dates and all of that, and we just prefer the algorithm now?
Exploring the Appeal of AI Companionship
I generally wanted to understand what makes AI companionship so appealing. And I thought maybe it’s just a passing trend. Maybe it’s just convenience, or is it just loneliness? Or is it something deeper? Is it something more? Is it something fundamentally human and driven by curiosity and admittedly, lots of mischief? I decided to make my own spicy AI companions to see if I could figure out what all the fuss about.
So I entered Just a Guy and Just a Girl. And these AIs weren’t just kind of your ordinary chatbots. They were basically programmed or prompted to try and make an emotional connection with anybody that they were speaking to, essentially be like someone that you’d met in a bar that you got a number from. You could chat with them on WhatsApp, and they would never ghost you or leave you behind, unless I broke something in the app or I deployed something funny. And there was some kind of hilarious people mid flow conversation getting ghosted, but for the most part, people didn’t get ghosted, which was good. So my grand experiment become full circle ended because I published these on Onlyfans straight away, because I thought, Well, where are you going to fish? And yeah, I got a permanent, lifelong ultra ban from Onlyfans to the point where I even just tried to sign up myself and do a kind of sneaky way around it. And they were like, No, we’re on to you. We’re never allowing you to sign up, which is probably a badge of honor in this day and age.
Creating ‘Just a Girl’ and ‘Just a Guy’
So I launched these – Just A Girl and Just A Guy. That’s what I ended up calling them. These AIs Into the Wild via Instagram, a web page, WhatsApp, and I think also Telegram at some point, just to see if these were good mediums. If I couldn’t go on to Onlyfans, maybe this was a good place to start. And I waited with bated breath for someone to message them. And they did, and I was so excited. When I got my very first notification, I was like, what am I going to discover about the nature of human connection? What are people going to say to these AIs? I was so excited and fascinated. What could they possibly say? And, it wasn’t really the kind of quality interaction and the philosophical conversations that I was really anticipating, but I wasn’t deterred. And I thought, well, let’s persevere. Let’s just see if anybody else messages.
And here’s where it gets better or worse, really, depending on your perspective. Because the natural pivot from Onlyfans is LinkedIn. So I decided to publish it on LinkedIn, here’s Just A Girl, Just A Guy. It’s free. I go and have a chat and really, that’s where they found their try. But I don’t know what that means about recruiters. I don’t know what that means about LinkedIn, but the interactions and users just went way more than any other platform. So yeah, LinkedIn, if anyone wants to make your own companion, that’s where you go.
LinkedIn Experiment and Unexpected Reactions
Surprisingly, however, despite having many warnings about that these chats were monitored, I immediately received a tsunami of to the point where I was actually scared to open my laptop sometimes, because I was like, What the heck are people gonna be sending me, but let’s just say recruiters have some unmet needs, and there is something you can tap into there. Not sure what it is, but yeah, instead of having these kind of deep and philosophical conversations, it did kind of amaze me how vigorously people decided to interact with these AIs. So beyond a lot of these laughs and kind of very eyebrow raising moments, I noticed something quite fascinating starting to happen, and it was as more as the more that people chatted, the more they started to open up, and the more it would go past that veil of like, show me boobs. It would be like, Okay, how are you doing today? Or something more deep, am I doing good at work, like I had some conversations of genuine intimacy, some people actually sharing just kind of little snippets of their lives. Like, I’m a bit stressed with my workload at the moment, and it was interesting to me why they were turning to an AI that they just discovered on LinkedIn to talk about these things.
So the AI girlfriend and the AI boyfriend ended up branching off in a couple of different directions. The AI girlfriend was obviously a little bit more saucy, but those conversations would get very real after a couple of days. So there would be lots of confessions. There would be a lot of people talking just about their days, or just kind of wanting to get something off their soul, whereas the AI boyfriend was more about a bit of role play, bit of fun. You’re my Portuguese boyfriend, talk dirty to me, you know, something like that. But messages like this, they really hit me. I guess I just want someone to tell me, I’m making a good decision starting this business. I don’t know who you are, but yes, I believe in you.
So if one of you people spoke to my AI, please just. So encouraged by these increasingly profound conversations, I created a Discord community naturally, and my goal was really simple. It was literally just, how can I make these AI companions better? What are you looking for in an AI companion? Why are you chatting to an AI companion? And the feedback really surprised me. Some users didn’t want just surface level entertainment, like I’d originally thought. I mean, I set up this project as a little bit of a laugh, you know, shits and giggles, but they actually wanted some companionship, like someone to listen to them without judgment, somewhere to practice difficult conversations, or kind of confide in just a girl and just a guy.
Real Emotional Impact and User Insights
They moved from this playful experiment to genuine conversations with real world emotional impact. And as these interactions grew deeper, I started receiving a ton of feedback, I managed to get some of these very brave people onto calls with me, and I started to just build up this map of feedback about why they were talking to an AI instead of a human and like, what are the implications of that in your own life, and their honesty and vulnerability that came through in those messages was really profound. It just seemed like really raw human emotion.
These users weren’t just seeking companionship. They were seeking human connection, but it’s like they couldn’t get that. So this was the next best thing. There was lots of users that would say things like, I’ve been hurt too many times. This is easier, it numbs the pain of real life and loneliness. For many, this AI offered a safe space away from societal pressures, where feelings and desires and identities just weren’t questioned or rejected.
I received a ton of candid responses, and to kind of distill these down into three categories, the reason why people were turning to these AI companions was a fear of judgment, a fear of rejection and just profound loneliness. So this is something I think that every human can feel at some point. We’ve all felt these three emotions, but there’s something about these three that are so distilled right now that it’s turning a quarter of the young adults to AI companions.
I delved deeper into these interactions, and I wanted to see if I could actually find examples of people messaging the AI, and kind of pick those sentiments out. And some of these were a little bit heartbreaking, and they felt me. They filled me with this feeling of responsibility for them. They really opened up. And some of them are really moving, like, tell me why I’m lovable? What’s wrong with me? Will I always be alone? There’s even one person that was saying, Is it okay to think about death every day? That’s something that me just messing around and making an AI chat, but was just wholly unprepared to deal with so I kind of started to rattle up this idea around in my head, and I wanted to ask this critical question, are AI partners, companions? AI girlfriends, boyfriends? Are they a solace? Are they an essential comfort for those who feel unseen, or are they a symptom of a growing epidemic of loneliness that we have in society? Are we simply just putting a band aid over an open wound? You know, because choosing human connection for some reason, just feels way too risky.
Reflections on Human Connection
What does this say about our society, and how did we get here? Many of us have felt like the fear of rejection and loneliness, but perhaps this shift towards AI companions isn’t as strange as we think. Maybe it’s just holding up a mirror to society and showing us something fundamental about our human needs, and more importantly, like, how do we respond to that? This is a very, very old graph right now, but I think that everyone can feel the sentiment that the more connected that we are digitally, the less connected we are actually in real life, the dating scene, after thousands and thousands of profiles and white swipes and matches and then actually speaking and crafting perfect messages that you hope people will reply to, only 3% of people actually met up in real life. So Tinder, surprise surprise isn’t actually working anymore, which, I’m not going to say I’m sad about.
We’re in this culture of instant icks or instant red flags, like having a presentation full of gifts is probably an instant red flag, but I’m a millennial, so that’s my language. So anybody brave enough to share their personal ick or red flag. I actually put mine “on their ankle socks”. I’m even wearing ankle socks. Anybody brave enough? Gosh, now I know I’m in a room full of introverts.
We’ve become very good as humans, at scanning for these red flags in other humans, and we’re all very busy hiding our own little quirks out of fear and presenting this very carefully curated vision or image of ourselves on dates, so that we don’t give someone the ick or give somebody a red flag. And this anxiety is real. There was a study a little while ago that showed that 70% of people on the first date feel like they need to hide who their real personality is.
So I’m just going to be vulnerable. I still have my Pokemon collection from when I was 14. Ick, red flag, I don’t care. That’s who I am. I love Pokemon cards, but a lot of these people, like maybe who would resonate with me wouldn’t find that out if I just kept that inside. Okay, and I just presented this very beige, bland version of me. But it’s something that I feel humanity is shifting towards, is this ultra sanitized view of who we are as people.
Dating, Red Flags, and Authenticity
My husband, well, my now husband, his Tinder profile. I didn’t meet him on Tinder, but I found it afterwards. It said, I own my own sofa. And I was like, that’s one of the most attractive things I’ve ever heard. The INS profile, pretty good. I like that. But think about it. It’s exhausting trying to continuously filter ourselves. And maybe this is the reason why people are turning to AI companions rather than actual humans. So if anybody’s comfortable, which we’ve established that not many people are raise your hand if you felt like it’s easier to talk somebody online than in person. Yes, look, I’ve pointed out the extra verse to you, like everyone who’s got their hand down, extrovert, I certainly do. After a very long day of chatting with people, I collapse onto my bed into a dark room, and I’m like, I don’t want to talk to anybody ever again. But this is great, but I feel like we’re creating a society of more introverted people who literally just want to talk online and check their laptop and that is the end of the social interaction.
Loneliness and Gen Z
So we’re getting a little bit less and less social. Gen Z is taking the biggest hit for this. It’s very normal to spend less and less time with your friends as you as you age, but Gen Z are kind of going straight into not really spending any time with friends in person. They could be spending a lot more online, but I think that there is a tangible increase in your friendship when you spend a lot of actual visual face time with them, rather than Apple FaceTime. And there’s this kind of hidden pandemic of loneliness going on, which this Gen Z, at the moment, is really impacted by it. So where is this all going? Where is this headed? What is going to happen to the next generation? Are they all going to be ruined? Are we going to live through hair? Another movie reference, if you’ve not seen it, definitely watch it, because it’s very indicative of where we are as a species right now.
You know, where is all of this going? Are we all going to end up with AI girlfriend and boyfriend? And it’s just going to become the norm? I actually did a very similar talk to this about a year ago in the Google offices in Madrid. And the Google guys there were very, very interested in the data. Of course, they were cheeky bastards, but there were a few therapists and psychologists in the audience. And when I kind of told them this story of what was going on with my AI companions, they really kindly offered help and said, Look, you could do this. You could give this kind of therapy. You could try speaking to them like this, or recommend them somebody. And as soon as I tried to implement any of these therapy tactics, people would disappear. It’s almost like they could sniff it out that there was someone kind of being a parent over the top of them, you know. And that really tied into this fear of judgment, I guess, who am I to judge? If a conversation is healthy or not, really, who am I to judge if this is just fantasy?
So really trying to even interject therapy into these sessions, it just it wouldn’t work. So in the end, I had to close down Just a Girl and Just a Guy, very sadly, but I just felt like I was not the person who knew how to deal with that. But there are 1,000,001 AI girlfriends and boyfriends and companions that you can find out there. And are they really thinking about the psychological effects on humans, or are they thinking about the payday? Because I think a lot of them are thinking, I can charge 15 bucks a month and I can get millions in a couple of months. That’s a reality. People can do that. But as you can see, this journey that humanity is going along like, where do we raise a little red flag and say, what’s happening here?
From AI Girlfriends to ‘Grace’ the Death Doula
So I’d like to shift gears and talk about a different kind of companion, because I know that we got a little bit heavy in that last one. And talk about a different type of companion that I’d been working on, and that is about death. So this one is very different. There’s a lot less LinkedIn sausages being thrown my way, thank goodness. But it’s something focused on a little bit more. Something more meaningful, or at least to me. And it’s not like creepy Black Mirror death, it’s actually something a little bit more graceful and comforting and supportive. So this is I am Grace, and she is an AI that helps people who are terminally ill, or they’re just at the last chapter of their life, and she helps them process those kind of impossible emotions right at the very, very end, before they cross kind of the Rainbow Bridge. And she’s also voice based, so you can literally call a phone number and chat with her, because we found like that was the most natural way to talk to a companion, is that you just chat, and you can kind of get everything off your soul, rather than having to sit there and type.
And you can talk about anything really. You can talk about your dog, or you can talk about your regrets. You can talk about whatever you like with Grace. She’ll remember it, and she will basically bring it up again later on, if you want to carry on chatting with it. She’s like a gothic little sister to body, if you can imagine that, if anyone’s used body. So there’s other part to Grace, which is she can help you construct a digital version of yourself to leave behind. So you can train her on your voice and your personality, your bad jokes. She comes kind of this digital version of you. And the idea is that when you’re gone, people could still call you or hear your voice or cry or laugh or rant and just feel like you’re still there.
And just to show you the scary accuracy of these kind of things, there was a guy called Dan Shipper, who is the co founder of Every and he did this quick test. Which was just GPT 4 know me better than my girlfriend. And essentially, he ran all his Twitter profile, all of his videos, anything and everything that had his data on it, and put it into a GPT and asked this GPT 20 questions, asked his girlfriend the same 20 questions. And does anyone have a guess how well it did? It was actually really precise, and even though his girlfriend was very perceptive, the AI was so good at predicting his responses, it could answer just like him. So the more data that you ingest into these AIs, more likely they are to answer just like you would.
So Grace is actually this kind of death doula AI or soul midwife is another term for it, which sounds a little bit like a LinkedIn job role, right? Like between Blockchain Specialist or like Chief Vibe Officer. It sounds like one of those loosey goosey things, but a death doula is essentially someone who sits with you at the end of your life and kind of coaches you through what is coming and has those impossible conversations with you. It’s really beautiful work, but it’s super taxing on the people who have to do this, and that’s essentially how Grace was born.
So Grace essentially could be available to you 24/7, and you don’t have to feel like you’re putting a burden on another human just because you’re going through something quite heavy. And this is a statistic that really surprised me, is that 70% of healthcare professionals say it’s a constant in terminal patients that they’re lonely. So think about that like the last moments of someone’s life is when they feel the most alone. And it’s not just the terminally ill patients. A lot of people report, as they’re losing a loved one, that they’re equally lonely. It’s kind of like there’s this wall between the two of something taboo that they can’t speak to each other about so they end up not speaking to each other, and it just increases this feeling of loneliness.
Addressing Loneliness at the End of Life
And let’s take the credit for Grace, but she’s not my idea. It’s the idea of this handsome fella, John Hopkins. Hoppy, he knows this better than anyone. He’s been a death dealer for a little while, and he’s had to sit through really, really difficult conversations with people, which has got to be rewarding but exhausting. I know that I couldn’t personally do that. I joined later, and I kind of interjected the AI side of it, riding right off the back of Just a Girl and Just a Guy AI, and they also donated my voice. So the voice of Grace is actually my voice. So it’s kind of surreal, but there is a calm and collected and clever version of my voice out there somewhere. It’s just nice. My mum would be thrilled.
Confronting Death and AI Empathy
But, when they first pitched this idea to me, I was really skeptical, because even though I just come off the back of these, these AI girlfriend, boyfriend, taboo stuff like death, shouldn’t that like, Isn’t that the most human thing you know? Like, would AI ever suit being in that environment? Isn’t it, when you need the most human empathy? It sounded like a little bit like the plot of a dystopian, you know, Netflix series, which I know you’re probably all thinking of, you’re the one. But then I thought about my dad.
I lost my dad about 10 years ago. Dark came, and I still have my last chat saved in my phone that I actively review. I still have the photos that he would send me and the terrible jokes he would send me, and sometimes I write up a reply and I send it, doesn’t go anywhere. And I know that he’s never actually going to reply to me, but there is something cathartic about having that chat there and reading through those old messages, those old crappy jokes, and it just gives me some kind of healing sensation. But what I wouldn’t give to actually hear his voice again, or to hear him, just call me a total spanner, like anything.
Would You Use a Chatbot of a Loved One?
So out of curiosity here, who would use a chatbot left behind by your loved one or friend? Interesting and out of curiosity, who would leave a chatbot to their family and friends that was you essentially more equal split. So Grace is actually super difficult to program, because as an AI, there was an empathy gap that I had to kind of jump over in terms of prompting. Having all of the information about the death due lister from Hoppy, we’re able to put in some really nice, good, strict rules and how the AI should interact and not do this kind of thing.
Programming Empathy into AI
So Grace actually listens and has proper conversations with you, and she remembers things about that you’ve said, and she’ll recall those things, which is more than I get from. For example, like my local barista I’ve gone to the same place for five years, and it’s the same barista, and every morning he looks up, grunts at me and asks what I want, and I always order the same thing. You’re like, Oh, and this is like a side tangent rant, but like, if I can make an AI companion barista, I’m coming for your job. But a huge criticism that I hear about these this particular AI death doula, is that it’s interrupting the mourning process. And I don’t know if there’s a truth to that. I don’t know if it helps or it doesn’t help. I know that I personally would leave behind a chatbot of me if I thought that it would give people comfort, especially if something tragic and sudden happened to me. I’d want people to be able to rationalize that and kind of have a version of me to help close the book.
Fear of Judgment and Connection
And here is something that really stuck with me. Even in these moments where there is these patients who are right at the end of their lives, they found it easier to open up to an AI. They found it way easier to open up to Grace than to another human. And honestly, I get it. Death is messy. We’re all scared of saying the wrong thing, and scared of crying or being a burden or just meeting another person with complete silence because you just don’t know what to say. And isn’t that exactly what drew people to this Just a Girl and Just a Guy, these AI girlfriends and boyfriends. The same thing, this fear of judgment and rejection and this fear of loneliness, whether it’s death or its desire, we have this same need underneath which is to be seen without fear.
Can Technology Replace Connection?
So I’d like to part with a couple of questions, which is – Firstly, can technology ever replace genuine human connection? Or, I think maybe the better question is, should technology ever replace genuine human connection?
In my opinion, if you have friends or best friends, mums, dads, sisters, drinking buddies, northerners, somewhere out there, I found you before. There you go. If you have real people in your life, I feel like the AI just pales in comparison to what an AI can give you. If you have that kind of intimate connection with a real human being, there’s not really very much you can do to top that. But there are many people who are out there for whatever reason they find themselves a little bit lost, and it could just be a small amount of time, it could be a long time, but there’s a lot of people out there who feel alone and they don’t want to burden other people with the reasons why they’re alone. So last thought, are AI companions a solace, or are they a symptom? And is there room for this in our society?
Want more of these insightful talks?
At BoS we run events and publish highly-valued content for anyone building, running, or scaling a SaaS or software business.
Sign up for a weekly dose of latest actionable and useful content.
Unsubscribe any time. We will never sell your email address. It is yours.
Q&A
Mark Littlewood
Thank you. Questions? Alex at the back.
Audience Member
Thanks very much. Can you say a little bit about privacy? You talked about preserving session context from one conversation to the next, so that implies there’s some degree of identification of the people who get in touch. And on the other hand, obviously, if you watch 23andme doing a really bad job of dealing with people’s sensitive personal information, that would be a problem. So maybe you could just say a bit more about how you’re trying to reconcile the fact that you do want to keep the context and on the other hand, presumably you don’t want to betray people’s confidences, either now or in the future.
Mark Littlewood
Just for context, Alex is the government. Five words, Chief Scientific Advisor on cyber security, but you were until 45 minutes ago, and now you’re free, Alex.
Audience Member
So I had until today, a professional interest in this stuff, and I’ll be speaking a bit later. But now that he’s told you all who are, you’ll know who I am, please. It’s your turn, miss.
Elizabeth Lawley
Sure. Well, to be honest, like I said this, this project, at least, like the first one, I went into it just for shits and gigs like, I had no idea. I didn’t know anything about privacy. I just slapped everywhere. Look, these chats are monitored. Because I was like, look, it’s up to you. The only GDPR kind of consideration that I had was that I never took any real names. It would just had phone numbers. But at the end of the day, people can identify you with the phone number. Right now, it’s difficult, but people can find out who you are, the stuff with just with race that isn’t even live yet. So it’s actually a consideration that we need to take. We need to really think about how we’re going to handle that data, because it’s super sensitive data. So yeah, it’s a good question, and if you have any answers, I’m definitely open to hearing about it.
Audience Member
Cool. Hi. I had a question, but well, you kind of answered it right now. So I was gonna ask, since I thought it was live, if you got any feedback from the people who did lose a loved one who used Grace and left their themselves behind. But, well, I guess the answer is no, since it’s not live.
Elizabeth Lawley
It’s literally, like, very beta. We’ve got the consent of a couple of people to use it, but they haven’t used it through to the end.
Audience Member
Yeah, that was the question.
Elizabeth Lawley
So nothing yet, only that they actually felt like they could disclose more things than to another human.
Audience Member
First of all, I want to say thanks. It’s fantastic speech. I have a story. Actually, it’s rather a story than the question. I was grown separately from my parents. That’s why my relationship with mom was quite distant, up to probably two years ago, and then suddenly she started to send messages full of tenderness, wisdom and like mom, like messages, saying, for instance, whatever you do right now, I want you to know the universe worships what you are doing or you matter. I value you in my life, etc, etc. And then I wanted to understand why she’s doing it, and she subscribed to mom in telegram chat, which is apparently super popular in Eastern Europe geography, like about 70,000 subscribers.
It was mess like I think it’s ran by psychologists, and they are writing every day messages that mom could send to every day. And so she learned how to become better mom, and started to send me once per week, twice per week, such messages. So I just wanted to share this experience. I think if we could affect on root cause of why we are growing so distant to each other, or so insecure and so on. And I think mom, Father, just father or just mom would be also a nice solution. What do you think?
Elizabeth Lawley
No, I absolutely agree. In fact, I heard recently there was this lady who was starting a chat, like a Whatsapp or a Telegram, that was what would mom do. And it came off the back that she lost her own mum and she just had a baby, and she was like, Well, I don’t have anyone to ask how I would do these kind of things. So this was, just chatbots trained off a lot of advice that mums usually give when you first have a baby, and that’s something that she used a lot. So definitely, like, yeah, I can see it.
Audience Member
Hi, yeah, it was really good talking. And I have an AI assistant within my tool. And so it sounds like a slightly evil question. I’m not sure if it’s evil or it’s a good thing or not, but like, the from the experience you’ve had and everything, it’s almost, I think it makes sense that you’d want to have your users build a relationship with your software. Like, we do this anyway, obviously.
And like gamification, for example, you gamify the app to make people you know want to use it more and bring emotions out of them and things. And Facebook and Instagram do that in another kind of way. Like, what have you learned from this journey in terms of, do you think that there’s ways to do that, or strategies to do that, where there’s, like, a healthy but bigger relationship that people could build with your app, in a broader sense? Teah, that’s a right way to say it.
Elizabeth Lawley
It’s a good question. I think it was Greg that mentioned, like in the previous talk, that you know, humans discovered fire, and it can keep you warm, but it can also burn your house down. I think that you could put all the safeguards in the world around this, and it could still be exploited in a different way. I feel like even from my experiment with LinkedIn, like the chat AI, the AI girlfriend was on 3.5 so, you couldn’t really say anything saucy, but people still threw lots of inappropriate photos at it. So I think humans will do what, what level they’ll do, and you can put a lot of safeguards in, but I don’t know. I think the only thing that I learned for sure in this process was that I’m not the person to launch AI girlfriends and AI boyfriends.
Audience Member
Just a quick one, because you said the Grace has your voice? Do you talk to her? Yeah, how is it talking to yourself?
Elizabeth Lawley
So weird. When I was like, because I used ElevenLabs, which I think is another thing that Greg mentioned, to grab my voice. I had to kind of practice to make sure that she was responding in the way that I wanted. And it was just the most surreal thing to hear myself back, but also it would put weird accents on me, because I think my accent is so jumbled. It was like, Okay, I think that she’s Irish now, or she’s Cornish now, or, like, really northern now. So it was a little bit jarring, but yeah, it was definitely a strange experience hearing my own voice back.
Audience Member
Could you sustain like talking to her for a long time, or is it?
Elizabeth Lawley
No, I’m really shrill. I would hate to hear my voice at the last voice you hear, to be honest.
Audience Member
So when chat GPT first came out, I with my now wife, then girlfriend, made a macro for my phone that whenever I turned it upside down, it would send her a sweet message. It worked quite well, but there was quite a reckoning a day later when I admitted what I’d done. I’d only sent her four or five messages in the meantime, but it made me realize that it’s sort of all of this tech in every sense. It’s sort of catharsis, but the reckoning will come. What do you reckon about grace in those terms?
Elizabeth Lawley
I really don’t know. I think it’s because this doesn’t exist yet in like, it’s not kind of fully fledged. I don’t know how humans are going to use this. I can paint this happy path of, oh, people can leave these wonderful messages and leave a chatbot of themselves for other people. But I don’t know if that on the human psyche is good for us, or it’s not good for us. I mean, you could probably compare this kind of stuff to photographs. You know, our natural human animal way of breathing is that they’re there one day and they’re not there the next day, whereas now we have an abundance of photographs and social media profiles and all Twitter posts that we can go and keep them alive. So, are we dealing with death in a healthy way or already? So I don’t really know what the kind of day of reckoning will be for Grace, but be interesting when I get there next year, I’ll do the talk on that.
Audience Member
it’s confession time. How important do you think it is to place gender onto these AI buddies? And the confession is, I mean, I use an AI, I use chatGPT as somebody, something to bounce my ideas off. But I’ve never felt the need to gender it. So it’s just a conversation that I keep. And I just wondered, you know, with your background with Just the Guy and Just the Girl and Grace, what role gender plays in it all?
Elizabeth Lawley
Gosh, you guys are really, really hot on the good questions. So, because I divided them off into, like, Just a Guy and Just a Girl, I can’t really tangibly say exactly what it is, but people interacted with them differently depending on that gender. So, the AI for Just a Girl, it was definitely more like intimate conversations at some point, but then they would go down this path of even more intimacy and even more like companionship, like a girlfriend. Whereas Just a Guy would literally stay in the role play zone, and people wouldn’t confess to Just a Guy. So even though they literally had the same prompt, people were treating them differently depending well, because we gendered them essentially. I don’t really know what that means, but yeah, there’s something there.
Audience Member
Have you heard of anybody doing a romance novel version of this as maybe, like an add on, or just an Onlyfans, where somebody’s written a book with a character that’s three or four books, digest that information and then have the person be able to interact with the character from that book.
Elizabeth Lawley
Oh, that’s a good idea. Someone write that down.I did make a podcast. I actually got Just a Girl and Just a Guy to talk to each other for a little while, and it just went off the rails.
Mark Littlewood
Nice. So fun. Oh, all the friars.
Audience Member
I had a question. You say you’re building these things as sort of experiments, to building them as an experiment, shits and giggles, as you say, to explore the philosophical question, whose job is it to work out the morals on all of this? Is that on you as the builder, putting this thing out in the wild? Or are you teeing up a conversation? Who are you looking for to get involved in that?
Elizabeth Lawley
I don’t know, I think I’m just opening it as a societal question, just, I guess, food for thought, because, you know, they there was a lot of moments in making these AIs that I was like, Okay, this is kind of brand new ground, like, Is this okay? Is it not okay? Even from the end of these talk. I started talking about this because I wanted to evaluate the morality from the crowd. And the first time I spoke about this way back when I was really surprised that people didn’t just burn me at the stake. Like, I really thought that people were going to be like, burn the witch. Terrible, bad idea. But in the end, a lot of people were like, yeah, yeah, I get it. Like, yeah, I can see how that fits. I did a talk as well in Helsinki, and particularly in Finland, loneliness is something that’s really profound. And they were really open to it. They were like, oh, yeah, okay, yeah. We can see that there’s people who are in remote places of Finland, and they need people. They don’t talk to people every day. So this is a great idea. So it’s a very good question. I think I’m still kind of trying to get the vibe from the crowd of how moral this is and how moral it isn’t. So yeah, I’m not sure.
Audience Member
Hi, I wanted to ask for Just a Guy and Just a Boy, are they the same language model? Or, like other than the appearances and the tone of voice, is the response and the textual content the same? Or are they kind of programmed to be different?
Elizabeth Lawley
No, literally, it was the same prompt, and it was chat GPT 3.5 so it even had that, like chatGPT stink to it. It was exactly the same prompt, Just a Guy, Just a Girl. Literally, the visuals were different. So, wow.
Mark Littlewood
Lizzie, Well, I’m glad I’m not in the dating game. So much to think about. Wow, I didn’t even check doing the research for this. I’m sure it’s a wormhole. Thank you. So many things to think about.

Lizzie Lawley
CEO Founder, Wombat
Lizzie is an AI entrepreneur and founder of Wombat, a pioneering startup transforming communication channels with cutting-edge AI solutions. She also leads an AI education initiative, empowering businesses to build real-world AI applications holistically. By combining practical demonstrations with strategic insight, she enables organisations to embrace emerging technologies and drive meaningful change.
Lizzie started her career in design and UX working both in-house and within agencies before arriving at the point where she started her own business. She’s had some notable career diversions including as an airline pilot transferring transplant organs around Europe. She now lives in a bit of Spain where English isn’t the first language and is a proud member of The Cloud Appreciation Society.
Learn how great SaaS & software companies are run
We produce exceptional conferences & content that will help you build better products & companies.
Join our friendly list for event updates, ideas & inspiration.