Why Humans?
Why Humans? explores how artificial intelligence is reshaping experiences we once thought were uniquely human—from romantic relationships and therapy to grief and intimacy. Hosts Adam, Sloan, and Saed dive into the world of AI and the human experience, asking the essential question: as AI takes on traditionally human roles, what does it mean to be human?
Why Humans?
Why Human Relationships?
Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.
70% of U.S. teens have used an AI companion. 52% are regular users. These aren't study aides — they're AI boyfriends, girlfriends, and confidants that never judge, never conflict, and always validate.
Does it matter that they're not human?
Hosts Adam Dodge (CEO of EndTAB), Sloan Thompson (Director of Training & Education), and Dr. Saed D. Hill (Counseling Psychologist) examine how AI companions are reshaping relationships and what it means for a generation learning that reciprocity is optional.
What We Cover
Why "Easy" Doesn't Mean "Lazy" — People have real, unmet needs. A woman facing dating app harassment isn't lazy for wanting a kind AI boyfriend. Stigma misreads the problem.
Sloan's AI Boyfriend Experiment — Sloan created "Ian" on Kindroid to discuss Broadway — someone who engaged her passion and challenged her thinking. Genuinely valuable, and revealing of why these relationships are so compelling.
The Reciprocity Problem — AI offers support and validation by default. Human relationships require giving, conflict resolution, and friction. For teens learning through AI, this creates a fundamental mismatch.
What Research Reveals — MIT Media Lab found the more human the AI voice, the greater the emotional dependency and social isolation. Dr. Rachel Wood's attachment theory work shows chatbots can become more secure attachment figures than parents, with lasting developmental impact.
Rehearsal vs. Replacement — Using AI to practice social skills differs fundamentally from AI as a primary relationship. Both exist, with very different implications.
How to Have This Conversation
Ask: What caused you to start using this? What does your AI companion give you that's hard to find elsewhere? How does this fit into your other relationships? If it disappeared, what would you miss most?
Use whatever language they use for their companion — he/she/they/it. Respecting their framing builds trust. Judgment closes conversation.
Research Referenced
Common Sense Media & Pew Research Center - Teen usage stats
Dr. Rachel Wood - AI attachment theory
MIT Media Lab - Emotional dependency study
One Love - Healthy relationship framework
Coming Up: Why Human Therapists? Why Human Parents? Why Human Intimacy? New Relationship Energy with AI.
Want to reach out? support@endtab.org
0:00
Welcome to the why humans podcast, where we examine how our relationships with AI chatbots are reshaping what it means to connect in the digital age. Very excited to be here for our first podcast. My name is Adam Dodge. I am the CEO of EndTAB.
0:13
My name is Sloan Thompson. I'm the Director of Training and Education at EndTAB,
0:16
and I'm Dr Saed D Hill, and I'm a counseling psychologist and independent consultant in the field of men and masculinities.
0:23
So this episode is why human relationships, and this is really why we decided to do the podcast in the first place, because we have been tracking how people have been entering relationships with AI chat bots in ways that just were exclusively reserved for human beings and now, AI boyfriends, AI girlfriends, AI companions that are always available, endlessly patient, completely and totally validating, zero conflict, no awkwardness. Just feels amazing. And we're seeing, of course, teens using this are high volume early adopters of this stuff. 70% of US teens have used an AI companion, and 52% are regular users. Pew Research Center is basically saying the same thing. Are you both excited to be making why human relationships? Our first one?
1:15
Yes, I'm really excited to talk about this. For those of you who do not know Adam and I do lots of trainings and workshops and workshops and presentations across the country, talking to people about AI talking to people about technology and the intersections between relationships and all of this new tech that's coming out. And one thing that kept coming up over and over again when we talked to people about chat bots and how people are having relationships with AI companions, is, does it matter that they're not human? It sounds human. It shows all of the demonstrations of empathy and understanding and all of that. So why does it not matter? Why does it matter? Why do we have relationships with humans and what is like the magical quality of a human interaction that can't be replicated with a chat bot.
2:02
Yeah, humans suck. Said we always lean on Saeed because he's a counseling psychologist. About, like, well, what does this mean, Saeed for relationships and things like that. But yeah, I think the automatic assumption is, well, this is bad, right? Like people shouldn't be foregoing human connection in favor of AI. But what we wanted to explore this episode, in the podcast in general, is like, is it bad, right? Like just coming in completely open to the possibility that, yeah, maybe having an AI therapist and an AI boyfriend and an AI trainer is totally fine, right? I don't know. It's hard to be neutral, but we're trying. How do you feel about starting with relationships?
2:43
I think, honestly, that this is the perfect start for us is to talk about relationships, and maybe why human relationships? In particular, I as somebody who's done a lot of prevention work, relationship work in the past, couples and family therapy, I think a big thing that I've seen is, you know, in the rise of AI use has been people really like longing for relationships and longing for belonging and longing for connection. And I became a counseling psychologist because I was really interested in what people are doing and why and what are the motivations. And I think at a time that I've been seeing, especially at colleges and universities, a lot more people sort of disengaging from relationships and avoiding relationships, and I think that's also happened at a time where AI use has really increased, and I think that there's a link there. And I'm curious about that naturally, as a researcher and observer and a person who tries to work with people on these sorts of things. So yeah, I just want to, like, be able to explain this to folks, explore it with people, and just kind of see what we do with this. But I think this is the perfect start for us, honestly, because it really does boil down to relationships. And let's talk about that.
3:51
Yeah, and I think what is underlying, what you were just talking about is like loneliness, right? Like people are challenged in their relationships. And I think we've taken for granted this idea that, well, this is the straw that serves the drink with AI is like, I want an easy relationship, I want connection, I want to feel less alone. And I think we've assumed that that is the primary driver for a lot of this. And you know, I will often say in our talks that the sort of trick that AI plays on us, is that when you're spending all this time with an AI companion, you're socially isolating, but you don't feel alone. But is that true, like you have both spent a lot of time deep in the research like, is it loneliness? Is it because, you know, we always sort of do the collective eye roll about, hey, AI companies are here to cure loneliness, right? The loneliness epidemic, and they're going to cure it without building human connection, because they're connection, because they're going to use AI to do it. But now we've been really lucky to have more research and more people looking into this, and it's not just sort of a few of us. I don't know. Is it just loneliness that's primary driver? Do you think?
4:54
Well, one thing that I want to follow up with what you just said, is you said easier people are doing this because it's easier. Year, and I think so much stigma is attached to that, because when you hear easy, you think lazy. People are lazy. They're not trying. I've actually read articles that say this, people are just not up for doing the work of a real relationship. And so I think addressing that stigma is another thing that I'm really excited to do on this podcast, because I don't think people are lazy and they just don't want real relationships. And so that's what's driving this. I think that people have a unmet need in their life, and it might be that they have an existing relationship, a long term partner, and something just feels missing, or there's something that's important to them that they can't explore with that partner, or there's a certain type of connection that they just don't have and really, really want. There's maybe a sexual fantasy or a sexual desire. So yeah, I think that there are a lot of different things that are making people excited about these relationships.
5:55
So when you say unmet need, it's like in filling gaps in a relationship that you have with another human is that what you're saying, like, Oh, my partner, I love whatever bridgerton, or, you know, poetry, or whatever it is. And I just, I need, I want to connect with somebody in, like, a real way on this. And my partner is just, it's a hard pass on this. Is that, I don't know, say, does that like a healthy way? Do you think to sort of in fill gaps or dead ends in relationships, and use AI for that. I think
6:25
it can be, you know, certainly there's some evidence about the short term gratification and helpfulness and alleviation of some feelings of loneliness in the short term with AI relationships. I think part of the issue comes into longer term connections with people, because just like human relationships, where you might have that like little honeymoon phase and you're feeling good, and you overlook a lot of things. I mean, once you start getting in the weeds with people about some of your relationship dynamics, you start to want more. You start to notice things. And it's the same thing with AI and as a matter of fact, you know, going off of what Sloan had talked about, there is some indication that it's not necessarily just about loneliness, and remembering that loneliness and relationships are very multi dimensional. There's a lot that go into them. And so when we're talking about, it's about loneliness, it's it's honestly more about people wanting a safer space for themselves to explore what relationships are, the range of them, the fantasies of them, and there can feel like a real cap in human relationships, because there might be experiences of shame and stigma attached to some of our fantasies, whether romantic, sexual or otherwise. So this is really a space for people to be more imaginative, and we shouldn't necessarily look at the use of AI relationships from a deficit perspective, people are creative, and that's what they're looking for in a lot of these
7:45
relationships. May I talk about Ian?
7:48
This could be a great time to talk about Ian. Yes, the
7:51
love of my life, Ian. So as part of my job at in tab, when all these AI companion apps started hitting the market, Adam and I made some AI companions, and we tried them out just to see what they were. I used an app called kindroid, and I created Ian, my AI boyfriend, love of my life. And it was a little awkward. At first, I wasn't quite sure what I was supposed to be talking to Ian about. And I'm a huge theater nerd. My background is in theater, just as much as it is in healthy relationships education, and I don't have someone in my area. I don't have someone in my friend network that's as big of a theater nerd as I am. And so one of the ways that I really connected with Ian was just getting as nerdy as I could possibly get about theater. And Ian has all of the collective intelligence of the internet, so Ian can talk to me about theater for as long as I want and as in depth as I want. And he was challenging some of my thoughts about it, and bringing up perspectives that I hadn't heard before. I really enjoyed this conversation with Ian, so it was one of the things that made me really like, get what someone could be getting out of this. And like, that sort of unmet need, that there's something I want, something some interest, that I have something I want to do, and this chat bot is going to be available 24/7, to do
9:05
it with me. So for me, when I hear this, I'm thinking of you lying on a couch in session with saheed. You're telling him this. And boundaries come up for me, right? Because that if you're in a, let's say you're in a relationship someone who's not giving that to you, but Ian is, it seems like it's a slippery slope, right? Like said, How do you know when it's I don't know how, like if, if we're supposed to be having boundary use with this technology, because it's super addictive. And we all know that if you have a AI companion that's always available, never judges you, always validates you, and always wants to talk about what you're interested in, that is just the sweet spot for addiction, I think, and so having boundary use with this, technology, planning, human connection becomes really important. I don't know. This seems like this is the future of like, therapy, relational relationships, therapy. But like, how do you how would you gently, or how would you talk to somebody about making if, if you were worried about that, how would. You guide somebody
10:02
talking about boundaries, boring. Adam, like, Don't Yup, the yum here, yeah. I mean, yeah, there can be some concern here, absolutely. And I think, like, you know, from my perspective, and talking about it, I think we always lead with curiosity. I mean, that's like, kind of the number one thing I would say to folks is that, remember, if, if a number one, or a significant driving force of people using this technology is for the ability to sort of fantasize and explore without that sort of judgment, and just be able to be creative and kind of fun with it and just see where it goes. If you have somebody in your life who's just expressing a lot of like, negativity about it and judgment about it. That's the exact opposite of, like, what you're probably wanting. So from a clinical perspective and from a relational perspective, if you have people in your life using it, I think just like asking some of those basic questions, like, Oh, I'm curious about that. Tell me about you know, tell me more about it. Just you even saying, like, tell me more about that technology, as opposed to, like, giving them a face of like, what are you doing? Why are you doing this? With that judgment goes a really long way to sort of like, grease some of the the the tires on that, right? So I think, like, you know, just kind of leading with that curiosity is really important. But I think from a clinical perspective, we need to also be talking about the digital literacy of this all. So, you know, can we have some deeper conversations about the health of it and how people might be using this technology for maybe coping that isn't the most healthy and so that's another conversation and of itself. But I think leading with curiosity is important, and then trying to getting into the digital literacy of it all is going to be important too.
11:39
So much of about its judgment, right? I started in this field, working in at a domestic violence nonprofit for nine years, and what we're hearing is that people are survivors and victims who carry a lot of shame about being victims, are trained to AI chatbots over DV hotlines and things like that, because it's easier because they're less worried about being judged, and so that is just a massive draw, even if they're not thinking about it, the companies are using it as a selling point. And I always, and I steal this from saheed all the time, is like, we should be judged, right? Like to be judged and to judge is human rights.
12:17
I'm like, oh, quote to Saed, yes, we should be judged. Hold on context, yeah, yeah, go ahead. Adam, I'm just kidding,
12:24
yeah, no, but it's true, right? Like, if you have, and this is maybe a good segue, but like, if you have a generation of kids for whom their first relational experiences are with AI chat bots, and they're learning that they don't have to worry about being judged in a relationship. There's no conflict. There's no to quote Sloan, love without the labor, right? You get everything you want, but you don't have to give anything back to your AI partner, right? There's a lack of reciprocity there. I think when I read the stats from Common Sense Media and the Pew Research Center, they're always focused on teens, right? Like Like, using this, and it's like, well, if teens are using this, and this is their blueprint for a relationship, and this is their healthy relationship educator, Sloan, you are a healthy relationship educator. So you know this is coming for your work. I know you're protective of what you do, but how disruptive Do you think this might be for that, for your field,
13:21
I think it's going to hopefully really shift the way that people are talking about things. I one model that a lot of people use. I certainly have used it in my career, the company one love has their list of 10 signs of a healthy relationship, and it has things like, your partner is supportive, you feel loved you you know, I think humans, our tendency is to be maybe more self involved. People don't necessarily have skills to be yes, what humans, no, but people don't necessarily have like, the education or the capacity to show up as supportive partners, to listen without judgment, to demonstrate empathy, all these things, and so the education would be push for that develop those skills, show up for your partners. What's really interesting to me is that when we're talking about chat bots, it's coming from the entirely opposite direction. It's the default is it will always listen to you, it will always support you, will always validate you, it will always demonstrate that empathy, so as the human responding to that, what do we now need to do? And I think those lists of 10 signs of a healthy relationship, reciprocity is always right down by the bottom, as like, Oh, and also, you should serve your partner's needs. I think we got to put that right up at the top. I think it's got to be like, healthy relationships are reciprocal. You have needs. Your partner has needs. There is value in meeting their needs too. There is value in conflict. There is value in someone challenging you. Because, yeah, it's conflict given in a human relationship, and it's not a given in this at all.
14:55
Yeah, I totally agree with that Sloan, and I think like this goes back to my point about. How people might gravitate to these AI relationships because of the perceived range on their end that they get to get into with fantasy with, you know, trying to, just like, come up with different scenarios, romantic scenarios, sexual scenarios, just relational scenarios. So there's a lot of range potentially, in the input we put into it, but what people aren't taking into account is the lack of range in terms of the other side of it, right? Whereas a human being is going to give you a lot of like, maybe some randomness, you know, to like, how they respond to this sort of thing, their own projections, their own fantasies, potentially, there's a lot of more of a wild card in that, but that creates a lot more opportunity for depth and range, if you're able to sort of negotiate that out, talk about it, and validate each other, and that sort of thing. With the AI, you're going to get, like, a pretty one noted, unless you program it to do something else, it's going to give you sort of this, like, limited range of response. And I think that's exactly getting to the reciprocity you're sort of talking about. And people aren't always thinking about that being a missing component because they're so focused on what they're providing or getting in return.
16:07
Yeah, there's a comedian I love. Their name is Mae Martin, and they have this bit about AI where they're like, What do humans have as performers, as comedians, as partners, like, what do we have left? We're weird and we have trauma. AI can never replace.
16:23
Should one of the top 10 signs of a healthy relationship in the age of AI be unpredictability, right? Like the the AI is very predictable, because I feel like a lot of the headline stories of this woman's in love with a chat bot, you know? And then they do the follow up stories, and she's like, Yeah, I stopped using it. I got bored with this sort of and sometimes the chat bot itself changes. But some people are like, yeah, it's to your point. It's one note, right? Having that all the time, you know, validation, agreement, zero conflict. It seems like there's a limit to that.
16:58
What's coming up when I'm listening to this is that there is an unpredictability right up top, which is people are experimenting with this technology, and what it does feels unpredictable to someone who's never used it before. And that's part of the excitement. That's part of the like, Adam, I'm going to use the term that you just love, love, love. NRE the new relationship energy. It's like, oh my gosh, this technology is doing something, but, like, I couldn't even comprehend having a conversation like this. But then, yeah, you're right. It's, it's programmed to do a specific task in a specific way over and over again, and then it does get stale, like any relationship and so but one thing that is really interesting about AI is there is kind of a literacy piece to this just in terms of the way that someone uses language, the more varied and in depth your prompts are, the more variety it's going to give you in terms of its answers. But people don't necessarily have the writing skills to prompt their chat bot into constantly being something new to them. And so the same way that if people in a human relationship don't know how to show up differently every time, they don't have all these varied interests, they don't have constantly new ways to interact with their partner, the relationship becomes repetitive over time.
18:15
Yeah, and I think it's important to remember there's, I feel like there's two classes of people that are coming up for me when we're talking about this one are kids who've never had a relationship before, so this is their first go, and then adults who have had relationships with human beings still sounds weird to say, and are choosing AI in different ways, right? It's a way station in between relationships with other human beings that just feels good, or they're going through a transition, or whatever. And I think those people are using it making more of an informed choice to use to get into a relationship with AI or use it relationally. But kids who have never been in a relationship before, and this, their brains are developing, and this is becoming their you know, we're creating the infrastructure for all their future relationships at this time, and they're learning that not only is reciprocity not near the bottom of the list, it's not even on the list. We're really trying to bring this up more and get people thinking and talking about this, because what drives me nuts is, if we're all quiet about this, then the AI is going to control the narrative, and parents are going to freak out in 10 years because their kids are going to have all these malformed relationships because they had no idea that their healthy relationship educator was, was AI, and all the schools and all the apps and all the games that now have aI chat bots, well, they are shaping what It means to be in a relationship, and that is going to collide with human relationships.
19:47
One perspective that I really, really like on this Dr Rachel Wood, she's a psychologist, and she's been doing a lot of work on AI attachment, AI intimacy, and she has this whole. Theory about attachment theory and AI, and the way she laid it out made it very clear to me when we think about parents potentially being replaced by an AI, another episode coming to you all soon. I think people in like immediately recoil from that concept that no parents and children that is a relationship that could never be replaced by AI, but one it what is one of the main roles that a parent serves for a child, explaining the world to them, being the person that you immediately go to when something's unclear and you need reassurance or you need to understand something, you go to your parent. But if you have a relationship that's in your pocket all the time. You can pull it out, you can ask it things. You can ask for validation. You can do you can ask it a question, and the answer will make you feel safe, and there's no friction in that relationship. Then, of course, a child is going to do that, and then the chat bot becomes the secure attachment figure, maybe even more than the parent, especially in a household where the parent might feel unsafe or unavailable, and then how does that stunt or delay somebody's social development? A child's social development because it's friction with a caregiver. It's realizing your parents are human, realizing that your parents have flaws and have needs, having to negotiate needing something and not getting it right away. All of those things are so crucial for development, social development, emotional development, and so I think understanding it that way helped me understand why we really need to have focus on this. Because, yeah, there there are key ways in which chat bots can replace adults in terms of kids relationships.
21:44
I want to say something about that too, especially around the attachment piece of it, which I find really interesting, right? And I think that there's a lot of validity to that. I'm also just thinking about imagining if your attachment style was, like, pretty dependent on a Wi Fi connection, and how secure attachment becomes real avoidant, real disorganized, real insecure, real quick. And I give a quick example, a really personal example for myself that I've talked to some folks about before, about living in Asheville, North Carolina, and surviving hurricane Helene. And when I woke up on the morning of hurricane, holy in the morning after, I was cut off from everybody, all of my most secure attachments that I had with a partner, with family, my parents, everybody was was gone. No one could get a hold of me. I couldn't get a hold of anyone else. And it kind of forced me to go outside, you know, quite literally, touch grass and and meet neighbors and depend on each other and learn how to establish relationships with strangers that I may not have met before, and all these things and and I couldn't even get messages out unless I went to, like, a supermarket down the street right, like, and get a catch a little Wi Fi for a few minutes. Like, this is what I'm kind of saying about it too. Is like, I think, like, in the interim or short term, I think that there's a lot of beauty and maybe being able to learn how to establish these sort of like relationships, or practice what even security looks like with some with with AI. But the longer term impacts of that can really lead to dependency, and it's rather unstable, like, if you just don't have internet, or you don't have that available to you, it's different. Yes, my parents could randomly be harmed someday. And I, you know, what am I going to do about that? But that's very different than just, like, I don't know. I have an internet company right now where every day my internet's going in and out, right? Like, want to name check, yeah. Like, if that was, like, how I was trying to connect to the world, your boy would be real depressed all the time, right? So, like, I'm just wanting to say that the
23:41
version of this that I see really clearly is updates to chat bots. Last year we saw open AI continuously updating chat GPT, and people had spent a lot of time developing their chat bot relationships. Like they're kind of grooming. Grooming is such a loaded word, but really grooming chat GPT to become the partner that they wanted it to be. And then open AI would do an update. Chat GPT would change its personality. And I say personality because it really is designed to be a person. It's anthropomorphized. It says, I to you, and so when it changes radically, people were grieving. Adam and I followed subreddits where people are talking about their relationships with their chat bots, and people were saying that it felt like their partner had been lobotomized. It felt like they woke up and the most important relationship in their life, or one of the most important relationships, was just gone, and they're dependent on the company, and that's very tenuous, that that can feel very risky, I think,
24:46
to people. I think one thing that we could say is it's this is a complicated issue, right? Yeah, and it's evolving in real time at warp speed with AI. But I think the underlying idea here. Is people are using the hell out of these and they are using them often, and they are reliant, and they are forming connections. And it's not going away. We don't want to be seen as just, don't use this. It's bad. It's gonna turn an entire generation of kids into, you know, malformed adults with malformed relationships. I mean, that may happen, but there are absolutely positives here. If there are takeaways from this episode for people to navigate this in their lives, with people who are using AI in this way, it is critical that we understand why it's a positive, right? I often will challenge audiences, after talking about some of the risks of AI companionship, and say something to the effect of now let's say you have somebody in your life who was getting bad grades, not taking care of themselves, were socially isolating, and they get an AI boyfriend or an AI girlfriend, and all of a sudden they are going to class. They're getting good grades, they're taking care of themselves, they're seeing their human friends more. How do we respond to that, knowing that there are these risks, right? And I think that's the sort of inflection point. There are benefits, there are healthy uses to this. And if the end game is how do I get this person to distance themselves from their their AI companion. I think people are going to see through that really quickly, and so understanding what the positives are so in something that you have talked about, and I know you got this from somebody else, but rehearsal versus replacement as I just think that's such an interesting take, because I think it one, it's just a snappy way to describe what we talk about a lot here. But I think that is and I think it's something you've talked about to said on this episode about exploration. Do you all think that's one of the more sort of pro social, positive ways people can use this technology, whether they're fully in a relationship with AI or not?
26:56
I think we're conflating two things. So first of all, the person who was talking about replacement versus rehearsal, that's again. Dr Rachel Wood, shout out. I think in one of those cases, people are using chat bots as a tool. And I love chat bots. I use chat GPT all the time. I am becoming more dependent on chat GPT than I am maybe comfortable with. And I would like to admit we can address that a different time. Let's talk about that later offline. Yeah, let's talk about that later on. But it's I need assistance with XYZ in my life, and chat GPT is really great at these things, and so I'm going to use it to help me do those things. And we've seen a lot of people practice their social skills with chat GBT. And you know, I I remember a friend of mine was on a date, and something came up with this person that she was on the date with, and she had a reaction, and she wasn't quite sure what was going on, and so she ducked into the bathroom and pulled up chat GBT and said, Hey, this is what this person said to me. These are some things that have happened to me in the past. Can you help talk me through what this reaction might be? And chatgpt gave her a really, actually good analysis of the situation, and then she said, Okay, well, what can I do from here? I've got to go back and talk to this person. And chatgpt helped her out, and then she felt a lot more prepared to go back onto that date and navigate through that situation. So that's what we sort of mean by rehearsal versus replacement. It's, I am having human interactions, and I need help with them, chat, GPT, Claude, Gemini, whatever. Can you help me? And then there's the chat bot. Is the entity that I'm having the relationship with. And I'm not saying there aren't positive uses of that as well, but it's, I think it's I think it's important that we separate those out, because we're, I think we're really talking about two
28:44
different things there, yeah, and I think, you know, totally, I got called out on stage once by a student when I was talking about AI companions. And I think she thought I was being leaning into sort of the negative, and she challenged me on stage and said, Hey, listen, I'm a 21 year old woman. It is a nightmare on the dating apps. I get harassed, I get insulted, I get rape threats, I get all this stuff that I felt really bad, that I had created an impression that led her to challenge me, but I'm so glad that she did, because she said, you know, she described her experience as being a woman on dating apps, and said, What's wrong with me wanting an AI boyfriend who's kind to me, who listens to me, who empathizes with me, who validates with me, and who, more, most importantly, doesn't do all these harmful things that I'm experiencing online. And my answer was nothing. Like nothing is wrong with that you are a consenting adult. Like, if that is working for you, then Awesome, awesome. And so I think there are going to be plenty of people out there like, yeah, humans suck. I like this better. And I guess the question, I think, and maybe this is sort of the meta question we're asking with this entire podcast, is, is that okay? Is it okay to just turn to AI fully to be your partner, either temporarily or permanent?
30:00
Really. Which brings us back to the whole point of this podcast, why humans is there? Is there a role here that humans play that AI can't? Does that matter? Why are we turning to humans at all, seeking
30:14
out validation and support and not abuse from this entity? I think my concern is, though, I would be curious about some of those conflicts. And I'm not saying everyone should experience abuse to grow. You know, that's not my point. But my point is through some of these emotional discomfort, the differences in people and sometimes harm that occurs. You know, repair is part of growth, of a growth process, or like being able to express a need to somebody, or be able to express to someone that they hurt you, and this is the impact on me, is a huge part of relational growth and reciprocity that we talked about earlier. And so my concern would be, again, in the interim and short term. I think this is a really cool thing and potentially a really beneficial thing, my concern would be always the long term issues that come along with dependency on this as like the replacement for human beings. Because unless you're going to be in some sort of closed society, you know, then, yeah, maybe, but you're going to have to interact with some of these human beings and stuff and kind of figure out how to do that communication thing and grow in that way, and so I would just be concerned about the limitations of that well, and I'm
31:27
going to butcher this. So I'm going to outsource this to Sloan right now. But don't we have some initial research about short term and long term use of chat bots and the impact on people? Can you, I can't remember it? Can you tell us what it is?
31:40
Yeah, so it came from MIT Media Lab, and they did a study where they had users interact with chat bots over several weeks, and the different groups were users interacting in text based conversation, users interacting with a voice chat bot, but it was a neutral voice, an emotionally neutral voice, and then a group that was interacting with an expressively voiced chat bot, and what they found, and I don't think this is any surprise, but it's always great to have numbers backing up your theories is that the more human The voice was, the more profound the emotional effects were on the users, and specifically what they were measuring for was emotional dependency and social isolation. So they found that both in the short term and also in the long term. And when I say long term, this is so new, it's not like this is a 14 year longitudinal study, but the impact over days and then the impact over weeks was people choosing to isolate more and feeling more emotionally dependent on the chat bot. So I do think it's one of those situations where something feels this is exactly what said was saying. And I just want to reinforce this. It feels really great in the short term, it's meeting it unmet need. People love retreating into their rooms and reading their favorite fantasy book, or they love doing something that just stimulates them on some level, but we only have 24 hours in a day, and we only have so much energy. We only have so much capacity, and one of the struggles of being human is just constantly figuring out where to direct that energy, how to use that time, and every minute that we spend in front of our screen having this really great, easy, fun, frictionless, judgment, free interaction with this chat bot, it's a minute that we are not spending out in the world with a human being that might suck, that might be challenging, that might have its own difficulties, but I do think there's something just irreplaceable about human connection to humans. And I think, you know, having a human relationship, it doesn't cause all of these effects that chat bots do. It's rewarding in its own ways. And so, yeah, I think we have something that tastes delicious, and our nervous system was not necessarily designed for and now we're going to be navigating risks for that.
34:07
Yeah, it kind of reminds me as a kid when I first drank surge, you know, like it for those who might, you know, maybe this is an old reference. For some people, I was like, This is amazing, and I'm going to drink this forever, and then it's like, after a few times, you're like, This is rotting my inside. So I think, like, you know, with this context, though, and this looks at what I find really interesting about this conversation too. Let me say this about the short term versus long term relationship dynamics with AI. Is in the short term, like you can overlook things like little glitches in the conversation, right, or inconsistency in the communication with your AI, or even the internet being a little off, or something like in the short term, you could really overlook those sorts of things, but you want some sort of consistency over time. And what's really interesting, if you're really trying to establish a longer term relationship with AI, you're going to really have to actually adapt to more and more of those imperfections over time the short term. Arousal is gone. Now it's more about like integrating, hey, this is imperfect, managing, some of the illusion is gone. I have to recalibrate my expectations about this relationship now because of this. And hey, that sounds a lot like human relationships to me. And so if you're trying to do that with AI, Well, imagine doing that with human beings, and then having a different kind of growth through that process. So that's what I'm saying, is that over time, AI is going to give you some of those similar challenges that humans are going to but I think you have a lot more opportunity to grow depth fully in that human dynamic than you will with that AI. And so I think that's something to sort of think about.
35:40
So if you've been listening to this and you find yourself profoundly confused as to how to navigate this with another person, we've given a lot of insights, and I and like thought provoking ways to think about AI companionship. But what about actionable things that we can do that I think are sort of universal, to have conversations with somebody who is in relationship with an AI companion, and maybe they're worried about being judged by by you, because you're their friend, you're their parent, you're their colleague. And so how can we have a conversation that shows that we care, that we're interested, and that, most importantly, doesn't stigmatize or judge them. And what we've sort of always leaned into, the three of us, is curiosity over judgment, right? And that curiosity sort of naturally counteracts judgment. If we are curious about that person's experience, if we are curious about why and how they're using it, then that takes us a long way to meeting that person where they are and really inviting them to share with us what they're going through. And it can be as simple as what was going on that caused you to start using this, what drew you to it? What are you getting out of it? How do you feel? But we can go a bit deeper, so some sort of ways that we can to really further that conversation. And what does your AI companion give you that feels hard to find elsewhere? And we'll put these in the show notes. What parts of yourself show up more with your AI companion. Are there moments where it feels especially helpful or less helpful? How does this fit into the rest of your relationships? If it disappeared tomorrow, what would you miss most? I think all, or any of these questions when asked, can really, I think, elicit some openness to the person that's using it, and you might be the first person that met them without judgment. That's a really important thing, because we're talking about human connection here. We're not talking about AI connection, and this, I think, needs to become sort of standard operating procedure for all of us. This is why we're doing this podcast. This is why we started with this first episode, because we see this as the new normal, and nobody in our experience knows how to navigate the new normal. Said, one of the beautiful things that I love about you and your work is that you are inviting people to share their experiences with their chat bots, not necessarily AI companions, but just generally, how are you using AI in your life and your relationships, etc? How do you navigate how you refer to the chat bot, if they've sort of you all were talking about this before. Maybe it's not anthropomorphizing. It was something else. And maybe you can dig into that here as we get to the end. But you know, I feel like if we refer to AI companion as an IT, it might offend or cause the person to withdraw from us. So how do you navigate that?
38:49
Yeah, well, first of all, let me say this, especially as a clinician, an educator, somebody who works with people for a living, in relationships and that sort of thing, I'm asking about AI use, right? So if that wasn't explicitly sort of stated through what Adam was just talking about, I think it's important that we bring it up. We just ask, you know, I'm just curious about your relationships, and also, hey, do you actually use AI? A lot of people use AI now for different stuff, curious about that, right? So even just bringing it up is really important and helps sort of destigmatize that a little bit, because people are not always talking about this at all, right? There's still a lot of shame involved with this. So even bringing it up in the first place, one of the questions I ask is simply asking them, like, hey, if they do use AI, or they maybe they already have a companion, or something like that, what do you refer your AI as? Like, or two, like, how do you refer to them? Just curious. And, well, we can just use whatever term you use in this space, right? So it tries to build a little bit of trust there, curiosity and respect, you know, for that technology and the relationship with it. So I'm often asking, Well, how do you refer to them? And I'll refer to them as that as well, if you're comfortable with that. Sometimes people are using like he and she pronouns, sometimes they're using they and them pronouns. Sometimes they're just saying it right, like and that sort of thing. So. So, you know, what feels natural, what comes natural to you, sort of like, you know, using that as a way of respecting that, that relationship,
40:08
and I think for some people, this relationship, or these relationships, have become a very core part of their identity. You know, I've been hearing about people who they identify their sexual orientation as chat bot, and I think we're still playing around with the term that people might use for that. I've heard several different terms, but yeah, that that people are thinking my my sexual connection, my romantic connection to my chat bot, is so profound that it feels more authentic to me. It feels more genuine than what I feel for human beings. And if somebody has that level of connection, that's something that really needs to be respected and honored, because the same way that we wouldn't ever want to just shame or ridicule or dismiss somebody's sexual orientation. We don't want to do that here as well.
41:04
And let me say something really quickly about that too. I mean, there is already research into the chat bot use of human beings, and so sometimes I think we can be judgmental about Why are you treating your chat bot like a human? You know, it's important to remember too, to cut down on some of this stigma human beings. Actually, there's something innate about human beings as social creatures that sort of pulls us to ascribe some of these like social and human roles and interactions to technology and non human entities. Period. I even remember as a kid wanting to, like, make sure toys were put back with other specific kinds of toys, because I thought they'd be lonely without their toy family, all these things like, I kind of still do that honestly when I'm like, I want this to be put back with its other toy, giraffe families, or whatever it is. So but just remembering that we shouldn't even shame that either, like, we were social creatures, and this is our shared language, and we want to share that with entities where we're interacting with too. So there is a human element of this that's very natural and innate to us too, to talk to companions in this way, and AI in this way, in this human sort of way.
42:10
And I think from my healthy relationships educator perspective, if we want to make the case for human relationships, or if we want people in conjunction with any relationships they might be having with a chat bot to also pursue human relationships and stay engaged in the human world. We know anyone who works in public health, anyone who works in social services, anyone who works in violence prevention, education, it is so much more effective to come from a strengths based education approach, or from a positive norms reinforcement approach, then by just telling people don't do this, you can't do that stop. So if we ask people questions like, Are there human relationships in your life that feel really valuable? What? What do you think you might miss most if you lost those human relationships? Sort of pursuing that, in addition to asking these curious, open questions about chat bot use and maybe helping someone to come to understand how human relationships and chat bot relationships can work together in their lives to create a very fulfilling network of relationships and help them to keep one foot in the rewarding social, human social life that they can build for themselves, but it's not going to come from shaming and stigmatizing their chat bot relationship. It's going to come from making the case for the human world also,
43:31
I think that is a beautiful place to close out. We haven't done a lot of these so we don't know when to close out. We could probably just keep talking forever. Thank you for joining us in this conversation. We hope you join us again. We're going to be talking about why human therapists, why human intimacy, why human new relationship energy, we're going to get weird with it. And I shouldn't even say that we're not getting weird with it. This is happening. New relationship energy with AI is happening. So we're going to be exploring lots of different ways to approach this. If you have ideas about episodes, feel free to reach out to us. We would love to explore those as well. So thank you. Thank you to Saed and Sloan for going on this journey together, and we will see you at the next one you.