Wired for Connection: The risks behind relationships with AI companions and chatbots
- Georgina Sturmer

- Oct 22
- 6 min read
This week, I was invited onto ITV News to talk about relationships. Click here to view the full report. The focus was on the rise in AI companions. These are chatbots created by tech companies, designed to offer friendship, companionship, and relationships to people who might be feeling lonely, isolated, or simply in need of something different in terms of interaction. It's a topic that I find frightening and fascinating in equal measure. So, in a world that feels increasingly disconnected - despite us all being more 'connected' than ever, are AI chatbots the answer to what the WHO refers to as a 'loneliness pandemic'?
Wired for connection - why we crave human relationships
We have an instinctive need - a craving - for connection with others. This starts as an innate need for survival, but it continues as a drive for security and comfort. Our interactions with our early caregivers teach us about what we need to do to be comforted and cared for, to feel liked and loved, to learn how to soothe ourselves. These build patterns of attachment which create the blueprint for how we foster relationships as we go through our lives.
Being 'wired for connection' isn't about being an extrovert. It's simply about that human need to know that we have a 'secure base' and a 'safe haven' in another person. That 'secure base' gives us the sense of confidence that we need in order to explore the world, take risks, and be vulnerable. That 'safe haven' is a place of safety that we can return to for comfort when things get tough.
Technology, emotion, and the rise of AI companions
The rise of technology has brought so much to our lives, in terms of convenience, entertainment, information, and communication. And we have always known that one day this might move into the sphere of relationships. After all, we have grown up with images in popular culture of 'robots' with feelings. For me, it was owning a Tamagotchi pet, and watching 'Johnny 5' from the film 'Short Circuit', back in the 1980s. And these days, it's everywhere. We communicate with bots, apps, and websites in a way that has started to feel real.
Online AI companions really came to my attention a couple of years ago, when I was asked to comment for a Euronews article about people who were experiencing real grief for the loss of AI companions. A year later, I appeared in a documentary called 'Artificial Intimacy' about AI companions. And now, I've spoken about it on ITV News. This is absolutely here to stay. The question is, what do these companions bring to our lives? And - the bigger question - what are the risks, and what do they tell us about our society right now?
Why do people form relationships with AI chatbots?
As I've already mentioned, we all crave connection. And if we are feeling lonely or isolated, these companions might feel like a simple option, because:
AI companions have become incredibly sophisticated
These bots aren't just answering questions or offering basic conversation. They have become so 'clever' and well trained that they are able to simulate much of the warmth and attunement that we seek in human relationships.
They offer predictable, consistent responses
Bots are - at least for now - mostly trained to act as a 'mirror'. This mirroring is something that we naturally seek out in our friendships and relationships, and it helps us to feel safe. In real-life relationships, we don't always receive predictable, consistent responses. So a conversation with an AI chatbot might end up feeling more soothing and calming than the potential vulnerability of a real-life conversation. This sense of safety might feel incredibly appealing, especially if we have experienced rejection or unpredictability in the past, or if we struggle to read social cues.
Isolation
Many of us are feeling more stressed and anxious than ever, and isolation is often a part of that experience. It can make it feel harder for us to build and maintain relationships, and so an AI chatbot might provide us with a sense of companionship that we are missing. There might also be practical reasons why we need to live in an isolated way, for example, if we are clinically vulnerable.
Availability
Most relationships and friendships have boundaries. Perhaps the time of day that we are able to speak to each other, or an implicit boundary about what to say or how much to divulge. With an AI chatbot, there is no such boundary.
Influence of our attachment styles
Our attachment patterns influence how we form relationships in different ways. If our primary attachment style is anxious, then we might be drawn to the availability of a chatbot, to sate our need for connection, without worrying that we are 'too much'.
If our primary attachment style is avoidant, then we might feel soothed by the knowledge that we have control in the relationship, that we can consider sharing our feelings without fear of judgement.
What are the risks of forming a relationship with an AI chatbot?
Blurring the sense of reality
The companies that have created these chatbots are doing an incredible job. They have become so realistic that it's easy to imagine that it will 'feel' like communicating with a human, if it doesn't already. And this sets us up for a fall. These bots don't represent the reality of human experience - they present us with a fantasy. And there's a risk that they will give us a false sense of what real-life communication and relationships will look like. And this in turn might make us withdraw further from the real world.
Lack of mutual attunement
In the short term, these relationships might provide us with that sense of comfort of feeling 'heard' and understood. But in the longer term, that lack of real, mutual attunement might leave us feeling anxious or low, or struggling with our self-esteem. Because the thing about relationships, is that they are messy. We have misunderstandings, moments of disconnection, natural ups and downs. And these moments, these cycles that we sometimes describe as 'rupture and repair', ultimately often make our relationships stronger. They also help us to learn and grow. When we have disagreements, conflict, or even just moments of misattunement, it gives us a chance to see our blind spots. To notice when we are being unreasonable, or guarded, or demanding. We won't get that from an algorithm.
Lack of boundaries
An AI companion is available 24/7, and that's part of the appeal. It can be difficult to cope when people aren't available. But some boundaries are important. They help us to learn how to soothe ourselves, to regulate our emotions, to meet our own needs.
The role of co-regulation
Our relationships are physical as well as being emotional. And I'm not necessarily talking about intimacy here. I'm talking about the way that humans can naturally co-regulate each other's nervous systems. We have all experienced this. Anxiety can feel contagious. But if we are feeling anxious, another person's calming presence can help us to soothe ourselves. We literally share our nervous systems with each other when we are in relationship. And regardless of how real a chatbot might seem, this isn't something that we can experience with a machine.
Ethical concerns
For now, the business model for these chatbots seems fairly clear. But there's a risk that this could change. What happens if a platform disappears or goes out of business, leaving users bereft or even grieving? What happens if the business model changes, and AI companions start acting in a demanding, exploitative, or manipulative way?
Tackling loneliness
We do have a real problem with loneliness. And this has ramifications for us on emotional and physical levels. But I'm not sure if AI companions are the answer. And in fact, they might offer us a way to avoid thinking about the real issue. Considering how we can take risks to solve the problems of loneliness and disconnection.
Keen to explore more? Book a free introductory chat to explore how counselling can help you to understand your relationships better.
Keen to learn more? If you're a qualified therapist, I teach on a CPCAB Level 5 Diploma in Attachment Based Psychotherapeutic Counselling. Find out more information here.





Comments