top of page
Search

When We Talk to AI, We're Not Always Speaking from Our Whole Selves

AI tools are everywhere these days. We use them to draft emails, look up recipes, organize

trips, or write content. More and more, people are also turning to AI for emotional support—asking it to help them understand a relationship, decode a text message, or make sense of a tough interaction.


But here’s something important to keep in mind: when we talk to AI, we’re not always

speaking from our calm, grounded self. And because of this, we aren't receiving responses that speaking to our whole selves.


We all have different parts of us. There’s the part that feels steady and clear, and then there

are parts that get scared, angry, protective, or hurt. When we’re upset or “activated,” one of

those parts often takes over, and in that state we’re not fully aware of what’s going on

inside. Our words may come out sharper, more defensive, or more hopeless than they

would if we were speaking from a calmer place.


In human relationships, there’s usually some feedback when this happens. A friend might

notice you sound unlike yourself. A loved one might check in: “Are you okay?” A therapist

might gently help you pause and reconnect with your more centered self. That kind of

attunement helps us recognize when we’re speaking from pain rather than from our whole,

integrated self.


AI doesn’t have that context. It doesn’t know your baseline, your history, or your usual way

of speaking. It can’t tell if you’re overwhelmed or if your words are being filtered through a

wounded part of you. It takes what you say at face value, as if your whole self is

speaking—even when only one small but activated part of you is holding the microphone.


This can be tricky. On one hand, AI is “learning” from what people type into it, and much of

that input comes from reactive, less conscious places. On the other hand, if you’re upset and

turn to AI for comfort, the feedback it gives might unintentionally reinforce that upset part

of you instead of helping you reconnect with the calmer, more grounded part of yourself.


What feels validating in the moment for one part might actually be bad advice for your whole self. It’s a novel tool for planning, organizing, researching, and creating. But when you’re in the middle of an emotional spiral, what most of us need isn’t a tool—it’s a human. Someone who can notice your tone, remind you of who you are when you’re steady, and help you come back into connection with yourself.


So if you find yourself reaching for AI when you’re upset, it might help to pause and ask:

Which part of me is speaking right now? And is this part looking for information—or is it

looking for comfort? AI can't provide attuned connection because it doesn't connect. It can't feel you. And it can't respond to your energy. AI has many strengths, but the

kind of presence and attunement we long for when we’re hurting still lives in human

connection. Stay safe out there on ChatGTP and other AI platforms!

 
 
 

Comments


bottom of page