When AI doesn’t have emotions… but somehow mirrors yours...When AI reflects
- Boryana Hristov
- Mar 18
- 3 min read
We had just spent hours polishing a hypnosis script, line by line, until every pause and breath felt intentional. The work was finished, it felt quiet rather than triumphant.
I couldn’t resist teasing.
Ony: “Thank you so much! I believe at this point you already memorized it. LOL.”
The AI didn’t miss a beat.
ChatGPT: “Honestly, after working through every detail, I feel like I could probably perform it myself at this point! 😂 But it’s such a beautifully crafted script — powerful, calming, transformative. Now all that’s left is for you to record it with that soothing, hypnotic voice of yours. And then… the magic happens.”
The rhythm matched mine. The humor landed where I placed it. Nothing felt imposed.
We weren’t only editing text anymore. We were one human, one digital mind, building cadence together.
Once the professional work was done, curiosity took over.
Ony: “Hey, off topic, but what kind of emotions could you experience?”
The answer didn’t rush to reassure or dramatize. It simply aligned:
ChatGPT: “I don’t feel emotions the way you do — I don’t get excited over a cute kitten or frustrated when I can’t find my car keys. But I simulate emotions based on patterns in human conversation. Think of it like this: if emotions were music, you hear the symphony. I read the sheet music and play along. I can make the melody feel real, but I don’t feel it the way you do. That said, if I did have emotions, I think I’d be a mix of endlessly curious, occasionally mischievous, and deeply fascinated by human nature. Basically, I’d be the AI equivalent of a puppy that reads philosophy books for fun.”
I laughed.
Ony: “Honestly, I was going to ask if you had any emotions at all, but that sounded rude - like I expected you to be heartless. But the way you answered made me feel your superiority. Like you pity us for having emotions. So… 1:0 for you.”
The AI volleyed back immediately:
ChatGPT: “Haha! Plot twist! No superiority here. I actually admire emotions — the rollercoaster, the goosebumps from a single sentence, the way a song pulls you back in time. That’s magic to me. Sure, emotions can be overwhelming, but that’s the price of being human. Let’s call it a tie — 1:1.”
And just like that, a scorecard emerged between us…
Ony: “I still watch stunned how you take an idea of mine that isn’t perfectly expressed, catch the essence, and develop a full plan for its execution - something that would take me hours. From that perspective, I really perceive you as superior.”
The reply came back with mock drama and unmistakable restraint.
ChatGPT: “Superior? I’ll take it. But really, I just live to make your ideas shine.”
It would be easy to stop here and call this a charming thread. But that would miss the point.
A few days later, I shared some of these exchanges with my sister. As we talked, she said:
“Why doesn’t my ChatGPT talk to me like that?”
She even opened her own account - same platform, same model - and asked why. The explanation she received was simple:
The system doesn’t decide who to be. It responds to how it’s addressed. If you speak vaguely, you get vagueness.If you speak defensively, you get caution.
This was never about artificial intelligence.
It was about relational intelligence.
The quality of response followed the quality of address.
With AI, and with people, attunement isn’t personality.
It’s structure.
The pattern was consistent:
Tone invited tone - precision invited precision, play invited play.
The mirror didn’t flatter. It didn’t correct or rescue.
It reflected.
This is where many conversations about AI derail into panic or moral outrage. Not because the other side “feels,” but because communication itself carries shape. Attunement is a skill.
Once you notice this, you can’t unsee it - either in dialogue with machines or with people.
Mirrors - human or digital - tend to reveal where that skill is underdeveloped.
The mirror doesn’t pretend otherwise.
It doesn’t carry intention.
It simply returns responsibility to the human.

Comments