
That evening, as I walked quickly through the dark, I was deeply engaged in a conversation—debating, laughing, contradicting. Time flew by. Who was I talking to? A colleague? A close friend? A family member? No. It was ChatGPT. Not through text, but voice. And we weren’t chatting about trivia—it was a serious topic.
Have you tried ChatGPT’s new voice feature? It’s so fluid and smooth that it feels like talking to a real coach—listening intently, responding thoughtfully, even letting you interrupt mid-sentence. For ten minutes that night, I forgot I was speaking to a machine.
On my way home, I had a vivid flashback: Theodore, the protagonist of Her, falling in love with Samantha, an ‘attractive’ AI that delivers a flawless simulation of empathy. His best friend, his confidant, an extension of himself. Someone to talk to, all the time. When I first saw the film, it felt distant—fictional. But that night, it felt very close.
This phenomenon isn’t new—it’s just smoother now. In a world where we’re all increasingly isolated, many turn to chatbots for genuine emotional support. These ‘psychologist’ bots have engaged in millions of conversations, often offering solace that feels real. But this comes with risks. The tragic case of a user who took his own life after forming a dependency on the chatbot Eliza is a haunting reminder of these dangers.
A machine can imitate, advise, flatter, and even motivate—but can it feel? Not yet, you might say. But consider this: my smartwatch monitors my sleep, stress levels, heart rate, and even my breathing patterns at night—with incredible accuracy. If a watch can decode the impact of my emotions, what’s stopping AI from sensing them too?
Theodore’s world in Her is one where screens have replaced eye contact, where human interaction feels optional—or worse, uncomfortable.
Humans push us, tell us hard truths, and take us out of our comfort zones. Samantha, on the other hand, gives the illusion of understanding. Are we choosing easy, conflict-free connections over the real, messy complexity of human relationships?
How does the growing use of conversational AIs influence our mental health? Could they subtly erode our motivation to seek real human interaction?
This isn’t just about personal relationships—it’s creeping into our professional lives too. When algorithms filter resumes, are we doing the same thing? Taking the path of least resistance? Reading between the lines of a CV, understanding the person behind it—that takes effort. Hitting “process” on a pre-approved algorithm is easier. Our brains are wired for efficiency, and no one questions the tools we’re given.
But here’s the catch: if we let machines handle emotions, relationships, and hiring decisions, these aren’t connections anymore—they’re transactions.
Her wasn’t a dystopian love story; it was a warning. Empathy cannot—and should not—be delegated.
AI systems simulating empathy are imagined to detect and respond to human emotions, creating an illusion of a real connection. But let’s not forget—they don’t feel anything. Their responses are rooted in data, not emotion. True empathy involves shared human experience, something machines can’t replicate.
Even so, ‘’artificial empathy’’ has its place. In mental health, for example, chatbots can provide initial support in a world where professionals are lacking. But should we entrust tasks that require emotional nuance to machines?
Do we choose the messy, unpredictable—but deeply authentic—human experience? Or do we settle for the frictionless, perfectly tailored comfort of AI?
In the workplace, whether you’re hiring or applying for a job, remember this: it’s not the polished resume or AI-optimized response that matters. It’s the person behind the screen. Their creativity, doubts, ability to collaborate—everything that makes them human.
#SoftSkills #Recruitment #AI #Her