Don’t outsource your feelings to AI — Empathy cannot be automated

As a Professor in the University set up, our days do not usually end when the lectures do. That day had a heavy beginning as I spent almost two hours interacting with students, not discussing their projects or grades but simply being the one to listen patiently to their chaos within. One was dealing with the crushing weight of parental expectations, while the other was navigating the loneliness that often engulfs the students hailing from small cities, learning to find a firm footing in the rhythm of ‘Life in a Metro’. I listened, asked questions and tried my best to offer what I usually do, not solutions but a non - judgemental space where they can open up and share the scuffled up feelings.
And just when I thought that my cup had run almost dry, leaving me a bit wary and blank, the third student knocked and entered for some academic query. He asked, I replied and then at the door, he paused. Looked at me and asked, “Ma'am, are you okay? You look sad….would you like to talk? I didn’t know what to say. I was numbed. Because this was the same student who had once sat in that very chair across me, few months back, struggling silently, while I tried to convince him
that the storm would pass. I had been his sounding board then, like I had been for the other two today. And today, the roles had reversed.
As academicians, we often do more than just taking classes or marking assignments. While interacting with the students in the classroom, we actually do what is far more layered. We notice the changes in students' behaviour, a change in the way they dress, a sudden loss of interest in discussions, an unusual quietness, quick loss of temper or a forced smile. Things that often go unnoticed by others, uncovered by textbooks and rare human data that isn’t captured on dashboards or mindmaps.
Over the past few months, I have been noticing a rapid shift that genuinely worries me. More and more students are turning to Artificial Intelligence — not just for assignments and projects but for emotional support. They are leaning on AI, as their friend and confidante, seeking guidance on dealing with stress, anxiety, relationships, parental expectations and peer pressure. Some of them, even shared with me that they even use AI to draft reply to their personal chats so that they don’t sound ‘too attached’, ‘too emotional’ or ‘too provocative’. And the global numbers affirm what I see in the classroom. A 2026 Trends Report by the American Psychological Association highlights that synthetic relationships are stepping in to fill the void of loneliness created by human connections. Amid what the researchers are now calling a loneliness epidemic, the number of AI companion apps designed to stimulate trusted relationships has grown to a whopping 700 per cent between 2022 and mid 2025 alone. Doesn’t that ring a bell, No an alarm?
In India, the stigma around mental health still runs deep. Thus, the appeal to pour in, to an invisible and yet omnipresent, AI interface that doesn’t judge, doesn’t interrupt and doesn’t try to ‘fix’ is understandable. I don’t dismiss that, but I do worry about its implications in this already interconnected, disconnected world.
Sociologist George Herbert Mead’s theory of Symbolic Interactionism explains how we as humans create meanings through social interactions and shared meanings, through the push and pull of real conversations. When I listen to a student, I don’t just process what they say. I also read their eyes, their hand movements, their pauses, their reflexes. These readings are supported by large datasets stored in the memory, perfected by experience and guided by human empathy. I personally believe that emotions must be raw and not calculative or strategically structured or guided by an AI based on random datasets.
Because human beings are not data points. The vulnerable, not so perfect, raw communications are the ones that shape us as humans. And the empathy, subjectivity, emotional intelligence that we add on at our individual level, truly make the communication meaningful, even in this algorithm-driven synthetic world. So while the algorithms may be perfected to gauge emotional probability, I think we still need ‘Human Mentors’ to decode the unsaid, unmeant and unfelt. When we analyse the situation with our experience and empathy, it's not synthetic, it's real, with all its fallacies and fantasies. And that imperfection, is not a flaw. Because human emotions leave enough room for the ‘what ifs’ and in matters of the heart, ‘what ifs’ matter much more than anything else. As everything in life isn’t black or white, it’s actually in the grey areas, that life truly exists.
That student at the door, wasn’t guided by any keyword. He asked because he felt. He asked because something in him read something in me, that wasn’t said out loud. And I am so glad he asked. I hadn’t even realised what those two hours had done to me but he saw it. It just reassured me that empathy still cannot be automated.
The writer is Professor and Media Incharge of University School of Mass Communication, GGSIPU; Views presented are personal.















