CEO speaks: Mental Health in Age of Algorithms

Update: 2025-11-12 19:09 GMT

We are living through one of the most paradoxical moments in human history. Artificial Intelligence (AI) has made our workplaces smarter, faster, and more efficient than ever before — yet our minds, ironically, are growing more fatigued, fragmented, and fragile.

AI is being hailed as the next industrial revolution — a transformative force capable of transforming how we live and learn. But an uncomfortable question lingers: Can AI heal the very minds it is beginning to dominate? Can a machine that understands our data also understand our despair?

Automation was once celebrated as the great liberator — freeing humans from monotonous, repetitive work. Today, it often feels like a new form of captivity. We move seamlessly from screen to screen, from Zoom to spreadsheet to smartphone, trapped in a loop of endless notifications. Work no longer ends; it merely changes devices. The lines between professional and personal life have long blurred into a haze of constant connectivity.

According to the World Health Organization, nearly 15% of working-age adults globally live with a mental disorder. “Burnout” has become an officially recognized occupational phenomenon. In India, one in seven young people between 15 and 24 years of age experience depression or anxiety as per a UNICEF report — the very generation entrusted with building the AI future.

The modern workplace has evolved into a paradox of progress — hyper-connected, data-driven, and efficient, yet emotionally depleted. In our obsession with measuring performance, we have forgotten that human beings are not machines. Productivity metrics can track output, but not purpose. Algorithms can predict engagement—but not joy. When “optimization” becomes the primary goal, mental health often becomes collateral damage.

Across industries, employees now face what psychologists call “cognitive overload” — a state where the brain’s natural capacity to focus, reflect, and rest is overwhelmed by constant digital stimulation. Even performance monitoring powered by AI — designed to increase accountability — can morph into a form of silent surveillance, creating an atmosphere of anxiety and self-censorship.

Universities, too, are not immune to this trend. Students increasingly rely on AI tools to assist in research, writing, and problem-solving. While this revolutionizes learning, it also fuels a different kind of anxiety — the fear of obsolescence. They worry that the next version of ChatGPT might render them irrelevant—even before they enter the workforce! The very technology meant to empower them sometimes leaves them questioning their worth.

And yet, ironically, AI might also be part of the solution. Around the world, innovators are building AI-powered mental health assistants that can detect early signs of distress — analyzing tone, language, or typing patterns to identify emotional decline. These systems can recommend mindfulness exercises or connect users to human counsellors before a crisis blows up. Start-ups like Woebot and Wysa are pioneering this space, democratizing access to emotional support and making help available to those who might otherwise suffer in silence. In clinical settings, too, AI is changing psychiatry. Predictive diagnostics, sentiment analysis, and brain-imaging pattern recognition are enabling early detection of disorders and personalized treatment pathways. AI can augment therapists and psychiatrists by giving them better insight into patient progress and emotional fluctuations over time.

But no matter how advanced, AI remains a mirror — not a mind, at least for now. It can reflect distress but cannot feel it. Algorithms cannot empathize. The healing power of human presence — the reassurance in a counsellor’s tone, the shared--often awkward--silence of understanding — remains irreplaceable. Technology can assist care, but it cannot replace caregivers. The real question is whether we can design human-centred workplaces — environments where technology amplifies, rather than erodes, our capacity for empathy, reflection, and well-being.

At Sister Nivedita University, we have embraced this philosophy. Beginning January 2026, we will launch the Centre for Wellbeing and Emotional Intelligence under our School for Lifelong Learning — an initiative that integrates mental health awareness, emotional literacy, and AI-assisted analytics into both student life and organizational culture. Our approach blends three ideas: embedding well-being into institutional policy, cultivating empathy as a professional competency, and encouraging mindful technology through digital detox hours and AI-human balance workshops.

AI can measure engagement, but it cannot ascribe meaning. It can detect fatigue, but it cannot restore purpose. The responsibility to preserve-- and enrich— the human spirit rests not with the machine, but with us — educators, leaders, and organizations that choose, deliberately and consciously, to value people over processes.

If we truly wish to build sustainable organizations, we must redefine “success” itself. Success in the AI era cannot be measured only by efficiency or output, but by how meaningfully humans live and work. That is the vision we uphold — to nurture graduates who are not just skilled but deeply human.

As Swami Vivekananda had said, “Education is the manifestation of the perfection already in man.” Perhaps the true frontier of progress lies not in building more intelligent machines, but in awakening more compassionate humans!

The author is the Vice-Chancellor of Sister Nivedita University and Group CEO, Techno India Group. A visionary leader, he is shaping future-ready institutions and inspiring students to lead with purpose

Similar News

News & Views

Around the world

Beyond classrooms

Blackboard

News & Views