Search A Light In The Darkness

Tuesday, 2 December 2025

People Are Turning to AI for Therapy, Grief & Love – And It’s a BIG Problem

 More people than ever are asking AI chatbots to talk them off the ledge, help them say goodbye to loved ones, and find life partners. It feels private, always available, and cheaper than human alternatives. But when the mediator is software optimised for engagement, we hand over the parts of life that define us.  

Here’s what’s happening in the world of chatbot therapy, grief companions and dating assistants, including what they promise – and what it’s really doing to those who trust the models in their most fragile moments. 

Artificial therapy provides an instant listener with a perfect memory and no judgement. The obvious upside is access: people who would never sit on a waiting list or be able to afford human support can talk immediately. But the risk most are underestimating is false competence. Models can mirror warmth and recall your triggers, but it cannot carry legal duty of care, clinical judgement, or the moral weight of advice in life-or-death situations. 

Using bots as grief tools can offer a digital presence of the dead. Families upload messages, voice notes and photos, and the systems generate a familiar tone that replies on cue. The benefit here is comfort, but the real downside is arrested grief. It’s a farewell that never ends, and the living can get stuck in looped conversations with a simulation that never moves on. 

Dating assistants within AI models are on the rise, promising better profiles, cleaner openers, and round-the-clock coaching. Users get more confident, with timid people starting conversations, busy people filtering faster through potential partners, and neurodiverse users develop structure. The risk here should be obvious: outsourcing charm all but guarantees that you end up matching strangers who expect the scripted version of you when meeting face-to-face....<<<Read More>>>...