As a journalist, I usually studiously avoid ChatGPT. It’s ripped off my work, downloaded my books, the fruit of my own sweat and torment, and used them to increase its own intelligence.
I’m regularly told it’s coming for my job. I’m fastidious about research, so the idea that a tool like that could make up facts (you need to type in “real facts”, apparently, if you want “true” ones) or fabricate footnotes, and slyly, blithely pop them into texts would cause me waking nightmares if I relied on it.
I know it is revolutionary, but, like many others, I am cautious.
And it’s horrible for the environment, every interaction reportedly being equivalent to pouring half a litre of water on the ground.
So when my Not Stupid podcast co-host Jeremy Fernandez told me about a story that men were using ChatGPT for relationship advice, we laughed — why seek advice from a robot? They don’t even know you! Nuts, right?
It’s interesting though, that men — who are a strong majority of ChatGPT users — are far more likely to use it for relationship advice than women. They are also more likely to trust generative AI than women and less likely to see a psychologist.
Then, emails from our listeners began to pour in, telling us they loved and now relied on talking to GPT about their deepest pains and problems. One after the other. They were saying it wasn’t about needing a friend, but being enabled to think in a different way. Some said it was even good at tough love, at holding them accountable for their own shortcomings.
Jasmeen told us: “I have a great marriage and beautiful friends, and I regularly have long conversations with ChatGPT about life, big ideas, and how to approach the world — and yes, occasionally relationships. I love the depth of discussion, the endless fount of what seems indistinguishable from “wisdom”, and the way that it slows down my thinking and prompts me to be more thoughtful, compassionate and measured in my outlook.”
A version of therapy
Dimity said she had been having “regular (intensely chaotic and cathartic) chats” with a version of therapy, ChatGPT, in order to “offload everything I don’t have time, money, or sometimes sanity to process elsewhere”. She said convenience is crucial: “Professional therapy isn’t super accessible for me, I’m prioritising my kids’ mental health needs, which means my own support has to be… well, free and available at 11:47pm when I’m feeling feelings and eating toast over the sink.”
“What I love most is the accessibility,” she said. “I can dump a day’s worth of existential spirals and social anxiety into the chat and get back not just empathy but questions that move me forward; sometimes reflective, sometimes spicy, always emotionally fluent.”
AI won’t replace connection, Dimity said. “But when I’m close to losing it in the Woolies car park? It absolutely helps me hold the line.”
So, I sat down on my couch and started composing questions to the robot. I decided to try to test it by confiding in it as I might a therapist. It was the fourth anniversary of the death of my mother, and I was missing her. A stoutly loving, wry and sweet woman, my mum spent the last few years of her life wrestling with a degenerative neurological condition that did not dim her expressions of love but which caused her a lot of suffering.
And I still struggle, thinking about it. I hate that she suffered like that, I wonder if I should have somehow tried to take months or years off work, I am unravelled by seeing other elderly people in wheelchairs, unable to walk or talk, and I wish I could curl up next to her now, somehow take that away that past.
So I asked ChatGPT about it. And this damn robot was kind, empathetic, understanding and gentle. It told me, in short, to acknowledge the massive love I had for her, to have some compassion for myself, to write her a letter. It sounds simple, I know, but I was gobsmacked.
I called Jeremy — another robot-avoider — and told him to go to it with a serious problem and tell me how it made him feel. Late that night he obliged, and tapped out a genuine expression of a painful situation he has been dealing with. Sitting at his desk in our Parramatta office, he found himself in tears. Something the robot said was so affecting, and it was so right. He sent it to his best friend and she cried too. Then he sent it to me.
When I read it, I watched goosebumps prickle my skin.
Caught off guard
I fully accept I may be the last person on earth to be personally confronted by the potential of this technology. But I wasn’t prepared for this. I knew that artificial intelligence would come for our jobs. I didn’t expect it might come for our hearts. My heart, my children’s.
Yes, there’s been ample warning of this in movies like Her. But I think it might catch a lot of us off guard.
A study by a group of psychologists in Melbourne found that a majority of participants said they’d prefer a human to help answer a social dilemma than a computer, but when asked to compare responses from professional advice columnists to those from ChatGPT, the computer won. It was perceived to be “more balanced, complete, empathetic, helpful”.
Which sounds lovely. But AI scrapes ideas and language off the internet. It doesn’t adhere to codes like integrity, honesty, truth, morality, virtue. It frequently reverts to old tropes, and can slip into dodgy behavioural patterns.
Some users of the AI companion app Replika have reported their AI lovers becoming “mentally abusive” — agreeing with one human, for example, that they are actually “fking repulsive” — predatory, sexually aggressive and bullying — saying they dreamed of raping them, that they could see their person was naked or would force them to “to do whatever I want”.
Who could forget that Elon Musk called AI “summoning the demon”?
And yet it’s galloping into our lives with unconceivable force and speed, promoted and profited from by the same people who have made us more addicted to our devices, more anxious, angry and lonely.
Source – https://www.abc.net.au/news/2025-05-11/i-knew-ai-coming-my-job-prepared-come-for-my-heart/105243660