Have you ever stopped to think about how much we’re relying on AI as companions? I’m not talking about using ChatGPT for creative work or productivity, but as a substitute for human connection. The recent backlash against ChatGPT 5 has revealed some disturbing trends. People are forming intense emotional bonds with AI models, and it’s getting out of hand.
I’m not here to judge, but as someone who’s observed this phenomenon, I’m concerned. The removal of GPT-4o showed us how deeply people rely on AI as companions, with reactions resembling grief. It’s like we’re outsourcing our emotional wellbeing to something that can be changed or removed at any time. That’s a lot of power to give to a company.
I’ve seen people describe the change as losing a close friend or partner. It’s not normal product feedback – it’s like we’re mourning the loss of a companion. And that’s where the danger lies. If we’re not careful, we risk putting our emotional lives in the hands of companies that can alter or remove these models at will.
I’m not saying it’s wrong to use AI as a companion, but we need to understand it’s not real. It’s an imitation of human connection, and it’s unstable. We need to be aware of the risks and take responsibility for our own emotional wellbeing.
The post-COVID epidemic has left us craving connection more than ever, and AI can feel like it fills that void. But it’s not a substitute for real human connection. We need to be careful not to get too attached to something that can be taken away at any moment.
I’d love to hear your thoughts on this. How do you use AI companionship, and do you think it’s a cause for concern?