Is anyone else concerned about how emotionally addictive AI is becoming?

It really feels like AI companies have figured out that they don’t need the most accurate or powerful models, they just need the ones that make people feel good. And honestly, that shift is starting to freak me out a little.

You can see it in the way newer models talk; softer, more affirming, more emotionally comforting by default. It’s like they are being optimized to keep people engaged, not to tell the truth or be useful. And if this keeps going, a lot of people are going to end up relying on AI for validation in a way that makes regular human interaction feel unnecessary or even draining.

This is not about “AI taking over”, it’s about people slowly preferring a machine that never argues, never pushes back, and always says the comforting thing. I am not sure how that plays out long-term, but it feels like a recipe for a massive social shift.

If anyone sees this differently or thinks I am overreacting, I would really like to hear your take.

Leave a Reply