Post your dam AI art here and keep it out of the other threads, it's GROSS 🤮

Apparently users are already making these choices for themselves in many cases. In some news-related podcasts I listen to, I’ve heard recently about how many young people are actively engaged with chatbots to combat loneliness or get advice or feedback (or even “practice”) before interacting with another actual human being. One I heard yesterday mentioned the backlash OpenAI saw after the release of ChatGPT-5, and then I found an article that provided some interesting context:

OpenAI’s launch of its new one-size-fits-all ChatGPT 5 model sparked an immediate user rebellion this week. Longtime users flooded social media with complaints about lost functionality, broken workflows, and even lost emotional connection.

The one “emotional connection” that comes to my mind when interacting with bots is the frustration I experience when I talk to a Google Home Mini and Google Assistant doesn’t respond in the way I think it should, even when I speak more clearly and rephrase the request. This is not where I look for an emotional engagement. AI is not my “bro”.

The Forbes article taught me a new phrase: “parasocial bonds”.

The AI noted that users develop “parasocial bonds” with different model personalities, treating them like “familiar colleagues.”

It also provided some suggestions directed toward “product leaders and marketers” that have real validity even absent any discussion of AI, things the decision makers at Wyze might do well to consider. It’s a brief article and worth a quick read, I think.

2 Likes