I know not many of you care about LLMs/other ai models but I think this really shows the amount of loneliness and in our society. Look at how it presents itself on Google. As an AI that feels alive, always available, that understands you. People don’t use this service to summarize text or get help with their programming homework like they might chatgpt. They are selling artificial companionship.
As far as I know, yes. It’s pretty much all just pretending to have conversations, not so much useful assistant tasks. And maybe not all of it’s use isn’t so detrimental but… I just worry about it. I’ve seen how important it is to some users, and with so many, there’s bound to be a lot of people who have an unhealthy relationship with these types of things.
Yeah, I’ve seen how this stuff talks, it’s just a “tells you what you want to hear” machine. It would be worrying if someone was using it instead of actually interacting with people.