The LLM marketing hype campaign has very successfully changed the overall perceived definition of what “AI” is and what “AI” could be.
Arguably it makes actual general AI as a concept harder to develop because financing and subsidies will likely keep going downstream toward LLM projects instead of attempts to emulate general intelligence.
That’s very insightful, and you’re right. I assume that an upcoming LLM product with a posh British waifu accent politely telling nerds how special they are would likely make fucking bank and maybe even be seen as the first ascended artificial being.
EDIT: I’m not wild about calling any human being an “NPC” though, just because that dehumanizing shit is a common techbro and chud concept.
The LLM marketing hype campaign has very successfully changed the overall perceived definition of what “AI” is and what “AI” could be.
Arguably it makes actual general AI as a concept harder to develop because financing and subsidies will likely keep going downstream toward LLM projects instead of attempts to emulate general intelligence.
the average person was always an NPC who goes by optics instead of fundamentals
“good people” to them means clean, resourced, wealthy, privileged
“bad people” means poor, distraught, dirty, refugee, etc
so it only makes sense that an algorithm box with the optics of a real voice, proper english grammar and syntax, would be perceived as “AI”
That’s very insightful, and you’re right. I assume that an upcoming LLM product with a posh British waifu accent politely telling nerds how special they are would likely make fucking bank and maybe even be seen as the first ascended artificial being.
EDIT: I’m not wild about calling any human being an “NPC” though, just because that dehumanizing shit is a common techbro and chud concept.