(one of) the scary things about these models is how little your average person understands/cares to learn how they work and what their limitations are. I know that’s a lot to ask, but then you hear about things like this and :agony: people are definitely getting fucked for no reason, because people see these models as like Jarvis.
Reminds me of some students getting flagged as cheating because someone asked chatgpt and thought it couldn’t make mistakes.
For sure, the ability of these models to produce very convincing looking output creates an illusion of intelligence and understanding that isn’t there. It’s very easy for people to become convinced that the model really knows what it’s talking about. In that regard, they’re not really that different from CEOs. :)
(one of) the scary things about these models is how little your average person understands/cares to learn how they work and what their limitations are. I know that’s a lot to ask, but then you hear about things like this and :agony: people are definitely getting fucked for no reason, because people see these models as like Jarvis.
Reminds me of some students getting flagged as cheating because someone asked chatgpt and thought it couldn’t make mistakes.
For sure, the ability of these models to produce very convincing looking output creates an illusion of intelligence and understanding that isn’t there. It’s very easy for people to become convinced that the model really knows what it’s talking about. In that regard, they’re not really that different from CEOs. :)