misk@sopuli.xyz to Technology@lemmy.worldEnglish · 6 个月前We have to stop ignoring AI’s hallucination problemwww.theverge.comexternal-linkmessage-square202fedilinkarrow-up1525arrow-down129
arrow-up1496arrow-down1external-linkWe have to stop ignoring AI’s hallucination problemwww.theverge.commisk@sopuli.xyz to Technology@lemmy.worldEnglish · 6 个月前message-square202fedilink
minus-squareUnsavoryMollusk@lemmy.worldlinkfedilinkEnglisharrow-up2·edit-26 个月前They are right though. LLM at their core are just about determining what is statistically the most probable to spit out.
minus-squareCyberflunk@lemmy.worldlinkfedilinkEnglisharrow-up1arrow-down1·6 个月前Your 1 sentence makes more sense than the slop above.
They are right though. LLM at their core are just about determining what is statistically the most probable to spit out.
Your 1 sentence makes more sense than the slop above.