misk@sopuli.xyz to Technology@lemmy.worldEnglish · 2 years agoWe have to stop ignoring AI’s hallucination problemwww.theverge.comexternal-linkmessage-square198linkfedilinkarrow-up1525arrow-down129
arrow-up1496arrow-down1external-linkWe have to stop ignoring AI’s hallucination problemwww.theverge.commisk@sopuli.xyz to Technology@lemmy.worldEnglish · 2 years agomessage-square198linkfedilink
minus-squareCyberflunk@lemmy.worldlinkfedilinkEnglisharrow-up1arrow-down2·edit-21 month agodeleted by creator
minus-squareUnsavoryMollusk@lemmy.worldlinkfedilinkEnglisharrow-up2·2 years agoThey are right though. LLM at their core are just about determining what is statistically the most probable to spit out.
minus-squareCyberflunk@lemmy.worldlinkfedilinkEnglisharrow-up1arrow-down1·edit-21 month agodeleted by creator
deleted by creator
They are right though. LLM at their core are just about determining what is statistically the most probable to spit out.
deleted by creator