misk@sopuli.xyz to Technology@lemmy.worldEnglish · 6 months agoWe have to stop ignoring AI’s hallucination problemwww.theverge.comexternal-linkmessage-square202fedilinkarrow-up1520arrow-down128
arrow-up1492arrow-down1external-linkWe have to stop ignoring AI’s hallucination problemwww.theverge.commisk@sopuli.xyz to Technology@lemmy.worldEnglish · 6 months agomessage-square202fedilink
minus-squareCyberflunk@lemmy.worldlinkfedilinkEnglisharrow-up1arrow-down2·6 months agoWtf are you even talking about.
minus-squareUnsavoryMollusk@lemmy.worldlinkfedilinkEnglisharrow-up2·edit-26 months agoThey are right though. LLM at their core are just about determining what is statistically the most probable to spit out.
minus-squareCyberflunk@lemmy.worldlinkfedilinkEnglisharrow-up1arrow-down1·6 months agoYour 1 sentence makes more sense than the slop above.
Wtf are you even talking about.
They are right though. LLM at their core are just about determining what is statistically the most probable to spit out.
Your 1 sentence makes more sense than the slop above.