TempermentalAnomaly@lemmy.world to Technology@lemmy.worldEnglish · 25 days agoAdvanced OpenAI models hallucinate more than older versions, internal report findswww.ynetnews.comexternal-linkmessage-square55fedilinkarrow-up1510arrow-down111
arrow-up1499arrow-down1external-linkAdvanced OpenAI models hallucinate more than older versions, internal report findswww.ynetnews.comTempermentalAnomaly@lemmy.world to Technology@lemmy.worldEnglish · 25 days agomessage-square55fedilink
minus-squarepalarith@aussie.zonelinkfedilinkEnglisharrow-up39arrow-down3·25 days agoWhy say hallucinate, when you should say incorrect. Sorry boss. I wasn’t wrong. Just hallucinating
minus-squareprimemagnus@lemmy.calinkfedilinkEnglisharrow-up8arrow-down1·edit-221 days agodeleted by creator
minus-squareSaharaMaleikuhm@feddit.orglinkfedilinkEnglisharrow-up6·24 days agoIt can be wrong without hallucinating, but it is wrong because it is hallucinating.
minus-squareKeenFlame@feddit.nulinkfedilinkEnglisharrow-up4·24 days agoBecause it’s not guessing, it’s fully presenting it as fact, and for other good reasons it’s actually a very good term for the issue inherent to all regression networks
Why say hallucinate, when you should say incorrect.
Sorry boss. I wasn’t wrong. Just hallucinating
deleted by creator
It can be wrong without hallucinating, but it is wrong because it is hallucinating.
Because it’s not guessing, it’s fully presenting it as fact, and for other good reasons it’s actually a very good term for the issue inherent to all regression networks