Pro@programming.dev to Technology@lemmy.worldEnglish · 1 month agoAnthropic apologizes after one of its expert witnesses cited a fake article hallucinated by Claude in the company's legal battle with music publisherschatgptiseatingtheworld.comexternal-linkmessage-square32fedilinkarrow-up1323arrow-down15
arrow-up1318arrow-down1external-linkAnthropic apologizes after one of its expert witnesses cited a fake article hallucinated by Claude in the company's legal battle with music publisherschatgptiseatingtheworld.comPro@programming.dev to Technology@lemmy.worldEnglish · 1 month agomessage-square32fedilink
minus-squaretyler@programming.devlinkfedilinkEnglisharrow-up15·1 month agoThere are so many services for formatting sources…why use an LLM for this?
minus-squaremic_check_one_two@lemmy.dbzer0.comlinkfedilinkEnglisharrow-up12arrow-down1·edit-21 month agoBecause many people still don’t understand that AI isn’t a reliable source of info. I have seen people use it to fact-check things in arguments before, as if it would actually give them a factual answer.
minus-squareKornblumenratte@feddit.orglinkfedilinkEnglisharrow-up7·1 month agoThis is far worse than being not a reliable source of info. Ms Chen had all the info she needed, and Claude falsified it.
There are so many services for formatting sources…why use an LLM for this?
Because many people still don’t understand that AI isn’t a reliable source of info. I have seen people use it to fact-check things in arguments before, as if it would actually give them a factual answer.
This is far worse than being not a reliable source of info. Ms Chen had all the info she needed, and Claude falsified it.