Tried out the new and popular “Deepseek” LLM with my standard “tell me facts about the author of PCalc” query. At least half were misleading or straight up hallucinations. LLMs are not a suitable technology for looking up facts, and anybody who tells you otherwise is… probably trying to sell you a LLM.
=> View attached media | View attached media | View attached media | View attached media
=> More informations about this toot | More toots from jamesthomson@mastodon.social
@jamesthomson oh oh oh does it know of me? I need to know what tales it thinks it knows.
=> More informations about this toot | More toots from NanoRaptor@bitbang.social
@jamesthomson and my standard test is to ask for quotes by people I know. That can really bring on the hyperhallucination.
=> More informations about this toot | More toots from NanoRaptor@bitbang.social
text/gemini
This content has been proxied by September (3851b).