Tried out the new and popular “Deepseek” LLM with my standard “tell me facts about the author of PCalc” query. At least half were misleading or straight up hallucinations. LLMs are not a suitable technology for looking up facts, and anybody who tells you otherwise is… probably trying to sell you a LLM.
=> View attached media | View attached media | View attached media | View attached media
=> More informations about this toot | More toots from jamesthomson@mastodon.social
I then asked for a list of ten Easter eggs in the app, and every single one was a hallucination, bar the Konami code, which I did actually do.
=> View attached media | View attached media | View attached media | View attached media
=> More informations about this toot | More toots from jamesthomson@mastodon.social
@jamesthomson I am glad you included the ability to type in the numerical sequence 5318008 and flipping the phone over. So many calculators don't include these necessary features.
=> More informations about this toot | More toots from tvwonder@mastodon.social
@tvwonder The thing is, I actually didn’t :)
=> More informations about this toot | More toots from jamesthomson@mastodon.social
@jamesthomson it's just funny that the LLM said they were features you added. As if you have to specifically allow people to type in a specific sequence of numbers and flip the phone over.
And relevant to your original statement - my daughter is one who has started to use ChatGPT as her primary search engine, no matter what I say about its hallucinations. 🤷🏼♂️
=> More informations about this toot | More toots from tvwonder@mastodon.social
@tvwonder Yeah, hallucinations in search is just not what the world needs right now!
=> More informations about this toot | More toots from jamesthomson@mastodon.social
@jamesthomson exactly
=> More informations about this toot | More toots from tvwonder@mastodon.social This content has been proxied by September (3851b).Proxy Information
text/gemini