Been thinking about the movie the Minority report a lot in the context of AI engines hallucinating. II read somewhere that OpenAI had started having a second AI engine built into their AI engine in order to try to deal with hallucination. So maybe the future of AI is the 'minority report solution' of having 3 AI enginees answer and have them vote if they disagree. Of course this does bring up the question of power usage requirements... #ai
=> More informations about this toot | More toots from Cfkschaller@fosstodon.org
@Cfkschaller Surely adding more AI to your AI will fix the problem with AI. #AI
=> More informations about this toot | More toots from serebit@floss.social
@serebit Well to be fair that is how we dealt with the unreliability of our own intelligence, by designing processes where things are double checked by multiple people.
=> More informations about this toot | More toots from Cfkschaller@fosstodon.org
@Cfkschaller Sure, but we're not dealing with intelligence here, we're dealing with probabilistic text autocompletion. Adding more models to adjust output will not address the inherent biases in the training data, nor will it magically solve the inability of these models to follow formal logic rules. This is diminishing returns—a doubling or tripling of energy requirements for a slight improvement in output reliability.
=> More informations about this toot | More toots from serebit@floss.social This content has been proxied by September (ba2dc).Proxy Information
text/gemini