Imagine a furby running a local #LLM with voice input/output.
Enjoy the free nightmare cuz it’s probably coming. 😂
=> More informations about this toot | More toots from bouncing@twit.social
@bouncing
Not nearly as much of a nightmare as a non-local LLM. I'd wager that's coming earlier.
=> More informations about this toot | More toots from Blort@social.tchncs.de
@Blort Datacenter LLMs too speedy for a children’s toy?
=> More informations about this toot | More toots from bouncing@twit.social This content has been proxied by September (ba2dc).Proxy Information
text/gemini