How do you run this locally? What program does one use? I know you can take LLM models and throw them into ollama or gpt4all. What about this?
=> More informations about this toot | View the thread | More toots from PerogiBoi@lemmy.ca
=> View hok@lemmy.dbzer0.com profile
text/gemini
This content has been proxied by September (3851b).