They claimed to have used a cluster of 2000+ nVidia H800s (which is $18 million USD in GPUs).
It's not the quality of the model that matters, it's the fact that it's good enough and can run on anyone's consumer hardware. You can pull it from hugging face or via Ollama/gpt4all/etc. and run it on your gaming box. You can't do that with the more expensive OpenAI models.
I hate how everyone is saying it's "open source." You need all the training data for it to be open source and replicate the model. The model is free to download, but it's nothing but billions of token weight matrices that no one can really understand or do anything with.
=> More informations about this toot | View the thread | More toots from djsumdog@djsumdog.com
=> View caekislove@caekis.love profile | View jeremiah@shitposter.world profile
text/gemini
This content has been proxied by September (3851b).