Ran DeepSeek R1 on Raspberry Pi 5, and the performance was…
https://itsfoss.com/deepseek-r1-raspberry-pi-5/
=> More informations about this toot | More toots from itsfoss@mastodon.social
@itsfoss were you using the distilled versions or deepseek V3? The distilled models have been fine-tuned with R1's reasoning reaponses, which is why they emulate it: The self-questions and going out of the topic are not part of the "real" answer. The V3 is a normal LLM, which ahpuld run better and with less "filler".
That said, it's amazing you could run the 8B model AT ALL on a vanilla Pi 5. Maybe the DeepSeek V3 4B could be the sweet spot?
=> More informations about this toot | More toots from jnk@masto.es This content has been proxied by September (3851b).Proxy Information
text/gemini