Ancestors

Written by Garrick Aden-Buie on 2024-08-10 at 13:59

About to hop on a plane to head to #PositConf2024 but want a local coding assistant for inflight hacking? With #ollama and #Shiny for #Python you can bring a coding assistant with you!

  1. Download Ollama from https://ollama.com/

  1. Run ollama run llama3.1:8b to get a decent, light ~5GB model

  1. Create a Shiny app from our Ollama chat template https://shiny.posit.co/py/components/display-messages/chat/#ai-quick-start

  1. Set model="llama3.1:8b in app.py

  1. shiny run app.py

  1. Local coding assistant!

=> More informations about this toot | More toots from grrrck@fosstodon.org

Toot

Written by Garrick Aden-Buie on 2024-08-10 at 14:00

I've also tried deepseek-coder-v2 which is an 8GB download and pretty darn good https://ollama.com/library/deepseek-coder-v2

=> More informations about this toot | More toots from grrrck@fosstodon.org

Descendants

Written by Garrick Aden-Buie on 2024-08-10 at 14:10

Quick screen recording of a local chat interface in #Positron with #ollama and #Shiny for #Python

=> View attached media

=> More informations about this toot | More toots from grrrck@fosstodon.org

Written by Ben Bolker on 2024-08-10 at 15:05

@grrrck Suggestions for R equivalents? Or does ollama already handle R?

=> More informations about this toot | More toots from bbolker@fediscience.org

Written by Jordi Rosell on 2024-08-10 at 20:39

@grrrck could be google chrome be embeded there?

=> More informations about this toot | More toots from jrosell@mastodon.social

Written by Thomas on 2024-08-11 at 17:05

@grrrck how's this impact your battery life? How much RAM are you working with?

=> More informations about this toot | More toots from thomasw@toot.bldrweb.org

Written by Garrick Aden-Buie on 2024-08-11 at 21:47

@thomasw Amazingly, it doesn't seem to be a big battery draw at all. I'm on an M1 with 32gb of memory (my work laptop) but it's definitely not using all of the memory either.

=> More informations about this toot | More toots from grrrck@fosstodon.org

Written by Thomas on 2024-08-12 at 14:23

@grrrck I was testing out the Alpaca[1] desktop GUI for Ollama a few days ago. On my machine it was a bit slow, but it was still pretty amazing watch these words appear on my screen "out of thin air," fully severed from the internet.

[1] https://jeffser.com/alpaca/

=> More informations about this toot | More toots from thomasw@toot.bldrweb.org

Proxy Information
Original URL
gemini://mastogem.picasoft.net/thread/112938085265993465
Status Code
Success (20)
Meta
text/gemini
Capsule Response Time
291.064048 milliseconds
Gemini-to-HTML Time
3.393351 milliseconds

This content has been proxied by September (3851b).