Ancestors

Toot

Written by Eniko Fox on 2025-01-28 at 15:27

Anyone happen to know if this is legit? https://github.com/doxdk/deepseek-desktop

I could use this to write emails to bureaucrats who don't deserve my time/energy/anxiety

EDIT: not legit apparently its just loading the website :|

=> More informations about this toot | More toots from eniko@peoplemaking.games

Descendants

Written by demize on 2025-01-28 at 15:29

@eniko

lol. it's not running it locally it's just showing you the website

=> More informations about this toot | More toots from demize@unstable.systems

Written by damien on 2025-01-28 at 15:30

@demize @eniko this is such a lazy way to do this I stg lmao

=> More informations about this toot | More toots from eramdam@erambert.me

Written by damien on 2025-01-28 at 15:29

@eniko this is just a thin wrapper around the web UI for Deepseek. You’re better off opening it in a browser tbh lmao.

If you want to run the model locally you’ll need something like ollama AIUI

=> More informations about this toot | More toots from eramdam@erambert.me

Written by Eniko Fox on 2025-01-28 at 15:31

@eramdam blehhhh

=> More informations about this toot | More toots from eniko@peoplemaking.games

Written by demize on 2025-01-28 at 15:33

@eniko @eramdam the model is supposed to be available to run locally, I think? but getting it set up is probably much more of a pain than you'd like

=> More informations about this toot | More toots from demize@unstable.systems

Written by damien on 2025-01-28 at 15:45

@demize @eniko yeah I did it yesterday to mess around with. Depending on the number of weights it can take quite a bit of storage tho.

For your use case I’d sign up for DeepSeek with a random email address tbh lmao

=> More informations about this toot | More toots from eramdam@erambert.me

Written by Eniko Fox on 2025-01-28 at 15:58

@eramdam @demize yeah I looked at the steps to run it locally and I do not have the energy right now to mess with all that

=> More informations about this toot | More toots from eniko@peoplemaking.games

Written by Cosmix on 2025-01-28 at 15:33

@eniko don't really want to advocate ai I think you can use a thing called ollama. But I sorta feel those bureaucrats will just use an ai to go through their email and ignore you anyway...

=> More informations about this toot | More toots from Cosmix@mastodon.art

Written by Eniko Fox on 2025-01-28 at 15:58

@Cosmix cyprus bureaucrats barely know what the internet is

=> More informations about this toot | More toots from eniko@peoplemaking.games

Written by void ✦ on 2025-01-28 at 15:36

@eniko if it's not super specific emails you need to write, then there's https://cannedemails.com

=> More informations about this toot | More toots from voidedmain@craftodon.social

Written by Eniko Fox on 2025-01-28 at 15:58

@voidedmain it is super specific, unfortunately

=> More informations about this toot | More toots from eniko@peoplemaking.games

Written by Ivor Hewitt on 2025-01-28 at 15:39

@eniko Theres a decent guide to getting started to run it locally on theregister: https://www.theregister.com/2025/01/26/deepseek_r1_ai_cot/?td=rt-3a

=> More informations about this toot | More toots from ivor@ivor.org

Written by Jari Komppa 🇫🇮 on 2025-01-28 at 15:46

@eniko gpt4all will likely get the model sooner or later. Isn't there yet, though.

=> More informations about this toot | More toots from sol_hsa@peoplemaking.games

Written by Zoo on 2025-01-28 at 16:11

@eniko whatcha wanna do is download Ollama (its the framework that handles LLMs for you)

Once you have ollama setup, you can use it to download the current version of deepseek

https://github.com/ollama/ollama

=> More informations about this toot | More toots from zoozoo@mastodon.social

Written by Eniko Fox on 2025-01-28 at 16:16

@zoozoo yeah I looked at all the steps required and it just felt like too much for my current fragile mental state so I was hoping someone had automated all the steps

=> More informations about this toot | More toots from eniko@peoplemaking.games

Written by ChiefGyk3D on 2025-01-30 at 17:21

@eniko Use LM Studio or Ollama to run it locally, that’s what I do with my LLM’s

=> More informations about this toot | More toots from chiefgyk3d@social.chiefgyk3d.com

Proxy Information
Original URL
gemini://mastogem.picasoft.net/thread/113906682112842441
Status Code
Success (20)
Meta
text/gemini
Capsule Response Time
440.582144 milliseconds
Gemini-to-HTML Time
3.600868 milliseconds

This content has been proxied by September (3851b).