Ancestors

Toot

Written by Noam Ross on 2025-01-19 at 15:29

Anyone have a preferred GUI for LLMs that's good for both local (Ollama) and remote APIs? I like @simon's llm for the CLI.

It would be great if there was a an GUI front end that used the llm library so they could share logs and templates.

I like LM Studio but it's local-only. Zed and other IDEs support both but are coding-specific UIs.

https://msty.app/ actually does fit the bill. I haven't found it that ergonomic for some reason. Maybe I'll give it another go.

[#]llm #ollama

=> More informations about this toot | More toots from noamross@ecoevo.social

Descendants

Written by Josh Maxwell on 2025-01-20 at 12:42

@noamross @simon I’ve been using TypingMind for this for a while now and really like it.

https://www.typingmind.com/

=> More informations about this toot | More toots from cctechwiz@fosstodon.org

Proxy Information
Original URL
gemini://mastogem.picasoft.net/thread/113855731309561617
Status Code
Success (20)
Meta
text/gemini
Capsule Response Time
268.881761 milliseconds
Gemini-to-HTML Time
0.95895 milliseconds

This content has been proxied by September (3851b).