This page permanently redirects to gemini://bbs.geminispace.org/u/drh3xx/24508.
=> Re: "An LLM for the Geminispace" | In: s/AI
@softwarepagan haven't you heard we NEED AI in everything? Just waiting for the AI trainers with integrated cameras on the toe peg that can text me when my laces have come undone and record a log on a nice web dashboard.
=> ๐ drh3xx
Jan 30 ยท 2 days ago
=> ๐ฎ lucss21a ยท Jan 30 at 13:08:
no. just no. i don't want my little corner of the internet to be crawled by the tenacles of big ai. it's horrid to imagine that. please reconsider your life choices and go back to the world wide web again. please. touching grass is also a better choice. dumbass.
=> ๐ฎ lucss21a ยท Jan 30 at 13:09:
sorry for being rude but please don't. it's a net negative for us here.
=> ๐ธ HanzBrix ยท Jan 30 at 13:22:
For the ones who don't want to be crawled, scraped and the like, aggressive rate limiting would solve the problem. ๐
=> ๐ฎ lucss21a ยท Jan 30 at 13:25:
i heard some old web oriented platforms such as nekoweb and poyoweb includes scraper blockers
=> ๐ stack ยท Jan 30 at 14:00:
@vi, totally with you!
We need to start thinking of how to build antitank weapons against these things. Pollution is not a bad idea.
=> ๐ธ HanzBrix ยท Jan 30 at 15:04:
@lucss21a You can set up ufw or iptables to outright ban connection spam, basically killing all scrapers.
=> ๐ก byte ยท Jan 30 at 15:44:
why whould anyone want this. the purpose of gemini is to stay minimal and clean, not to spread slop generated by bullshitting machine trained on stolen data. there's no good gen-AI, the whole concept is rotten to the core. ew. yikes, even.
=> ๐ฟ argenkiwi [OP] ยท Jan 30 at 18:36:
Thanks @jdcard. That is exactly the thought experiment I had in mind. I wasn't thinking about filling the Geminispace with bots, but I guess it is a reasonable fear. It seems it will not be easy to prevent that. How do we know one of us is not a bot right now?
But as @CarloMonte says there is only enough content for an SLM at this stage. Although I'm sure eventually they will develop techniques to learn from a small knowledge base and make an expert SLM for a specific use case.
Now I am thinking of making a parody of the LLM I was imagining when I started this post. It would be a self-hating one. What should I call it? RATMGPT, DeepRATM...
=> ๐ s/AI
An LLM for the Geminispace โ With the DeepSeek breakthrough reviving the hype around large language models I've just had a thought: what if we trained an LLM using the contents of the Geminispace to measure how much knowledge is added to as time progresses? I think it would be an interesting experiment considering the Geminispace is in its early stages. It may give some interesting insights on how knowledge accumulates overtime.
=> ๐ฌ argenkiwi ยท 16 comments ยท Jan 30 ยท 2 days ago This content has been proxied by September (3851b).Proxy Information
text/gemini; charset=utf-8