a handy list of LLM poisoners
https://awful.systems/post/3408871
=> More informations about this toot | More toots from dgerard@awful.systems
Getting Kee-Oth vs Joshua vibes from this:
www.youtube.com/embed/IbZJ1PeFLGU?start=33&end=70
=> More informations about this toot | More toots from o7___o7@awful.systems
Doing God’s work 🙏
=> More informations about this toot | More toots from Gigliorananomicom@sh.itjust.works
I do feel like active anti-scraping measures could go somewhat further, though - the obvious route in my eyes would be to try to actively feed complete garbage to scrapers instead - whether by sticking a bunch of garbage on webpages to mislead scrapers or by trying to prompt inject the shit out of the AIs themselves.
Me, predicting how anti-scraping efforts would evolve
(I have nothing more to add, I just find this whole development pretty vindicating)
=> More informations about this toot | More toots from BlueMonday1984@awful.systems
Additionally, xeiaso.net/blog/2025/anubis/
Some of this stuff could be conceivably implemented as an easy-to-consume service. It would be nice if it were possible to fend off the scrapers without needing to be a sysadmin or, say, a cloudflare customer.
(Whilst I could be either of those things, unless someone is paying me I would very much rather not)
=> More informations about this toot | More toots from rook@awful.systems
A WP plugin would be handy.
=> More informations about this toot | More toots from homesweethomeMrL@lemmy.world
Stupidly trivial question probably, but I guess it isn’t possible to poison LLMs on static websites hosted on GitHub?
=> More informations about this toot | More toots from arsCynic@beehaw.org
Sure, but then you have to generate all that crap and store it with them. Preumably Github will eventually decide that you are wasting their space and bandwidth and… no, never mind, they’re Microsoft now. Competence isn’t in their vocabulary.
=> More informations about this toot | More toots from dashdsrdash@awful.systems
You can make a page filled with gibberish and have a display: none honeypot link to it inside your other pages. Not sure how effective would that be though
=> More informations about this toot | More toots from 2kool4idkwhat@lemdro.id This content has been proxied by September (3851b).Proxy Information
text/gemini