Ancestors

Toot

Written by Ethan Blanton on 2025-01-16 at 02:28

So ... is there any reason I shouldn't put a nest of meaningless pages on my web site in robots.txt as forbidden, link them into web pages in a way that should be invisible to (or obviously ignored by) humans and un-followed by browsers, and then put URLs for some of the deeper pages into fail2ban? It feels like this should allow human traffic, allow valid search crawlers, and blackhole AI crawlers.

=> More informations about this toot | More toots from elb@social.sdf.org

Descendants

Written by aburka 🫣 on 2025-01-16 at 02:41

@elb clever idea to use fail2ban as an enforcement mechanism for robots.txt!

=> More informations about this toot | More toots from aburka@hachyderm.io

Written by Mark Eichin on 2025-01-16 at 21:57

@aburka

Ooh, I've used fail2ban for ssh for something like a decade, didn't occur to me that it could score off of other log files, neat!

@elb

=> More informations about this toot | More toots from eichin@mastodon.mit.edu

Proxy Information
Original URL
gemini://mastogem.picasoft.net/thread/113835672154030597
Status Code
Success (20)
Meta
text/gemini
Capsule Response Time
265.512381 milliseconds
Gemini-to-HTML Time
1.488997 milliseconds

This content has been proxied by September (3851b).