Only 0.2% of my website's unique visitors are (likely) human #indieweb
=> More informations about this toot | More toots from trobador@mastodon.social
At least big LLM vendors seem to be respecting my robots.txt now (still openai scrapers check it daily)
=> More informations about this toot | More toots from trobador@mastodon.social
Most of the traffic comes from botnets/vulnerability scanners. This is pretty common but I feel I should do something about it... Maybe I should implement some fail2ban rules on 400s
=> More informations about this toot | More toots from trobador@mastodon.social
@trobador Yes, fail2ban is your friend. The bandwidth and processing taken by bad bits and scrapers can add up. Everybody can read robots.txt. Those in the disallow list who go outside what they’re allowed in there get ever increasing bans. Some bad bots get filtered based on their cidr.
=> More informations about this toot | More toots from omz13@mastodon.social This content has been proxied by September (3851b).Proxy Information
text/gemini