On gopher to http proxies

I have done some searching on the www search engines and my (and everyone elses) content is live on the web due to robots.txt not being setup correctly I realized when I got to the last sentence that I am part of the problem. I had forgotten about the early version of the RPoD proxy living on SDF. So this is now fixed with:

User-agent: * 
Disallow: /

So, I want to appologize, and also ask everyone running a gopher proxy to create robots.txt files in their www root to keep gopher and the web away from each other.

Thanks

Proxy Information
Original URL
gemini://1436.ninja/Phlog/gmi/20171204b.gmi
Status Code
Success (20)
Meta
text/gemini
Capsule Response Time
737.659022 milliseconds
Gemini-to-HTML Time
0.369887 milliseconds

This content has been proxied by September (ba2dc).