does one of you around here happen to have an excellent robots.txt that tells google, chatgpt, et.. al. to fuck off?

23 - script kitty (ΞΈβ¨Ί) & actual real life vampire
wife: @evie-src
![]() | ![]() | ![]() |
![]() | ![]() | ![]() |
π΅πΈπ΅πΈπ΅πΈπ΅πΈπ΅πΈπ΅πΈπ΅πΈπ΅πΈπ΅πΈπ΅πΈπ΅πΈπ΅πΈπ΅πΈπ΅πΈπ΅πΈπ΅πΈπ΅πΈπ΅πΈπ΅πΈπ΅πΈπ΅πΈπ΅πΈπ΅πΈπ΅πΈπ΅πΈπ΅πΈπ΅πΈπ΅πΈπ΅πΈπ΅πΈπ΅πΈπ΅πΈπ΅πΈπ΅πΈ
does one of you around here happen to have an excellent robots.txt that tells google, chatgpt, et.. al. to fuck off?
the best part is that this doesn't stop indexing, which google decided needed to be an actual header set over http, of course. but it should stop them from actually adding the content. ideally. that's what they say, at least.
of course. robots.txt is a suggestion, after all.
thanks for the http header stuff. will def be reviewing that soon.