So to prevent flooding / spam by bots especially since some bots are just brutal when they crawl by within milliseconds jumping to every single page they can get.
I am going to apply limit's to my PHP block
limit_req_zone $binary_remote_addr zone=one:10m rate=1r/s;
limit_conn_zone $binary_remote_addr zone=addr:10m;
location ~ \.php$ {
limit_req zone=one burst=5;
limit_conn addr 10;
}
But in applying these limits on all PHP pages will that have bad repercussions on Google/Bing/Baidu/Yandex etc I don't want them flooding me with requests either but at the same time I don't want them to start receiving 503 errors either.
Whats a good setting that won't effect legitimate decent (I think I just committed a crime calling some of these companies decent?) crawlers like Google, Bing, Baidu, Yandex etc.
http://www.networkflare.com/