Just in case it is relevant I'll explain the setup. It is vbulletin site with apache serving php and nginx static content. I have this in nginx.conf limit_zone one $binary_remote_addr 10m; and this in vhost config of the site in question: limit_conn one 35; I hope that is reasonable setting. Haven't noticed any issues with regular traffic. But occasionally we get some ips breaking the zoby karabaja - Nginx Mailing List - English
Thanks Antonio, so far we've only had fake google bots so it seems fine as it is for now. But I'll apply those if there is any issue with Yahoo or some other spider being faked. I am assuming that other user agents are not affected by the current rule.by karabaja - Nginx Mailing List - English
Thanks everyone for being so helpful. I've ended up applying Igor's suggestion. But I've dropped this line as I wasn't sure what to do with it: "~(?i)(Purebot|Lipperhey|MaMaCaSpEr|libwww-perl|Mail.Ru|gold crawler)" 1; I am guessing it can be used if I want to match more then just google's user agent. But in any case what I did worked very nice. I tested it using Firefox user agenby karabaja - Nginx Mailing List - English
Hello everyone. Sorry for double posting this question in How to section of the forum but I've noticed later there is a lot of un-replied threads there and that mailing list is more active so I am assuming I have more chance of getting some help here. I am hoping this is possible and I'd really appreciate some help on configuring it. We are having some bots on our site that are using googleby karabaja - Nginx Mailing List - English
Hello everyone. I am hoping this is possible and I'd really appreciate some help on configuring it. We are having some bots on our site that are using google spider user agent but they are fake and their ip range has nothing to do with Google. So I am looking for some solution that would match visitors with user agent is Google who's ip doesn't start with for example 66.x or 70.x and block thby karabaja - How to...