Welcome! Log In Create A New Profile

Advanced

Slow down, but not stop, serving pages to bot that doesn't respect robots.txt delay

August 11, 2011 11:37AM
Hi all.

I want to differentiate the nginx awesome rate limiting for requests, in order to enforce bots to respect my directives (but not to block them):

so i created these limit_req_zone:

limit_req_zone $binary_remote_addr zone=antiddosspider:1m rate=1r/m;
limit_req_zone $binary_remote_addr zone=antiddosphp:1m rate=1r/s;
limit_req_zone $binary_remote_addr zone=antiddosstatic:1m rate=10r/s;

Now, i ask, is it possible to configure something like this?

if ( $http_user_agent ~* (?:bot|spider) ) {
limit_req zone=antiddosspider burst=1;
}

location / {
limit_req zone=antiddosphp burst=100;
proxy_pass http://localhost:8181;
include /etc/nginx/proxy.conf;
}

I don't know many spider that are crawling my site, but i don't wanna lose the possibility to be indexed if they are not malware! ...But 1 page for minute please ;)

Best regards,
Stefano
Subject Author Posted

Slow down, but not stop, serving pages to bot that doesn't respect robots.txt delay

rastrano August 11, 2011 11:37AM

Re: Slow down, but not stop, serving pages to bot that doesn't respect robots.txt delay

rastrano August 11, 2011 11:56AM

Re: Slow down, but not stop, serving pages to bot that doesn't respect robots.txt delay

ressaid August 11, 2011 05:10PM

Re: Slow down, but not stop, serving pages to bot that doesn't respect robots.txt delay

rastrano August 12, 2011 05:33AM

Re: Slow down, but not stop, serving pages to bot that doesn't respect robots.txt delay

ressaid August 12, 2011 07:34AM



Sorry, only registered users may post in this forum.

Click here to login

Online Users

Guests: 58
Record Number of Users: 6 on February 13, 2018
Record Number of Guests: 421 on December 02, 2018
Powered by nginx      Powered by FreeBSD      PHP Powered      Powered by MariaDB      ipv6 ready