Kon Wilms
April 22, 2009 08:44PM
On Wed, Apr 22, 2009 at 5:17 PM, davidr <nginx-forum@nginx.us> wrote:
> What's the best way to limit the number of requests an IP can make in a, say 15 min, time period, for example? Is there a way to block them on a webserver (nginx) layer and move it away from an application layer since app layer blocking incurs too much of a performance hit? I'm looking for something that would simply count for the number of requests over a particular time period and just add the IP to iptables if it ever crosses the limit.

You could try fail2ban - it's pretty easy to build rules for it.

The trick is that you don't want to have it monitoring your main nginx
log. So the solution is to place links to bogus URLs in your html
pages which are invisible to a human. This way when their scraper
attempts to hit the bogus links, nginx will trigger entries into your
errorlog, and your fail2ban monitor will catch x amount of those in y
amount of time and block the host in iptables.

Subject Author Posted

Help: How to deal with content scrapers?

davidr April 22, 2009 08:17PM

Re: Help: How to deal with content scrapers?

Kon Wilms April 22, 2009 08:44PM

Re: Help: How to deal with content scrapers?

Jonathan Vanasco April 22, 2009 09:41PM

Sorry, only registered users may post in this forum.

Click here to login

Online Users

Guests: 91
Record Number of Users: 6 on February 13, 2018
Record Number of Guests: 421 on December 02, 2018
Powered by nginx      Powered by FreeBSD      PHP Powered      Powered by MariaDB      ipv6 ready