Hi,
I have a huge problem on my site with site scrapers, spam and spider bots. What I was thinking was that I could use the HttpLimitReqModule module to limit the number of requests per ip except that limits on a level of per second or per minute is too small. Most of these bots fetch pages or do things at a similar rate to regular users the difference here is that over an hour or a day they fetch hundreds of pages sometimes thousands. I can't use iptables either because they usually use one connection and sometimes users with real browsers establish more connections than these robots.
My request is to increase the measured speed unit in the limit_req_zone directive to something larger than per minute; ie per hour, per day.