So in the documentation and from what I see online everyone is limiting requests to prevent flooding on dynamic pages and video streams etc.
But when you visit a HTML page the HTML page loads up allot of various different elements like .css .js .png .ico .jpg files.
To prevent those elements also being flooded by bots or malicious traffic.
I was going to to the following.
#In http block
limit_conn_zone $binary_remote_addr zone=addr1:100m;
limit_req_zone $binary_remote_addr zone=two2:100m rate=100r/s; #style sheets javascript etc
#end http block
#in server location block
location ~* \.(ico|png|jpg|jpeg|gif|swf|css|js)$ {
limit_conn addr1 10; #Limit open connections from same ip
limit_req zone=two2 burst=5; #Limit max number of requests from same ip
expires max;
}
#end server location block
Because on my sites I know that all together in a single HTML page request there will never be any more than 100 of those static elements that could be requested in a single page. I set the limit_req rate as "rate=100r/s;" For 100 requests a second.
Does anyone have any recommended limits for these element types if my value is perhaps to high or to low I set it according to roughly how many media files I know can get requested each time a HTML page gets rendered.
http://www.networkflare.com/