Welcome! Log In Create A New Profile


Thundering herd

Posted by Shoikan 
Thundering herd
August 15, 2012 11:27AM

We are running a server farm that has to serve out many small AND large files (very little in between), that are produced by a single 'source' server. We are running into severe thundering herd scenarios, which we have solved with an ungodly number of squid instances in between. Obviously, we'd like something a bit more intelligent so we are looking at nginx.

I've now got a proof of concept set up on our testbed, but I cannot generate enough load to stress it to the point where we trigger the thundering herd issue, so I cannot see the protection in action. So, I am wondering.... Do I need to turn something specific on (or off) to set up that protection, or, is it set up 'by default'?

Thanks in advance!

Re: Thundering herd
August 20, 2012 10:11AM
Ok. I found the 'cache stampede' section and how to set that up: set proxy_cache_use_stale to updating. However, I still have a question about that.

Say you have one single source server, and several proxy nginx proxies. In front of the nginx proxies are several apaches (may some day also be upgraded to nginx :D). If a new request comes in that would cause a cache stampede, this request will come in on all servers, so all proxies will get it from many clients (think >40.000). The new file will simply not be there until generated, and there will be no stale content that can be presented instead. What does nginx then? Will it simply put all the incoming requests on hold until either the cache is filled, or it has timed out?
Sorry, only registered users may post in this forum.

Click here to login

Online Users

Guests: 102
Record Number of Users: 8 on April 13, 2023
Record Number of Guests: 421 on December 02, 2018
Powered by nginx      Powered by FreeBSD      PHP Powered      Powered by MariaDB      ipv6 ready