On 11 Dez 2011 16h01 WET, mdounin@mdounin.ru wrote:
Hello Maxim,
> Hello!
>
> Here are patch series which addresses several issues in cache
> (first 3 patches), and adds cache lock support (last 2).
>
> I'm planning to commit first 3 patches shortly, cache lock patches
> will remain experimental for a while. Review and testing
> appreciated.
I will. If I understood correctly the later two patches implement a
locking mechanism so that if /foobar is not cached and there are 5
requests for foobar in time interval below the cache lock timeout then
only the first request goes upstream the other requests are queued and
are served up when the upstream returns a response.
Example:
location /foobar {
# usual proxy cache stuff...
proxy_cache_lock on;
proxy_cache_lock_timeout 500; # the default, can be tweaked
proxy_pass http://my_upstream;
}
1. Request for /foobar comes in at t=0
2. Additional request for /foobar comes in at t=200ms
3. Additional request for /foobar comes in at t=220ms
Requests 2 and 3 are queued by the cache manager. Request 1 is
forwarded upstream. When a response arrives, requests 2 and 3 are
served.
For the moment the cache entries that are updating will use the
current 'proxy_cache_use_stale updating' setting.
Thanks,
--- appa
_______________________________________________
nginx-devel mailing list
nginx-devel@nginx.org
http://mailman.nginx.org/mailman/listinfo/nginx-devel