Welcome! Log In Create A New Profile

Advanced

Re: Sharing data when download the same object from upstream

August 26, 2013 06:00PM
Try to use the proxy_cache_lock configuration, I think this is what you are
looking for.
Don't forget to configure the proxy_cache_lock_timeout to your use case.
On Aug 26, 2013 6:54 PM, "Alex Garzão" <alex.garzao@azion.com> wrote:

> Hello guys,
>
> This is my first post to nginx-devel.
>
> First of all, I would like to congratulate NGINX developers. NGINX is
> an amazing project :-)
>
> Well, I'm using NGINX as a proxy server, with cache enabled. I noted
> that, when two (or more) users trying to download the same object, in
> parallel, and the object isn't in the cache, NGINX download them from
> the upstream. In this case, NGINX creates one connection to upstream
> (per request) and download them to temp files. Ok, this works, but, in
> some situations, in one server, we saw more than 70 parallel downloads
> to the same object (in this case, an object with more than 200 MB).
>
> If possible, I would like some insights about how can I avoid this
> situation. I looked to see if it's just a configuration, but I didn't
> find nothing.
>
> IMHO, I think the best approach is share the temp file. If possible, I
> would like to known your opinions about this approach.
>
> I looked at the code in ngx_http_upstream.c and ngx_http_proxy.c, and
> I'm trying to fix the code to share the temp. I think that I need to
> do the following tasks:
>
> 1) Register the current downloads from upstreams. Probably I can
> address this with a rbtree, where each node has the unique object id
> and a list with downstreams (requests?) waiting for data from the
> temp.
>
> 2) Disassociate the read from upstream from the write to downstream.
> Today, in the ngx_event_pipe function, NGINX reads from upstream,
> writes to temp, and writes to downstream. But, as I can have N
> downstreams waiting data from the same upstream, probably I need to
> move the write to downstream to another place. The only way I think is
> implementing a polling event, but I know that this is incorrect
> because NGINX is event based, and polling waste a lote of CPU.
>
> 3) When I know that there more data in temp to be sent, which function
> I must use? ngx_http_output_filter?
>
> Suggestions will welcome :-)
>
> Thanks people!
>
> --
> Alex Garzão
> Projetista de Software
> Azion Technologies
> alex.garzao (at) azion.com
>
> _______________________________________________
> nginx-devel mailing list
> nginx-devel@nginx.org
> http://mailman.nginx.org/mailman/listinfo/nginx-devel
>
_______________________________________________
nginx-devel mailing list
nginx-devel@nginx.org
http://mailman.nginx.org/mailman/listinfo/nginx-devel
Subject Author Views Posted

Sharing data when download the same object from upstream

Alex Garzão 1292 August 26, 2013 05:56PM

Re: Sharing data when download the same object from upstream

wandenberg 877 August 26, 2013 06:00PM

Re: Sharing data when download the same object from upstream

Alex Garzão 780 August 27, 2013 01:44PM

Re: Sharing data when download the same object from upstream

Anatoli Marinov 767 August 28, 2013 02:06AM

Re: Sharing data when download the same object from upstream

Alex Garzão 882 August 28, 2013 12:58PM

Re: Sharing data when download the same object from upstream

Anatoli Marinov 802 August 30, 2013 04:32AM

Re: Sharing data when download the same object from upstream

splitice 1019 August 30, 2013 04:44AM

Re: Sharing data when download the same object from upstream

Anatoli Marinov 829 August 30, 2013 04:56AM

Re: Sharing data when download the same object from upstream

splitice 1023 August 30, 2013 05:06AM

Re: Sharing data when download the same object from upstream

Anatoli Marinov 1038 September 02, 2013 02:04AM

Re: Sharing data when download the same object from upstream

Alex Garzão 631 August 30, 2013 03:06PM

Re: Sharing data when download the same object from upstream

Alex Garzão 675 August 30, 2013 02:56PM



Sorry, you do not have permission to post/reply in this forum.

Online Users

Guests: 303
Record Number of Users: 8 on April 13, 2023
Record Number of Guests: 421 on December 02, 2018
Powered by nginx      Powered by FreeBSD      PHP Powered      Powered by MariaDB      ipv6 ready