Welcome! Log In Create A New Profile

Advanced

Re: is that possible to queue request one by one

chen cw
October 29, 2012 06:04AM
use ngx_http_cleanup_add to add a handler for each request. when a request
is done, this handler will run.
then, in the handler, get ngx_request_t of a blocked request, and run
ngx_http_core_run_phases(r).

the method is not fit for one situation, that is requests are processed all
by nginx itself, e.g, static files.
the problem for this situation is stack overflow or OFM. so i first use
this method in limit_upstream
https://github.com/cfsego/nginx-limit-upstream

On Mon, Oct 29, 2012 at 1:22 PM, Jossan Davis <jossandavis@gmail.com> wrote:

> Hi,
>
> thanks for your reply!
>
>
> my current solution ismodified limit_con module.i add an counter in share
> mem. when a request have a prev request it will blocked and recheck the
> counter in 500ms by using nginx timer.
>
>
> your solution is better than mine, but i still have a question .*a
> finished request is how to notify a blocked request which is in the same
> worker?*
> 在 2012/10/29 11:08 AM,"chen cw" <crk_world@yahoo.com.cn>写道:
>
>> On Fri, Oct 26, 2012 at 2:38 AM, Jossan Davis <jossandavis@gmail.com>wrote:
>>
>>> Hi
>>>
>>> nginx module limit_req only can limit request frequency by url param
>>> key. it delay a request by "ngx_add_timer(r->connection->write, delay)".
>>>
>>> my requirement is a little bit different. i need the next request be
>>> process immediately after the prev one have done.
>>> i have tried to use "ngx_http_cleanup_add". but it seem that the
>>> ngx_http_request_t is not in shared mem, i can't use
>>> "ngx_add_timer(r->connection->write, 0)" to notify the next request.
>>>
>>>
>> you can store the counter in share mem as module limit_req, and also
>> store blocked requests list in global queue for each worker, such as module
>> upstream_keeplive. so when you use "ngx_http_cleanup_add", you can get the
>> ngx_http_request_t in such list. I think this method meets your requirement.
>>
>> --
>>
>> --
>>
>> Charles Chen
>>
>> Software Engineer
>>
>> Server Platforms Team at Taobao.com
>>
>>
>> _______________________________________________
>> nginx-devel mailing list
>> nginx-devel@nginx.org
>> http://mailman.nginx.org/mailman/listinfo/nginx-devel
>>
>
> _______________________________________________
> nginx-devel mailing list
> nginx-devel@nginx.org
> http://mailman.nginx.org/mailman/listinfo/nginx-devel
>



--

--

Charles Chen

Software Engineer

Server Platforms Team at Taobao.com
_______________________________________________
nginx-devel mailing list
nginx-devel@nginx.org
http://mailman.nginx.org/mailman/listinfo/nginx-devel
Subject Author Views Posted

is that possible to queue request one by one

Jossan Davis 858 October 25, 2012 02:40PM

Re: is that possible to queue request one by one

chen cw 429 October 28, 2012 11:10PM

Re: is that possible to queue request one by one

Jossan Davis 449 October 29, 2012 01:24AM

Re: is that possible to queue request one by one

chen cw 565 October 29, 2012 06:04AM



Sorry, you do not have permission to post/reply in this forum.

Online Users

Guests: 192
Record Number of Users: 8 on April 13, 2023
Record Number of Guests: 421 on December 02, 2018
Powered by nginx      Powered by FreeBSD      PHP Powered      Powered by MariaDB      ipv6 ready