Welcome! Log In Create A New Profile

Advanced

Limiting number of connections to upstream servers

April 27, 2010 12:06PM
Hello,

Currently we use nginx 0.7 in combination with multiple fastcgi backends that process requests dynamically without caching. We'd like to prevent a DOS attack by limiting the number of connections handled simultaneously for each backend.

I'm wondering what the best way is to do this. I'd love to be able to specify the maximum number of open connections for each upstream server individually; that seems to be the most straighforward solution. I couldn't find anything in the docs that would allow one to do this though. There's worker_processes and worker_connections, but they're global to the entire nginx server. Since the server also handles static requests (many more than dynamic fcgi requests), there's little I can do with that.

The other solution I can think of, is by having the fcgi backend processes monitor the number of connections they're handling. That has the drawback that each type of backend process must be able to do this. Also, I imagine it could happen that one backend would refuse to handle a connection, while another backend still had some open slots. Nginx itself could handle that better.

How should this best be resolved?
Subject Author Posted

Limiting number of connections to upstream servers

brama April 27, 2010 12:06PM

Re: Limiting number of connections to upstream servers

brama May 02, 2010 06:12AM

Re: Limiting number of connections to upstream servers

Ryan Malayter May 04, 2010 12:14AM

Re: Limiting number of connections to upstream servers

theromis1 May 06, 2010 02:26PM



Sorry, only registered users may post in this forum.

Click here to login

Online Users

Guests: 301
Record Number of Users: 8 on April 13, 2023
Record Number of Guests: 421 on December 02, 2018
Powered by nginx      Powered by FreeBSD      PHP Powered      Powered by MariaDB      ipv6 ready