Welcome! Log In Create A New Profile

Advanced

Limiting parallel requests by source IP address

February 08, 2015 05:41PM
I am using nginx as a reverse proxy to a Ruby on Rails application using the unicorn server on multiple load-balanced application servers. This configuration allows many HTTP requests to be serviced in parallel. I'll call the total number of parallel requests that can be serviced 'P', which is the same as the number of unicorn processes running on the application servers.

I have many users accessing the nginx server and I want to ensure that no single user can consume too much (or all) of the resources. There are existing plugins for this type of thing: limit_conn and limit_req. The problem is that it looks like these plugins are based upon the request rate (i.e. requests per second). This is a less than ideal way to limit resources because the rate at which requests are made does not equate to the amount of load the user is putting on the system. For example, if the requests being made are simple (and quick to service) then it might be OK for a user to make 20 per second. However, if the requests are complex and take a longer time to service then we may not want a user to be able to make more than 1 of these expensive requests per second. So it is impossible to choose a rate that allows many quick requests, but few slow ones.

Instead of limiting by rate, it would be better to limit the number of *parallel* requests a user can make. So if the total system can service P parallel requests we would limit any one user to say P/10 requests. So from the perspective of any one user our system appears to have 1/10th of the capacity that it really does. We don't need to limit the capacity to P/number_of_users because in practice most users are inactive at any point in time. We just need to ensure that no matter how many requests, fast or slow, that one user floods the system with, they can't consume all of the resources and so impact other users.

Note that I don't want to return a 503 error message to a user who tries to make more than P/10 requests at once. I just want to queue the next request so that it will eventually execute, just more slowly.

I can't find any existing plugin for Nginx that does this. Am I missing something?

I am planning to write a plugin that will allow me to implement resource limits in this way. But I am curious if anyone can see a hole in this logic, or an alternative way to achieve the same thing.

Thanks,

Chris.
Subject Author Posted

Limiting parallel requests by source IP address

ChrisAha February 08, 2015 05:41PM

Re: Limiting parallel requests by source IP address

B.R. February 09, 2015 02:44PM

Re: Limiting parallel requests by source IP address

ChrisAha February 09, 2015 10:23PM



Sorry, only registered users may post in this forum.

Click here to login

Online Users

Guests: 101
Record Number of Users: 8 on April 13, 2023
Record Number of Guests: 421 on December 02, 2018
Powered by nginx      Powered by FreeBSD      PHP Powered      Powered by MariaDB      ipv6 ready