Welcome! Log In Create A New Profile

Advanced

Re: How to cap server load?

Stefan Caunter
January 05, 2013 06:36PM
There are a number of ways; varnish does this with its director back
end definitions, gdnsd also lets you use health checks to manage
server pools, and you have mentioned haproxy. The reason I mentioned
database at the beginning, is that you ultimately have to "protect"
the database with your system, by identifying which requests are
hurting you most frequently, and looking for creative ways to reduce
that pressure on the database. Unless you can provide relief and
backend pool management with a protection system, you can make things
worse with larger webservers and more backends.



On Sat, Jan 5, 2013 at 1:32 PM, KT Walrus <kevin@my.walr.us> wrote:
> Good idea of measuring actual response time, but I'm just looking to for a more crude limit of setting a maximum number of concurrent requests for the localhost server before requests are "bounced" to another backend.
>
> But, do you know how to do the "response time" limit within NGINX? Or, do I need to do this test with scripts outside NGINX and have all load balancers that send requests to this backend do health checks (again outside NGINX) and edit configuration file and reload NGINX?
>
> If so, I might as well put HAProxy in front of NGINX which can do what I want.
>
> I was looking for a simple way within NGINX to see how many concurrent requests there are for the localhost backend.
>
> Just exploring my options...
>
> Kevin
>
> On Jan 5, 2013, at 1:20 PM, Stefan Caunter <stef@scaleengine.com> wrote:
>
>> You need to test the response time of a sample php script. Mark the
>> back end as down if it fails the response time threshold a certain
>> number of times. After you back off, it should recover health if your
>> algorithm is working. Remember, the database is likely to be the
>> ultimate performance issue with php performance.
>>
>>
>> ----
>>
>> Stefan Caunter
>> https://www.scaleengine.com/
>>
>>
>> On Sat, Jan 5, 2013 at 11:15 AM, KT Walrus <kevin@my.walr.us> wrote:
>>> I really want to ensure that my web servers are not overloaded.
>>>
>>> Can I do this with nginx?
>>>
>>> That is, is there a variable I could test to decide whether nginx should send the request to the local PHP backend or to forward the request to other nginx servers in the server farm, based on the load of the PHP backend? Maybe a variable that contains how many concurrent requests to nginx are waiting for a response?
>>> _______________________________________________
>>> nginx mailing list
>>> nginx@nginx.org
>>> http://mailman.nginx.org/mailman/listinfo/nginx
>>
>> _______________________________________________
>> nginx mailing list
>> nginx@nginx.org
>> http://mailman.nginx.org/mailman/listinfo/nginx
>
> _______________________________________________
> nginx mailing list
> nginx@nginx.org
> http://mailman.nginx.org/mailman/listinfo/nginx

_______________________________________________
nginx mailing list
nginx@nginx.org
http://mailman.nginx.org/mailman/listinfo/nginx
Subject Author Posted

How to cap server load?

ktwalrus January 05, 2013 11:16AM

Re: How to cap server load?

Stefan Caunter January 05, 2013 01:22PM

Re: How to cap server load?

ktwalrus January 05, 2013 01:34PM

Re: How to cap server load?

Stefan Caunter January 05, 2013 06:36PM



Sorry, only registered users may post in this forum.

Click here to login

Online Users

Guests: 206
Record Number of Users: 8 on April 13, 2023
Record Number of Guests: 421 on December 02, 2018
Powered by nginx      Powered by FreeBSD      PHP Powered      Powered by MariaDB      ipv6 ready