Welcome! Log In Create A New Profile

Advanced

Re: RE: Nginx setting up >25.000 concurrent connections per second

Aleksandar Lazic
October 06, 2011 03:20PM
Hi,

On 06.10.2011 21:00, atadmin wrote:
> Hi,
>
> I have the same result with 4 servers that 9 servers executing ab in
> the
> same time. I tested it before until i configured 9 servers client.
> With
> 1 server i can get between 12000 - 14500 concurrent connections.

Maybe you use the wrong tool?

http://kristianlyng.wordpress.com/2010/10/23/275k-req/

Here a statement from http://haproxy.1wt.eu/news.html
###
Oct 23th, 2010 : new httperf results : 572000 reqs/s

This morning I came across this interesting post from Kristian
Lyngstol about the performance tests he ran on the Varnish cache. What
struck me was the number of requests per second Kristian managed to
reach : 275000, not less. I'm not surprized at all that Varnish can
withstand such high rates, it's known for being very fast. My surprize
came from the fact that Kristian managed to find fast enough tools to
run this test. My old injector is limited to around 100k requests per
second on my machines, as it does not support keep-alive, and Apache's
ab to around 150k with keep-alive enabled. And when I managed to reach 2
millions requests per second, I was feeding a constant stream of
pipelined requests with netcat, which is particularly inconvenient to
use.

Kristian said he used httperf. I tried it in the past but did not
manage to get good numbers out of it. He said he found some "httperf
secrets", so that made me want to try again. First tests were limited to
approximately 50000 requests per second with httperf at 100% CPU.
Something close to my memories. But reading the man, I found that
httperf can work in a session-based mode with the "--wsess" parameter,
where it also support HTTP pipelining. Hmmm nice, we'll be less sensible
to packet round-trips :-) So I tried again with haproxy simply doing
redirects. Performance was still limited to 50000 requests per second.

In fact, there appears to be a default limit of 50000 requests per
second when "--rate" is not specified. I set it to 1 million and ran the
test again. Result: about 158000 requests per second at 100% CPU and
with haproxy at 44%. Since my machine is a Core2 Quad at 3 GHz, I fired
3 httperf against one haproxy process. The load reached a max of 572000
requests/s with an average around 450000 requests per second. This time,
haproxy and all 3 httperf were using 100% CPU. What an improvement!

These tests mean nothing at all for real world uses of course,
because when you have many clients, they won't send you massive amounts
of pipelined requests. However it's very nice to be able to stress-test
the HTTP engine for regression testing. And this will be an invaluable
measurement tool to test the end-to-end keep-alive when it's finished. I
still have to figure out the meaning of some options and how to make the
process less verbose. Right now it fills a screen with many zeroes,
making it hard to extract the useful numbers. I'm grateful to Kristian
to have made me revisit httperf !
###

> # ab -n 500000 -c 200 http://192.168.1.11/a.txt
> This is ApacheBench, Version 2.3 <$Revision: 655654 $>
> Copyright 1996 Adam Twiss, Zeus Technology Ltd,
> http://www.zeustech.net/
> Licensed to The Apache Software Foundation, http://www.apache.org/
>
> Benchmarking 192.168.1.11 (be patient)
> Completed 50000 requests
> Completed 100000 requests
> Completed 150000 requests
> Completed 200000 requests
> Completed 250000 requests
> Completed 300000 requests
> Completed 350000 requests
> Completed 400000 requests
> Completed 450000 requests
> Completed 500000 requests
> Finished 500000 requests
>
>
> Server Software: nginx/0.7.67
> Server Hostname: 192.168.1.11
> Server Port: 80
>
> Document Path: /a.txt
> Document Length: 5 bytes
>
> Concurrency Level: 200
> Time taken for tests: 34.309 seconds
> Complete requests: 500000
> Failed requests: 0
> Write errors: 0
> Total transferred: 107500000 bytes
> HTML transferred: 2500000 bytes
> Requests per second: 14573.44 [#/sec] (mean)
> Time per request: 13.724 [ms] (mean)
> Time per request: 0.069 [ms] (mean, across all concurrent
> requests)
> Transfer rate: 3059.85 [Kbytes/sec] received
>
> Connection Times (ms)
> min mean[+/-sd] median max
> Connect: 0 4 73.8 2 3006
> Processing: 0 10 5.2 9 225
> Waiting: 0 9 5.2 8 224
> Total: 1 14 73.9 11 3021
>
> Percentage of the requests served within a certain time (ms)
> 50% 11
> 66% 13
> 75% 14
> 80% 15
> 90% 17
> 95% 19
> 98% 21
> 99% 24
> 100% 3021 (longest request)
>
>
> Thanks.
>
> Posted at Nginx Forum:
> http://forum.nginx.org/read.php?2,216332,216336#msg-216336
>
> _______________________________________________
> nginx mailing list
> nginx@nginx.org
> http://mailman.nginx.org/mailman/listinfo/nginx

_______________________________________________
nginx mailing list
nginx@nginx.org
http://mailman.nginx.org/mailman/listinfo/nginx
Subject Author Posted

Nginx setting up >25.000 concurrent connections per second

atadmin October 06, 2011 02:30PM

Re: Nginx setting up >25.000 concurrent connections per second

ktm2 October 06, 2011 02:40PM

RE: Nginx setting up >25.000 concurrent connections per second

Richard Kearsley October 06, 2011 02:44PM

Re: RE: Nginx setting up >25.000 concurrent connections per second

atadmin October 06, 2011 03:00PM

Re: RE: Nginx setting up >25.000 concurrent connections per second

atadmin October 06, 2011 04:53PM

Re: RE: Nginx setting up >25.000 concurrent connections per second

atadmin October 07, 2011 02:44AM

Re: RE: Nginx setting up >25.000 concurrent connections per second

Aleksandar Lazic October 06, 2011 03:20PM

Re: RE: Nginx setting up >25.000 concurrent connections per second

Bradley Falzon October 07, 2011 06:46AM

Re: Nginx setting up >25.000 concurrent connections per second

Dennis Jacobfeuerborn October 07, 2011 07:20AM

Re: Nginx setting up >25.000 concurrent connections per second

Bradley Falzon October 07, 2011 08:30AM

SV: Nginx setting up >25.000 concurrent connections per second

Fredrik Widlund October 07, 2011 12:06PM

Re: Nginx setting up >25.000 concurrent connections per second

Bradley Falzon October 06, 2011 08:10PM

Re: Nginx setting up >25.000 concurrent connections per second

magicbear October 06, 2011 09:12PM

Re: Nginx setting up >25.000 concurrent connections per second

magicbear October 06, 2011 09:12PM

Re: Nginx setting up >25.000 concurrent connections per second

magicbear October 06, 2011 09:22PM



Sorry, only registered users may post in this forum.

Click here to login

Online Users

Guests: 300
Record Number of Users: 8 on April 13, 2023
Record Number of Guests: 421 on December 02, 2018
Powered by nginx      Powered by FreeBSD      PHP Powered      Powered by MariaDB      ipv6 ready