Hi,
I need your help and advice to optimize my linux system with Nginx.
I would like to have 2000 requests per seconds without loss packet.
But it's difficult, because each second one user execute "HTTP GET" to download a file JSON.
This file change every seconds, but no its name.. So i don't use cache..
This file is on another disk (with same performance).
Currently (with Jmeter) i lose 10% of packets each seconds.
My conf :
user www-data;
worker_processes 4;
worker_rlimit_nofile 16384;
error_log /var/log/nginx/error.log;
pid /var/run/nginx.pid;
events {
worker_connections 4096;
multi_accept on;
use epoll;
}
http {
include /etc/nginx/mime.types;
#access_log /var/log/nginx/access.log;
access_log off;
sendfile on;
#tcp_nopush on;
tcp_nodelay on;
#keepalive_timeout 0;
#client_body_timeout 4;
#client_header_timeout 4;
#keepalive_timeout 4;
#send_timeout 4;
keepalive_timeout 10;
#reset_timedout_connection on;
gzip on;
gzip_http_version 1.1;
gzip_vary on;
gzip_comp_level 6;
gzip_proxied any;
gzip_types text/plain application/json application/georacing application/octet-stream;
gzip_buffers 16 8k;
#gzip_disable "MSIE [1-6]\.(?!.*SV1)";
gzip_disable msie6;
...
Have you an idea to optimize my conf ?
Thank you for your help.
Best regards.
Charlie.