Yes. Firewall would be another option. But before to that, i would like to try out all options at nginx level if one or other would resolve the issue at nginx layer itself. cant we put accept() filters? or how the deny option works? can we use deny option to not to accept any new connections if number of connections already exceeds max limit from a client IP.? are there any third party modulby Phani Sreenivasa Prasad - Nginx Mailing List - English
I assume it would help dropping connections . since we are setting rate limit per ip and any client IP which is suspicious by sending requests in bulk(lets say 10000 connections/requests), it makes sense to not to accept connections/requests from that IP. Thoughts ??by Phani Sreenivasa Prasad - Nginx Mailing List - English
Hi All, I am using nginx in our products. When I run goldeneye DoS attack script against nginx, it is not able to defend against the attack and normal users getting impacted. python goldeneye.py http://<ipaddress> -w 5 -s 10000 -m random -d we are using below nginx limit_req options but didnt help. The nginx documentation says that, these options are used to limit the request ratiby Phani Sreenivasa Prasad - Nginx Mailing List - English
no other way to read this env variable other than reading it from access_log file? is it possible to export this to all other processes running on a system?by Phani Sreenivasa Prasad - Nginx Mailing List - English
on my dev setup, the logs are disabled due to memory constraint. Also the log_format directive would log many more fields into that file which I am not interested in. is there a way I can read it through fastcgi param from my fastcgi app? I see one problem here is - nginx sends all env variables as a record when the request comes initially and later on it doesnt send any more env variables aftby Phani Sreenivasa Prasad - Nginx Mailing List - English
Hi So, how can I read this value from my fastcgi app for each request/response?by Phani Sreenivasa Prasad - Nginx Mailing List - English
Hi I want to read the nginx the value of nginx variable $body_bytes_sent from my fastcgi application to check how many bytes nginx had sent to client? I tried something below fastcgi_param BODY_BYTES_SENT $body_bytes_sent in my fastcgi_params and trying to read the fastcgi param value from the fastcgi application. But it always returns 0. Please help me how to read nginx variableby Phani Sreenivasa Prasad - Nginx Mailing List - English
Hi I am using fastCGI for my application to talk to nginx. I have a requirement such that when my application processes request and sent the response , would like to check whether nginx also sent response successfully to client. ? How can this be achieved? Is there a way I can register a callback with nginx with any of its directives?by Phani Sreenivasa Prasad - Nginx Mailing List - English
Hi B.R Please find the nginx confiuration below that we are using. and any help would be greatful. nginx -V ================= nginx version: nginx/1.8.0 built with OpenSSL 1.0.2h-fips 3 May 2016 TLS SNI support enabled configure arguments: --crossbuild=Linux::arm --with-cc=arm-linux-gnueabihf-gcc --with-cpp=arm-linux-gnueabihf-gcc --with-cc-opt='-pipe -Os -gdwarf-4 -mfpu=neon --sysrootby Phani Sreenivasa Prasad - Nginx Mailing List - English
Hi I have a question the other way. how to enable pipelining on upstream side? or atleast how to make nginx open multiple loopack connections to serve requests pipelined from client side? Thanks Prasad.by Phani Sreenivasa Prasad - Nginx Mailing List - English
Hi all, for one of our products we have chosen nginx as our webserver and using fastCGI to talk to upstream(application) layer. We have a usecase where in the client sends huge payload typically in MB and nginx is quick enough reading all the data and buffering it . Whereas our upstream server is quite slower in consuming the data. This resulted in timeout on client side since the upstream cantby Phani Sreenivasa Prasad - Nginx Mailing List - English
Hi all, for one of our products we have chosen nginx as our webserver and using fastCGI to talk to upstream(application) layer. We have a usecase where in the client sends huge payload typically in MB and nginx is quick enough reading all the data and buffering it . Whereas our upstream server is quite slower in consuming the data. This resulted in timeout on client side since the upstream cantby Phani Sreenivasa Prasad - Nginx Development
setting the fastcgi_request_buffering and fastcgi_buffering to "off" in nginx.conf has no effect. Nginx still buffering the requests and responses. Need help on this.by Phani Sreenivasa Prasad - How to...
Hi All, I have found two issues with nginx 1.8.1 version . 1. nginx is not forwarding the http pipelined requests to the FastCGI server. I have a case where the browser sends a GET request to submit a job to my FastCGI application for processing and then sends a DELETE request to cancel the job while in progress. But though the two requests might be pipelined at nginx, it is not the sameby Phani Sreenivasa Prasad - How to...
Hi Ameisen,, How did you resolve this issue? I am facing same issue with FastCGI application too. It accepts > connections on the listener socket, and defers them to other threads > to process (which do so using a fiber pool) while the listener thread > keeps listening for more connections. This is how many network applications does accept concurrent requests by doing multby Phani Sreenivasa Prasad - How to...
Hi All, I have found two issues with nginx 1.8.1 version . 1. nginx is not forwarding the http pipelined requests to the FastCGI server. I have a case where the browser sends a GET request to submit a job to my FastCGI application for processing and then sends a DELETE request to cancel the job while in progress. But though the two requests might be pipelined at nginx, it is not the same caby Phani Sreenivasa Prasad - How to...