This is a small example of my code, I changed the code and in the current version on post/put request trying to read the buffer and print it, on any other requests just send response (I tried to minimal it as much as I could) On the current flow when running a post request with 1MB data the ngx_http_read_client_request_body return NGX_AGAIN and not call the post_handler I am not getting the ngxby Ortal - Nginx Mailing List - English
Hi, I am developing my own nginx module, I am getting a post requests, parse the data and send 204. I worked with nginx version release-1.9.15, and I am trying to upgrade to version release-1.15.5. After the upgrade post requests with payload larger then 1M are getting blocked. From the nginx log: 2018/11/21 20:03:10 13470#0: *2 http reading blocked I attached to the process and this isby Ortal - Nginx Mailing List - English
Hello, I am building an nginx module, using ngx_http_upstream. I am using ngx_http_request_t struct and I would like to know if my assumption that request_body->bufs will not be reuse (free) until the connection will be finalized?by Ortal - Nginx Mailing List - English
Wrong stack... #3 0x00000000004799f5 in ngx_http_free_request (r=0x2a04d80, rc=0) at src/http/ngx_http_request.c:3434 #4 0x00000000004798f0 in ngx_http_close_request (r=0x2a04d80, rc=0) at src/http/ngx_http_request.c:3405 #5 0x0000000000479383 in ngx_http_lingering_close_handler (rev=0x27821d0) at src/http/ngx_http_request.c:3265 #6 0x0000000000444dd2 in ngx_event_process_posted (cycle=0x2by Ortal - Migration from Other Servers
Hello, I am building a nginx module which get parsed http requests send it to my server, wait for the response from my server and send the client http response. The nginx close my client connections due to lingering_close: #3 0x0000000000477647 in ngx_http_terminate_request (r=0x1cb1050, rc=-2) at src/http/ngx_http_request.c:2466 #4 0x0000000000476fb5 in ngx_http_finalize_request (r=0xby Ortal - Migration from Other Servers
Hello, I am writing a nginx module. I would like to know it which flow is the error filed of the connection set to 1, I am running a test which the nginx epoll call ngx_http_finalize_request with: r->connection->error = 1, this will terminate my request before I finished with my job. Thanksby Ortal - Nginx Mailing List - English
Hello, I am developing my own nginx module, I would like to use the "keepalive_timeout" option. I tried to find a callback which signal that timeout was expired, before the nginx call ngx_http_close_connection. I am using a memory which I allocate and I would like to free it Thanks, Ortal Leviby Ortal - Nginx Mailing List - English
Make sense... Thanksby Ortal - Nginx Mailing List - English
Hello, I have created my own NGINX module, which get rest request and return a response. When I send a response with status 500 (NGX_HTTP_INTERNAL_SERVER_ERROR) or 404 (NGX_HTTP_BAD_REQUEST) I see on the memory uses that the nginx processes is growing up PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND 8734 ortal 20 0 7761628 7.335g 1736 S 69.0 4by Ortal - Nginx Mailing List - English
Hello, I am building my own NGINX module, when I try to set the last modified out header I have a struct timespec which I need to convert to last modified. When I set the last modified with the tv_sec the last modified get invalid date. How can I set the last modified correctly?by Ortal - Nginx Mailing List - English
Hello, I created a NGINX module, I am trying to do a benchmark on my module. I would like to check the performance on a post requests (different files...). I tried to use ab, wrk and locust. I tried running each one of the tools on the same NGINX servers and different servers. In all of my tests the NGINX did not passed the 30% CPU while the tolls got to over 100%. My question is: Which tby Ortal - Nginx Mailing List - English
Hello, I am writing a NGINX module, and I would like to know if there is an option to send the client a response in parts? Meaning, in case of large files my server return the response to the GET request in parts, I would like to send each part to the user without saving it to a temp file. I would like each part of the response to be send and the client will get a few responses to the same requby Ortal - Nginx Mailing List - English
Hi, I am using the "Emiller's Guide To Nginx Module Development", in section "3.1.4. Sending the body", there is allocation to: ngx_chain_t out. I do not understand where is the free for this allocation? Should it be a handler which free the buf? Thanks, Ortal Leviby Ortal - Nginx Mailing List - English
Hello, I am new in NGINX. I am using boto service with NGINX. When I am sending a put request I get a response with ETag which does not match the MD5. My question is: what could be the reason for that? Thanks, Ortalby Ortal - Nginx Mailing List - English