Welcome! Log In Create A New Profile


Output buffering in nginx+php+fastcgi

Posted by keenharlequin 
Output buffering in nginx+php+fastcgi
September 13, 2011 09:46PM
I have nginx set up on an Ubuntu server, with php through fastcgi. A particular web app I have uses php to parse through a very large text file (2.5 million lines or so), searching for particular regexp matches. As it finds them, it outputs them. The process takes a considerable period of time (about 37 seconds).

What I would like to happen is for each chunk of the output to be instantly sent to the client every time it's produced. I have php flush()ing at appropriate times, but because of buffering in fastcgi, the output is not sent to the client until the buffer fills up. The default buffer is 4096 bytes; it can be modified using fastcgi_buffer_size parameter in the nginx config, but that leads to another issue: if the buffer is too small, fastcgi craps out with an error "upstream sent too big header while reading response header from upstream".

One option is to pad each output from php to 4K, or whatever the smallest buffer size fastcgi can muster. But that seems like increasing the amount of traffic (which is metered) unnecessarily.

Any thoughts on how to get fastcgi to flush the buffer every time there's output from php?
Re: Output buffering in nginx+php+fastcgi
November 24, 2011 10:32AM
I would also like to have a response on this from Max Dunin or Igor Sysoev, preferably. We have a similar issue when we want to show progress status for a long PHP process in the browser, and when we don't want to create separate AJAX requests only to show the progress status.

Feedback appreciated!

Sorry, only registered users may post in this forum.

Click here to login

Online Users

Guests: 285
Record Number of Users: 8 on April 13, 2023
Record Number of Guests: 421 on December 02, 2018
Powered by nginx      Powered by FreeBSD      PHP Powered      Powered by MariaDB      ipv6 ready