Welcome! Log In Create A New Profile


apparent deadlock with fastcgi

October 10, 2012 06:18PM
I have a simple fastcgi responder that converts a JSON post to a CSV download. It works in a streaming fashion, writing the response before finishing reading the request. According to the FastCGI specification, such operation is allowed:


"The Responder application sends CGI/1.1 stdout data to the Web server over FCGI_STDOUT, and CGI/1.1 stderr data over FCGI_STDERR. The application sends these concurrently, not one after the other. The application must wait to finish reading FCGI_PARAMS before it begins writing FCGI_STDOUT and FCGI_STDERR, but it needn't finish reading from FCGI_STDIN before it begins writing these two streams."

Using a debugger I observe that my responder blocks while reading from FCGI_STDIN before I have received the whole request, which I think implies there is a bug on the nginx side.

If I buffer the entire response in memory before sending it to back to nginx, I don't see the problem with the blocking read on the request.

I only observe this problem when the request is "large enough", but I'm not sure what exact size triggers the problem.

Does nginx support this sort of behavior for fastcgi responders? If not, is that fact documented somewhere that I missed?
Subject Author Posted

apparent deadlock with fastcgi

jkl October 10, 2012 06:18PM

Re: apparent deadlock with fastcgi

Maxim Dounin October 10, 2012 06:50PM

Sorry, only registered users may post in this forum.

Click here to login

Online Users

Guests: 75
Record Number of Users: 6 on February 13, 2018
Record Number of Guests: 421 on December 02, 2018
Powered by nginx      Powered by FreeBSD      PHP Powered      Powered by MariaDB      ipv6 ready