Welcome! Log In Create A New Profile

Advanced

apparent deadlock with fastcgi

October 10, 2012 06:18PM
I have a simple fastcgi responder that converts a JSON post to a CSV download. It works in a streaming fashion, writing the response before finishing reading the request. According to the FastCGI specification, such operation is allowed:

http://www.fastcgi.com/devkit/doc/fcgi-spec.html

"The Responder application sends CGI/1.1 stdout data to the Web server over FCGI_STDOUT, and CGI/1.1 stderr data over FCGI_STDERR. The application sends these concurrently, not one after the other. The application must wait to finish reading FCGI_PARAMS before it begins writing FCGI_STDOUT and FCGI_STDERR, but it needn't finish reading from FCGI_STDIN before it begins writing these two streams."

Using a debugger I observe that my responder blocks while reading from FCGI_STDIN before I have received the whole request, which I think implies there is a bug on the nginx side.

If I buffer the entire response in memory before sending it to back to nginx, I don't see the problem with the blocking read on the request.

I only observe this problem when the request is "large enough", but I'm not sure what exact size triggers the problem.

Does nginx support this sort of behavior for fastcgi responders? If not, is that fact documented somewhere that I missed?
SubjectAuthorPosted

apparent deadlock with fastcgi

jklOctober 10, 2012 06:18PM

Re: apparent deadlock with fastcgi

Maxim DouninOctober 10, 2012 06:50PM



Sorry, only registered users may post in this forum.

Click here to login

Online Users

Guests: 64
Record Number of Users: 7 on March 06, 2014
Record Number of Guests: 229 on August 01, 2014
Powered by nginx    Powered by FreeBSD    PHP Powered    Powered by Percona     ipv6 ready