My php application requires max_execution_time > 60 but nginx always cut connection after 60 seconds.
I studdied the manuals but couldn't find a solution :/
Here are my system specs:
Ubuntu 10.04.4 LTS
nginx/0.7.65
php-fpm 5.3.2-1ubuntu4.7ppa5lucid1
What I have done so far:
php.ini
max_execution_time = 300
max_input_time = 300
php5-fpm.ini
request_terminate_timeout = 0
sites
location ~* \.php$ {
fastcgi_pass unix:/tmp/php.sock;
fastcgi_index index.php;
fastcgi_param SCRIPT_FILENAME /var/www/wiki$fastcgi_script_name;
include fastcgi_params;
fastcgi_send_timeout 300;
}
Because logfiles from php and nginx doesn't show who is disconnect the connection - php-fpm or nginx - so I wrote a short php-Script which counts in a textfile with an interval of 1 second.
timeout.php
<?php
$fh = fopen("timeout_test.txt", "w+");
if (!$fh)
{
echo "File couldn't be created!";
exit;
}
for ($i = 0;; $i++)
{
sleep(1);
fwrite($fh, ($i+1)."<br />\r\n");
echo ($i+1)."<br />\r\n";
flush();
}
fclose($fh);
?>
The nginx error log shows this message exactly after 60 sec:
2012/06/18 12:20:39 [error] 10271#0: *8264 upstream timed out (110: Connection timed out) while reading response header from upstream, client: xxx.xxx.xxx.xxx, server: xxxx.xxxx.xx.xx, request: "GET /timeout.php HTTP/1.1", upstream: "fastcgi://unix:/tmp/php.sock:", host: "xxxx.xxxx.xx.xx"
I also verified this on a machine with nginx/1.0.15 and php-fpm 5.3.13
Does anybody has an idear what causes this timeout?