Sunday, July 31, 2016

php fpm - FastCGI: "comm with server aborted: read failed" only for one specific file




Related question:
FastCGI and Apache 500 error intermittently



The solution does not work for me.








The problem:



I have a Laravel 5.1 application (was in production on other servers without any problems) running on a fresh Ubuntu 14.04 server with Apache 2.4.7 and PHP through PHP-FPM.



Everything works fine as long as a certain file isn't invoked in the application:



$compiledPath = __DIR__.'/cache/compiled.php';


if (file_exists($compiledPath)) {
require $compiledPath; // this causes a "500 Internal Server Error"
}


It's a Laravel specific file created automatically by the framework itself to speed things up a little (so it's not a bug in my code), it really exists and I have full access permissions. It's about 600kB in size. When I remove it, everything works fine. But when I tell Laravel to create it again and then hit any route of the application, I get a "500 Internal Server Error" with the following log entries:




[fastcgi:error] [pid 14334] (104)Connection reset by peer: [client
xxx.xxx.xxx.xxx:41395] FastCGI: comm with server

"/var/www/clients/client1/web1/cgi-bin/php5-fcgi-yyy.yyy.yyy.yyy-80-domain.com"
aborted: read failed



[fastcgi:error] [pid 14334] [client xxx.xxx.xxx.xxx:41395] FastCGI:
incomplete headers (0 bytes) received from server
"/var/www/clients/client1/web1/cgi-bin/php5-fcgi-yyy.yyy.yyy.yyy-80-domain.com"



[fastcgi:error] [pid 14334] (104)Connection reset by peer: [client
xxx.xxx.xxx.xxx:41395] FastCGI: comm with server
"/var/www/clients/client1/web1/cgi-bin/php5-fcgi-yyy.yyy.yyy.yyy-80-domain.com"

aborted: read failed



[fastcgi:error] [pid 14334] [client xxx.xxx.xxx.xxx:41395] FastCGI:
incomplete headers (0 bytes) received from server
"/var/www/clients/client1/web1/cgi-bin/php5-fcgi-yyy.yyy.yyy.yyy-80-domain.com"




What I've tried:



I tried the solution in the related question mentioned above, which also represents most of the other suggestions concerning this problem I could find: Play around with the common PHP-FPM settings in order to assign more resources. The accepted answer also mentions the option of completely abandoning FastCGI, but I don't want to go there. So I played around with the values, but no luck.




There is no load on the server whatsoever since I'm the only one using it, so I really doubt that it's an issue with the available resources (It's a VPS with 12GB RAM). Could it have something to do with the filesize? It's the only PHP file that big.



I could reproduce the problem on 2 different servers with the same configuration. It did not occur on an Ubuntu 12.04 server with Apache 2.2 with FastCGI.



My current configuration:



PHP-FPM:



pm.max_children = 10

pm.start_servers = 2
pm.min_spare_servers = 1
pm.max_spare_servers = 5
pm.max_requests = 0






...

Alias /php5-fcgi /var/www/....
FastCgiExternalServer /var/www/.... -idle-timeout 300 -socket /var/lib/php5-fpm/web1.sock -pass-header Authorization



php.ini



memory_limit = 512M
output_buffering = on


Answer



If PHP is failing only on specific source files, the most probable reason is that some PHP code accelerator (opcode cache) like Xcache, APC or eAccelerator has issues with the file. This can be due to bugs in the accelerator or in PHP itself.



You can try to run your script via PHP command-line interface (php-cli command) as PHP CLI doesn't use any accelerators.


No comments:

Post a Comment

linux - How to SSH to ec2 instance in VPC private subnet via NAT server

I have created a VPC in aws with a public subnet and a private subnet. The private subnet does not have direct access to external network. S...