A single user/website (WordPress) is returning HTTP 503 Service Unavailable:
The server is temporarily unable to service your request due to maintenance downtime or capacity problems. Please try again later.
Other users/websites are working just fine.
How do I troubleshoot this?
I’ve checked all the logs at /usr/local/apnscp/storage/logs
, /var/log/httpd
and /var/log/php-fpm
, and also logs in the virtual account under /home/virtual/example.com/var/logs
and they show nothing. They don’t report any fatal errors or anything revant to resource usage, etc.
Environment
# cpcmd misc:cp-version
revision: 4921e9dac660d66dccd58c0ef39b44009bb9819d
timestamp: 1738426006
ver_maj: 3
ver_min: 2
ver_patch: 45
ver_pre: ''
dirty: false
debug: false
# uname -r
4.18.0-553.36.1.el8_10.x86_64
Appears to be related to/same issue as:
Any idea what can couse a website in ApisCP to start throwing a HTTP 503 - Service Unavailable error? Error message is:
The server is temporarily unable to service your request due to maintenance downtime or capacity problems. Please try again later.
Doesn’t appear to be an issue with the website itself.
Apache error_log is showing the following whenever I try to refresh the website:
[proxy_fcgi:error] [pid 603735:tid 603758] [client 69.162.124.229:60452] AH01067: Failed to read FastCGI he…
This time, the chown ...
did not fix the issue.
I found that last time I had renamed website.com/wp-content/debug.log
and that it is significantly large (4G).
Once again, the same debug.log
file was again 4G. Removinging the file fixed the issue.
Maybe error has to do with the debug.log being maxed out and WP being unable to write further to it.
Maximum filesize is 4 GB to restrict runaway logging… which is exactly what happened here. It may be overridden using the system.process-limits
Scope .
ETA: here’s a related post that walks through validating the limit:
What’s the HTTP response code? What application is accepting the upload? Does it always fail at the 4 GB boundary?
You’d need to alter the value in:
limits.conf using system.process-limits , this affects the max file size created within vfs; can be tested within the account using dd if=/dev/zero of=~/file bs=1G count=5
Apache may be overrode using LimitFSIZE , its primary purpose is to throttle logfiles from running away
If uploaded by from a PHP app, per-site by overriding resources.file_size
…
Ah great; as it should be. Thanks for confirming!