PHP Warning: exec(): Unable to fork [rm some_file.txt] in some.php on
line 111
There is a question that has been asked before about this subject PHP Warning: exec() unable to fork I have similar problem but it not the same.
ulimit -a
core file size (blocks, -c) 0
data seg size (kbytes, -d) unlimited
scheduling priority (-e) 0
file size (blocks, -f) unlimited
pending signals (-i) 31364
max locked memory (kbytes, -l) 64
max memory size (kbytes, -m) unlimited
open files (-n) 1024
pipe size (512 bytes, -p) 8
POSIX message queues (bytes, -q) 819200
real-time priority (-r) 0
stack size (kbytes, -s) 10240
cpu time (seconds, -t) unlimited
max user processes (-u) 31364
virtual memory (kbytes, -v) unlimited
file locks (-x) unlimited
My limits are shown above and it looks like there is nothing with low limit on server that can affect this error.
I tried to unsetting variables after using them both with unset and by setting them to null to free up memory. But it has no effect.
unset($var);
$var = null;
Unable to fork error occuring because of exhausting of some resources but I can't find the reason. Can you suggest me to which logs I should look?
Any ideas or workaround for this problem?
Any ideas or workaround for this problem?
The problem is likely a flaw in your code - like it was in https://stackoverflow.com/a/20649541/2038383. So the work around is fixing it.
Can you suggest to me in which logs I should look?
There is your PHP logs, then your system / kernel logs.
You already know where to get the PHP log and what is in it by default. Unfortunately your not going to get much more out of PHP. You could catch the error yourself with set_error_handler() but that won't give you any more useful info (it'll give you PHP's "errno" but not UNIX's errno).
As for system logs; I said in comments check your syslog. There might be something in there, and it's always a good starting point. But actually you won't generally see ulimit violations in syslog. Some will get logged (example stack size generates segfault which gets logged) but many won't. This post deals with how to get logs of ulimit violations: https://unix.stackexchange.com/questions/139011/how-do-i-configure-logging-for-ulimits. Surprisingly non trivial.
The way system ulimit violations are supposed to be reported by system call is by setting an errno. For example, if max user processes is hit fork() will return EAGAIN.
So ... you need to get at that UNIX errno to know what is really going on. Unfortunately I don't think there is a way in PHP (there is posix_errno(), but pretty sure that is limited to PHP's posix_XXX function library). Also note, it's PHP generating the "Unable to fork" message. How that maps to actual system call error is not completely transparent.
So best off looking at other ways to debug of which there are plenty. Like system monitoring tools: ps, dstat, strace might be a good start.
Related
I have no idea why or how this came to be, but for some odd reason PHP scripts on my server, once they utilize ini_set trying to influence the memory_limit setting, cause the script to completely crash. No error messages, no nothing. If i call the script through the browser, all i get is a blank page.
Any hints on this?
Update:
running 'free' returns
total used free shared buffers cached
Mem: 8190820 7922056 268764 0 565124 6598656
-/+ buffers/cache: 758276 7432544
Swap: 2102456 0 2102456
Is something hogging my memory?
running ps aux |grep apache gives me 'ERROR: Unsupported option (BSD syntax)'
Checking manually i found a whole bunch of lines refering to:
/usr/sbin/apache2 -k start
All at about 0.3% memory usage and owned by 'www-data'.
The scary part is that none of the processes listed by 'ps aux' uses more than 0.8% of the memory. And if i add up all the percentages listed, i never arrive at where i should according to what 'free' is telling me.
I seem to remember there being a problem with requesting anything over 2GB. I think 2GB is the magic cut-off in at least some versions of PHP.
try with this code:
ini_set('memory_limit', '-1');
I'm trying to increase the allowed memory for certain PHP script. No matter what I do, for instance this:
ini_set('memory_limit', '512M');
... the script always runs out of memory at around 300MB:
Fatal error: Out of memory (allocated 25165824) (tried to allocate 343810589 bytes) in \\Foo\phpQuery\phpQuery.php on line 255
I've verified by several means that memory_limit is actually changed. The issue seems to be that PHP can't physically allocate a total of 300 MB of memory (25165824 bytes + 343810589 bytes = 352 MB).
I've tried both PHP/5.3.0 and PHP/5.3.9 in two different Windows-based computers with the following specs:
Windows XP / Windows Server 2003 (both computers are 32-bit boxes with 1GB or RAM)
Official PHP 32-bit VC9 binaries
Running as Apache 2.2 module (third-party 32-bit VC9 binaries)
I understand that using half of the physical RAM will force swapping and slow things down as hell but I just need to make sure the script actually works so it can be deployed to the live server. I've also tried larger values (which procuded the same error) and smaller values (with either made my script hit the limit or made Apache crash).
What can be the source of this apparently hard-coded memory limit?
Update #1: I've done further testing with the Windows Server 2003 box (which is actually a VMWare virtual machine). I've increased the "physical" RAM to 2 GB and I've verified that the paging file is allowed to grow up to 1152 MB. Task manager shows that current transaction load is 886 MB and there're 1,5 GB of free physical memory. However, I'm getting the same error with exactly the same figures.
Update #2: As I said, the memory_limit directive is fine. It shows up in both ini_get() and phpinfo(). The error message you'd get is slightly different from mine; mine indicates a PHP crash. Please compare:
Out of memory (allocated 25165824) (tried to allocate 343810589 bytes)
Allowed memory size of 25165824 bytes exhausted (tried to allocate 343810589 bytes)
I'll try to compose a script to reproduce the issue and report back.
An OOM exception is different to the memory limit warninigs.
This means PHP can't actually allocate the memory because insufficient resources are available within your operating system.
You'll need to check the system has sufficient memory/paging available to support this.
Try with max_input_time, sometimes when PHP says memory_limit it actually means max_input_time (-1 is infinite for this one).
I had a similar problem with out-of-memory errors popping up at numbers as low as 250MB. If you find your apache config file that controls ThreadsPerChild (for me it was /conf/extra/httpd-mpm.conf) and reduce the ThreadsPerChild from 150 to 50 or so under you should see a noticeable improvement... here's a script to test it out:
echo "Memory limit: ".ini_get("memory_limit")."<br><br>";
$a=array();
if (ob_get_level() == 0) ob_start();
for($i=0;$i<200;$i++)
{
$a[]=str_pad('',1024*1024*32);
echo "Pass ".$i.", memory used: ".number_format((memory_get_usage())/(1024*1024),0)." MB<br>";
ob_flush();
flush();
}
I'm using a php script to create image thumbnails and this error is thrown while creating some thumbs:
Fatal error: Allowed memory size of 31457280 bytes exhausted (tried to
allocate 227 bytes)
this is what top shows:
top - 07:43:49 up 44 days, 22:21, 1 user, load average: 0.00, 0.00, 0.00
Tasks: 171 total, 1 running, 170 sleeping, 0 stopped, 0 zombie
Cpu(s): 0.0%us, 0.2%sy, 0.0%ni, 99.7%id, 0.2%wa, 0.0%hi, 0.0%si, 0.0%st
Mem: 6097648k total, 3459060k used, 2638588k free, 566924k buffers
Swap: 4194296k total, 0k used, 4194296k free, 1991920k cached
I haven't looked at optimizing phpthumb code. But is there any other way to free the already used memory? May be a cron job can be used to free this memory on regular intervals?
Your image is probably larger than ~10-15MB. PHP has a limit on the amount of memory it can take up per script (memory_limit in php.ini)
What happens is that you load an image in memory (And then resize it, creating a second image)...
Change the memory limit if you're allowed, or don't load such large image ...
AFAIK there is no stream image reader ...
If you can't change the memory limit, a workaround might be calling the commandline ImageMagick or GraphicsMagick tools if they're installed ...
This is a typical php.ini problem, if you are running this script on a VPS or a dedicated server, edit the php.ini file and set memory_limit to 99(or more)MB, also look out for max_run_time as that can stop a script after x number of seconds.
Don't forget to reboot Apache after you have done the changes,
If you are running this on a shared server, you might have some problems trying to solve this as you can't edit the settings file, you can try to set the settings in the actual script, however this usually doesn't .
I need to run a PHP CLI script and give it a LOT of memory (it's an automatic documentation generator that needs to cover a large code base). I have a powerful machine and have allocated 5GB to php, both in php.ini and in an inline ini_set('memory_limit','5120M') declaration in the script file.
If I add these lines to top of the script:
phpinfo();
exit();
... it claims that it has a memory limit of 5120M, as I specified.
But the script still errors out, saying
Fatal error: Allowed memory size of 1073741824 bytes exhausted
... which is 1GB, not 5GB as I specified.
Is there any other place that the script might be looking to determine its memory limit? This is running in Fedora Linux.
(Yes, the ultimate solution may be to rewrite the script to be more efficient, but I didn't write it in the first place, so before I resort to that, I want to just throw resources at it and see if that works.)
The heap limit property is a size_t, which is 32 bits on a 32-bit machine. If this is in bytes, this would limit the memory limit to 4 GB. You may try running it on a 64 bit machine, with 64-bit PHP.
Edit: confirmed, heap->limit is a size_t (unsigned int) and is in bytes. A memory_limit of -1 sets the heap->limit to 4GB, and does not disable it as the documentation implies. Setting it to 5GB makes it wrap around to 1GB.
I've got a PHP script that I call to run MySQL database backups to .sql files, TAR/GZip them and e-mail them to me. One of the database is hosted by a different provider than the one providing the web server. Everything is hosted on Linux/Unix. When I run this command:
$results = exec("mysqldump -h $dbhost -u $dbuser -p$dbpass $dbname > $backupfile", $output, $retval);
(FYI, I've also tried this with system(), passthru() and shell_exec().)
My browser loads the page for 15-20 seconds and then stops without processing. When I look at the server with an FTP client, I can see the resulting file show up a few seconds later and then the file size builds until the database is backed up. So, the backup file is created but the script stops working before the file can be compressed and sent to me.
I've checked the the max_execution_time variable in PHP and it's set to 30 seconds (longer than it takes for the page to stop working) and have set the set_time_limit value to as much as 200 seconds.
Anyone have any idea what's going on here?
Are you on shared hosting or are these your own servers? If the former your hosting provider may have set the max execution time to 15-20secs and set it so it cannot be overridden (I have this problem with 1&1 and these type of scripts).
Re-check the execution-time-related parameters with a phpinfo() call... maybe it's all about what Paolo writes.
Could also be a (reverse) proxy that is giving up after a certain period of inactivity. Granted it's a long shot but anyway.... try
// test A
$start = time();
sleep(20);
$stop = time();
echo $start, ' ', $stop;
and
// test B
for($i=0; $i<20; $i++) {
sleep(1);
echo time(), "\n";
}
If the first one times out and the second doesn't I'd call that not proof but evidence.
Maybe the provider has set another resource limit beyond the php.ini setting.
Try
<?php passthru('ulimit -a');
If the command is available it should print a list of resources and their limits, e.g.
core file size (blocks, -c) 0
data seg size (kbytes, -d) unlimited
scheduling priority (-e) 0
file size (blocks, -f) unlimited
pending signals (-i) 4095
max locked memory (kbytes, -l) 64
max memory size (kbytes, -m) unlimited
open files (-n) 1024
pipe size (512 bytes, -p) 8
POSIX message queues (bytes, -q) 819200
real-time priority (-r) 0
stack size (kbytes, -s) 8192
cpu time (seconds, -t) unlimited
max user processes (-u) 4095
virtual memory (kbytes, -v) unlimited
file locks (-x) unlimited
Maybe you find some more restrictive settings than that on your shared server.
Do a manual dump and diff it against the broken one. This may tell you at which point mysqldump stops/crashes
Consider logging mysqldump output, as in mysqldump ... 2>/tmp/dump.log
Consider executing mysqldump detached so that control is returned to PHP before the dump is finished
On a side note, it is almost always a good idea to mysqldump -Q