I have this set in my php script to make it supposedly run as long as it needs to to parse and do mysql queries and fetch images for over 100,000 rows.
ignore_user_abort(true);
set_time_limit(0);
#begin logging output
error_reporting(E_ALL);
ini_set('memory_limit', '512M');
I run the command like this in shell:
nohup php myscript.php > output.txt
after running about 8 to 10 hours this script will still be running but execution just stops... no more output.. it's not a zombie process I checked top. It hasn't met the memory limit either and if it did wouldn't it exit?
What is going on? It's a real pain to babysit this script and write custom code to nudge it along. What is going on? I read up on unix maybe cleaning up zombies but it's not a zombie. I know it's not php settings.. and it's not running through a webserver it's from command line only so what gives.
It looks like you haven't detached your process correctly. Currently, if your process's parent die, your process will die too. If you place your process in background (create a real daemon), you'll not meet scuh trouble.
You can execute your PHP this way to really detach it :
php myscript.php > output.txt 2>&1 &
For your information :
> output.txt
will redirect standard output (ie. your echo, print etc) to output.txt file
2>&1
will redirect error output to standard output, writting it in the same output.txt file
&
is the most important thing in your case : it will detach your process to create a real daemon.
Edit : if you're having troubles while disconecting your shell, the most simple is to put your script on a bash script, for example run.sh :
#!/bin/bash
php myscript.php > output.txt 2>&1 &
And you'll run your script this way :
bash run.sh &
In such case, your shell will "think" your program has ended at the end of the shell script, not at the end of the php daemon.
Long-running PHP scripts shouldn't die or hang without reason. I've had scripts that run continuously for 6 months +. There must be something else going on inside of your script body.
I know I should use comment to answer this, but I have not enough reputation to do it...
Maybe your process is consuming 100% of CPU, I had an issue with a while loop without calling a sleep() or usleep() at the end of the loop.
Related
When executing multiple scripts within PHP using the exec command; are each script ran one at a time as in one after the other or are they ran simultaneously?
exec('/usr/bin/php -q process-duplicates.php');
exec('/usr/bin/php -q process-images.php');
exec('/usr/bin/php -q process-sitemaps.php');
Just want to make sure they are one after the other before attempting to rewrite my crontabs.
Sure, the only way to run at background is adding & to the command line arguments, which would put that exec()'d process into the background:
exec("php test.php &");
So you are right, they run one after the other.
NOTE: In your case you shouldn't use & as it will force to run all the scripts simultaneously.
exec waits for the script to return, see php.net
If a program is started with this function, in order for it to continue running in the background, the output of the program must be redirected to a file or another output stream. Failing to do so will cause PHP to hang until the execution of the program ends.
But as a devops, please please please do not run your cron jobs like this! Create entries in the crontab for each, or put them in a shell script and have cron run the script.
I'm trying to write a cronjob which launches multiple processes that I want to run in parallel.
I'm using a foreach calling each command, but the command line waits for the output. I don't want it to put.
Was wondering if anyone ever used any library for this?
Add an ampersand after the command:
$ php task.php &
It will run that instance of php in the background and continue.
If you read the manual on passthru you'll notice it tells you how to avoid this...
If a program is started with this function, in order for it to continue running in the background, the output of the program must be redirected to a file or another output stream. Failing to do so will cause PHP to hang until the execution of the program ends.
So you can rely on UNIX fds to redirect output to something like /dev/null for example if you don't care about the output or to some file if you do want to save the output and this will avoid PHP waiting on the command to finish.
pssthru("somecommand > /some/path/to/file")
I currently use nohup to run a long php script and redirect the live results to a file using this command
nohup php long_file.php >logs 2>&1 &
so i just go and visit logs file continuously to see the results
Now i want to do the exact same thing using another php file to execute the above command
i tried the above command with php exec and the redirect output doesn't seem to be working,
I know i can just retrive the output using php and store it using any file write function but the thing is .. the output is too long thats why i keep it running on server's background
a similar question :
Shell_exec php with nohup, but it had no answer
any solution ?
Please try with -q
nohup php -q long_file.php >logs 2>&1 &
http://ubuntuforums.org/showthread.php?t=977332
Did you try passthru instead of exec?
You are redirecting STDOUT to a file by using >
This will truncate the file each time the script is run. If two scripts simultaneously redirect their output to the same file, the one started last will truncate the output from the first script.
If you really want to properly append to a log file with multiple concurrent running scripts, consider using >> to avoid having the log file truncated.
Side effect however is that the log file never truncates and keeps expanding, so if the file gets really large you can consider including it in a logrotate scheme.
In a apcahe server i want to run a PHP scripts as cron which starts a php file in background and exits just after starting of the file and doesn't wait for the script to complete as that script will take around 60 minutes to complete.how this can be done?
You should know that there is no threads in PHP.
But you can execute programs and detach them easily if you're running on Unix/linux system.
$command = "/usr/bin/php '/path/to/your/php/to/execute.php'";
exec("{$command} > /dev/null 2>&1 & echo -n \$!");
May do the job. Let's explain a bit :
exec($command);
Executes /usr/bin/php '/path/to/your/php/to/execute.php' : your script is launched but Apache will awaits the end of the execution before executing next code.
> /dev/null
will redirect standard output (ie. your echo, print etc) to a virtual file (all outputs written in it are lost).
2>&1
will redirect error output to standard output, writting in the same virtual and non-existing file. This avoids having logs into your apache2/error.log for example.
&
is the most important thing in your case : it will detach your execution of $command : so exec() will immediatly release your php code execution.
echo -n \$!
will give PID of your detached execution as response : it will be returned by exec() and makes you able to work with it (such as, put this pid into a database and kill it after some time to avoid zombies).
You need to use "&" symbol to run program as background proccess.
$ php -f file.php &
Thats will run this command in background.
You may wright sh script
#!/bin/bash
php -f file.php &
And run this script from crontab.
This may not be the best solution to your specific problem. But for the record, there is Threads in PHP.
https://github.com/krakjoe/pthreads
I'm assuming you know how to use threads, this is very young code that I wrote myself, but if you have experience with threads and mutex and the like you should be able to solve your problem using this extension.
This is clearly a shameless plug of my own project, and if the user doesn't have the access required to install extensions then it won't help him, but many people find stackoverflow and it will solve other problems no doubt ...
I have this in one PHP file:
echo shell_exec('nohup /usr/bin/php -f '.CRON_DIRECTORY.'testjob.php > /dev/null 2>&1 &');
and in testjob.php I have:
file_put_contents('test.txt',time()); exit;
And it all runs just dandy. However if I go to processes it's not terminating testjob.php after it runs.
(Having to post this as an answer instead of comment as stackoverflow still won't let me post comments...)
Works for me. I made testjob.php exactly as described, and another file test.php with just the given line (except I removed CRON_DIRECTORY, because testjob.php was in the same directory for me).
To be sure I was measuring correctly, I added "sleep(5)" at the top of testjob.php, and in another window I have:
watch 'ps a |grep php'
running. This happens:
I run test.php
test.php exits immediately but testjob.php appears in my list
After 5 seconds it disappears.
I wondered if shell might matter, so I switched from bash to sh. Same result.
I also wondered if it might be because your outer script is long-running. So I put "sleep(10)" at the bottom of test.php. Same result (i.e. testjob.php finishes after 5 seconds, test.php finishes 5 seconds after that).
So, unhelpfully, your problem is somewhere other than the code you've posted.
Remove & from the end of your command. This symbol says nohup to continue running in background, thus shell_exec is waiting for task to complete... and waiting... and waiting... till the end of times ;)
I don't even understan why would you perform this command with nohup.
echo shell_exec('/usr/bin/php -f '.CRON_DIRECTORY.'testjob.php > /dev/null 2>&1');
should be enough.
You're executing PHP and make that execution a background task. That means it will run in background until it is finished. shell_exec will not kill that process or something similar.
You might want to set an execution limit, PHP cli has a setting of unlimited by default. See as well set_time_limit PHP Manual;
So if you wonder why the php process does not terminate, you need to debug the script. If that's too complicated and you're unable to find out why the script runs that long, you might just want to terminate the process after some time, e.g. 1 minute.