I'm trying to write a cronjob which launches multiple processes that I want to run in parallel.
I'm using a foreach calling each command, but the command line waits for the output. I don't want it to put.
Was wondering if anyone ever used any library for this?
Add an ampersand after the command:
$ php task.php &
It will run that instance of php in the background and continue.
If you read the manual on passthru you'll notice it tells you how to avoid this...
If a program is started with this function, in order for it to continue running in the background, the output of the program must be redirected to a file or another output stream. Failing to do so will cause PHP to hang until the execution of the program ends.
So you can rely on UNIX fds to redirect output to something like /dev/null for example if you don't care about the output or to some file if you do want to save the output and this will avoid PHP waiting on the command to finish.
pssthru("somecommand > /some/path/to/file")
Related
I currently use nohup to run a long php script and redirect the live results to a file using this command
nohup php long_file.php >logs 2>&1 &
so i just go and visit logs file continuously to see the results
Now i want to do the exact same thing using another php file to execute the above command
i tried the above command with php exec and the redirect output doesn't seem to be working,
I know i can just retrive the output using php and store it using any file write function but the thing is .. the output is too long thats why i keep it running on server's background
a similar question :
Shell_exec php with nohup, but it had no answer
any solution ?
Please try with -q
nohup php -q long_file.php >logs 2>&1 &
http://ubuntuforums.org/showthread.php?t=977332
Did you try passthru instead of exec?
You are redirecting STDOUT to a file by using >
This will truncate the file each time the script is run. If two scripts simultaneously redirect their output to the same file, the one started last will truncate the output from the first script.
If you really want to properly append to a log file with multiple concurrent running scripts, consider using >> to avoid having the log file truncated.
Side effect however is that the log file never truncates and keeps expanding, so if the file gets really large you can consider including it in a logrotate scheme.
I have this set in my php script to make it supposedly run as long as it needs to to parse and do mysql queries and fetch images for over 100,000 rows.
ignore_user_abort(true);
set_time_limit(0);
#begin logging output
error_reporting(E_ALL);
ini_set('memory_limit', '512M');
I run the command like this in shell:
nohup php myscript.php > output.txt
after running about 8 to 10 hours this script will still be running but execution just stops... no more output.. it's not a zombie process I checked top. It hasn't met the memory limit either and if it did wouldn't it exit?
What is going on? It's a real pain to babysit this script and write custom code to nudge it along. What is going on? I read up on unix maybe cleaning up zombies but it's not a zombie. I know it's not php settings.. and it's not running through a webserver it's from command line only so what gives.
It looks like you haven't detached your process correctly. Currently, if your process's parent die, your process will die too. If you place your process in background (create a real daemon), you'll not meet scuh trouble.
You can execute your PHP this way to really detach it :
php myscript.php > output.txt 2>&1 &
For your information :
> output.txt
will redirect standard output (ie. your echo, print etc) to output.txt file
2>&1
will redirect error output to standard output, writting it in the same output.txt file
&
is the most important thing in your case : it will detach your process to create a real daemon.
Edit : if you're having troubles while disconecting your shell, the most simple is to put your script on a bash script, for example run.sh :
#!/bin/bash
php myscript.php > output.txt 2>&1 &
And you'll run your script this way :
bash run.sh &
In such case, your shell will "think" your program has ended at the end of the shell script, not at the end of the php daemon.
Long-running PHP scripts shouldn't die or hang without reason. I've had scripts that run continuously for 6 months +. There must be something else going on inside of your script body.
I know I should use comment to answer this, but I have not enough reputation to do it...
Maybe your process is consuming 100% of CPU, I had an issue with a while loop without calling a sleep() or usleep() at the end of the loop.
How can I run a non blocking system call in PHP?
The system call will call a streaming service run by a second PHP script.. So my page sits and waits on this call.
My two thoughts on a solution:
1: There exists a native method / parameter to execute a system call by non blocking
2: Run system() on a new C++ program that will then fork itself and run the actual php script, on a sep. thread
Is there a native method of executing system calls in a non blocking manner or do I need to hack around this...
I currently have shell_exec('nohup php /path/to/file.php &') but it still holds
From PHP manual:
If a program is started with this function, in order for it to
continue running in the background, the output of the program must be
redirected to a file or another output stream. Failing to do so will
cause PHP to hang until the execution of the program ends.
An example is provided in a comment on the same page (linux based):
If you want to start a php process that continues to run independently
from apache (with a different parent pid) use nohub. Example:
exec('nohup php process.php > process.out 2> process.err < /dev/null
&');
I need to run a Python script in the background after being called from a PHP file. The PHP file should continue to run independently of the Python script (i.e. it shouldn't hang waiting for the Python script to finish processing, but should instead carry on processing itself).
The Python script takes one argument and produces no output (it merely processes some data in the background), then exits. I'm running Python 2.6, PHP 5.2.6, and Ubuntu 9.04.
You could use exec() to kick off the Python interperator and have it send its output to either a file or to /dev/null with redirection. Using the & operator in the exec call will cause the command to be started and PHP to continue without waiting for a result.
http://www.developertutorials.com/tutorials/php/running-background-processes-in-php-349/ goes into more detail.
PHP Process Control can be used for this. The proc_open command can be used to start a process. You can later check up on it, read it's output etc.
View the manual entry: http://www.php.net/manual/en/function.proc-open.php and search around google for PHP Process Control
I'm guessing the PHP file is called via Apache, in which case you won't be able to fork(). You should make your Python script daemonize. Check out python-daemon.
You could use:
<?php
shell_exec('./test.sh &');
?>
where ./test.sh should be the execution line to your script
I want initiate one php page as background process from another php page.
Use popen():
$command = 'php somefile.php';
pclose(popen($command,'r'));
This launches somefile.php as a background process.
This is a technique I used to get around restrictions applied by my webhost (who limited cronjobs to 15 minutes of execution time, so my backup scripts would always timeout).
exec( 'php somefile.php | /dev/null &' );
The breakdown of this line is:
exec() - PHP reference Runs the specified command, as if from the Linux Command Line.
php somefile.php: Invokes PHP to open, and run, somefile.php. This is the same behaviour as what would happen if that file was accessed through a web browser.
| ("pipe") - Sends the output of the proceeding command to a specified target. In this instance, it would "pipe" the content which would normally be read by the web browser accessing the file.
/dev/null - A blackhole. No, not kidding. It is a place where you send output if you just want it to disappear.
& - Appending this character to the end of a Linux command means "Do not wait - Send this to the background and continue."
So, in summary, the provided code will execute a PHP script, return no output, and not wait for it to finish before continuing onto the next line.
(And, as always, if any of these assumptions on my part are in error, I would love to be corrected by more knowledgeable members of the community.)
You have to make sure, that the background process is not terminated when the processing of the page finished. If you are on a Linux system, you could try to use the nohup command:
$command = 'nohup php somefile.php';
pclose(popen($command,'r'));
If it still gets terminated, you could try the "daemon" command.