So I was wondering if there's a better way to do this in 2013.
Older answers suggest something like this
shell_exec('nohup php script.php some_argument another_argument > /dev/null 2>&1 &');
Which seems to work, but it looks weird. And the thing is that I don't have much control over the script I execute. For example I would like the possibility to terminate it inside another request. The only solution I found is to place a file somewhere, like "exit_now.txt', then inside that script check if that file exists and fire exit() if it does.
You can send signal to the invoked process (by kill command from shell) and install signal handler to your program to react as you wish.
Documentation for installing signal handler
Example with pcntl_fork() and pcntl_signal()
Related
How can I run a php file IN BACKGROUND after submitting a form. The loading has to happen in background since it usually takes a very long time.
Basically it's just like running a cronjob, except I want to trigger it manually and with my browser.
There are several possible ways to do this.
Try setting ignore_user_abort to TRUE in your script.
If changed to TRUE scripts will not be terminated after a client has aborted their connection.
Take a look at popen() and pclose(). You can do something like this:
pclose(popen("start php /path/to/myscript.php", "r"));
You can kick off a separate PHP process with a system() or exec() call. Something like this:
system('php /path/to/myscript.php >/dev/null 2>&1 &');
Start the request using AJAX. The browser will continue running while waiting for a response. You can even show a popup or some information when the request is finished, although you don't have to.
I noticed exec and shell_exec is causing perpetual loading.
Basically, I'm trying to do something as simple as loading a PHP script in the background. When I try to do that, it just loads and loads.
My code is as follows
exec('php test.php -- '.escapeshellarg($param1).' > /dev/null ');
I first thought it was my other script, so I pointed it to a file with just:
echo $agrv[1];
But it still loads perpetually.
Don't wait for the process to exit
exec() waits for the process to give an exit code. The link I provided above may help you.
Oh, and since you tagged Linux for whatever reason, I assume you're on a Linux distro.
You could consider this, aswell:
http://ca1.php.net/pcntl_fork
I have this set in my php script to make it supposedly run as long as it needs to to parse and do mysql queries and fetch images for over 100,000 rows.
ignore_user_abort(true);
set_time_limit(0);
#begin logging output
error_reporting(E_ALL);
ini_set('memory_limit', '512M');
I run the command like this in shell:
nohup php myscript.php > output.txt
after running about 8 to 10 hours this script will still be running but execution just stops... no more output.. it's not a zombie process I checked top. It hasn't met the memory limit either and if it did wouldn't it exit?
What is going on? It's a real pain to babysit this script and write custom code to nudge it along. What is going on? I read up on unix maybe cleaning up zombies but it's not a zombie. I know it's not php settings.. and it's not running through a webserver it's from command line only so what gives.
It looks like you haven't detached your process correctly. Currently, if your process's parent die, your process will die too. If you place your process in background (create a real daemon), you'll not meet scuh trouble.
You can execute your PHP this way to really detach it :
php myscript.php > output.txt 2>&1 &
For your information :
> output.txt
will redirect standard output (ie. your echo, print etc) to output.txt file
2>&1
will redirect error output to standard output, writting it in the same output.txt file
&
is the most important thing in your case : it will detach your process to create a real daemon.
Edit : if you're having troubles while disconecting your shell, the most simple is to put your script on a bash script, for example run.sh :
#!/bin/bash
php myscript.php > output.txt 2>&1 &
And you'll run your script this way :
bash run.sh &
In such case, your shell will "think" your program has ended at the end of the shell script, not at the end of the php daemon.
Long-running PHP scripts shouldn't die or hang without reason. I've had scripts that run continuously for 6 months +. There must be something else going on inside of your script body.
I know I should use comment to answer this, but I have not enough reputation to do it...
Maybe your process is consuming 100% of CPU, I had an issue with a while loop without calling a sleep() or usleep() at the end of the loop.
I know there's been similar questions but they don't solve my problem...
After checking out the folders from the repo (which works fine).
A method is called from jquery to execute the following in php.
exec ('svn cleanup '.$checkout_dir);
session_write_close(); //Some suggestion that was supposed to help but doesn't
exec ('svn commit -m "SAVE DITAMAP" '.$file);
These would output the following:
svn cleanup USER_WORKSPACE/0A8288
svn commit -m "SAVE DITAMAP" USER_WORKSPACE/0A8288/map.ditamap
1) the first line (exec ('svn cleanup')...executes fine.
2) as soon as I call svn commit then my server hangs, and everything goes to hell
The apache error logs show this error:
[notice] Child 3424: Waiting 240 more seconds for 4 worker threads to finish.
I'm not using the php_svn module because I couldn't get it to compile on windows.
Does anyone know what is going on here? I can execute the exact same cmd from the terminal windows and it executes just fine.
since i cannot find any documentation on jquery exec(), i assume this is calling php. i copied this from the documentation page:
When calling exec() from within an apache php script, make sure to take care of stdout, stderr and stdin (as in the example below). If you forget this and your shell command produces output the sh and apache deamons may never return (they will normally time out after a few minutes). From the calling web page the script may seem to not return any data.
If you want to start a php process that continues to run independently from apache (with a different parent pid) use nohub. Example:
exec('nohup php process.php > process.out 2> process.err < /dev/null &');
hope it helps
Okay, I've found the problem.
It actually didn't have anything to do with the exec running the background, especially because a one file commit doesn't take a lot of time.
The problem was that the commit was expecting a --username and --password that didn't show up, and just caused apache to hang.
To solve this, I changed the svnserve.conf in the folder where I installed svn and changed it to allow non-auth users write access.
I don't think you'd normally want to do this, but my site already authenticates the user name and pass upon logging in.
Alternatively you could
How can I run a non blocking system call in PHP?
The system call will call a streaming service run by a second PHP script.. So my page sits and waits on this call.
My two thoughts on a solution:
1: There exists a native method / parameter to execute a system call by non blocking
2: Run system() on a new C++ program that will then fork itself and run the actual php script, on a sep. thread
Is there a native method of executing system calls in a non blocking manner or do I need to hack around this...
I currently have shell_exec('nohup php /path/to/file.php &') but it still holds
From PHP manual:
If a program is started with this function, in order for it to
continue running in the background, the output of the program must be
redirected to a file or another output stream. Failing to do so will
cause PHP to hang until the execution of the program ends.
An example is provided in a comment on the same page (linux based):
If you want to start a php process that continues to run independently
from apache (with a different parent pid) use nohub. Example:
exec('nohup php process.php > process.out 2> process.err < /dev/null
&');