How can I run a non blocking system call in PHP?
The system call will call a streaming service run by a second PHP script.. So my page sits and waits on this call.
My two thoughts on a solution:
1: There exists a native method / parameter to execute a system call by non blocking
2: Run system() on a new C++ program that will then fork itself and run the actual php script, on a sep. thread
Is there a native method of executing system calls in a non blocking manner or do I need to hack around this...
I currently have shell_exec('nohup php /path/to/file.php &') but it still holds
From PHP manual:
If a program is started with this function, in order for it to
continue running in the background, the output of the program must be
redirected to a file or another output stream. Failing to do so will
cause PHP to hang until the execution of the program ends.
An example is provided in a comment on the same page (linux based):
If you want to start a php process that continues to run independently
from apache (with a different parent pid) use nohub. Example:
exec('nohup php process.php > process.out 2> process.err < /dev/null
&');
Related
I have 2 websites, hosted on 2 different servers. They are kind of interlinked. Sometimes I just do stuff on Website-1 and run a script on Website-2. Like I edited something on Website-1 and now I want to run a script on Website-2 to update accordingly on it's server.
Till now I am using following code on website 1.
$file = file_get_contents('Website-2/update.php');
But the problem with this is that my Website-1 server script stops running and wait for the file to return some data. And I don't wanna do anything with that data. I just wanted to run the script.
Is there a way where I can do this in a better way or tell PHP to move to next line of code.
If you want to call the second site without making your user wait for a response,
I would recommend using a message queue.
Site 1 request would put a message to the queue.
Cron job to check queue and run update on site 2 when message exists.
Common queues apps to look at:
[https://aws.amazon.com/sqs/?nc2=h_m1][1]
[https://beanstalkd.github.io/][2]
[https://www.iron.io/mq][3]
[1]: https://aws.amazon.com/sqs/?nc2=h_m1
[2]: https://beanstalkd.github.io/
[3]: https://www.iron.io/mq
What you're trying to achieve is called a web hook and should be implemented with proper authentication, so that not anybody can execute your scripts at any time and overload your server.
On server 2 you need to execute your script asynchronously via workers, threads, message queues or similar.
You can also run the asynchronous command on your server 1. There are many ways to achieve this. Here are some links with more on this.
(Async curl request in PHP)
(https://segment.com/blog/how-to-make-async-requests-in-php/)
Call your remote server as normal. But, In the PHP script you normally call, Take all the functionality and put it in a third script. Then from the old script call the new one with (on Linux)
exec('php -f "{path to new script}.php" $args > /dev/null &');
The & at the end makes this a background or non-blocking call. Because you call it from the remote sever you don't have to change anything on the calling server. The php -f runs a php file. The > /dev/null sends the output from that file to the garbage.
On windows you can use COM and WScript.Shell to do the same thing
$WshShell = new \COM('WScript.Shell');
$oExec = $WshShell->Run('cmd /C php {path to new script}.php', 0, false);
You may want to use escapeshellarg on the filename and any arguments supplied.
So it will look like this
Server1 calls Server2
Script that was called (on Server2) runs exec and kicks off a background job (Server2) then exits
Server1 continues as normal
Server2 continues the background process
So using your example instead of calling:
file_get_contents('Website-2/update.php');
You will call
file_get_contents('Website-2/update_kickstart.php');
In update_kickstart.php put this code
<?php
exec('php -f "{path}update.php" > /dev/null &');
Which will run update.php as a separate background (non-blocking) call. Because it's non-blocking update_kickstart.php will finish and return to searver1 which can go about it's business and update.php will run on server2 independantly
Simple...
The last note is that file_get_contents is a poor choice. I would use SSH and probably PHPSecLib2.0 to connect to server2 and run the exec command directly with a user that has access only to that file(Chroot it or something similar). As it is anyone can call that file and run it. With it behind a SSH login it's protected, with it Chrooted that "special" user can only run that one file.
I'm trying to write a cronjob which launches multiple processes that I want to run in parallel.
I'm using a foreach calling each command, but the command line waits for the output. I don't want it to put.
Was wondering if anyone ever used any library for this?
Add an ampersand after the command:
$ php task.php &
It will run that instance of php in the background and continue.
If you read the manual on passthru you'll notice it tells you how to avoid this...
If a program is started with this function, in order for it to continue running in the background, the output of the program must be redirected to a file or another output stream. Failing to do so will cause PHP to hang until the execution of the program ends.
So you can rely on UNIX fds to redirect output to something like /dev/null for example if you don't care about the output or to some file if you do want to save the output and this will avoid PHP waiting on the command to finish.
pssthru("somecommand > /some/path/to/file")
I have a problem with PHP passthru() blocking when it is supposed to start a daemon.
I have a Node.js daemon with a bash script wrapper around it. That bash script uses a bit of process replacement because the Node.js server can't directly log to syslog. The bash script contains a command like this:
forever -l app.log app.js
But because I want it to log to syslog, I use:
forever -l >(logger) app.js
The logger process replacement created a file descriptor like /dev/fd/63 whose path is passed to the forever command as the logfile to use.
This works great when I start the daemon using the bash script directly, but when the bash script is executed using PHP passthru() or exec() then these calls will block. If I use a regular logfile instead of the process replacement then both passthru() and exec() work just fine, starting the daemon in the background.
I have created a complete working example (using a simple PHP daemon instead of Node.js) on Github's Gist: https://gist.github.com/1977896 (needs PHP 5.3.6+)
Why does the passthru() call block on the process replacement? And is there anything I can do to work around it?
passthru() will block in PHP even if you start a daemon, it's unfortunate. I've heard some people have luck rewriting it with nohup:
exec('/path/to/cmd');
then becomes:
exec('nohup /path/to/cmd &');
Personally, what I've had the most luck with is exec()'ing a wget exec to call another script (or the same script) to actually run the blocking exec. This frees the calling process from getting blocked by giving it to another http process not associated with the live user. With the appropriate flags, wget will return immediately, not waiting for a response:
exec('wget --quiet --tries=1 -O - --timeout=1 --no-cache http://localhost/path/to/cmd');
The http handler will eventually time out which is fine and should leave the daemon running. If you need output (hence the passthru() call you're making) just run the script redirecting output to a file and then poll that file for changes in your live process.
I need to run a Python script in the background after being called from a PHP file. The PHP file should continue to run independently of the Python script (i.e. it shouldn't hang waiting for the Python script to finish processing, but should instead carry on processing itself).
The Python script takes one argument and produces no output (it merely processes some data in the background), then exits. I'm running Python 2.6, PHP 5.2.6, and Ubuntu 9.04.
You could use exec() to kick off the Python interperator and have it send its output to either a file or to /dev/null with redirection. Using the & operator in the exec call will cause the command to be started and PHP to continue without waiting for a result.
http://www.developertutorials.com/tutorials/php/running-background-processes-in-php-349/ goes into more detail.
PHP Process Control can be used for this. The proc_open command can be used to start a process. You can later check up on it, read it's output etc.
View the manual entry: http://www.php.net/manual/en/function.proc-open.php and search around google for PHP Process Control
I'm guessing the PHP file is called via Apache, in which case you won't be able to fork(). You should make your Python script daemonize. Check out python-daemon.
You could use:
<?php
shell_exec('./test.sh &');
?>
where ./test.sh should be the execution line to your script
I want initiate one php page as background process from another php page.
Use popen():
$command = 'php somefile.php';
pclose(popen($command,'r'));
This launches somefile.php as a background process.
This is a technique I used to get around restrictions applied by my webhost (who limited cronjobs to 15 minutes of execution time, so my backup scripts would always timeout).
exec( 'php somefile.php | /dev/null &' );
The breakdown of this line is:
exec() - PHP reference Runs the specified command, as if from the Linux Command Line.
php somefile.php: Invokes PHP to open, and run, somefile.php. This is the same behaviour as what would happen if that file was accessed through a web browser.
| ("pipe") - Sends the output of the proceeding command to a specified target. In this instance, it would "pipe" the content which would normally be read by the web browser accessing the file.
/dev/null - A blackhole. No, not kidding. It is a place where you send output if you just want it to disappear.
& - Appending this character to the end of a Linux command means "Do not wait - Send this to the background and continue."
So, in summary, the provided code will execute a PHP script, return no output, and not wait for it to finish before continuing onto the next line.
(And, as always, if any of these assumptions on my part are in error, I would love to be corrected by more knowledgeable members of the community.)
You have to make sure, that the background process is not terminated when the processing of the page finished. If you are on a Linux system, you could try to use the nohup command:
$command = 'nohup php somefile.php';
pclose(popen($command,'r'));
If it still gets terminated, you could try the "daemon" command.