PHP: how can I call a function in a non-blocking fashion? - php

I saw a couple of other question on the issue but not a clear answer.
I've a PHP file (must be PHP, cannot cron or other stuff) running from CLI where I must call the same function multiple time with different arguments:
doWork($param1);
doWork($param2);
doWork($param2);
function doWork($data)
{
//do stuff, write result to db
}
Each call makes HTTPs requests and parses the response. The operation can require up to a minute to complete. I must prevent the "convoy effect": each call must be executed without waiting for the previous one to complete.
PECL pthread is not an option due to server constraints.
Any ideas?

As far as I know you cannot do what you are looking for.
Instead of calling a function with its parameters, you have to call another cli php script in a nonblocking manner and put your function in that script.
This is your main script:
callDoWork($param1);
callDoWork($param2);
callDoWork($param3);
function callDoWork($param){
$cmd = 'start "" /b php doWork.php '.$param;
//if $param contains spaces or other special caracters for the command line,
// you have to escape them.
pclose(popen($cmd);
}
doWork.php would look like :
if(is_array($_SERVER['argv'])) $param = $_SERVER['argv'][1];
doWork($param);
function doWork($data)
{
//do stuff, write result to db
}
More information about argv.

How about adding "> /dev/null 2>/dev/null &"
exec('php myFile.php > /dev/null 2>/dev/null &');
You can check the documentation for more

Related

Call a .phar executable from inside a web app controller

I need to run composer.phar update from a web controller.
I can run all kinds of regular commands in this way (ls, cp, etc) but when I invoke the phar file I get empty output.
The code I have looks like this:
class Maintenance_Controller
{
public function do_maintenance()
{
echo exec("/usr/bin/env php composer.phar", $out, $ret);
var_dump($out); // outputs -> array()
var_dump($ret); // outputs -> int(127)
}
}
127 indicates a bad path, but I am sure I'm in the right directory.
Also, this works when using a php_cli wrapper, so maybe it has to do with the www-data user? chmod 777 does not help, and I hate to do that anyway.
I have also used passthru(), system() and the backtic syntax. I am unable to get to the reason this doesn't work. I can't seem to interrogate the stderr or stdout from the exec() call beyond the 127 code.
Obvious Question:
What am I doing wrong?
Better Question:
Is there a better way to interrogate and execute .phar files from within a script?
UPDATE:
From this question:
Value 127 is returned by /bin/sh when the given command is not found within your PATH system variable and it is not a built-in shell command.
Try using exec('php composer.phar', $out, $ret); and see if that works. You might also need to use the full path to php if its in a non-standard location which you can probably get from which php.
Im not sure why you are using passthru here. I would use exec for better handling.
Id use exec here instead of passthru
class Maintenance_Controller
{
public function do_maintenance()
{
exec("composer.phar update", $out, $ret);
if(!$ret) {
// handle success
} else {
// handle error
}
}
}
This way you have all the output by line in $out as well as the shell return val (0 if ok, > 0 if not). If you wanna get really fancy you can loop over $out and scan for the errors and then build and exception to throw.

PHP exec() re-executing same script again and again

Here I am writing about my flow in detail.
I am doing an Ajax request on same host.
AJAX call to saveUser.php
in saveUser.php, I have included Common.php.
Common.php have createFile1() and createFile2() function.
createFile1() function just creating a sample1.php for another purpose.
in createFile2(), I am creating a file sample2.php and executing a exec('php /home/public/sample2.php > /dev/null &'); command to execute that file.
I am calling this createFile1() and createFile2() function respectively from saveUser.php which is being called by an AJAX request.
As I have set the exec command to run in background by '&' at end of command, it is returning without waiting for the response and all goes smoothly in front.
But when I am checking on my server, these files sample1.php and sample2.php are getting created again and again. It seems all this saveUser.php action are getting executed again and again until I stops the sample2.php process from SSH.
I checked the processes list, each time 'php /home/public/sample2.php' is having new process_id, so it confirms that it is getting executed again and again. If I remove the exec() code and execute this sample2.php from SSH, it works as expected, there is not such problem.
Please suggest me whats going on wrong? Is there any problem with server configuration, I am using hostgator shared account.
Also I am including same Common.php file in sample2.php also, informing in case it can help it.
Thanks for your help in advance.
saveUser.php code
include_once dirname(__FILE__).'/Common.php';
createFile1();
createFile2();
echo 'saved successfully!';
Common.php code
function createFile1()
{
$template = file_get_contents('sendDmTemplate.php');
$serviceFp = fopen('sendDmFiles/sample1.php',"w");
fwrite($serviceFp , $fileContent);
fclose($serviceFp);
}
function createFile2()
{
$fileContent = file_get_contents(dirname(__FILE__).'/sampleFileTemplate.php');
$serviceFp = fopen('sample2.php',"w");
fwrite($serviceFp , $fileContent);
fclose($serviceFp);
exec('php '.dirname(__FILE__).'/sample2.php > /dev/null &');
}
sample2.php code
include_once dirname(__FILE__).'/Common.php';
echo 'hi';
This exact same thing happened to me. I was trying to have one php script call exec() on another php script but for some strange reason it would just exec() itself again creating an infinite loop. The currently accepted answer didn't get me anywhere, but I was able to get it to work by specifying an absolute path to php.
So, instead of
exec('php /home/public/sample2.php > /dev/null &');
it would be
exec('/usr/local/bin/php /home/public/sample2.php > /dev/null &');
Check the following:
1) Are you sure you are NOT calling your script saveUser.php multiple times? Mayby a codingerror somewhere in the javascript XHR? Check this by looking in the (apache?) log.
2) Are you sure your php executes alright without the -q? I use php -q pathtoscript.php
3) If not 1 or 2: Post the code in here (or somewhere) of saveUser.php
EDIT: I see your problem. The file you create includes common.php again, and executes that again. Etc. <-- wrong Oops. I wrote that too early. Looking into it again now.
4) Is it possible you use some errorhandling that redirects to your saveUser.php?
Edit:
5) There might arise a problem from the fact that you are including the file that is executing the command itself in combination with include_once, but I am not sure. You could simply test this by changing your content of sample2.php content by adjusting sampleFileTemplate.php. Create a common2.php (with identical content as common.php), and use that one. COuld you testdrive that setup and report back?

PHP execute external script without waiting, while passing variables

I'm making a function for my users where they can upload large XML files to synchronize with my database.
When a user uploads a file to upload.php, I want to start processing the data in the background with process.php, preferably from a shell command, and redirect the user to status.php, which shows the process of the synchronization.
I need to pass some variables to the process.php script while executing it, either at least one variable with the user id and put the other variables into a text file, (Would probably prefer this so I wont have to put to much data into the exec() command.) or the user id and a bunch of $_POST variables.
One solution I had in mind is executing the PHP script like this:
exec("php -f ./process.php > /dev/null 2>/dev/null &");
This allows me to lock away process.php from http access, which is good since it's a process taking script. The only thing I need here is to pass a variable somehow, but i don't know how to do it.
So my main question is:
How do i pass a variable in the above solution?
Or do any of you have a better solution to doing this? Possibly one where i wont have to go through exec()? Keep in mind that i do not want the user to wait for the script to execute, and i need to pass at least one variable.
Update: For future reference, remember to use escapeshellarg() when passing arguments through exec() or likewise functions.
You test use it
exec("php -f ./process.php var1 var2 > /dev/null 2>/dev/null &");
And if you like get these variables values can acces with global variable $argv. If you print this var show same:
print_r($argv);
Array
(
[0] => process.php
[1] => var1
[2] => var2
)
You can pass parameters like the following.
// call process.php
exec("php -f ./process.php foo=bar bar=foo > /dev/null 2>/dev/null &");
// process.php
if ($argc > 0) {
for ($i=1;$i < $argc;$i++) {
parse_str($argv[$i],$tmp);
$_REQUEST = array_merge($_REQUEST, $tmp);
}
}
var_dump($_REQUEST);
I don't really understood your goal, but to pass an argument to an PHP-script works similar to any other shell scripts. See: http://www.php.net/manual/en/features.commandline.usage.php (Example #2)
"When a user uploads a file […], I want to start processing the data in the background" - You can't access an upload before it is finished, in PHP using CGI.
Here is my solution.
The advantage is that you can use this script for command line usage as well as for regular web usage. In case you call it from cmd the $argv variable is set, so the parse_str() part extracts the variables and puts them into the $_GET array. In case you call it from the web the $argv is not set so the values come from the url.
// executingScript.php
// You have to make percent escaping yourself
exec("php -f ./executedScript.php foo=bar%20foo bar=foo%20bar > /dev/null 2>/dev/null &");
// executedScript.php
// The if-statement avoids crashing when calling the script from the web
if (isset($argv)) {
parse_str(implode('&', array_slice($argv, 1)), $_GET);
}
This will make you able to access the variables as usual:
echo $_GET["foo"] // outputs "bar foo"
echo $_GET["bar"] // outputs "foo bar"

Run and forget system call in php

So I am trying to execute some script from my php code. That lives in page blah.php
<?php
// ....
// just basic web site that allows upload of file...
?>
Inside I use system call
if (system("blah.pl arg1") != 0)
{
print "error\n";
}
else
{
print "working on it..you will receive e-mail at completion\n";
}
Works great but it waits until it completes until it prints working on it.
I am aware that I am calling perl script from php not a typo.
How can I just start program execution and let it complete in background.
The blah.pl script handles e-mail notification.
Anyone?
Thanks I appreciate it
From system function documentation:
Note: If a program is started with this function, in order for it to continue running in the background, the output of the program must be redirected to a file or another output stream. Failing to do so will cause PHP to hang until the execution of the program ends.
Simply redirect the output of your Perl script and PHP should be able to run without having to wait for the script to finish.
system("blah.pl arg1 > /dev/null 2>&1 &");
Can you add an ampersand to spawn the process in the background?
if (system("blah.pl arg1 &") != 0)
You could probably use the popen function
http://www.php.net/manual/en/function.popen.php
Since all the PHP command line functions wait for a result - I recommend that you have a separate page to call the emailer script.
In other words, send a request to the email.php page (or whatever) using cURL or the file wrappers which allows you to close the connection before receiving a result. So the process will look something like this.
page_running.php
|
cURL call to (email.php?arg1=0)
| |
final output email.php
|
calls system("blah.pl arg1")

PHP exec() return value for background process (linux)

Using PHP on Linux, I'd like to determine whether a shell command run using exec() was successfully executed. I'm using the return_var parameter to check for a successful return value of 0. This works fine until I need to do the same thing for a process that has to run in the background. For example, in the following command $result returns 0:
exec('badcommand > /dev/null 2>&1 &', $output, $result);
I have put the redirect in there on purpose, I do not want to capture any output. I just want to know that the command was executed successfully. Is that possible to do?
Thanks, Brian
My guess is that what you are trying to do is not directly possible. By backgrounding the process, you are letting your PHP script continue (and potentially exit) before a result exists.
A work around is to have a second PHP (or Bash/etc) script that just does the command execution and writes the result to a temp file.
The main script would be something like:
$resultFile = '/tmp/result001';
touch($resultFile);
exec('php command_runner.php '.escapeshellarg($resultFile).' > /dev/null 2>&1 &');
// do other stuff...
// Sometime later when you want to check the result...
while (!strlen(file_get_contents($resultFile))) {
sleep(5);
}
$result = intval(file_get_contents($resultFile));
unlink($resultFile);
And the command_runner.php would look like:
$outputFile = $argv[0];
exec('badcommand > /dev/null 2>&1', $output, $result);
file_put_contents($outputFile, $result);
Its not pretty, and there is certainly room for adding robustness and handling concurrent executions, but the general idea should work.
Not using the exec() method. When you send a process to the background, it will return 0 to the exec call and php will continue execution, there's no way to retrieve the final result.
pcntl_fork() however will fork your application, so you can run exec() in the child process and leave it waiting until it finishes. Then exit() with the status the exec call returned.
In the parent process you can access that return code with pcntl_waitpid()
Just my 2 cents, how about using the || or && bash operator?
exec('ls && touch /tmp/res_ok || touch /tmp/res_bad');
And then check for file existence.

Categories