Email when shell_exec is finished [duplicate] - php

I need to run multiple scripts(5 scripts) via cmd, I want to make sure unless and until the first script finishes the second should not initiate. Thus after first script completes then only second should being then third one and so on..
Currently I am using the following code to do this
exec ("php phpscript1.php ");
exec ("php phpscript2.php ");
exec ("php phpscript3.php ");
exec ("php phpscript4.php ");
exec ("php phpscript5.php ");
I think these scripts run asynchronously, any suggestion guys so that these scripts can be run synchronously.

PHP exec will wait until the execution of the called program is finished, before processing the next line, unless you use & at the end of the string to run the program in background.

If I'm getting you right, you're executing php scripts from inside a php script.
Normally, php waits for the execution of the exec ("php phpscript1.php"); to finish before processing the next line.
To avoid this, just redirect the output to /dev/null or a file and run it in background.
For example: exec ("php phpscript1.php >/dev/null 2>&1 &");.

Check out the exec function syntax on php.net.
You will see that exec does not run anything asynchronously by default.
exec has two other parameters. The third one, return_var can give you a hint if the script ran successfully or any exception was fired. You can use that variable to check if you can run the succeeding scripts.
Test it and let us know if it works for you.

In my opinion, it would be better to run cronjobs. They will execute synchronously. If the task is "on-the-fly", you could execute the command to add this cronjob. More information about cronjobs:
http://unixgeeks.org/security/newbie/unix/cron-1.html
http://service.futurequest.net/index.php?_m=knowledgebase&_a=viewarticle&kbarticleid=30

Both exec and system wait for the script to execute unless you don't fork.
Check.php
<?php
echo "here ".__LINE__."\n";
exec ("php phpscript1.php");
echo "here ".__LINE__."\n";
system("php phpscript2.php");
echo "here ".__LINE__."\n";
?>
phpscript1.php
<?php
echo "=================phpscript1.php\n";
sleep(5);
?>
phpscript2.php
<?php
echo "=================phpscript2.php\n";
sleep(5);
?>
Check.php execute script1 for 5 seconds then display next line number and then execute script2 by printing the next line.

Related

php - exec() run in background [duplicate]

I have a process intensive task that I would like to run in the background.
The user clicks on a page, the PHP script runs, and finally, based on some conditions, if required, then it has to run a shell script, E.G.:
shell_exec('php measurePerformance.php 47 844 email#yahoo.com');
Currently I use shell_exec, but this requires the script to wait for an output. Is there any way to execute the command I want without waiting for it to complete?
How about adding.
"> /dev/null 2>/dev/null &"
shell_exec('php measurePerformance.php 47 844 email#yahoo.com > /dev/null 2>/dev/null &');
Note this also gets rid of the stdio and stderr.
This will execute a command and disconnect from the running process. Of course, it can be any command you want. But for a test, you can create a php file with a sleep(20) command it.
exec("nohup /usr/bin/php -f sleep.php > /dev/null 2>&1 &");
You can also give your output back to the client instantly and continue processing your PHP code afterwards.
This is the method I am using for long-waiting Ajax calls which would not have any effect on client side:
ob_end_clean();
ignore_user_abort();
ob_start();
header("Connection: close");
echo json_encode($out);
header("Content-Length: " . ob_get_length());
ob_end_flush();
flush();
// execute your command here. client will not wait for response, it already has one above.
You can find the detailed explanation here: http://oytun.co/response-now-process-later
On Windows 2003, to call another script without waiting, I used this:
$commandString = "start /b c:\\php\\php.EXE C:\\Inetpub\\wwwroot\\mysite.com\\phpforktest.php --passmsg=$testmsg";
pclose(popen($commandString, 'r'));
This only works AFTER giving changing permissions on cmd.exe - add Read and Execute for IUSR_YOURMACHINE (I also set write to Deny).
Use PHP's popen command, e.g.:
pclose(popen("start c:\wamp\bin\php.exe c:\wamp\www\script.php","r"));
This will create a child process and the script will excute in the background without waiting for output.
Sure, for windows you can use:
$WshShell = new COM("WScript.Shell");
$oExec = $WshShell->Run("C:/path/to/php-win.exe -f C:/path/to/script.php", 0, false);
Note:
If you get a COM error, add the extension to your php.ini and restart apache:
[COM_DOT_NET]
extension=php_com_dotnet.dll
If it's off of a web page, I recommend generating a signal of some kind (dropping a file in a directory, perhaps) and having a cron job pick up the work that needs to be done. Otherwise, we're likely to get into the territory of using pcntl_fork() and exec() from inside an Apache process, and that's just bad mojo.
That will work but you will have to be careful not to overload your server because it will create a new process every time you call this function which will run in background. If only one concurrent call at the same time then this workaround will do the job.
If not then I would advice to run a message queue like for instance beanstalkd/gearman/amazon sqs.

Is executing multiple PHP Scripts ran one at a time or simultaneously?

When executing multiple scripts within PHP using the exec command; are each script ran one at a time as in one after the other or are they ran simultaneously?
exec('/usr/bin/php -q process-duplicates.php');
exec('/usr/bin/php -q process-images.php');
exec('/usr/bin/php -q process-sitemaps.php');
Just want to make sure they are one after the other before attempting to rewrite my crontabs.
Sure, the only way to run at background is adding & to the command line arguments, which would put that exec()'d process into the background:
exec("php test.php &");
So you are right, they run one after the other.
NOTE: In your case you shouldn't use & as it will force to run all the scripts simultaneously.
exec waits for the script to return, see php.net
If a program is started with this function, in order for it to continue running in the background, the output of the program must be redirected to a file or another output stream. Failing to do so will cause PHP to hang until the execution of the program ends.
But as a devops, please please please do not run your cron jobs like this! Create entries in the crontab for each, or put them in a shell script and have cron run the script.

PHP "Cancel" line of code

I am using phpseclib to ssh to my server and run a python script. The python script is an infinite loop, so it runs until you stop it. When I execute python script.py via ssh with phpseclib, it works, but the page just loads for ever. It does this because phpseclib does not think it is "done" running the line of code that runs the infinite loop script so it hangs on that line. I have tried using exit and die after that line, but of course, it didnt work because it hangs on the line before, the one that executes the command. Does any one have any ideas on how I can fix this without modifying the python file? Thanks.
Assuming the command will be run by a shell, you could have it execute this to start it:
nohup python myscript.py > /dev/null 2>&1 &
If you put an & on the end of any shell command it will run in the background and return immediately, that's all you really need.
Something else you could have also done:
$ssh->setTimeout(1);

child processes seperate from main process bash shell php

I' am calling sell script from my php code
with
foreach ($some_array) {
shell_exec(nohup $code);
}
like above
I want all shell_exec call to work independent from mail process which is php execution that we call shell script
But It's not working as I expected all shell_executions start right after previous one completed
So how can I make this shell_exec calls as independent child process that they don't wait each others completation
Thanks in advance
Add the '&' to the end of the command you want to execute so it works in background.
For a sequence of commands, enclose them within parentheses then append the & symbol but be sure to redirect stdout, stderr somewhere otherwise your script will hang waiting e.g.:
<?php
exec('( sleep 10; echo "finished" | mail ian#example.com ) &> /dev/null &');
?>
See http://us.php.net/manual/en/function.exec.php
Send them to the background
shell_exec("nohup somecommand &");
^---run job in background

Is there a way to use shell_exec without waiting for the command to complete?

I have a process intensive task that I would like to run in the background.
The user clicks on a page, the PHP script runs, and finally, based on some conditions, if required, then it has to run a shell script, E.G.:
shell_exec('php measurePerformance.php 47 844 email#yahoo.com');
Currently I use shell_exec, but this requires the script to wait for an output. Is there any way to execute the command I want without waiting for it to complete?
How about adding.
"> /dev/null 2>/dev/null &"
shell_exec('php measurePerformance.php 47 844 email#yahoo.com > /dev/null 2>/dev/null &');
Note this also gets rid of the stdio and stderr.
This will execute a command and disconnect from the running process. Of course, it can be any command you want. But for a test, you can create a php file with a sleep(20) command it.
exec("nohup /usr/bin/php -f sleep.php > /dev/null 2>&1 &");
You can also give your output back to the client instantly and continue processing your PHP code afterwards.
This is the method I am using for long-waiting Ajax calls which would not have any effect on client side:
ob_end_clean();
ignore_user_abort();
ob_start();
header("Connection: close");
echo json_encode($out);
header("Content-Length: " . ob_get_length());
ob_end_flush();
flush();
// execute your command here. client will not wait for response, it already has one above.
You can find the detailed explanation here: http://oytun.co/response-now-process-later
On Windows 2003, to call another script without waiting, I used this:
$commandString = "start /b c:\\php\\php.EXE C:\\Inetpub\\wwwroot\\mysite.com\\phpforktest.php --passmsg=$testmsg";
pclose(popen($commandString, 'r'));
This only works AFTER giving changing permissions on cmd.exe - add Read and Execute for IUSR_YOURMACHINE (I also set write to Deny).
Use PHP's popen command, e.g.:
pclose(popen("start c:\wamp\bin\php.exe c:\wamp\www\script.php","r"));
This will create a child process and the script will excute in the background without waiting for output.
Sure, for windows you can use:
$WshShell = new COM("WScript.Shell");
$oExec = $WshShell->Run("C:/path/to/php-win.exe -f C:/path/to/script.php", 0, false);
Note:
If you get a COM error, add the extension to your php.ini and restart apache:
[COM_DOT_NET]
extension=php_com_dotnet.dll
If it's off of a web page, I recommend generating a signal of some kind (dropping a file in a directory, perhaps) and having a cron job pick up the work that needs to be done. Otherwise, we're likely to get into the territory of using pcntl_fork() and exec() from inside an Apache process, and that's just bad mojo.
That will work but you will have to be careful not to overload your server because it will create a new process every time you call this function which will run in background. If only one concurrent call at the same time then this workaround will do the job.
If not then I would advice to run a message queue like for instance beanstalkd/gearman/amazon sqs.

Categories