I am using phpseclib to ssh to my server and run a python script. The python script is an infinite loop, so it runs until you stop it. When I execute python script.py via ssh with phpseclib, it works, but the page just loads for ever. It does this because phpseclib does not think it is "done" running the line of code that runs the infinite loop script so it hangs on that line. I have tried using exit and die after that line, but of course, it didnt work because it hangs on the line before, the one that executes the command. Does any one have any ideas on how I can fix this without modifying the python file? Thanks.
Assuming the command will be run by a shell, you could have it execute this to start it:
nohup python myscript.py > /dev/null 2>&1 &
If you put an & on the end of any shell command it will run in the background and return immediately, that's all you really need.
Something else you could have also done:
$ssh->setTimeout(1);
Related
When executing multiple scripts within PHP using the exec command; are each script ran one at a time as in one after the other or are they ran simultaneously?
exec('/usr/bin/php -q process-duplicates.php');
exec('/usr/bin/php -q process-images.php');
exec('/usr/bin/php -q process-sitemaps.php');
Just want to make sure they are one after the other before attempting to rewrite my crontabs.
Sure, the only way to run at background is adding & to the command line arguments, which would put that exec()'d process into the background:
exec("php test.php &");
So you are right, they run one after the other.
NOTE: In your case you shouldn't use & as it will force to run all the scripts simultaneously.
exec waits for the script to return, see php.net
If a program is started with this function, in order for it to continue running in the background, the output of the program must be redirected to a file or another output stream. Failing to do so will cause PHP to hang until the execution of the program ends.
But as a devops, please please please do not run your cron jobs like this! Create entries in the crontab for each, or put them in a shell script and have cron run the script.
command is working fine when I execute it using terminal but not working when I try to execute it using php.
I am executing a command from php using exec & shell_exec but it is not working.
Please help me to do that.
My code is as below:
shell_exec("emulator -avd avd8");
and
exec("emulator -avd avd8");
When I execute this from terminal it execute the emulator but I execute same from php and it do nothing.
You would require Asynchronous task execution in PHP.
Now when you say exec("emulator -avd avd8") the command is executed and php would wait for the output.
try adding '&' at the end may be like exec("emulator -avd avd8 &") to make this a background process.
Refer this link for more information
Asynchronous shell exec in PHP
I've written this simple shell script:
#!/bin/sh
STORAGE_PATH=/tmp/;
export STORAGE_PATH;
cd STORAGE_PATH;
perl /{SOME_PATH}/perl-script.pl;
When I call it from shell, it works perfectly as expected. But from PHP it hangs infinitely, and when debugging, I found that it hangs during the Perl execution which doesn't really make any sense since it continues if it was called in shell.
Did I make any mistake with the shebang #!/bin/sh? I tried with #!/bin/bash too.
I tried with all variations in php: exec, system, shell_exec, callthru but nothing is working..
Did I miss something?
Well, I can't see how it worked in shell.
You are calling /{SOME_PATH}/perl-script.pl but the variable SOME_PATH is not defined. Did you mean STORAGE_PATH? Also, you forgot the $ before {.
While you're at it, check that the user running the webserver has permissions to execute both scripts.
I have this set in my php script to make it supposedly run as long as it needs to to parse and do mysql queries and fetch images for over 100,000 rows.
ignore_user_abort(true);
set_time_limit(0);
#begin logging output
error_reporting(E_ALL);
ini_set('memory_limit', '512M');
I run the command like this in shell:
nohup php myscript.php > output.txt
after running about 8 to 10 hours this script will still be running but execution just stops... no more output.. it's not a zombie process I checked top. It hasn't met the memory limit either and if it did wouldn't it exit?
What is going on? It's a real pain to babysit this script and write custom code to nudge it along. What is going on? I read up on unix maybe cleaning up zombies but it's not a zombie. I know it's not php settings.. and it's not running through a webserver it's from command line only so what gives.
It looks like you haven't detached your process correctly. Currently, if your process's parent die, your process will die too. If you place your process in background (create a real daemon), you'll not meet scuh trouble.
You can execute your PHP this way to really detach it :
php myscript.php > output.txt 2>&1 &
For your information :
> output.txt
will redirect standard output (ie. your echo, print etc) to output.txt file
2>&1
will redirect error output to standard output, writting it in the same output.txt file
&
is the most important thing in your case : it will detach your process to create a real daemon.
Edit : if you're having troubles while disconecting your shell, the most simple is to put your script on a bash script, for example run.sh :
#!/bin/bash
php myscript.php > output.txt 2>&1 &
And you'll run your script this way :
bash run.sh &
In such case, your shell will "think" your program has ended at the end of the shell script, not at the end of the php daemon.
Long-running PHP scripts shouldn't die or hang without reason. I've had scripts that run continuously for 6 months +. There must be something else going on inside of your script body.
I know I should use comment to answer this, but I have not enough reputation to do it...
Maybe your process is consuming 100% of CPU, I had an issue with a while loop without calling a sleep() or usleep() at the end of the loop.
I am trying to manage a queue of files waiting to be processed by ffmpeg. A page is run using CRON that runs through a database of files waiting to be processed. The page then builds the commands and sends them to the command line using exec().
However, when the PHP page is run from the command line or CRON, it runs the exec() OK, but does not return to the PHP page to continue updating the database and other functions.
Example:
<?php
$cmd = "ffmpeg inpupt.mpg output.m4v";
exec($cmd . ' 2>&1', $output, $return);
//Page continues...but not executed
$update = mysql_query("UPDATE.....");
?>
When this page is run from the command line, the command is run using exec() but then the rest of the page is not executed. I think the problem may be that I am running a command using exec() in a page run from the command line.
Is it possible to run a PHP page in full from the command line which includes exec()?
Or is there a better way of doing this?
Thank you.
I wrote an article about Running a Background Process from PHP on Linux some time ago:
<?php system( 'sh test.sh >/dev/null &' ); ?>
Notice the & operator at the end. This starts a process that returns control to the shell immediately AND CONTINUES TO RUN in the background.
More examples:
<!--
saving standard output to a file
very important when your process runs in background
as this is the only way the process can error/success
-->
<?php system( 'sh test.sh >test-out.txt &' ); ?>
<!--
saving standard output and standard error to files
same as above, most programs log errors to standard error hence its better to capture both
-->
<?php system( 'sh test.sh >test-out.txt 2>test-err.txt &' ); ?>
Have you tried using CURL instead?
Unsure but probably thats due to the shell constraints of cron processes if it works as a web page then use it as a web page, setup a cron job that calls wget wherever_your_page_is and it will be called via your web server and should mimic your tests.