I wrote a php script that refresh automatically to do a long process.
To do that, I use:
header('refresh: '.$refresh_time);
The time between every refresh is changed depending of what happened
It works well on a classic browser but I need to execute this script from shell now.
To do that, I tried:
php my_php_script.php
But this don't works as it execute the script only 1 time (yep, no refresh here).
So, is it possible to use the headers sent in php by calling the script in a command shell?
If yes, please enlighten my knowledge.
If no, BE DOOMED!!!! Damn, I'll have change my code!
Thanks for the help!
You can achieve that by using shell command in php file You can use the following code it will do the exact same thing in shell.
$refresh_time = rand(2,10);
sleep($refresh_time);
$command = "php my_php_script.php > /dev/null &";
shell_exec($command);
On Linux System, you can use cron. And on Windows, you can use windows schedule task.
Basically you can tell them when to execute your script. Every minute, day, week, etc.
Search on Google for more details.
Tricky part would be to change the time interval, can't help you much with that. Will surely let you know in case I find a solution for the same.
I answer my question but I would like to thanks you all for the helpfull tips.
As HTML headers can't be used by calling php though shell, I wrote a script that get the refresh header sent and sleep the time needed before calling again the php script.
#!/bin/bash
my_bash_function()
{
response="$(curl -vs http://localhost/my_php_script.php) 2>&1"
refresh_regex="refresh:[[:space:]]([0-9]+)\s*"
numeric_regex="^[0-9]+$"
if [[ "$response" =~ $refresh_regex ]] ; then
refresh="${BASH_REMATCH[1]}"
if [[ "$refresh" =~ $numeric_regex ]] ; then
sleep $refresh
my_bash_function
fi
fi
}
my_bash_function
That way, it allow me to not change the php script and simply use it as the curl or php command could respond to the refresh header.
Am I wrong ?
Edit: I was wrong! curl -I don't seams to execute properly the php code but only what the headers need. curl -vs seams to works but to get the header output, I had to add 2>&1
Credits: Jigar, Pavan Jiwnani
Related
I have seen this question on here before so I am sorry for the repetition but I have still not found an answer to my problem.
I have a bash script that takes a while to run. It needs to be passed variables set by a user on a webpage (don't worry there will be plenty of validation for security etc)
I can get the bash file to start but it dies after 20 seconds maybe when run from the webpage.
When run from the terminal.. runs absolutely fine.
Ok so I have the following:
$bashFile = shell_exec('./CoinCreationBashFile.sh "'.$coinName.'" "'.$coinNameAbreviation.'" "'.$blockReward.'" "'.$blockSpacing.'" "'.$targetTimespan.'" "'.$totalCoins.'" "'.$firstBitAddy.'" "'.$seedNode.'" "'.$seedName.'" "'.$headline.'" ');
Now this executes the bash file, but I read up about Shell_exec php with nohup and came up with the following:
$bashFile = shell_exec('nohup ./CoinCreationBashFile.sh "'.$coinName.'" "'.$coinNameAbreviation.'" "'.$blockReward.'" "'.$blockSpacing.'" "'.$targetTimespan.'" "'.$totalCoins.'" "'.$firstBitAddy.'" "'.$seedNode.'" "'.$seedName.'" "'.$headline.'" >/dev/null 2>&1 &');
But this still died after short time :(
So read up about set_time_limit and max_execution_time and set these to like 10000000 in the php.ini file.... Yet still no joy :(
Just want to run a bash script without it timing out and exiting. Don't really want to have to put an intermediate step in there but someone suggested I look at ZeroMQ to "detach worker from process" so I may have to go this route.
many thanks in advance
dont try runging a script via browser if they take more then 60 seconds instead try running this with SSH or as a cronjob.
I noticed exec and shell_exec is causing perpetual loading.
Basically, I'm trying to do something as simple as loading a PHP script in the background. When I try to do that, it just loads and loads.
My code is as follows
exec('php test.php -- '.escapeshellarg($param1).' > /dev/null ');
I first thought it was my other script, so I pointed it to a file with just:
echo $agrv[1];
But it still loads perpetually.
Don't wait for the process to exit
exec() waits for the process to give an exit code. The link I provided above may help you.
Oh, and since you tagged Linux for whatever reason, I assume you're on a Linux distro.
You could consider this, aswell:
http://ca1.php.net/pcntl_fork
I want to run wget as follows
shell_exec('wget "'http://somedomain.com/somefile.mp4'"');
sleep(20);
continue my code...
What I want is to let PHP wait for the shell_exec wget file download to finish before continuing on with the rest of the code. I don't want to wait a set number of seconds.
How do I do this, because as I run shell_exec wget, the file will start downloading and run in background and PHP will continue.
Does your URL contain the & character? If so, your wget might be going into the background and shell_exec() might be returning right away.
For example, if $url is "http://www.example.com/?foo=1&bar=2", you would need to make sure that it is single-quoted when passed on a command line:
shell_exec("wget '$url'");
Otherwise the shell would misinterpret the &.
Escaping command line parameters is a good idea in general. The most comprehensive way to do this is with escapeshellarg():
shell_exec("wget ".escapeshellarg($url));
shell_exec does wait for the command to finish - so you don't need the sleep command at all:
<?php
shell_exec("sleep 10");
?>
# time php c.php
10.14s real 0.05s user 0.07s system
I think your problem is likely the quotes on this line:
shell_exec('wget "'http://somedomain.com/somefile.mp4'"');
it should be
shell_exec("wget 'http://somedomain.com/somefile.mp4'");
I am trying to invoke a script which takes several seconds (web services with 3rd party) using the PHP exec call. After much struggling, I reduced this to the classic hello world example. The calling script looks like:
exec('/usr/bin/php /home/quote2bi/tmp/helloworld.php > /tmp/execoutput.txt 2>&1 &');
When I run this, the output execoutput.txt contains a copy of the invoking script page, not hello world as I expected.
Why can't I get this PHP script to execute using exec? Note that when I change the command to something like ls -l, the output is a directory listing as expected. btw, in case it matters, I did chmod the called script to 755...
Update - I moved the exec call to the end of the calling script and at least now I don't see the calling script executed in the output. Thx to posters and I will try some of these ideas.
Help!
Thanks
Steve
I had this issue also and it turns out this is a bug in php (#11430). The fix is to use php-cli when calling another php script within a php script. So you can still use exec but rather than use php use php-cli when calling it in the browser:
exec("php-cli somescript.php");
This worked for me.
What exec is doing is taking the rightmost command and appending it to your destination. If you have the shebang line in your php script, you shouldn't need to include the binary directive of the php interpreter.
if you just want the script's output, try:
exec('/home/quote2bi/tmp/helloworld.php > /tmp/execoutput.txt 2>&1 &')
however if you do not want the errors to be in the file, you should redirect the STDERR prior to outputting to the file. Like so:
exec('/home/quote2bi/tmp/helloworld.php 2> /dev/null > /tmp/execoutput.txt')
the above should only output the "Hello World" to the execoutput.
Edit:
Interesting you are getting this behaviour. You stated the command "ls" worked. Try making an alias for this and forward it to a file like so:
alias pexec='php /home/quote2bi/tmp/helloworld.php'
then
exec('pexec > /tmp/execoutput.txt 2>&1 &')
it seems to be a problem with the way exec handles input as opposed to the shell itself.
-John
The problem is with PHP itself, it treats everything as $argv in the script. It doesn´t redirect the output to a file ou to /dev/null.
I faced the same problem some time ago. What I did is to create a runscript.php in /opt/php-bin and then inside this script run what It should be running. Something like this:
$script = $argv[1]
$params = implode(' ', array_slice($argv, 2));
$cmd = "{$script} {$params} > /dev/null &";
$output = array();
$return = 0;
exec("php {$cmd}", $output, $return);
exit((int)$return);
And then you call it using:
exec('/opt/php-bin/runscript.php /path/to/your/script.php arg1 arg2')
It´s the only way I managed to get this working.
To avoid the stated problems of PHP in this area, why not put this in inside a shell script? PHP can then execute the shell script which has all the redirections handled internally.
If you need to dynamically change things, then why not write the shell script and then execute it (and of course, clean up afterwards)?
if you are just simply running a php script one possible way to execute the entire code is to use the include() that will run the php file and output any results. You cannot direct the output to a text file but it should appear in the browser window if you're Hello World php script looks like
<?php echo "Hello World!"; ?>
then it will spit that out in the browser. So your second code would look like
<?php include("helloWorld.php"); echo " PHP ROCKS";?>
resulting in a page that would look like,
Hello world! PHP ROCKS
This runs as if you run the script from browser.
This came across while working on a project on linux platform.
exec('wget http://<url to the php script>)
Hope this helps!!
I would like to be able to start a second script (either PHP or Python) when a page is loaded and have it continue to run after the user cancels/navigates away is this possible?
You can send Connection:Close headers, which finishes the page for your user, but enables you to execute things "after page loads".
There is a simple way to ignore user abort (see php manual too):
ignore_user_abort(true);
Use process forking with pcntl.
It only works under Unix operating systems, however.
You can also do something like this:
exec("/usr/bin/php ./child_script.php > /dev/null 2>&1 &");
You can read more about the above example here.
for keeping the current script:
ignore_user_abort(true);
set_time_limit(0);
for running another script:
see $sock=fsockopen('http://localhost/path_to_script',80,$errorStr,3600) + stream_set_timeout($sock,3600);
see exec('php path_to_script'); - this will cause your script to run from the CLI, so you'd have to install php-cli on that server.
Another approach if you can't use the others is to include an img tag at the bottom of your output that requests a php page that does whatever it is you are wanting to do.
It will still show the loading animation though, so I think Karsten's suggestion is probably better (I'll try that next time I need to do this type of thing I think).