I'm calling a shell script using shell_exec function of php where I'm downloading a file from another server to the server where this php page is hosted using wget command and sending a report to my page after completing the process. It works fine.
Here is my TestFile.php file
<?php
$output = shell_exec("sudo /home/user/myScript");
echo "<pre>$output</pre>";
?>
And my bash script myScript
#!/bin/sh
sudo wget -O /path/to/download/downloadfile http://example.com/TestFile.zip
But I want to show the real time progress of the wget operation on the webpage from where I'm calling the shell script. I want to show just the real time progress in percentage. On this way I'm able to send the progress of wget just in percentage to the webpage using this command
sudo wget -O /path/to/download/downloadfile http://example.com/TestFile.zip 2>&1 | grep -o "[0-9]\+%"
And, The output is like this
2%
4%
6%
8%
...
...
...
95%
97%
99%
100%
But this output is coming after completing the process. At the time of the process running the window doesn't show anything. But I don't want this.
I want the real time update of the process at the time of wget process running not after completing the command.
Could any one help me to solve the problem.
Thanks in advance.
Can't test it at the moment but something like this should work:
<?php
ob_start();
$descriptors = [
['pipe', 'r'],
['pipe', 'w'],
['pipe', 'w'],
];
$process = proc_open('wget blahblah', $descriptors, $pipes, '/tmp', []);
if(is_resource($process)) {
while ($line = fgets($pipes[1])) {
echo $line;
ob_flush();
}
}
Related
I need to make a background script that is spawned by PHP command line script that echos to the SSH session. Essentially, I need to do this linux command:
script/path 2>&1 &
If I just run this command in linux, it works great. Output is still displayed to the screen, but I can still use the same session for other commands. However, when I do this in PHP, it doesn't work the same way.
I've tried:
`script/path 2>&1 &`;
exec("script/path 2>&1 &");
system("script/path 2>&1 &")
...And none of these work. I need it to spawn the process, and then kill itself so that I can free up the session, but I still want the output from the child process to print to the screen.
(please comment if this is unclear... I had a hard time putting this into words :P)
I came up with a solution that works in this case.
I created a wrapper bash script that starts up the PHP script, which in turn spawns a child script that has its output redirected to a file, which the bash script wrapper tails.
Here is the bash script I came up with:
php php_script.php "$#"
ps -ef | grep php_script.log | grep -v grep | awk '{print $2}' | xargs kill > /dev/null 2>&1
if [ "$1" != "-stop" ]
then
tail -f php_script.log -n 0 &
fi
(it also cleans up "tail" processes that are still running so that you don't get a gazillion processes when you run this bash script multiple times)
And then in your php child script, you call the external script like this:
exec("php php_script.php >> php_script.log &");
This way the parent PHP script exits without killing the child script, you still get the output from the child script, and your command prompt is still available for other commands.
I've got a PHP script that executes a bash script on a remote server which fires a number of processes.
<?php
$connection = ssh2_connect('address1.com', 22);
ssh2_auth_password($connection, 'user', 'pass');
$stream = ssh2_exec($connection, '/root/incoming/process.sh');
?>
The bash script process.sh works fine when executed locally on the remote server, no issues.
#!/bin/bash
wget -O /root/incoming/myfile.mp3 http://address2.com/myfile.mp3;
lame --decode /root/incoming/myfile.mp3 - | /usr/settings/stereo_tool_cmd_64 - - -s /usr/settings/setting.sts | lame -b 128 - /var/www/processed/myfile.mp3
But when I try to execute it remotely using the PHP script it bombs out at various stages of the first process (wget). It doesn't even complete the wget download, stopping at random stages of the transfer.
Is this an issue with PHP ssh2_exec?
Or am I missing something?
Found it after much hunting..
My PHP script doesn't require any feedback from the shell script, I just needed to start it and forget about it.
What solved my problem was the following :
$stream = ssh2_exec($connection, "/root/incoming/process.sh &> /dev/null &");
Hope it helps somebody else.
Add set_time_limit ( 0 ); to the beginning of the script. PHP usually defaults to timing out after 30 seconds.
I have 5 php scripts need to be executed one after other. How can i create a single php/batch script which executes all the php scripts in sequence.
I want to schedule a cron which run that php file.
#!/path/to/your/php
<?php
include("script1.php");
include("script2.php");
include("script3.php");
//...
?>
Alternative
#/bin/bash
/path/to/your/php /path/to/your/script1.php
/path/to/your/php /path/to/your/script2.php
/path/to/your/php /path/to/your/script3.php
# ...
If your scripts need to be accessed via http:
#/bin/bash
wget --quiet http://path/to/your/script1.php > /dev/null 2>&1
wget --quiet http://path/to/your/script2.php > /dev/null 2>&1
wget --quiet http://path/to/your/script3.php > /dev/null 2>&1
# ...
I did something for testing using the "wait" command, as it seems I just put wait between the calls to each script. I did a little test creating databases in php scripts, returning some records in a bash script, then updating with another php script, then another bash script to return the updated results and it seemed to work fine...
From what I have read, as long as the subscript doesn't call another subscript, the master script will wait if "wait" command is used between script calls.
Code is as below:
#!/bin/sh
/usr/bin/php test.php
wait
/bin/bash test.sh
wait
/usr/bin/php test2.php
wait
/bin/bash test2.sh
wait
echo all done
Hope it would execute all the php scripts in sequence.
I have this code:
<?php
exec("sleep 15m; ls -l");
echo "Done";
?>
But it gets stuck and won't stop loading until 15 minutes have passed, is there anyway to not wait for the exec execution?
From PHP - exec
exec
...
Note:
If a program is started with this function, in order for it to continue running in the background, the output of the program must be redirected to a file or another output stream. Failing to do so will cause PHP to hang until the execution of the program ends.
So you must redirect the output and start it in the background
exec("(sleep 15m; ls -l) >/dev/null 2>&1 &");
If I understand it correctly, you want to execute your command in the background:
exec("(sleep 15m ; ls -l) &")
I have the following exec() command with an & sign at the end so the script runs in the background. However the script is not running in the background. It's timing out in the browser after exactly 5.6 minutes. Also if i close the browser the script doesn't keep running.
exec("/usr/local/bin/php -q /home/user/somefile.php &")
If I run the script via the command line, it does not time out. My question is how do i prevent timeout. How do i run the script in the background using exec so it's not browser dependent. What am i doing wrong and what should i look at.
exec() function handle outputs from your executed program, so I suggest you to redirect outputs to /dev/null (a virtual writable file, that automatically loose every data you write in).
Try to run :
exec("/usr/local/bin/php -q /home/gooffers/somefile.php > /dev/null 2>&1 &");
Note : 2>&1 redirects error output to standard output, and > /dev/null redirects standard output to that virtual file.
If you have still difficulties, you can create a script that just execute other scripts. exec() follows a process when it is doing a task, but releases when the task is finished. if the executed script just executes another one, the task is very quick and exec is released the same way.
Let's see an implementation. Create a exec.php that contains :
<?php
if (count($argv) == 1)
{
die('You must give a script to exec...');
}
array_shift($argv);
$cmd = '/usr/local/bin/php -q';
foreach ($argv as $arg)
{
$cmd .= " " . escapeshellarg($arg);
}
exec("{$cmd} > /dev/null 2>&1 &");
?>
Now, run the following command :
exec("/usr/local/bin/php -q exec.php /home/gooffers/somefile.php > /dev/null 2>&1 &");
If you have arguments, you can give them too :
exec("/usr/local/bin/php -q exec.php /home/gooffers/somefile.php x y z > /dev/null 2>&1 &");
You'll need to use shell_exec() instead:
shell_exec("/usr/local/bin/php -q /home/gooffers/somefile.php &");
That being said, if you have shell access, why don't you install this as a cronjob? I'm not sure why a PHP script is invoking another to run like this.