I am trying to integrate a wget command I have written into a php script. The command recursively downloads every html/php file on a website (which is required functionality that I haven't found in file_get_contents()). I have tested the wget command in a terminal window, but when executing it using either exec() or shell_exec() nothing happens. I don't get any errors, or warnings.
Here is the command in question,
wget --recursive -m --domains oooff.com --page-requisites --html-extension --convert-links -R gif,jpg,pdf http://www.oooff.com/
I have tried simple wget commands (not as many parameters) from exec(), and shell_exec(), but they also don't work.
If wget isn't an option, I am open to using some other method of downloading a website in it's entirety.
My code that I have now is,
exec("wget google.com", $array);
Then when printing the array it is empty
I had to specify a path to wget. New command:
exec("/usr/local/bin/wget google.com", $array);
invoke wget with proper options
-q to remove it s information output
-O - to output the request response on stdout
php -r 'exec("wget -q -O - google.com", $array);var_dump($array);'
In our case wget didn't have enough permission to save wget'ed file in current dir. Solution:
$dir = __DIR__.'/777folder/';
exec("/usr/bin/wget -P ".$dir." --other-options");
where
-P, --directory-prefix=PREFIX save files to PREFIX/...
ps. we also added /usr/bin found with whereis wget but in our system it works fine without it
Related
I am trying to implement simple backup feature of some directories (mainly directories in /etc) which is handled by laravel. Basically I store .tar archives containing specific directory files.
This is a command used to create a backup archive of a single directory:
shell_exec("cd {$backupPath} && tar -cf {$dirName}.tar -P {$fullPathToDir}")
This is a command to restore directory from a backup archive:
shell_exec("cd / && sudo tar -xf {$backupPath . $dirName} --recursive-unlink --unlink-first")
For test reasons I let http user run sudo tar, however my initial idea was to create a bash script that will handle that, and add it to sudoers. Running command or shell script gives same errors.
The problem is if I run it through php I get errors like this:
Cannot unlink: Read-only file system
But, if i run it from command line, it works:
su http -s /bin/bash -c "cd / && sudo tar -xf {$backupPath . $dirName} --recursive-unlink --unlink-first"
Running this both on full archlinux system and archlinux docker container gives me same results. I would appreciate any kind of help.
So issue was with systemd unit for php-fpm 7.4, where ProtectSystem was set to true, after commenting it out, everything worked as expected.
sed -i 's:ProtectSystem=full:#ProtectSystem=full:' /usr/lib/systemd/system/php-fpm7.service
sorry for bad english..
i have php file like this:
<?php
exec(`sh /tmp/script.sh`);
echo "Work!";
?>
and this is the script:
#!/bin/bash
url="http://someweb.com/get.php?user=user&pass=pass";
wget -O /tmp/file.txt $url
sed -i 's/#Test_file/Ok_Test_file/' /tmp/file.txt
cp /tmp/file.txt /var/www/_client/personale/file.txt
Now when load file.php to the browser, the script works ,but only commands
wget and sed are performed , except cp which doesn't work..does not copy the file!
If i run the script to terminal manually (Debian 8) all cmd are executed...
Where is the problem?
Thanks.
Joele
PHP likely does not have permission to execute the command. Try using sudo to execute the command.
Hi I have the following method which when triggered should run a command in Git Bash.
function convert($tmpName, $fileName, $fileSize, $fileType){
}
The command will be something like this:
pyang -f yin -o H:\\YangModels\\yin\\ietf-inet-types.yin ietf-inet-types.yang
I was looking at shell commands here but don't know if this relates to Git Bash or not.
Just looking for a way to run commands in Git Bash when a PHP method runs, thanks.
The command has to be run through Git Bash it will not work through the normal command line or Windows shell.
Edit: Been looking into it more and found a command that could be similar to what I need. The user seems to be trying to run his command through cygwin, trying to specify mine to target git bash but haven't figured it out yet.
$result = shell_exec('C:/cygwin/bin/bash.exe /c --login -i git');
exec or shell_exec can run any command you would normally run via command line e.g.
shell_exec('cd /var/www && /path/to/git pull origin master');
I'm not sure exactly how your code is formed but it might be something like:
function convert($tmpName, $fileName, $fileSize, $fileType){
$output = shell_exec('pyang -f '.$fileType.' -o '.$tmpName.' '.$fileName);
}
just for fun I am trying to make wget downloads from php cli (/usr/bin/php -a) and it works:
php > `wget -q -nd -O /Users/user/path/to/file.html http:\/\/host.com/some_file.html`;
php > // a HTML file is downloaded and its content is placed into /Users/user/path/to/file.html
However, when I try to do the same thing from a PHP script, it does not work:
<?php
`wget -q -nd -O /Users/user/path/to/file.html http:\/\/host.com/some_file.html`;
// After requesting the script from the browser and after the script is executed, the file doesn't exist on the specified path
I would like to say that the user which executes apache and therefore PHP server side scripting is the same as the user which executes the php command from the command line (so I guess this should not be a problem of permissions).
Why if I use a .php script and call wget inside the script the file is not saved?
Thanks for the attention!
PS: please, do not tell me that I can use curl for such a task, I know I can, I am just curious to know how can I do something similar without using PHP tools, that's it
use the full path to wget, since the daemon doesn't run .bash_profile.
`/opt/local/bin/wget -q -nd -O /Users/user/path/to/file.html http://host.com/some_file.html`;
BTW, there's no need to escape / in shell commands.
The backtick operator runs a shell command and returns the output from the shell command.
In this situation, simply logging (or if you have to, echo-ing) the result will probably reveal the error. E.g:
$result = `wget -q -nd -O /Users/user/path/to/file.html http:\/\/host.com/some_file.html`;
trigger_error($result, E_USER_NOTICE);
I have installed ffmpeg on my server and it works fine via my terminal. I'm able to successfully convert a file to webm format, so I'm sure the installation is fine. I'm also sure that I only have one installation of ffmpeg installed on my machine.
A problem arises when I try to convert files through PHP via PHP's exec(). When I run the same commands, I ran in the terminal, nothing happens. I looked around stackoverflow and other parts of the net for some help. I tried this to see the output:
exec($cmd, $out, $rv);
echo "output is:\n".implode("\n", $out)."\n exit code:$rv\n";
The output is: "output is: exit code:127"
The command I'm using is in this format:
ffmpeg -i "sample.mov" -vcodec libvpx -r 30 -b "644k" -acodec libvorbis -ab 128000 -ar "44100" -ac 2 -s "352x198" "sample.webm"
I've tried replacing "ffmpeg" with the full path to FFmpeg but that did not work.
Why isn't the script running the command correctly and converting the files?
Thank you!
Error code 127 means the executable (ffmpeg) couldn't be found. Try specifying the whole path (you can that out find in your terminal with which ffmpeg) or compare the value of the PATH environment variable in your php script and terminal.
I have similar problem with ant target executions from php. I can't get whole output from ant command only first few rows and ant target was not executed. In other words is partial executed.
With bellow command I've managed to run it but output of the command is append to log_file.log.
$commandString = 'you_command_here >> log_file.log 2>&1 &';
$command = exec($commandString);
Hope this will work for you.