I have a website that needs to build a debian package and move it into a different directory for people to download. I have been able to do this using Linux and bash files to compress and build a Packages file with dpkg. Here's the bash script
#!/bin/bash
echo Enter app name
read NAME
cd /home/stumpx/cydia/apps
dpkg -b $NAME
cp /home/stumpx/cydia/apps/$NAME.deb /home/stumpx/cydia/upload/deb/$NAME.deb
cd /home/stumpx/cydia/upload
dpkg-scanpackages -m . /dev/null >Packages
bzip2 /home/stumpx/cydia/upload/Packages -f -k
It would be nice I guess to make .bz2 files.
You forgot your question. But I'll answer it regardless. Use exec() to invoke your bash script.
Basically you need to execute system commands. This is done via exec() in php. So you will have to write a bash script that does it all ( build package, compress and move it ) and execute it using php
Related
If I ssh to server and cd public_html and run the shell script (below) it works fine.
One second thought, it would be easier to just setup a crontab on the server and have it run every day.
But if I run it from the web page outlined below, the zip file 'chessclub.zip' is not created or synced. The bash script is located on the server at 'home/user/public_html/ but it won't be found and executed. How can I get the bash script to execute on the server, not locally?
HTML
<button onclick = 'getZIP.php;'>ZIP IT</button>
PHP 'getZIP.php'
<?php
shell_exec("/home/user/public_html/backup_cccr");
?>
SHELL SCRIPT ON SERVER ("backup_cccr")
#!/bin/bash
zip -r -9 -FS chessclub.zip * -x chessclub.zip
Best idea was to scrap the php zip functions for bash zip functions because bash function are better: (backup_cccr)
#!/bin/bash
zip -r -9 -FS chessclub.zip public_html/* -x 'public_html/chessclub.zip'
cp chessclub.zip public_html/
Copying the updated chessclub.zip to public_html means the file is accessible from a web browser
I used a daily cron job to automatically create a backup. Easy to do.
I am trying to implement simple backup feature of some directories (mainly directories in /etc) which is handled by laravel. Basically I store .tar archives containing specific directory files.
This is a command used to create a backup archive of a single directory:
shell_exec("cd {$backupPath} && tar -cf {$dirName}.tar -P {$fullPathToDir}")
This is a command to restore directory from a backup archive:
shell_exec("cd / && sudo tar -xf {$backupPath . $dirName} --recursive-unlink --unlink-first")
For test reasons I let http user run sudo tar, however my initial idea was to create a bash script that will handle that, and add it to sudoers. Running command or shell script gives same errors.
The problem is if I run it through php I get errors like this:
Cannot unlink: Read-only file system
But, if i run it from command line, it works:
su http -s /bin/bash -c "cd / && sudo tar -xf {$backupPath . $dirName} --recursive-unlink --unlink-first"
Running this both on full archlinux system and archlinux docker container gives me same results. I would appreciate any kind of help.
So issue was with systemd unit for php-fpm 7.4, where ProtectSystem was set to true, after commenting it out, everything worked as expected.
sed -i 's:ProtectSystem=full:#ProtectSystem=full:' /usr/lib/systemd/system/php-fpm7.service
In PHP running on Ubuntu, I can run exec('npm -v') and the output is good,
but I can't run exec('gitbook xxxx').
gitbook is a npm package I installed by
npm install gitbook -g
I can run gitbook xxxx in the Ubuntu terminal, how can I run it from my PHP code?
If you run php by nginx or apache (for example, visit url example.com/index.php), sometime you need to export the PATH
exec("export PATH=/usr/local/bin && gitbook build);
after I added export PATH, everything works fine.
I tried once like this on UNIX-based OS:
You can run shell commands via the exec() function:
// make an php file to execute shell script
exec("node yourscript.js &", $output);
Well output here become array of each line of output along with process id. You can kill process by processid also.
exec("kill " . $processid);
This how I was did. Other then this you can use node supervisor. Hope it will help you. Try your also with node command.
How can I launch a php script from Linux console?
For example, I have myscript.php file. I should be able to launch it like myscript from console (I can create and use any other scripts)
Also, I should be able to send parameters to it like myscript dosome [-n <count>]. Can I do this?
You could find it with very simple Google search.
So you should install PHP-CLI first. For example:
sudo apt-get install php5-cli
After that you should write some code to your php file:
nano example.php
and write
<?php phpinfo(); ?>
For run this script you should give permission:
sudo chmod 755 example.php
And can run it as:
./example.php
To run php script from shell with argumets you must create bash script, name it myscript and add this code (change php to your php installation dir)
"php" myscript.php $*
And add script folder to Linux PATH
$ export PATH=$PATH:/path/to/folder/
Then you can access the script with myscript arg1 -n 20. You can access arguments via $argv variable in php script
I am trying to integrate a wget command I have written into a php script. The command recursively downloads every html/php file on a website (which is required functionality that I haven't found in file_get_contents()). I have tested the wget command in a terminal window, but when executing it using either exec() or shell_exec() nothing happens. I don't get any errors, or warnings.
Here is the command in question,
wget --recursive -m --domains oooff.com --page-requisites --html-extension --convert-links -R gif,jpg,pdf http://www.oooff.com/
I have tried simple wget commands (not as many parameters) from exec(), and shell_exec(), but they also don't work.
If wget isn't an option, I am open to using some other method of downloading a website in it's entirety.
My code that I have now is,
exec("wget google.com", $array);
Then when printing the array it is empty
I had to specify a path to wget. New command:
exec("/usr/local/bin/wget google.com", $array);
invoke wget with proper options
-q to remove it s information output
-O - to output the request response on stdout
php -r 'exec("wget -q -O - google.com", $array);var_dump($array);'
In our case wget didn't have enough permission to save wget'ed file in current dir. Solution:
$dir = __DIR__.'/777folder/';
exec("/usr/bin/wget -P ".$dir." --other-options");
where
-P, --directory-prefix=PREFIX save files to PREFIX/...
ps. we also added /usr/bin found with whereis wget but in our system it works fine without it