I am trying to make a simple program in Laravel that would allow the user to click a button and a backup SQL for their database would be generated locally.
$command = "/usr/local/mysql/bin/mysqldump --user=" . env('DB_USERNAME') ." --password=" . env('DB_PASSWORD') . " --host=" . env('DB_HOST') . " " . env('DB_DATABASE') . " > " . storage_path() . "/" . $filename;
$returnVar = NULL;
$output = NULL;
exec($command, $output, $returnVar);
This is my current code and as you can see I have specified the entire path of mysqldump, if I don't it just returns an empty .sql. The same command (without the entire path) runs flawlessly when I run it in terminal, now this code is going to run on locally different environments (Windows, Linux, Mac) and the path to mysqldump would be different in each.
Is there a way I can make this happen without specifying the entire path?
You could use another env variable with a default to 'mysqldump'.
However, you shouldn't use env variables directly in your code, load them into config parameters in your config files. env() won't work once the config is cached.
The underlying problem, however, is that /usr/local/mysql/bin is not in your PATH variable for the web user. PHP exec $PATH variable missing elements
Related
<?php // BISMILLAHIRRAHMANIRRAHEEM
error_reporting(E_ALL);
ini_set("display_errors", 1);
// tried the following with soffice, soffice.com, soffice.exe, soffice.bin
// MS-DOS: soffice, soffice.com, soffice.bin all work successufully and generate the pdf
$ex = "soffice.com --headless --convert-to html \"" . getcwd() . DIRECTORY_SEPARATOR . "sources" . DIRECTORY_SEPARATOR . "test.docx\" --outdir \"" . getcwd() . DIRECTORY_SEPARATOR . "results\"";// 2>&1";
chdir("C:\Program Files\LibreOffice\program");
echo $ex . "<br />";
exec($ex, $output);
//exec("mkdir rr", $output); // generates directory
//exec("dir soff*", $output);
//exec("a.bat", $output); // doesn't generate pdf
print_r($output);
?>
Hi, I'm using IIS with Windows 7, PHP 5.6.39. I am unable to convert a docx document to pdf by calling LibreOffice via PHP. The command output to screen in the code example works if executed in MS-DOS window but doesn't work in php's exec() or shell_exec(). The folder seems to have the appropriate permissions since my upload scripts and exec("mkdir newdir") are working.
exec("dir"); also outputs directory listings for the mentioned path, so DOS navigation commands seem to work as well.
I'd like to avoid the solution to make another user. Not elegant. This shouldn't be difficult INSHAALLAH.
By not working, I mean no document is generated and no output/ error is generated by exec() either which makes it difficult to debug.
Thank you for your time.
Update:
After some fiddling, I've realized the return code is: -1073741515
Google results show that its apparently a File I/O failure.
Another script I have downloaded which does something similar, also gave the same error. So looking into the causes. As I said before, the directories can be made and files uploaded as it has permissions.
In the link I've posted, there seems to be need for making new user. Is it possible that IIS has permissions to the directory but not the PHP exec()?
On Linux, it gives me error code 77, permission denied?
BISMILLAHIRRAHMANIRRAHEEM
ALHAMDOLILLAH after much search, finally found the answer.
Windows:
I did some permissions etc earlier, should be trivial with IIS. But what finally solved the problem was this.
Apparently there needs to be specified -env:UserInstallation=file:///C:\some\path because you cannot run multiple instances of LibreOffice concurrently without specifying a different profile folder (Reference).
Also, see the same solution here.
When executing a program via PHP over IIS, the user account for Windows is the IUSR account.
So,
Give IUSR write (and modify? I give full) permissions to be able to write to C:\inetpub\wwwroot\path directory.
Change directory to LibreOffice path for soffice.com (Not needed if providing full path when executing):
chdir("C:\\Program Files\\LibreOffice\\program");
Set HOME environment variable, as LibreOffice needs a writable temporary path. IUSR must have permissions to this path. I simply re-use wwwroot (This step could be optional):
putenv("HOME=C:\\inetpub\\wwwroot\\temp");
Execute in PHP, something like:
exec("\"C:\\Program Files\\LibreOffice\\program\\soffice.com\" --headless \"-env:UserInstallation=file:///C:/inetpub/wwwroot/LibreOfficeProfilePath1\" --convert-to pdf:writer_pdf_Export --outdir \"C:\\inetpub\\wwwroot\\result\" \"C:\\inetpub\\wwwroot\\source\\file.docx\"", $output, $ret);
// Displaying the command output
print_r($output);
echo "<br />\n\n<br />"; // Line breaks for view-source
echo "Return code: " . $ret;
Clean up by removing the profile directories for LibreOffice. Since you'd be making unique directories on-the-fly on a server environment for concurrent sessions of LibreOffice, you'd like to delete them too. Following snippet from here:
exec("rmdir /S /Q \"C:\\inetpub\\wwwroot\\LibreOfficeProfilePath1\"");
You could look to generate this profile path automatically, like:
// Do this before executing soffice command and pass it to -env
$profilepath = "C:/inetpub/wwwroot/LibreOfficeProfilePath" . date("YmdHis") . rand(0, 999999);
// Make sure to replace '/' with '\' when deleting
exec("rmdir /S /Q \"" . str_replace("/", "\\", $profilepath) . "\""); // Windows
exec("rm -rf \"" . str_replace("/", "\\", $profilepath) . "\"); // Linux
or instead use PHP to delete directory with files in it as shown here.
LibreOffice running as a service and PHP requesting document conversions may be a better solution but this post relates to invoking LibreOffice each time a document conversion is needed and then letting it close. A separate process method may also be used to avoid waiting for LibreOffice to complete conversion.
Linux:
Similar steps for www-data, as that is the user account for apache in Linux. See the solution here.
www-data does not have $HOME directory so it needs to be specified as indicated in (3) above.
env path would be something like:
\"-env:UserInstallation=file:///var/www/tmp\"
Notice the three / after file: regardless of Windows or Linux, even though paths in Linux start with /, or maybe they ignore the slash in Windows, seems inconsistent to me, though.
INSHAALLAH it'll work after that.
I have a python script which is running neural networks based on keras, it runs fine but when i use to run that script through shell_exec in my php and use var_dump i get "NULL".
I have already seen many solutions. I have increased max_execution_time and max_input time in php.ini to 5000. I am also giving full paths of files and required environment in command. Also i have given all permissions to folders (lampp, htdocs, (my php file folder)) This is the command which gets finally called in shell_exec.
source /home/characterleveldl/anaconda3/bin/activate /home/characterleveldl/anaconda3/envs/spyderr && python fyp_final.py 5 /opt/lampp/htdocs/charlevel_fyp/train.csv /opt/lampp/htdocs/charlevel_fyp/test.csv 86save_true.pdf 86save_random.pdf 86save_result.txt 2>&1
fyp_final.py , train.csv, test.csv everything exists in same folder as the php file. I ran the same exact command via terminal in root mode and it ran without any error. But in below php code it returns NULL.
$cmd_initial ='source /home/characterleveldl/anaconda3/bin/activate /home/characterleveldl/anaconda3/envs/spyderr && ';
$cmd = $cmd_initial . 'python fyp_final.py ' . $num_classes . ' ' . $train . ' ' . $test . ' ' . $save_true . ' ' . $save_random . ' ' . $save_text . ' 2>&1';
$command = escapeshellcmd($cmd);
$output = shell_exec($command);
var_dump($cmd);
var_dump($output);
Please help me. If i am doing something wrong or is it even possible to run machine learning code from php since i tried running small codes they run perfectly.
I figured it out. I solved it via putting my code inside a shell script and then executing this script via shell_exec. It now works fine.
I'm using exec() for execute mysql-dump but it's generating an empty file on apache, but when i use the "php artisan serve" the file is generated correctly, the output file has the same user and group in apache and in the artisan serve.
Using: Ubuntu 14.04 and Xamp 5.6.12
$dir = substr(__DIR__, 0, 24).'database/backups/';
$newBackup = Backup::create();
$command = 'mysqldump -uroot lions > '.$dir.$newBackup->getDateTimeString().'.sql';
exec($command);
Ok, I was facing the same problem with Laravel 5.6 and running windows though. Before anything, you should make sure to include the password argument (either as --password or -p), even if its empty.
$cmd =
"mysqldump -h " . env('DB_HOST') .
" -u " . env('DB_USERNAME') .
" -p\"" . env('DB_PASSWORD') . "\"" .
" --databases " . env('DB_DATABASE');
The main issue is that even though several sources note that exec waits until the process is complete there is no way to know for sure, leaving that aside, since processes could be owned by different users it'd probably be better if you created a temporary file and then flashed the contents of the output array into it rather than hoping for mysqldump to do it for you.
$output = [];
$name = "your_random_file_name.sql";
exec($cmd, $output);
$tmppath = tempnam(sys_get_temp_dir(), $name);
$handle = fopen($tmppath, "w");
fwrite($handle, implode($output, "\n"));
Finally, write the main file to your desired location, in my case I upload it directly to an Amazon S3 bucket, and close the temporary file.
Storage::disk('myS3bucket')
->putFileAs("backups", new File($tmppath), $name);
fclose($handle);
I know it seems redundant but, as I mentioned above, there is no way to know for sure.
Indeed, in a development environment, it is customary not to set a password for the database. And that's why your command does not work.
But in a production environment, you are obliged to set a password for the database.
Try in your production environment (ie with a password!), And you will see that the script works.
I don't have a solution for this problem but I do know it should be possible to do this. If I enter the following command in my terminal the mysqldump works:
mysqldump "--user=root" "--password=" dbname > ~/Desktop/export.sql
But when I use this command in my scheduler as below, I gett an empty file.
$schedule->exec("mysqldump \"--user=".env('DB_USERNAME')."\" \"--password=".env('DB_PASSWORD')."\" ".env('DB_DATABASE')." > export.sql")->everyMinute();
Maybe there is someone who knows why this doesn't work? I'm guessing it's a permission issue, but I leak experience in this.
So I have 5 cloud machines running and my 1st one is set up as a apache server. My goal is for the users to click on the submit button on the webpage, it will run parallel-ssh on my 1st machine and launch a script on the other cloud machines. I have the webpage running, the script running and this is my attempt to launch parallel-ssh from index.php
So "master.txt" is on the 1st cloud machine that holds on the info about the other cloud machines. StrictHostKeyChecking is used to overlook the security checks. And the perl command is what will be launched on all the cloud machines. I know this question is fairly confusing but I'm new to both php and perl and I really need an answer for this project. Sorry, it's one big command but I had to break them into lines because it wouldn't display right on here.
Maybe you'd have an easier time if you connected to each server through libssh2 or phpseclib and ran commands on each of the machines like that?
This is a big command.
Might I suggest this instead so it can be better understood.
In the following code, I use escapeshellarg a lot to make sure all our shell arguments are properly escaped and are not open to attack. This also depends on whether or not your variables are trusted, but shouldn't hurt unless each argument variable is actually composed of multiple arguments or other not so common things.
<?php
$result = shell_exec(
'parallel-ssh -h master.txt "-O StrictHostKeyChecking=no" ' . // SSH command
'-t 5 ' . // 5 second timeout on each host
'-l divx ' . // User
'-i ' . // Inline mode used for troubleshooting.. Take this out once it works.
'-P ' . // Print the output. This will only return it so it is stored in $result
escapeshellarg(
'perl /mnt/nas-storage/EncoderSetup/commTools/con.pl ' . // Executes a perl file
escapeshellarg($input) . ' ' . // $input arg to perl command
escapeshellarg($output) . ' ' . // $output arg to perl command
escapeshellarg($intraperiod) . ' ' . // $intraperiod arg to perl command
escapeshellarg($res) . ' ' . // $res arg to perl command
escapeshellarg($qp) . ' ' . // $qp arg to perl command
escapeshellarg($framerate) . ' ' . // $framerate arg to perl command
escapeshellarg($startframe) . ' ' . // $startframe arg to perl command
escapeshellarg($numgop) . ' ' . // $numgop arg to perl command
escapeshellarg($enc) . ' ' . // $enc arg to perl command
escapeshellarg($cfg) . ' ' . // $cfg arg to perl command
escapeshellarg($sao) . ' ' . // $sao arg to perl command
escapeshellarg($wafrosync) . ' ' . // $wafrosync arg to perl command
escapeshellarg($amp) . ' ' . // $amp arg to perl command
escapeshellarg($tmvp) . ' ' . // $tmvp arg to perl command
escapeshellarg($transkp) . ' ' . // $transkp arg to perl command
escapeshellarg($fasttranskp) . ' ' . // $fasttranskp arg to perl command
escapeshellarg($goploc) // $goploc arg to perl command
)
);
print $result;
This should work for you but there are some things to consider.
First, execute it and print out the $result to see what the actual output is. If you get something like
[FAILURE] server.hostname Exited with error code 255
Then it is possible that pssh is asking for a password for each host. I notices that you are using the -A option which asks for a password. You can't do that with shell_exec in php because then the script will hang and wait forever for a password. Instead, you need to setup SSH keys so that your first cloud server can ssh into the other cloud servers without a password. Setting up SSH public key based authentication actually is pretty easy. But not if you've never done it before. I'm sure there are plenty of posts on how to set that up. The procedure is basically:
Generate a public and private key (No passphrase).
Type in this command at your first cloud server: ssh-keygen
Don't enter a passphrase when it asks you
Copy the id_rsa.pub file to the ~/.ssh/authorized_keys file on each of the secondary cloud servers
Make sure the .ssh folder has 700 permissions on each of the cloud servers
Make sure the .ssh/authorized_keys file has 600 permissions on each of the cloud servers.
If all went as planned, you should be able to execute commands on each of the cloud servers from your main cloud server securely and without a password. Now, you can just run your command and it should work.... or at least give you output as to why it didn't so you can continue to troubleshoot.
Another concern is the user that shell_exec is run as. If you are running a web server on your main cloud server, then you will have to make sure that the current user (usually apache) has the id_rsa file in the .ssh folder wherever your apache home directory is (usually /var/www/). So you would put the id_rsa file in the /var/www/.ssh/ folder and make sure it is owned by apache. Also, make sure it is chmod 600 to protect it.
There are other security concerns like protecting your id_rsa file. Don't run any untrusted scripts on your server or use any virtual hosts with users that upload their own files for their own websites. The security concern comes into play because any script that is run as apache can easily access, and compromise your id_rsa file... yikes. Anyone who has access to this file will easily gain access to each of your cloud servers... so protecting it should not be taken lightly.
I'm trying to import lot's of CSV files into a MySQL DB programmatically.
After some research, I found LOAD DATA, but it is not a possibility, as the server don't allow it.
Then, I found mysqlimport a viable alternative.
Here's what is happening:
I download lot's of CSV files from a FTP server and, one by one, I execute the folowing:
exec("ln " . $path . $csv_file. " " . $path . "CSVs.txt");
exec("mysqlimport -u root -ppass --local --ignore-lines=1 --columns=column1,column2 " . $path . "CSVs.txt>" . $path . "_.txt");
unlink($path . "CSVs.txt");
First, I create a symlink of the file (as I didn't manage to use the file name different from the table name)
Then, I execute the mysqlimport command, sending the output do a txt file.
Finally, I remove the symlink.
When I run, the CSV temp file is created and deleted, but nothing apear on my DB, nor in the output txt.
If I echo the command and past in my terminal, it works flawlessly, but, in the exec, nothing happens.
I believe it has something to do with the quotes and backslashes, but couldn't manage to fix it.
Any help will be appreciated :P
EDIT: I already chmod**ed the **mysqlimpot bin… still didn't work!
It seem like PHP wasn't finding mysqlimport command. So, I had to use it's full path to make it work!