rsync copy folder hierarchy but only files in last folder - php

I wonder if this is possible. Basically, I am trying to write a very basic script to help me speed up the upgrade process of several installs of a CMS system (where I need to copy across certain folders).
So let's say I have a folder:
/home/sites/domain.co.uk/public_html/folder1/folder2/files
What I need to do is copy the folder tree and only copy the files in the files folder.
At the moment I am using rsync -a via:
<?php
exec("rsync -a /home/sites/domain.co.uk/public_html/folder1/folder2/files /home/sites/domain.co.uk/copy");
?>
The problem with this is instead of creating the following:
/home/sites/domain.co.uk/copy/folder1/folder2/files
it creates
/home/sites/domain.co.uk/copy/files
Is there a way to use rsync to copy the previous folders / structure as well so I end up with?
/home/sites/domain.co.uk/copy/folder1/folder2/files
I'm quite happy to use rsync as opposed to creating a full php script etc

functionality of rsync is to transfer files from one location to another OR from one server to another(if you have root access to other folder)
In your following example
<?php
exec("rsync -a /home/sites/domain.co.uk/public_html/folder1/folder2/files /home/sites/domain.co.uk/copy");
?>
There must be files folder on your server at given path which contains files and in copy folder all those files will be transferred.
instead you can try below command to transfer your files
<?php
exec("rsync -auv -e ssh --progress /home/sites/domain.co.uk/public_html/folder1/folder2/files /home/sites/domain.co.uk/copy");
?>
--progress will show which file is being transferred.

Related

hard links between php files in different directories, what should be the expected behaviour of __FILE__

I have two websites that are identical in structure. The index.php file needs to generate different content depending on the domain that the page is being served from. I am using apache 2.2 and created two virtual hosts using two different folders under /var/www (/var/www/site.domain.com and /var/www/site.domain.ca)
I am using FILE to obtain the full path of the executing index.php and depending on the path I output the correct content.
Since the files are all the same, I wanted to use links to make editing the file easier, so instead of editing one of the index.php, then copying it to the other directory, I wanted editing either of files to update the other.
I used the cp -al command to copy with hard links:
cp -al /var/www/site.domain.com /var/www/site.domain.ca
The problem is that when I access the index.php file in one site vs. the other, the FILE variable does not reflect the path of which index.php file is executing. Depending on which domain I visit first, the FILE will reflect the path of that domain.
I will try using getcwd to see if that works, but can someone explain why this is happening. Shouldn't FILE reflect the current script that's executing.
These are hard links, not soft links.
Is apache caching the script, is APC the source of the problem?
Update: getcwd() worked, it seems to always return the correct current directory.
That's how hardlinks work. The system is not going to scan through the entire filesystem to try and figure out alternative names for the file. A hardlink is exactly that.. HARD. "this is an official, unchanging, immutable name for this file".
e.g:
$ cat z.php
<?php
echo __FILE__, "\n"
$ mkdir subdir
$ cd subdir
$ ln ../z.php hardlink.php
$ ln -s ../z.php softlink.php
$ php hardlink.php
/home/marc/test/subdir/hardlink.php
$ php softlink.php
/home/marc/test/z.php
Note now the hardlink displays the location of the hardlink itself, while the softlink (aka symlink) displays the target of the link, not the link itself.

Setting File Permissions in Windows with PHP

I have a script that uploads a *.csv to import into a DB table that I have which works great in linux through chmod($target, 0777); but I can't for the life of me find the solution to do exactly this but on a Windows based Apache server.
Some other posts have people responding "don't put in the 0777 and it should work" but that's not the case for me. Thanks!
Thanks to the comment left on my original post I was able to figure it out with a little more help from https://web.archive.org/web/20171121192635/http://www.howyoudo.info/index.php/how-to-fix-windows-server-upload-file-inherit-permissions-error/
The problem only happens when you use PHP to upload a file. When you upload a file, PHP sends the file to a temporary directory on the
hard drive (for me it is C:\Windows\Temp) and then copies it over to
it’s intended directory. Once the file has landed in the temporary
directory, it is assigned the permissions of that directory. The
problem is when Windows copies that file, it keeps the temporary
directory’s permissions and doesn’t inherit your web directory’s
permissions.
The easiest way to fix this problem is to add to the temporary directory your intended web directory’s permissions. There’s no need
to erase the permissions already in the temporary directory, just add
the web directory’s permissions to them. In other words, follow these
steps
To change the permissions of your temporary upload directory, find
the “upload_tmp_dir” in your php.ini file.
Set it to the directory
of your choosing (outside your web folders of course) or leave it at
default (for me it is C:\Windows\Temp).
Browse to this folder and add the permissions of your web folders to it.
While Brian Leishman's answer will work, if you do not have the ability to edit permissions on the temp folder, you can make your uploaded file inherit permissions from its new location manually with the following on the command line:
icacls "target.txt" /q /c /reset
So, using PHP's exec() function:
exec( 'icacls "target.txt" /q /c /reset' );
For details of the various icacls flags, see: https://technet.microsoft.com/en-us/library/cc753525.aspx
Tip: Using the /t flag, you can use pattern matching to process multiple files.

Running pdftohtml with PHP - can't find resulting HTML file

I have a PHP script that runs this command:
system("pdftohtml -i -noframes /home/myacc/pdfs/test.pdf");
When I run this from within public_html, an html file called test.html is successfully created and is placed in the public_html folder. However, when I run this from a cron job outside of public_html, no html file is created in any folder on the server.
Can anyone help me find where the resulting html file will be?
Thank you.
The problem lies in how cronjob work. Try to use the absolute location to pdftohtml.
Also, from your comment, you can't change how pdftohtml write the output. However, there's a trick to workaround this. That is by using custom shell script:
#!/bin/bash
cd /path/to/output/
/path/to/pdftohtml -i -noframes /home/myacc/pdfs/test.pdf

scandir outside web tree using PHP5

I'm looking for a way to get directory listing for files that are outside the webserver's tree, for example i want to list all files and folders under '/home' directory and put them in an array (just as scandir does).
I can 'sudo su' to a user that has rights to check the directory content but i don't know how to convert a directory listing that i could get from a
exec ('ls -la /home');
Or maybe with a bash script ?
Difficult. The 'su' command does not take the password from stdin for security reasons, and that makes it pretty much impossible to use it from PHP.
You need to find an alternative.
Will the list be huge? Does it have to be real time?
Can you perhaps edit the crontab of that other user who has permissions to list files? If yes, then you could execute a command that outputs the list of files to some text file, which you then would be able to read from PHP. Cron can run once every minute, but if your directory does not change very often, then that would work.

Zipping files through shell_exec in PHP: problem with paths

I'm working on a web backup service part of which allows the user to download a zip file containing the files they have selected from a list of what they have backed up.
The zipping process is done via a shell_exec command. For this to work I have given the apache user no-password sudo privileges to the zip command.
My problem is, since I can't do "sudo cd /path/to/user/files" to the folder containing the user's backed up files I need to specify the full absolute path to the files to be zipped which then go into the zip file which I don't want.
Here's what I mean:
The user, 'test1' has their files backed up which are stored as follows
/home/backups/test1/my pics/pic1.jpg
/home/backups/test1/my pics/pic2.jpg
/home/backups/test1/my pics/pic3.jpg
/home/backups/test1/hello.txt
/home/backups/test1/music/band/song.mp3
When these files are zipped up they keep the full paths but I just want it to show:
my pics/pic1.jpg
my pics/pic2.jpg
my pics/pic3.jpg
hello.txt
music/band/song.mp3
Is there some way to tell zip to truncate the paths, or specify a custom path for each file?
Or is there some way I can change the working directory to the user's root backup folder so I can simply specify relative file paths.
Thanks for reading, hope you got this far and it all makes sense!
I have to slightly question why you are making a potentially dangerous system shell call when there are a number of good PHP zipping classes around. The tutorial here shows how to easily create a class that will output a zip file to the browser. There is also a number of classes on phpclasses.org, this seems to be the best one.
If you have to do it with a system call my suggestions are:
To truncate the path for the file you could use symbolic links.
Can you not increase the permissions of the zip executable to mean that apache can use it without using sudo?
Have you tried using chdir() to change the working directory?
chdir('/home/backups/test1/');
A better idea may be to make a shell script, and grant the webserver sudo access to that shell script. This may be more secure, too
#!/bin/bash
#
# Zip up files for a given user
#
# usage: backupToZipFile.sh <username>
cd home/backups/$1
tar -czf /tmp/$1.tgz .
Also have you considered running sudo as the user you're trying to zip files as, instead of as root? This would be more secure as well.
It seems like a simple option, but according to man it's just not there. You could of course symlink the stuff you want to archive in the location where you'll create the zip file (I assume you can actually cd to that directory). With the proper options, zip will not archive the symlinks themselves but the symlinked files instead, using the names of the symlink.
E.g. create symlinks /tmp/$PID/my pics/pic1.jpg etc, and then zip everything in /tmp/$PID.

Categories