I'm working on a web backup service part of which allows the user to download a zip file containing the files they have selected from a list of what they have backed up.
The zipping process is done via a shell_exec command. For this to work I have given the apache user no-password sudo privileges to the zip command.
My problem is, since I can't do "sudo cd /path/to/user/files" to the folder containing the user's backed up files I need to specify the full absolute path to the files to be zipped which then go into the zip file which I don't want.
Here's what I mean:
The user, 'test1' has their files backed up which are stored as follows
/home/backups/test1/my pics/pic1.jpg
/home/backups/test1/my pics/pic2.jpg
/home/backups/test1/my pics/pic3.jpg
/home/backups/test1/hello.txt
/home/backups/test1/music/band/song.mp3
When these files are zipped up they keep the full paths but I just want it to show:
my pics/pic1.jpg
my pics/pic2.jpg
my pics/pic3.jpg
hello.txt
music/band/song.mp3
Is there some way to tell zip to truncate the paths, or specify a custom path for each file?
Or is there some way I can change the working directory to the user's root backup folder so I can simply specify relative file paths.
Thanks for reading, hope you got this far and it all makes sense!
I have to slightly question why you are making a potentially dangerous system shell call when there are a number of good PHP zipping classes around. The tutorial here shows how to easily create a class that will output a zip file to the browser. There is also a number of classes on phpclasses.org, this seems to be the best one.
If you have to do it with a system call my suggestions are:
To truncate the path for the file you could use symbolic links.
Can you not increase the permissions of the zip executable to mean that apache can use it without using sudo?
Have you tried using chdir() to change the working directory?
chdir('/home/backups/test1/');
A better idea may be to make a shell script, and grant the webserver sudo access to that shell script. This may be more secure, too
#!/bin/bash
#
# Zip up files for a given user
#
# usage: backupToZipFile.sh <username>
cd home/backups/$1
tar -czf /tmp/$1.tgz .
Also have you considered running sudo as the user you're trying to zip files as, instead of as root? This would be more secure as well.
It seems like a simple option, but according to man it's just not there. You could of course symlink the stuff you want to archive in the location where you'll create the zip file (I assume you can actually cd to that directory). With the proper options, zip will not archive the symlinks themselves but the symlinked files instead, using the names of the symlink.
E.g. create symlinks /tmp/$PID/my pics/pic1.jpg etc, and then zip everything in /tmp/$PID.
Related
I'm facing a problem with deleting video files from folder using php unlink() function , image are deleting but when trying deleting videos it says
unlink(file_path) : permission denied.
You (running your script as either through CLI or a webserver) need write access to the directory in which the files are located. so access to the file is not enough.
Your image directory would be different and writable for webserver or cli.
chmod("your/video/dir/path",0777);
try using above code before unlink the video in your script.
Edit: Looks like you are using Windows. Unfortunately, my answer is for Unix-like operating systems (so Linux, MacOS). You could try installing Bash extension for Win8+, but still I'm not sure if this would work. However, I'm keeping my answer in case of anyone with similar problem is looking here for an answer.
Changing permissions through PHP might work in some cases, but not always, because if you don't have permissions to delete the file you might also not have permissions to change them.
The best solution is to create a directory where you will keep files to which PHP will have full access. Let's call it dirname. Once you have created a directory, change its owner and group to the one corresponding to your web server user's name (if you're using Apache, it's "www-data"), for example: chown www-data:www-data dirname.
Once you've done that, change folder's permissions. My suggestion is 744, it will assure that user who owns it will have all permissions, and everyone else will be able only to read it. To do that, execute the following command: chmod -R 777 dirname.
Now you should be able to do anything you want with the files in given directory directly from PHP.
I wonder if this is possible. Basically, I am trying to write a very basic script to help me speed up the upgrade process of several installs of a CMS system (where I need to copy across certain folders).
So let's say I have a folder:
/home/sites/domain.co.uk/public_html/folder1/folder2/files
What I need to do is copy the folder tree and only copy the files in the files folder.
At the moment I am using rsync -a via:
<?php
exec("rsync -a /home/sites/domain.co.uk/public_html/folder1/folder2/files /home/sites/domain.co.uk/copy");
?>
The problem with this is instead of creating the following:
/home/sites/domain.co.uk/copy/folder1/folder2/files
it creates
/home/sites/domain.co.uk/copy/files
Is there a way to use rsync to copy the previous folders / structure as well so I end up with?
/home/sites/domain.co.uk/copy/folder1/folder2/files
I'm quite happy to use rsync as opposed to creating a full php script etc
functionality of rsync is to transfer files from one location to another OR from one server to another(if you have root access to other folder)
In your following example
<?php
exec("rsync -a /home/sites/domain.co.uk/public_html/folder1/folder2/files /home/sites/domain.co.uk/copy");
?>
There must be files folder on your server at given path which contains files and in copy folder all those files will be transferred.
instead you can try below command to transfer your files
<?php
exec("rsync -auv -e ssh --progress /home/sites/domain.co.uk/public_html/folder1/folder2/files /home/sites/domain.co.uk/copy");
?>
--progress will show which file is being transferred.
I have two websites that are identical in structure. The index.php file needs to generate different content depending on the domain that the page is being served from. I am using apache 2.2 and created two virtual hosts using two different folders under /var/www (/var/www/site.domain.com and /var/www/site.domain.ca)
I am using FILE to obtain the full path of the executing index.php and depending on the path I output the correct content.
Since the files are all the same, I wanted to use links to make editing the file easier, so instead of editing one of the index.php, then copying it to the other directory, I wanted editing either of files to update the other.
I used the cp -al command to copy with hard links:
cp -al /var/www/site.domain.com /var/www/site.domain.ca
The problem is that when I access the index.php file in one site vs. the other, the FILE variable does not reflect the path of which index.php file is executing. Depending on which domain I visit first, the FILE will reflect the path of that domain.
I will try using getcwd to see if that works, but can someone explain why this is happening. Shouldn't FILE reflect the current script that's executing.
These are hard links, not soft links.
Is apache caching the script, is APC the source of the problem?
Update: getcwd() worked, it seems to always return the correct current directory.
That's how hardlinks work. The system is not going to scan through the entire filesystem to try and figure out alternative names for the file. A hardlink is exactly that.. HARD. "this is an official, unchanging, immutable name for this file".
e.g:
$ cat z.php
<?php
echo __FILE__, "\n"
$ mkdir subdir
$ cd subdir
$ ln ../z.php hardlink.php
$ ln -s ../z.php softlink.php
$ php hardlink.php
/home/marc/test/subdir/hardlink.php
$ php softlink.php
/home/marc/test/z.php
Note now the hardlink displays the location of the hardlink itself, while the softlink (aka symlink) displays the target of the link, not the link itself.
I have a script that uploads a *.csv to import into a DB table that I have which works great in linux through chmod($target, 0777); but I can't for the life of me find the solution to do exactly this but on a Windows based Apache server.
Some other posts have people responding "don't put in the 0777 and it should work" but that's not the case for me. Thanks!
Thanks to the comment left on my original post I was able to figure it out with a little more help from https://web.archive.org/web/20171121192635/http://www.howyoudo.info/index.php/how-to-fix-windows-server-upload-file-inherit-permissions-error/
The problem only happens when you use PHP to upload a file. When you upload a file, PHP sends the file to a temporary directory on the
hard drive (for me it is C:\Windows\Temp) and then copies it over to
it’s intended directory. Once the file has landed in the temporary
directory, it is assigned the permissions of that directory. The
problem is when Windows copies that file, it keeps the temporary
directory’s permissions and doesn’t inherit your web directory’s
permissions.
The easiest way to fix this problem is to add to the temporary directory your intended web directory’s permissions. There’s no need
to erase the permissions already in the temporary directory, just add
the web directory’s permissions to them. In other words, follow these
steps
To change the permissions of your temporary upload directory, find
the “upload_tmp_dir” in your php.ini file.
Set it to the directory
of your choosing (outside your web folders of course) or leave it at
default (for me it is C:\Windows\Temp).
Browse to this folder and add the permissions of your web folders to it.
While Brian Leishman's answer will work, if you do not have the ability to edit permissions on the temp folder, you can make your uploaded file inherit permissions from its new location manually with the following on the command line:
icacls "target.txt" /q /c /reset
So, using PHP's exec() function:
exec( 'icacls "target.txt" /q /c /reset' );
For details of the various icacls flags, see: https://technet.microsoft.com/en-us/library/cc753525.aspx
Tip: Using the /t flag, you can use pattern matching to process multiple files.
Im having some problems zipping a directory.
The following line will do the trick but it also includes the root directory.
exec('zip -r '.$tmp_zip.' '.$filename_no_ext.'/rss-ticker/*');
So I only want to zip everything in the dir rss-ticker
How to fix this?
Thanks for your help
What zip program are you using? tar works in the way that you want it to above, then you can gzip it.
Alternatively, chdir() to the directory you want to zip, and specify your path as * - that should only get the files in the current working directory.
If you can't get it to work in the way you want to (or even if you can) try the ZIP Extension or this 3rd party library - doing it in pure PHP will make your code more portable.
Unless you have an overwhelming reason not to (like a desire to cause yourself much pain and headache), you really should be using the ZipArchive toolkit. There is an example of how at this question.
As to getting zip to work with exec, I noticed two points:
you will definitely need the $filename_no_ext variable in there. If you just have it be a .. Right now, because you're starting rss-ticker with /, it is assuming that the folder is at the root of the file-system. I don't think you want that.
You do not need the trailing * to get the command line zip function to work.